A self-improving, modular AI suite for local model hosting, runtime learning, and offline dataset curation.
Note
Heidi-CLI has been evolved into the Unified Learning Suite. Legacy agent orchestration features (Copilot SDK, Jules, OpenWebUI) have been removed to focus on a local-first learning loop.
Heidi now provides a complete feedback loop for local AI development:
- Multi-Model Host: Serve multiple local LLMs via an OpenAI-compatible API.
- Runtime Learning: Models gain experience through memory, reflection, and reward scoring.
- Data Pipeline: Raw runs are captured, redacted, and curated into training datasets.
- Self-Improvement: Background retraining, evaluation gates, and atomic hot-swapping.
model_host/: OpenAI-compatible multi-model routing.runtime/: Memory layers, reflection, and strategy ranking.pipeline/: Data capture, curation, and secret redaction.registry/: Versioning, eval gates, and promotion logic.
python -m pip install -e .heidi statusThis will initialize your persistent state root (default: ./state).
heidi model serveThe host will listen on http://127.0.0.1:8000 and provide:
GET /v1/modelsPOST /v1/chat/completions
Verify your installation and modular structure:
export PYTHONPATH=$PYTHONPATH:$(pwd)/src
python3 scripts/doctor.pyHeidi is building the future of local, autonomous AI learning.