Your OpenClaw still has "goldfish memory"?
Session ends β forgets everything. Cross-agent β memory lost. Context explodes β πΈ token bill skyrockets. hawk-bridge gives your AI persistent memory: autoCapture + autoRecall, zero manual work.
English | δΈζ | ηΉι«δΈζ | ζ₯ζ¬θͺ | νκ΅μ΄ | FranΓ§ais | EspaΓ±ol | Deutsch | Italiano | Π ΡΡΡΠΊΠΈΠΉ | PortuguΓͺs (Brasil)** |
AI agents forget everything after each session. hawk-bridge bridges OpenClaw's hook system with hawk's Python memory, giving agents a persistent, self-improving memory that works automatically:
- Every response β hawk extracts and stores meaningful memories
- Every new session β hawk injects relevant memories before thinking begins
- No manual operation β it just works
Without hawk-bridge:
User: "I prefer concise replies, not paragraphs" Agent: "Sure thing!" β (next session β agent forgets again)
With hawk-bridge:
User: "I prefer concise replies" Agent: stored as
preference:communicationβ (next session β injected automatically, applies immediately)
| Scenario | β Without hawk-bridge | β With hawk-bridge |
|---|---|---|
| New session starts | Blank β knows nothing about you | β Injects relevant memories automatically |
| User repeats a preference | "I told you before..." | Remembers from session 1 |
| Long task runs for days | Restart = start over | Task state persists, resumes seamlessly |
| Context gets large | Token bill skyrockets, πΈ | 5 compression strategies keep it lean |
| Duplicate info | Same fact stored 10 times | SimHash dedup β stored once |
| Memory recall | All similar, redundant injection | MMR diverse recall β no repetition |
| Memory management | Everything piles up forever | 4-tier decay β noise fades, signal stays |
| Self-improvement | Repeats the same mistakes | importance + access_count tracking β smart promotion |
| Multi-agent team | Each agent starts fresh, no shared context | Shared LanceDB β all agents learn from each other |
Without hawk-bridge: AI agents forget everything β across sessions, across agents, and spend too much on LLM tokens.
With hawk-bridge: Persistent memory, shared context, and lower costs.
| Pain Point | β Without | β With hawk-bridge |
|---|---|---|
| AI forgets everything after session ends | β New session starts blank | β Cross-session memory injection |
| Team context lost | β Each agent starts fresh | β Shared LanceDB, all agents access same memories |
| Multiple agents repeat same mistakes | β Agent A doesn't know Agent B's decisions | β Memory is shared, not siloed |
| LLM costs spiral out of control | β Unlimited context growth, πΈ token bills explode | β Compression + dedup + MMR shrinks context |
| Context overflow / token limit hit | β Session history grows until crash | β Auto-pruning + 4-tier decay keeps context lean |
| Important decisions forgotten | β Only in old session, lost forever | β Stored in LanceDB with importance scoring |
| Duplicate memories pile up | β Same info stored many times | β SimHash dedup, 64-bit fingerprint |
| Repetitive recall | β "Tell me about X" β 5 similar memories injected | β MMR ensures diverse, non-repeating injection |
| No self-improving memory | β Nothing gets better over time | β importance + access_count tracking β smart promotion |
Problem 1: Session context window limits Context has a token limit (e.g. 32k). Long history crowds out important content. β hawk-bridge compresses/archives, injects only the most relevant.
Problem 2: AI forgets across sessions When a session ends, context disappears. Next conversation starts fresh. β hawk-recall injects memories from LanceDB before every new session.
Problem 3: Multiple agents share nothing Agent A knows nothing about Agent B's context. Decisions made by one agent are invisible to others. β Shared LanceDB memory: all agents read/write to the same store. No silos.
Problem 4: Context grows too large before sending to LLM Recall without optimization = large, repetitive context. β After compression + SimHash dedup + MMR: context is much smaller before LLM is called, saving tokens and cost.
Problem 5: Memory never self-manages Without hawk-bridge: all messages pile up in session history until context overflows. β hawk-capture auto-extracts β LanceDB. Unimportant β delete. Important β promote to long-term.
Session (persistent, on disk)
β
βββΊ History messages
β
βΌ
Context Assembly (in memory)
β
ββββΊ hawk-recall injects memories β from LanceDB
β
ββββΊ Skills descriptions
ββββΊ Tools list
ββββΊ System Prompt
β
βΌ
LLM Reply
β
βΌ
hawk-capture extracts β stored in LanceDB
How it works:
- Every response β
hawk-captureextracts meaningful content β saves to LanceDB - Every new session β
hawk-recallretrieves relevant memories β injects into context - Old memories β auto-managed via 4-tier decay (Working β Short β Long β Archive)
- Duplicate memories β SimHash dedup prevents storage waste
- Redundant recall β MMR ensures diverse, non-repetitive injection
| # | Feature | Description |
|---|---|---|
| 1 | Auto-Capture Hook | message:sent β hawk extracts 6 categories of memories automatically |
| 2 | Auto-Recall Hook | agent:bootstrap β hawk injects relevant memories before first response |
| 3 | Hybrid Retrieval | BM25 + vector search + RRF fusion β no API key required for baseline |
| 4 | Zero-Config Fallback | Works out-of-the-box in BM25-only mode, no API keys needed |
| 5 | 4 Embedding Providers | Ollama (local) / sentence-transformers (CPU) / Jina AI (free API) / OpenAI |
| 6 | Graceful Degradation | Automatically falls back when API keys are unavailable |
| 7 | Context-Aware Injection | BM25 rank score used directly when no embedder available |
| 8 | Seed Memory | Pre-populated with generic AI agent team concepts β customize after install |
| 9 | Sub-100ms Recall | LanceDB ANN index for instant retrieval |
| 10 | Cross-Platform Install | One command, works on Ubuntu/Debian/Fedora/Arch/Alpine/openSUSE |
| 11 | SimHash Auto-Dedup | 64-bit fingerprint dedup β prevents duplicate memories from being stored |
| 12 | MMR Diverse Recall | Maximal Marginal Relevance β relevant AND diverse, reduces context size |
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β OpenClaw Gateway β
βββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββββββββββββββββ€
β β β
β agent:bootstrap β message:sent β
β β β β β
β ββββββββββββββββββ΄ββββββββββββ β
β β π¦
hawk-recall β β Injects relevant memories β
β β (before first response) β into agent context β
β βββββββββββββββββββββββββββββββ β
β β β
β βββββββββββββββββββββββββββββββββββββββββββββββ β
β β LanceDB β β
β β Vector search + BM25 + RRF fusion β β
β βββββββββββββββββββββββββββββββββββββββββββββββ β
β β β
β βββββββββββββββββββββββββ β
β β context-hawk (Python) β β Extraction / scoring β
β β MemoryManager + Extractor β / decay β
β βββββββββββββββββββββββββ β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Choose the method that works best for you:
# Most convenient β one command
clawhub install hawk-bridge
# or via OpenClaw
openclaw skills install hawk-bridgeβ Auto-updates, easy to manage, no manual setup
# Downloads and runs the install script automatically
bash <(curl -fsSL https://github.com/relunctance/hawk-bridge/master/install.sh)β Works on all Linux distros, fully automatic
git clone https://github.com/relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
npm install && npm run build
# Then add to openclaw.json:
openclaw plugins install /tmp/hawk-bridgeβ Full control, for advanced users
- Open OpenClaw dashboard β Skills β Browse
- Search for "hawk-bridge"
- Click Install
β No command line needed
That's it. The installer handles:
| Step | What it does |
|---|---|
| 1 | Detects and installs Node.js, Python3, git, curl |
| 2 | Installs npm dependencies (lancedb, openai) |
| 3 | Installs Python packages (lancedb, rank-bm25, sentence-transformers) |
| 4 | Clones context-hawk workspace into ~/.openclaw/workspace/context-hawk |
| 5 | Creates ~/.openclaw/hawk symlink |
| 6 | Installs Ollama (if not present) |
| 7 | Pulls nomic-embed-text embedding model |
| 8 | Builds TypeScript hooks and seeds initial memories |
Supported distros: Ubuntu Β· Debian Β· Fedora Β· CentOS Β· Arch Β· Alpine Β· openSUSE
If you prefer to install manually instead of using the one-command script:
Ubuntu / Debian
# 1. System deps
sudo apt-get update && sudo apt-get install -y nodejs npm python3 python3-pip git curl
# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages
# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text
# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 6. npm + build
npm install && npm run build
# 7. Seed memory
node dist/seed.js
# 8. Activate
openclaw plugins install /tmp/hawk-bridgeFedora / RHEL / CentOS / Rocky / AlmaLinux
# 1. System deps
sudo dnf install -y nodejs npm python3 python3-pip git curl
# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages
# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text
# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 6. npm + build
npm install && npm run build
# 7. Seed memory
node dist/seed.js
# 8. Activate
openclaw plugins install /tmp/hawk-bridgeArch / Manjaro / EndeavourOS
# 1. System deps
sudo pacman -Sy --noconfirm nodejs npm python python-pip git curl
# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages
# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text
# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 6. npm + build
npm install && npm run build
# 7. Seed memory
node dist/seed.js
# 8. Activate
openclaw plugins install /tmp/hawk-bridgeAlpine
# 1. System deps
apk add --no-cache nodejs npm python3 py3-pip git curl
# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages
# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text
# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 6. npm + build
npm install && npm run build
# 7. Seed memory
node dist/seed.js
# 8. Activate
openclaw plugins install /tmp/hawk-bridgeopenSUSE / SUSE Linux Enterprise
# 1. System deps
sudo apt-get update && sudo apt-get install -y nodejs npm python3 python3-pip git curl
# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages
# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text
# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 6. npm + build
npm install && npm run build
# 7. Seed memory
node dist/seed.js
# 8. Activate
openclaw plugins install /tmp/hawk-bridge# 1. System deps
sudo dnf install -y nodejs npm python3 python3-pip git curl
# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages
# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text
# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 6. npm + build
npm install && npm run build
# 7. Seed memory
node dist/seed.js
# 8. Activate
openclaw plugins install /tmp/hawk-bridge# 1. System deps
sudo pacman -Sy --noconfirm nodejs npm python python-pip git curl
# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages
# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text
# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 6. npm + build
npm install && npm run build
# 7. Seed memory
node dist/seed.js
# 8. Activate
openclaw plugins install /tmp/hawk-bridge# 1. System deps
apk add --no-cache nodejs npm python3 py3-pip git curl
# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages
# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text
# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 6. npm + build
npm install && npm run build
# 7. Seed memory
node dist/seed.js
# 8. Activate
openclaw plugins install /tmp/hawk-bridge# 1. System deps
sudo zypper install -y nodejs npm python3 python3-pip git curl
# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages
# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text
# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 6. npm + build
npm install && npm run build
# 7. Seed memory
node dist/seed.js
# 8. Activate
openclaw plugins install /tmp/hawk-bridgemacOS
# 1. Install Homebrew (if not present)
/bin/bash -c "$(curl -fsSL https://github.com/Homebrew/install/HEAD/install.sh)"
# 2. System deps
brew install node python git curl
# 3. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
# 4. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers
# 5. Ollama (optional)
brew install ollama
ollama pull nomic-embed-text
# 6. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk
# 7. npm + build
npm install && npm run build
# 8. Seed memory
node dist/seed.js
# 9. Activate
openclaw plugins install /tmp/hawk-bridgeNote:
pip install --break-system-packagesis required on Linux to bypass PEP 668. Ollama install script auto-detects macOS and uses Homebrew if available.
After install, choose your embedding mode β all via environment variables:
# β Default: sentence-transformers (local CPU, no API key needed β works out of the box)
# No environment variables needed!
# β‘ Ollama local GPU (recommended for quality β free, no API key)
export OLLAMA_BASE_URL=http://localhost:11434
# β’ Jina AI free tier (requires free API key from jina.ai)
export JINA_API_KEY=your_free_key
# β οΈ Proxy required in China: set HTTP/SOCKS proxy below
export HTTPS_PROXY=http://YOUR_PROXY_HOST:PORT
# β£ BM25-only fallback (no embedding needed β keyword search only)
# No environment variables neededJina AI offers a generous free tier β no credit card required:
- Register at https://jina.ai/ (GitHub login supported)
- Get Key: Go to https://jina.ai/settings/ β API Keys β Create API Key
- Copy Key: starts with
jina_ - Configure:
β οΈ Important: Jina AI requires a proxy in China (api.jina.ai is blocked). SetHTTPS_PROXYto your proxy URL (e.g.http://192.168.1.109:10808).
For best results with Jina, create ~/.hawk/config.json:
{
"openai_api_key": "jina_YOUR_KEY_HERE",
"embedding_model": "jina-embeddings-v3",
"embedding_dimensions": 1024,
"base_url": "https://api.jina.ai/v1",
"proxy": "http://YOUR_PROXY_HOST:PORT"
}| Field | Description |
|---|---|
openai_api_key |
Your Jina API key (starts with jina_) |
embedding_model |
Model name: jina-embeddings-v3 (recommended) |
embedding_dimensions |
Vector size: 1024 for jina-embeddings-v3 |
base_url |
Must be https://api.jina.ai/v1 |
proxy |
HTTP proxy URL (required in China) |
{
"plugins": {
"load": {
"paths": ["/tmp/hawk-bridge"]
},
"allow": ["hawk-bridge"]
}
}No API keys in config files β environment variables only.
| Mode | Provider | API Key | Quality | Speed |
|---|---|---|---|---|
| BM25-only | Built-in | β | ββ | β‘β‘β‘ |
| sentence-transformers | Local CPU | β | βββ | β‘β‘ |
| Ollama | Local GPU | β | ββββ | β‘β‘β‘β‘ |
| Jina AI | Cloud | β free | ββββ | β‘β‘β‘β‘ |
| Minimax | Cloud | β | βββββ | β‘β‘β‘β‘β‘ |
Default: BM25-only β works immediately with zero configuration.
Has OLLAMA_BASE_URL? β Full hybrid: vector + BM25 + RRF
Has USE_LOCAL_EMBEDDING=1? β sentence-transformers + BM25 + RRF
Has JINA_API_KEY? β Jina embeddings + BM25 + RRF
Has MINIMAX_API_KEY? β Minimax embeddings + BM25 + RRF
Nothing configured? β BM25-only (pure keyword, no API calls)
No API key = no crash = graceful degradation.
On first install, 11 foundational memories are seeded automatically:
- Team structure (main/wukong/bajie/bailong/tseng roles)
- Collaboration norms (GitHub inbox β done workflow)
- Project context (hawk-bridge, qujingskills, gql-openclaw)
- Communication preferences
- Operating principles
These ensure hawk-recall has something to inject from day one.
hawk-bridge/
βββ README.md
βββ LICENSE
βββ install.sh # One-command installer (curl | bash)
βββ package.json
βββ openclaw.plugin.json # Plugin manifest + configSchema
βββ src/
β βββ index.ts # Plugin entry point
β βββ config.ts # OpenClaw config reader + env detection
β βββ lancedb.ts # LanceDB wrapper
β βββ embeddings.ts # 5 embedding providers
β βββ retriever.ts # Hybrid search (BM25 + vector + RRF)
β βββ seed.ts # Seed memory initializer
β βββ hooks/
β βββ hawk-recall/ # agent:bootstrap hook
β β βββ handler.ts
β β βββ HOOK.md
β βββ hawk-capture/ # message:sent hook
β βββ handler.ts
β βββ HOOK.md
βββ python/ # context-hawk (installed by install.sh)
| Runtime | Node.js 18+ (ESM), Python 3.12+ |
| Vector DB | LanceDB (local, serverless) |
| Retrieval | BM25 + ANN vector search + RRF fusion |
| Embedding | Ollama / sentence-transformers / Jina AI / OpenAI / Minimax |
| Hook Events | agent:bootstrap (recall), message:sent (capture) |
| Dependencies | Zero hard dependencies β all optional with auto-fallback |
| Persistence | Local filesystem, no external DB required |
| License | MIT |
| hawk-bridge | context-hawk | |
|---|---|---|
| Role | OpenClaw hook bridge | Python memory library |
| What it does | Triggers hooks, manages lifecycle | Memory extraction, scoring, decay |
| Interface | TypeScript hooks β LanceDB | Python MemoryManager, VectorRetriever |
| Installs | npm packages, system deps | Cloned into ~/.openclaw/workspace/ |
They work together: hawk-bridge decides when to act, context-hawk handles how.
- π¦ context-hawk β Python memory library
- π gql-openclaw β Team collaboration workspace
- π qujingskills β Laravel development standards