Skip to content

relunctance/hawk-bridge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

50 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ¦… hawk-bridge

Your OpenClaw still has "goldfish memory"?

Session ends β†’ forgets everything. Cross-agent β†’ memory lost. Context explodes β†’ πŸ’Έ token bill skyrockets. hawk-bridge gives your AI persistent memory: autoCapture + autoRecall, zero manual work.

License: MIT OpenClaw Compatible Node.js Python

English | δΈ­ζ–‡ | 繁體中文 | ζ—₯本θͺž | ν•œκ΅­μ–΄ | FranΓ§ais | EspaΓ±ol | Deutsch | Italiano | Русский | PortuguΓͺs (Brasil)** |


What does it do?

AI agents forget everything after each session. hawk-bridge bridges OpenClaw's hook system with hawk's Python memory, giving agents a persistent, self-improving memory that works automatically:

  • Every response β†’ hawk extracts and stores meaningful memories
  • Every new session β†’ hawk injects relevant memories before thinking begins
  • No manual operation β€” it just works

Without hawk-bridge:

User: "I prefer concise replies, not paragraphs" Agent: "Sure thing!" βœ… (next session β€” agent forgets again)

With hawk-bridge:

User: "I prefer concise replies" Agent: stored as preference:communication βœ… (next session β€” injected automatically, applies immediately)


❌ Without vs βœ… With hawk-bridge

Scenario ❌ Without hawk-bridge βœ… With hawk-bridge
New session starts Blank β€” knows nothing about you βœ… Injects relevant memories automatically
User repeats a preference "I told you before..." Remembers from session 1
Long task runs for days Restart = start over Task state persists, resumes seamlessly
Context gets large Token bill skyrockets, πŸ’Έ 5 compression strategies keep it lean
Duplicate info Same fact stored 10 times SimHash dedup β€” stored once
Memory recall All similar, redundant injection MMR diverse recall β€” no repetition
Memory management Everything piles up forever 4-tier decay β€” noise fades, signal stays
Self-improvement Repeats the same mistakes importance + access_count tracking β†’ smart promotion
Multi-agent team Each agent starts fresh, no shared context Shared LanceDB β€” all agents learn from each other

πŸ¦… What problem does it solve?

Without hawk-bridge: AI agents forget everything β€” across sessions, across agents, and spend too much on LLM tokens.

With hawk-bridge: Persistent memory, shared context, and lower costs.

Pain Points hawk-bridge Solves

Pain Point ❌ Without βœ… With hawk-bridge
AI forgets everything after session ends ❌ New session starts blank βœ… Cross-session memory injection
Team context lost ❌ Each agent starts fresh βœ… Shared LanceDB, all agents access same memories
Multiple agents repeat same mistakes ❌ Agent A doesn't know Agent B's decisions βœ… Memory is shared, not siloed
LLM costs spiral out of control ❌ Unlimited context growth, πŸ’Έ token bills explode βœ… Compression + dedup + MMR shrinks context
Context overflow / token limit hit ❌ Session history grows until crash βœ… Auto-pruning + 4-tier decay keeps context lean
Important decisions forgotten ❌ Only in old session, lost forever βœ… Stored in LanceDB with importance scoring
Duplicate memories pile up ❌ Same info stored many times βœ… SimHash dedup, 64-bit fingerprint
Repetitive recall ❌ "Tell me about X" β†’ 5 similar memories injected βœ… MMR ensures diverse, non-repeating injection
No self-improving memory ❌ Nothing gets better over time βœ… importance + access_count tracking β†’ smart promotion

hawk-bridge solves 5 core problems:

Problem 1: Session context window limits Context has a token limit (e.g. 32k). Long history crowds out important content. β†’ hawk-bridge compresses/archives, injects only the most relevant.

Problem 2: AI forgets across sessions When a session ends, context disappears. Next conversation starts fresh. β†’ hawk-recall injects memories from LanceDB before every new session.

Problem 3: Multiple agents share nothing Agent A knows nothing about Agent B's context. Decisions made by one agent are invisible to others. β†’ Shared LanceDB memory: all agents read/write to the same store. No silos.

Problem 4: Context grows too large before sending to LLM Recall without optimization = large, repetitive context. β†’ After compression + SimHash dedup + MMR: context is much smaller before LLM is called, saving tokens and cost.

Problem 5: Memory never self-manages Without hawk-bridge: all messages pile up in session history until context overflows. β†’ hawk-capture auto-extracts β†’ LanceDB. Unimportant β†’ delete. Important β†’ promote to long-term.


πŸ”„ hawk-bridge in the Session/Context Lifecycle

Session (persistent, on disk)
    β”‚
    └─► History messages
            β”‚
            β–Ό
    Context Assembly (in memory)
            β”‚
            β”œβ”€β”€β–Ί hawk-recall injects memories ← from LanceDB
            β”‚
            β”œβ”€β”€β–Ί Skills descriptions
            β”œβ”€β”€β–Ί Tools list
            └──► System Prompt
                    β”‚
                    β–Ό
                LLM Reply
                    β”‚
                    β–Ό
            hawk-capture extracts β†’ stored in LanceDB

How it works:

  1. Every response β†’ hawk-capture extracts meaningful content β†’ saves to LanceDB
  2. Every new session β†’ hawk-recall retrieves relevant memories β†’ injects into context
  3. Old memories β†’ auto-managed via 4-tier decay (Working β†’ Short β†’ Long β†’ Archive)
  4. Duplicate memories β†’ SimHash dedup prevents storage waste
  5. Redundant recall β†’ MMR ensures diverse, non-repetitive injection

✨ Core Features

# Feature Description
1 Auto-Capture Hook message:sent β†’ hawk extracts 6 categories of memories automatically
2 Auto-Recall Hook agent:bootstrap β†’ hawk injects relevant memories before first response
3 Hybrid Retrieval BM25 + vector search + RRF fusion β€” no API key required for baseline
4 Zero-Config Fallback Works out-of-the-box in BM25-only mode, no API keys needed
5 4 Embedding Providers Ollama (local) / sentence-transformers (CPU) / Jina AI (free API) / OpenAI
6 Graceful Degradation Automatically falls back when API keys are unavailable
7 Context-Aware Injection BM25 rank score used directly when no embedder available
8 Seed Memory Pre-populated with generic AI agent team concepts β€” customize after install
9 Sub-100ms Recall LanceDB ANN index for instant retrieval
10 Cross-Platform Install One command, works on Ubuntu/Debian/Fedora/Arch/Alpine/openSUSE
11 SimHash Auto-Dedup 64-bit fingerprint dedup β€” prevents duplicate memories from being stored
12 MMR Diverse Recall Maximal Marginal Relevance β€” relevant AND diverse, reduces context size

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                     OpenClaw Gateway                             β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                   β”‚                                               β”‚
β”‚  agent:bootstrap β”‚  message:sent                               β”‚
β”‚         ↓         β”‚         ↓                                   β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                                β”‚
β”‚  β”‚       πŸ¦… hawk-recall       β”‚  ← Injects relevant memories  β”‚
β”‚  β”‚    (before first response)  β”‚     into agent context       β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                                β”‚
β”‚                   ↓                                               β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                β”‚
β”‚  β”‚              LanceDB                         β”‚                β”‚
β”‚  β”‚   Vector search + BM25 + RRF fusion          β”‚                β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                β”‚
β”‚                   ↓                                               β”‚
β”‚         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”                                β”‚
β”‚         β”‚  context-hawk (Python) β”‚  ← Extraction / scoring     β”‚
β”‚         β”‚  MemoryManager + Extractor β”‚   / decay               β”‚
β”‚         β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                                β”‚
β”‚                                                               β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸš€ One-Command Install

Choose the method that works best for you:

Option A β€” ClawHub (Recommended)

# Most convenient β€” one command
clawhub install hawk-bridge
# or via OpenClaw
openclaw skills install hawk-bridge

βœ… Auto-updates, easy to manage, no manual setup

Option B β€” Clone & Install Script

# Downloads and runs the install script automatically
bash <(curl -fsSL https://github.com/relunctance/hawk-bridge/master/install.sh)

βœ… Works on all Linux distros, fully automatic

Option C β€” Manual Install

git clone https://github.com/relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge
npm install && npm run build
# Then add to openclaw.json:
openclaw plugins install /tmp/hawk-bridge

βœ… Full control, for advanced users

Option D β€” OpenClaw UI

  1. Open OpenClaw dashboard β†’ Skills β†’ Browse
  2. Search for "hawk-bridge"
  3. Click Install

βœ… No command line needed


That's it. The installer handles:

Step What it does
1 Detects and installs Node.js, Python3, git, curl
2 Installs npm dependencies (lancedb, openai)
3 Installs Python packages (lancedb, rank-bm25, sentence-transformers)
4 Clones context-hawk workspace into ~/.openclaw/workspace/context-hawk
5 Creates ~/.openclaw/hawk symlink
6 Installs Ollama (if not present)
7 Pulls nomic-embed-text embedding model
8 Builds TypeScript hooks and seeds initial memories

Supported distros: Ubuntu Β· Debian Β· Fedora Β· CentOS Β· Arch Β· Alpine Β· openSUSE

πŸ”§ Manual Install (per Distro)

If you prefer to install manually instead of using the one-command script:

Ubuntu / Debian
# 1. System deps
sudo apt-get update && sudo apt-get install -y nodejs npm python3 python3-pip git curl

# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages

# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 6. npm + build
npm install && npm run build

# 7. Seed memory
node dist/seed.js

# 8. Activate
openclaw plugins install /tmp/hawk-bridge
Fedora / RHEL / CentOS / Rocky / AlmaLinux
# 1. System deps
sudo dnf install -y nodejs npm python3 python3-pip git curl

# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages

# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 6. npm + build
npm install && npm run build

# 7. Seed memory
node dist/seed.js

# 8. Activate
openclaw plugins install /tmp/hawk-bridge
Arch / Manjaro / EndeavourOS
# 1. System deps
sudo pacman -Sy --noconfirm nodejs npm python python-pip git curl

# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages

# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 6. npm + build
npm install && npm run build

# 7. Seed memory
node dist/seed.js

# 8. Activate
openclaw plugins install /tmp/hawk-bridge
Alpine
# 1. System deps
apk add --no-cache nodejs npm python3 py3-pip git curl

# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages

# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 6. npm + build
npm install && npm run build

# 7. Seed memory
node dist/seed.js

# 8. Activate
openclaw plugins install /tmp/hawk-bridge
openSUSE / SUSE Linux Enterprise
# 1. System deps
sudo apt-get update && sudo apt-get install -y nodejs npm python3 python3-pip git curl

# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages

# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 6. npm + build
npm install && npm run build

# 7. Seed memory
node dist/seed.js

# 8. Activate
openclaw plugins install /tmp/hawk-bridge

Fedora / RHEL / CentOS / Rocky / AlmaLinux

# 1. System deps
sudo dnf install -y nodejs npm python3 python3-pip git curl

# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages

# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 6. npm + build
npm install && npm run build

# 7. Seed memory
node dist/seed.js

# 8. Activate
openclaw plugins install /tmp/hawk-bridge

Arch / Manjaro / EndeavourOS

# 1. System deps
sudo pacman -Sy --noconfirm nodejs npm python python-pip git curl

# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages

# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 6. npm + build
npm install && npm run build

# 7. Seed memory
node dist/seed.js

# 8. Activate
openclaw plugins install /tmp/hawk-bridge

Alpine

# 1. System deps
apk add --no-cache nodejs npm python3 py3-pip git curl

# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages

# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 6. npm + build
npm install && npm run build

# 7. Seed memory
node dist/seed.js

# 8. Activate
openclaw plugins install /tmp/hawk-bridge

openSUSE / SUSE Linux Enterprise

# 1. System deps
sudo zypper install -y nodejs npm python3 python3-pip git curl

# 2. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 3. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers --break-system-packages

# 4. Ollama (optional)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull nomic-embed-text

# 5. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 6. npm + build
npm install && npm run build

# 7. Seed memory
node dist/seed.js

# 8. Activate
openclaw plugins install /tmp/hawk-bridge
macOS
# 1. Install Homebrew (if not present)
/bin/bash -c "$(curl -fsSL https://github.com/Homebrew/install/HEAD/install.sh)"

# 2. System deps
brew install node python git curl

# 3. Clone repo
git clone git@github.com:relunctance/hawk-bridge.git /tmp/hawk-bridge
cd /tmp/hawk-bridge

# 4. Python deps
pip3 install lancedb openai tiktoken rank-bm25 sentence-transformers

# 5. Ollama (optional)
brew install ollama
ollama pull nomic-embed-text

# 6. context-hawk
git clone git@github.com:relunctance/context-hawk.git ~/.openclaw/workspace/context-hawk
ln -sf ~/.openclaw/workspace/context-hawk/hawk ~/.openclaw/hawk

# 7. npm + build
npm install && npm run build

# 8. Seed memory
node dist/seed.js

# 9. Activate
openclaw plugins install /tmp/hawk-bridge

Note: pip install --break-system-packages is required on Linux to bypass PEP 668. Ollama install script auto-detects macOS and uses Homebrew if available.


πŸ”§ Configuration

After install, choose your embedding mode β€” all via environment variables:

# β‘  Default: sentence-transformers (local CPU, no API key needed β€” works out of the box)
# No environment variables needed!

# β‘‘ Ollama local GPU (recommended for quality β€” free, no API key)
export OLLAMA_BASE_URL=http://localhost:11434

# β‘’ Jina AI free tier (requires free API key from jina.ai)
export JINA_API_KEY=your_free_key
# ⚠️ Proxy required in China: set HTTP/SOCKS proxy below
export HTTPS_PROXY=http://YOUR_PROXY_HOST:PORT

# β‘£ BM25-only fallback (no embedding needed β€” keyword search only)
# No environment variables needed

πŸ”‘ Get Your Free Jina API Key (Recommended)

Jina AI offers a generous free tier β€” no credit card required:

  1. Register at https://jina.ai/ (GitHub login supported)
  2. Get Key: Go to https://jina.ai/settings/ β†’ API Keys β†’ Create API Key
  3. Copy Key: starts with jina_
  4. Configure:

⚠️ Important: Jina AI requires a proxy in China (api.jina.ai is blocked). Set HTTPS_PROXY to your proxy URL (e.g. http://192.168.1.109:10808).

~/.hawk/config.json (Recommended for Jina)

For best results with Jina, create ~/.hawk/config.json:

{
  "openai_api_key": "jina_YOUR_KEY_HERE",
  "embedding_model": "jina-embeddings-v3",
  "embedding_dimensions": 1024,
  "base_url": "https://api.jina.ai/v1",
  "proxy": "http://YOUR_PROXY_HOST:PORT"
}
Field Description
openai_api_key Your Jina API key (starts with jina_)
embedding_model Model name: jina-embeddings-v3 (recommended)
embedding_dimensions Vector size: 1024 for jina-embeddings-v3
base_url Must be https://api.jina.ai/v1
proxy HTTP proxy URL (required in China)

openclaw.json

{
  "plugins": {
    "load": {
      "paths": ["/tmp/hawk-bridge"]
    },
    "allow": ["hawk-bridge"]
  }
}

No API keys in config files β€” environment variables only.


πŸ“Š Retrieval Modes

Mode Provider API Key Quality Speed
BM25-only Built-in ❌ ⭐⭐ ⚑⚑⚑
sentence-transformers Local CPU ❌ ⭐⭐⭐ ⚑⚑
Ollama Local GPU ❌ ⭐⭐⭐⭐ ⚑⚑⚑⚑
Jina AI Cloud βœ… free ⭐⭐⭐⭐ ⚑⚑⚑⚑
Minimax Cloud βœ… ⭐⭐⭐⭐⭐ ⚑⚑⚑⚑⚑

Default: BM25-only β€” works immediately with zero configuration.


πŸ”„ Degradation Logic

Has OLLAMA_BASE_URL?       β†’ Full hybrid: vector + BM25 + RRF
Has USE_LOCAL_EMBEDDING=1? β†’ sentence-transformers + BM25 + RRF
Has JINA_API_KEY?          β†’ Jina embeddings + BM25 + RRF
Has MINIMAX_API_KEY?      β†’ Minimax embeddings + BM25 + RRF
Nothing configured?        β†’ BM25-only (pure keyword, no API calls)

No API key = no crash = graceful degradation.


🌱 Seed Memory

On first install, 11 foundational memories are seeded automatically:

  • Team structure (main/wukong/bajie/bailong/tseng roles)
  • Collaboration norms (GitHub inbox β†’ done workflow)
  • Project context (hawk-bridge, qujingskills, gql-openclaw)
  • Communication preferences
  • Operating principles

These ensure hawk-recall has something to inject from day one.


πŸ“ File Structure

hawk-bridge/
β”œβ”€β”€ README.md
β”œβ”€β”€ LICENSE
β”œβ”€β”€ install.sh                   # One-command installer (curl | bash)
β”œβ”€β”€ package.json
β”œβ”€β”€ openclaw.plugin.json         # Plugin manifest + configSchema
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ index.ts               # Plugin entry point
β”‚   β”œβ”€β”€ config.ts              # OpenClaw config reader + env detection
β”‚   β”œβ”€β”€ lancedb.ts             # LanceDB wrapper
β”‚   β”œβ”€β”€ embeddings.ts           # 5 embedding providers
β”‚   β”œβ”€β”€ retriever.ts            # Hybrid search (BM25 + vector + RRF)
β”‚   β”œβ”€β”€ seed.ts                # Seed memory initializer
β”‚   └── hooks/
β”‚       β”œβ”€β”€ hawk-recall/       # agent:bootstrap hook
β”‚       β”‚   β”œβ”€β”€ handler.ts
β”‚       β”‚   └── HOOK.md
β”‚       └── hawk-capture/      # message:sent hook
β”‚           β”œβ”€β”€ handler.ts
β”‚           └── HOOK.md
└── python/                    # context-hawk (installed by install.sh)

πŸ”Œ Tech Specs

Runtime Node.js 18+ (ESM), Python 3.12+
Vector DB LanceDB (local, serverless)
Retrieval BM25 + ANN vector search + RRF fusion
Embedding Ollama / sentence-transformers / Jina AI / OpenAI / Minimax
Hook Events agent:bootstrap (recall), message:sent (capture)
Dependencies Zero hard dependencies β€” all optional with auto-fallback
Persistence Local filesystem, no external DB required
License MIT

🀝 Relationship with context-hawk

hawk-bridge context-hawk
Role OpenClaw hook bridge Python memory library
What it does Triggers hooks, manages lifecycle Memory extraction, scoring, decay
Interface TypeScript hooks β†’ LanceDB Python MemoryManager, VectorRetriever
Installs npm packages, system deps Cloned into ~/.openclaw/workspace/

They work together: hawk-bridge decides when to act, context-hawk handles how.


πŸ“– Related

About

OpenClaw Hook bridge to hawk Python memory system - autoCapture + autoRecall

Resources

Stars

Watchers

Forks

Packages