Skip to content

Commit 85025ad

Browse files
authored
[FEATURE]: Add support for embedding models in OpenRouter (#1359)
* Add support for embedding models in OpenRouter * Update doc for OpenRouter embedding * Run cargo fmt
1 parent 98e9e6e commit 85025ad

File tree

2 files changed

+28
-7
lines changed

2 files changed

+28
-7
lines changed

docs/docs/ai/llm.mdx

Lines changed: 23 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ We support the following types of LLM APIs:
2626
| [Anthropic](#anthropic) | `LlmApiType.ANTHROPIC` |||
2727
| [Voyage](#voyage) | `LlmApiType.VOYAGE` |||
2828
| [LiteLLM](#litellm) | `LlmApiType.LITE_LLM` |||
29-
| [OpenRouter](#openrouter) | `LlmApiType.OPEN_ROUTER` || |
29+
| [OpenRouter](#openrouter) | `LlmApiType.OPEN_ROUTER` || |
3030
| [vLLM](#vllm) | `LlmApiType.VLLM` |||
3131
| [Bedrock](#bedrock) | `LlmApiType.BEDROCK` |||
3232

@@ -400,7 +400,7 @@ You can find the full list of models supported by LiteLLM [here](https://docs.li
400400
To use the OpenRouter API, you need to set the environment variable `OPENROUTER_API_KEY`.
401401
You can generate the API key from [here](https://openrouter.ai/settings/keys).
402402

403-
A spec for OpenRouter looks like this:
403+
A text generation spec for OpenRouter looks like this:
404404

405405
<Tabs>
406406
<TabItem value="python" label="Python" default>
@@ -415,6 +415,27 @@ cocoindex.LlmSpec(
415415
</TabItem>
416416
</Tabs>
417417

418+
OpenRouter also supports some text embedding models. Note that for OpenRouter embedding
419+
models, you need to explicitly provide the `output_dimension` parameter in the spec.
420+
Here's how you can define the spec to use an OpenRouter embedding model:
421+
422+
<Tabs>
423+
<TabItem value="python" label="Python" default>
424+
425+
```python
426+
cocoindex.functions.EmbedText(
427+
api_type=cocoindex.LlmApiType.OPEN_ROUTER,
428+
model="openai/text-embedding-3-small",
429+
# Task type for embedding model
430+
task_type="SEMANTICS_SIMILARITY",
431+
# Required: the number of output dimensions for the embedding model
432+
output_dimension=1536,
433+
)
434+
```
435+
436+
</TabItem>
437+
</Tabs>
438+
418439
You can find the full list of models supported by OpenRouter [here](https://openrouter.ai/models).
419440

420441
### vLLM

rust/cocoindex/src/llm/mod.rs

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -166,6 +166,10 @@ pub async fn new_llm_embedding_client(
166166
LlmApiType::Ollama => {
167167
Box::new(ollama::Client::new(address).await?) as Box<dyn LlmEmbeddingClient>
168168
}
169+
LlmApiType::OpenRouter => {
170+
Box::new(openrouter::Client::new_openrouter(address, api_key).await?)
171+
as Box<dyn LlmEmbeddingClient>
172+
}
169173
LlmApiType::Gemini => {
170174
Box::new(gemini::AiStudioClient::new(address, api_key)?) as Box<dyn LlmEmbeddingClient>
171175
}
@@ -178,11 +182,7 @@ pub async fn new_llm_embedding_client(
178182
Box::new(gemini::VertexAiClient::new(address, api_key, api_config).await?)
179183
as Box<dyn LlmEmbeddingClient>
180184
}
181-
LlmApiType::OpenRouter
182-
| LlmApiType::LiteLlm
183-
| LlmApiType::Vllm
184-
| LlmApiType::Anthropic
185-
| LlmApiType::Bedrock => {
185+
LlmApiType::LiteLlm | LlmApiType::Vllm | LlmApiType::Anthropic | LlmApiType::Bedrock => {
186186
api_bail!("Embedding is not supported for API type {:?}", api_type)
187187
}
188188
};

0 commit comments

Comments
 (0)