You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: Add multi-provider LLM support via LiteLLM integration
Replace OpenAI-only implementation with LiteLLM to support 100+ LLM providers
including Anthropic Claude, Google Gemini, Azure OpenAI, AWS Bedrock, Groq,
and local Ollama models.
Changes:
- Add litellm>=1.0.0 dependency
- Refactor ChatGPT_API functions to use litellm.completion()
- Enhance count_tokens() for multi-provider token counting
- Update config.yaml with provider-specific model examples
- Update README.md with multi-provider setup instructions
Backward compatible: Existing OPENAI_API_KEY and CHATGPT_API_KEY still work.
Default model remains gpt-4o-2024-11-20.
Copy file name to clipboardExpand all lines: README.md
+46-3Lines changed: 46 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -147,14 +147,49 @@ You can follow these steps to generate a PageIndex tree from a PDF document.
147
147
pip3 install --upgrade -r requirements.txt
148
148
```
149
149
150
-
### 2. Set your OpenAI API key
150
+
### 2. Set your API key
151
151
152
-
Create a `.env` file in the root directory and add your API key:
152
+
PageIndex now supports multiple LLM providers via [LiteLLM](https://docs.litellm.ai/). Create a `.env` file in the root directory and add your API key:
153
153
154
+
**OpenAI (default):**
154
155
```bash
156
+
OPENAI_API_KEY=your_openai_key_here
157
+
# or
155
158
CHATGPT_API_KEY=your_openai_key_here
156
159
```
157
160
161
+
**Anthropic Claude:**
162
+
```bash
163
+
ANTHROPIC_API_KEY=your_anthropic_key_here
164
+
```
165
+
166
+
**Google Gemini:**
167
+
```bash
168
+
GEMINI_API_KEY=your_google_key_here
169
+
```
170
+
171
+
**Azure OpenAI:**
172
+
```bash
173
+
AZURE_API_KEY=your_azure_key_here
174
+
AZURE_API_BASE=your_azure_endpoint
175
+
AZURE_API_VERSION=2024-02-01
176
+
```
177
+
178
+
**AWS Bedrock:**
179
+
```bash
180
+
AWS_ACCESS_KEY_ID=your_access_key
181
+
AWS_SECRET_ACCESS_KEY=your_secret_key
182
+
AWS_REGION_NAME=us-east-1
183
+
```
184
+
185
+
**Groq:**
186
+
```bash
187
+
GROQ_API_KEY=your_groq_key_here
188
+
```
189
+
190
+
**Ollama (local):**
191
+
No API key needed. Just ensure Ollama is running locally.
0 commit comments