-
-
Notifications
You must be signed in to change notification settings - Fork 152
Description
TanStack AI version
0.8.1
Framework/Library version
TanStack Start v1.154.7, Node.js
Describe the bug and the steps to reproduce it
The @tanstack/ai-ollama adapter's mapCommonOptionsToOllama() method does not forward the systemPrompts field to the Ollama API. The core TextEngine.streamModelResponse() correctly passes systemPrompts in the options object to adapter.chatStream(), but the Ollama adapter only maps messages, model, options, and tools — options.systemPrompts is silently discarded.
This means calling chat() with systemPrompts has no effect when using the Ollama adapter:
import { chat } from '@tanstack/ai';
import { createOllamaChat } from '@tanstack/ai-ollama';
const stream = chat({
adapter: createOllamaChat('llama3', 'http://localhost:11434'),
messages: [{ role: 'user', content: 'Who are you?' }],
systemPrompts: ['You are a helpful pirate assistant. Always respond in pirate speak.'],
});
// Model responds normally — system prompt is never delivered to Ollama.Expected:** The model responds in pirate speak, because the system prompt was forwarded.
Actual: The model responds normally with no awareness of the system prompt.
Root cause
In packages/typescript/ai-ollama/src/adapters/text.ts, the mapCommonOptionsToOllama() method constructs the ChatRequest without including systemPrompts. Other adapters handle this correctly — for example, the Anthropic adapter forwards system prompts via system: options.systemPrompts?.join('\n').
The Ollama ChatRequest type supports a system string field, so the fix is a one-line addition:
...(options.systemPrompts?.length
? { system: options.systemPrompts.join('\n') }
: {}),I will be submitting a PR with this fix.
Workaround
Prepend the system prompt as a { role: 'system' } message in the messages array instead of using systemPrompts. This works because Ollama's API treats system-role messages as system instructions, but it bypasses the TanStack AI systemPrompts contract.
Your Minimal, Reproducible Example - (Sandbox Highly Recommended)
The bug is visible by reading mapCommonOptionsToOllama() in packages/typescript/ai-ollama/src/adapters/text.ts — it never accesses options.systemPrompts. No external reproduction needed.
Screenshots or Videos (Optional)
No response
Do you intend to try to help solve this bug with your own PR?
Yes, I am also opening a PR that solves the problem along side this issue
Terms & Code of Conduct
- I agree to follow this project's Code of Conduct
- I understand that if my bug cannot be reliable reproduced in a debuggable environment, it will probably not be fixed and this issue may even be closed.