Bug Description
Session title generation has been silently failing since v1.3.3. All new sessions retain their default "New session - <timestamp>" title instead of getting an LLM-generated title.
The root cause: when the user's selected model has a variant that includes an effort parameter (e.g. anthropic/claude-opus-4-6 with variant max), that parameter leaks into the LLM.stream call for the title agent. The title agent resolves to a small model (claude-haiku-4-5-20251001) which does not support the effort parameter, causing a 400 error from the Anthropic API. The error is swallowed by Effect.ignore on the fork, so the failure is completely silent.
Reproduction
- Set your model to
anthropic/claude-opus-4-6 with variant max (or any variant that maps to output_config.effort)
- Start a new session and send a message
- Observe the session title remains
"New session - <timestamp>"
Evidence from logs
INFO service=llm providerID=anthropic modelID=claude-haiku-4-5-20251001
sessionID=ses_xxx small=true agent=title mode=primary stream
ERROR service=llm
requestBodyValues: {
"model": "claude-haiku-4-5-20251001",
"output_config": { "effort": "high" }, ← should not be here
...
}
responseBody: {
"type": "error",
"error": {
"type": "invalid_request_error",
"message": "This model does not support the effort parameter."
}
}
ERROR service=session.prompt
error=No output generated. Check the stream for errors.
failed to generate title
Impact data (from local DB)
| Version |
Sessions with title |
Sessions without title |
Success rate |
| 1.3.9 |
0 |
4 |
0% |
| 1.3.5 |
0 |
3 |
0% |
| 1.3.3 |
0 |
6 |
0% |
| 1.3.2 |
29 |
2 |
93.5% |
| 1.3.0 |
40 |
3 |
93.0% |
| 1.2.27 |
42 |
2 |
95.5% |
Title generation drops to 0% success starting from v1.3.3 for users with an effort-bearing variant.
Root cause
In packages/opencode/src/session/prompt.ts, the ensureTitle function passes user: firstInfo to LLM.stream:
const result = await LLM.stream({
agent: ag,
user: firstInfo, // ← carries variant from user's main model
// ...
model: mdl, // ← this is the small model (haiku)
})
The user message's variant (e.g. max → effort: high) is applied to the API call even though the resolved model (claude-haiku-4-5) doesn't support it.
Suggested fix
Strip or ignore the user variant when calling the title agent's small model. For example:
- Pass
user: { ...firstInfo, variant: undefined } in the title generation call
- Or have
LLM.stream skip unsupported parameters when small: true
- Or catch the specific error and retry without the variant
Workaround
Override the title agent's model in config to one that isn't affected:
Environment
- opencode: v1.3.3 → v1.3.9 (all affected)
- Provider: anthropic (confirmed), likely affects any provider where variant maps to unsupported model params
- OS: macOS (arm64)
Bug Description
Session title generation has been silently failing since v1.3.3. All new sessions retain their default
"New session - <timestamp>"title instead of getting an LLM-generated title.The root cause: when the user's selected model has a variant that includes an
effortparameter (e.g.anthropic/claude-opus-4-6with variantmax), that parameter leaks into theLLM.streamcall for thetitleagent. The title agent resolves to a small model (claude-haiku-4-5-20251001) which does not support theeffortparameter, causing a 400 error from the Anthropic API. The error is swallowed byEffect.ignoreon the fork, so the failure is completely silent.Reproduction
anthropic/claude-opus-4-6with variantmax(or any variant that maps tooutput_config.effort)"New session - <timestamp>"Evidence from logs
Impact data (from local DB)
Title generation drops to 0% success starting from v1.3.3 for users with an effort-bearing variant.
Root cause
In
packages/opencode/src/session/prompt.ts, theensureTitlefunction passesuser: firstInfotoLLM.stream:The user message's
variant(e.g.max→effort: high) is applied to the API call even though the resolved model (claude-haiku-4-5) doesn't support it.Suggested fix
Strip or ignore the user variant when calling the title agent's small model. For example:
user: { ...firstInfo, variant: undefined }in the title generation callLLM.streamskip unsupported parameters whensmall: trueWorkaround
Override the title agent's model in config to one that isn't affected:
Environment