Description
GPT 5.2 Codex via Copilot is compacting at 100k tokens when the model's context is 272k.
Plugins
None
OpenCode version
latest
Steps to reproduce
- Use gpt-5.2-codex via github copilot
- Context in top right indicates max 100k when it is actually 272k
- Reach ~90k+ tokens in session and compaction happens
Screenshot and/or share link
Operating System
Windows 10 latest
Terminal
Windows Terminal
Description
GPT 5.2 Codex via Copilot is compacting at 100k tokens when the model's context is 272k.
Plugins
None
OpenCode version
latest
Steps to reproduce
Screenshot and/or share link
Operating System
Windows 10 latest
Terminal
Windows Terminal