fix: add missing context overflow pattern for OpenAI-compatible providers#27628
fix: add missing context overflow pattern for OpenAI-compatible providers#27628realcarsonterry wants to merge 1 commit into
Conversation
Fixes anomalyco#27519 Some OpenAI-compatible providers return the error message "tokens in request more than max tokens allowed" when the context window is exceeded. This pattern was not recognized by the overflow detection logic, causing the system to retry indefinitely instead of triggering context compaction. This adds the missing pattern to OVERFLOW_PATTERNS to ensure proper handling of this error across all OpenAI-compatible providers. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
Hey! Your PR title Please update it to start with one of:
Where See CONTRIBUTING.md for details. |
|
The following comment was made by an LLM, it may be inaccurate: Related PRs FoundI found two related PRs that address similar context overflow detection improvements:
These PRs are related to the same system for detecting and handling context overflow errors across different providers. They may provide context on how similar patterns have been added previously or could complement your changes. |
|
Thanks for updating your PR! It now meets our contributing guidelines. 👍 |
Issue for this PR
Closes #27519
Type of change
What does this PR do?
This PR adds a missing overflow pattern to prevent infinite retry loops when OpenAI-compatible providers return context overflow errors.
Some OpenAI-compatible providers return "tokens in request more than max tokens allowed" when the context window is exceeded. This pattern wasn't recognized in the
OVERFLOW_PATTERNSarray, causing the system to infinitely retry instead of triggering context compaction.The fix adds the pattern
/tokens in request more than max tokens allowed/ito theOVERFLOW_PATTERNSarray inpackages/opencode/src/provider/error.ts.How did you verify your code works?
Screenshots / recordings
N/A - This is a backend error handling fix
Checklist