feat: add Astraflow provider support#4614
Conversation
Signed-off-by: ucloudnb666 <ucloudnb666@users.noreply.github.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub. |
PR SummaryMedium Risk Overview Registers both provider IDs in the provider registry and model/provider definitions (with reseller capabilities and model ID patterns), and documents the optional Reviewed by Cursor Bugbot for commit a66fcc0. Bugbot is set up for automated code reviews on this repo. Configure here. |
Greptile SummaryThis PR adds Astraflow (UCloud's OpenAI-compatible aggregation platform) as a new provider, exposing both a global and a China endpoint. The registration, type definitions, and environment variable documentation are all consistent with existing provider patterns.
Confidence Score: 3/5The provider wiring is safe but the core execution logic has multiple defects affecting tool-use and streaming paths. Three defects in the implementation file: forced-tool iteration always re-forces the same first tool, follow-up API calls in forced-tool mode omit the tools array entirely, and streaming responses lack the isStreaming flag. All three affect live execution paths. apps/sim/providers/astraflow/index.ts — should be reconciled against apps/sim/providers/xai/index.ts as the closest reference. Important Files Changed
Sequence DiagramsequenceDiagram
participant C as Caller
participant AP as AstraflowProvider
participant OAI as OpenAI-compat API
C->>AP: executeRequest(request)
alt "stream=true AND no tools"
AP->>OAI: chat.completions.create(stream:true)
OAI-->>AP: AsyncIterable chunks
AP-->>C: StreamingExecution (isStreaming missing)
else non-streaming / tool loop
AP->>OAI: chat.completions.create(initialPayload)
OAI-->>AP: response
loop tool iterations
AP->>AP: execute tools in parallel
AP->>OAI: chat.completions.create(nextPayload)
OAI-->>AP: next response
note over AP: usedForcedTools never updated
end
AP-->>C: ProviderResponse
end
Reviews (1): Last reviewed commit: "feat: add Astraflow provider support" | Re-trigger Greptile |
|
|
||
| const thisToolsTime = Date.now() - toolsStartTime | ||
| toolsTime += thisToolsTime | ||
|
|
||
| let nextPayload: any = { ...basePayload, messages: currentMessages } | ||
| if ( | ||
| typeof originalToolChoice === 'object' && | ||
| forcedTools.length > 0 | ||
| ) { | ||
| const remaining = forcedTools.filter((t) => !usedForcedTools.includes(t)) | ||
| nextPayload.tool_choice = | ||
| remaining.length > 0 | ||
| ? { type: 'function', function: { name: remaining[0] } } | ||
| : 'auto' |
There was a problem hiding this comment.
usedForcedTools never updated — forced-tool cycling is broken
usedForcedTools is initialised as [] and never mutated inside the tool loop, unlike in the reference xAI provider where checkForForcedToolUsage updates it after every model response. Because the filter forcedTools.filter((t) => !usedForcedTools.includes(t)) always sees an empty exclusion set, remaining always equals the full forcedTools list, so the provider forces the same first tool on every single iteration instead of advancing through the sequence. Every iteration will re-force forcedTools[0], making the tool-usage-control feature entirely non-functional.
| let nextPayload: any = { ...basePayload, messages: currentMessages } | ||
| if ( | ||
| typeof originalToolChoice === 'object' && | ||
| forcedTools.length > 0 | ||
| ) { | ||
| const remaining = forcedTools.filter((t) => !usedForcedTools.includes(t)) | ||
| nextPayload.tool_choice = | ||
| remaining.length > 0 | ||
| ? { type: 'function', function: { name: remaining[0] } } | ||
| : 'auto' | ||
| } else { | ||
| nextPayload.tools = preparedTools?.tools | ||
| nextPayload.tool_choice = 'auto' | ||
| } |
There was a problem hiding this comment.
nextPayload.tools missing in the forced-tool branch
When originalToolChoice is an object (forced tool mode), the payload is built from { ...basePayload, messages: currentMessages } and only tool_choice is appended — tools is never added. The OpenAI-compatible endpoint receives a tool_choice that references a function but no tools array to resolve it against, which will either cause an API error or silently ignore the constraint. The reference xAI provider always sets both tools and tool_choice together in this branch.
| let nextPayload: any = { ...basePayload, messages: currentMessages } | |
| if ( | |
| typeof originalToolChoice === 'object' && | |
| forcedTools.length > 0 | |
| ) { | |
| const remaining = forcedTools.filter((t) => !usedForcedTools.includes(t)) | |
| nextPayload.tool_choice = | |
| remaining.length > 0 | |
| ? { type: 'function', function: { name: remaining[0] } } | |
| : 'auto' | |
| } else { | |
| nextPayload.tools = preparedTools?.tools | |
| nextPayload.tool_choice = 'auto' | |
| } | |
| let nextPayload: any = { ...basePayload, messages: currentMessages } | |
| if ( | |
| typeof originalToolChoice === 'object' && | |
| forcedTools.length > 0 | |
| ) { | |
| const remaining = forcedTools.filter((t) => !usedForcedTools.includes(t)) | |
| nextPayload.tools = preparedTools?.tools | |
| nextPayload.tool_choice = | |
| remaining.length > 0 | |
| ? { type: 'function', function: { name: remaining[0] } } | |
| : 'auto' | |
| } else { | |
| nextPayload.tools = preparedTools?.tools | |
| nextPayload.tool_choice = 'auto' | |
| } |
| logs: [], | ||
| metadata: { | ||
| startTime: providerStartTimeISO, | ||
| endTime: new Date().toISOString(), | ||
| duration: Date.now() - providerStartTime, | ||
| }, | ||
| }, | ||
| } as StreamingExecution | ||
|
|
||
| return streamingResult |
There was a problem hiding this comment.
isStreaming: true missing from streaming execution result
Every other provider (xAI, etc.) sets isStreaming: true inside the execution object returned as StreamingExecution. Callers that check this flag to distinguish a streaming result from a completed ProviderResponse will treat this as a non-streaming response and may attempt to read .content directly instead of consuming the stream.
| logs: [], | |
| metadata: { | |
| startTime: providerStartTimeISO, | |
| endTime: new Date().toISOString(), | |
| duration: Date.now() - providerStartTime, | |
| }, | |
| }, | |
| } as StreamingExecution | |
| return streamingResult | |
| logs: [], | |
| metadata: { | |
| startTime: providerStartTimeISO, | |
| endTime: new Date().toISOString(), | |
| duration: Date.now() - providerStartTime, | |
| }, | |
| isStreaming: true, | |
| }, | |
| } as StreamingExecution | |
| return streamingResult |
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 3 potential issues.
❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
Reviewed by Cursor Bugbot for commit a66fcc0. Configure here.
| const initialPayload = { ...basePayload } | ||
| let originalToolChoice: any | ||
| const forcedTools = preparedTools?.forcedTools || [] | ||
| let usedForcedTools: string[] = [] |
There was a problem hiding this comment.
Forced tools never tracked, causing infinite re-forcing
High Severity
usedForcedTools is initialized as an empty array and never updated after tool calls are executed. Every other provider (xAI, OpenRouter, Deepseek) calls checkForForcedToolUsage or trackForcedToolUsage to populate this array. Because it stays empty, the filter on line 371 never removes any tools from remaining, so the same forcedTools[0] is re-forced every iteration until MAX_TOOL_ITERATIONS is hit.
Additional Locations (1)
Reviewed by Cursor Bugbot for commit a66fcc0. Configure here.
| } else { | ||
| nextPayload.tools = preparedTools?.tools | ||
| nextPayload.tool_choice = 'auto' | ||
| } |
There was a problem hiding this comment.
Missing tools in forced-tool branch payload
High Severity
When the forced-tool branch is taken (lines 367–375), nextPayload is spread from basePayload which only contains model and messages. The code sets tool_choice to force a specific function but never sets nextPayload.tools. Sending tool_choice without tools definitions will cause the API call to fail. The else branch correctly sets nextPayload.tools = preparedTools?.tools, but the forced-tool branch omits it.
Reviewed by Cursor Bugbot for commit a66fcc0. Configure here.
| } | ||
|
|
||
| // ── Streaming (no tools) ────────────────────────────────────────────── | ||
| if (request.stream && (!tools || tools.length === 0)) { |
There was a problem hiding this comment.
Missing streaming-after-tools response path
Medium Severity
When request.stream is true and tools are present, the provider falls through to the non-streaming code path and returns a ProviderResponse instead of a StreamingExecution. Every comparable provider (xAI, OpenRouter, Deepseek, Cerebras) includes a post-tool-loop check for request.stream that creates a final streaming response. The caller expecting a StreamingExecution will receive the wrong response type.
Reviewed by Cursor Bugbot for commit a66fcc0. Configure here.


Summary
Adds Astraflow (by UCloud / 优刻得) as a new LLM provider in sim. Astraflow is an OpenAI-compatible AI model aggregation platform supporting 200+ models.
Two endpoints are registered:
astraflow— Global endpoint (https://api-us-ca.umodelverse.ai/v1), env var:ASTRAFLOW_API_KEYastraflow-cn— China endpoint (https://api.modelverse.cn/v1), env var:ASTRAFLOW_CN_API_KEYChanges
apps/sim/providers/astraflow/index.ts— New provider implementation (OpenAI-compatible, same pattern as xAI/OpenRouter)apps/sim/providers/types.ts— Added'astraflow'and'astraflow-cn'toProviderIdunionapps/sim/providers/models.ts— Addedastraflowandastraflow-cnentries toPROVIDER_DEFINITIONSapps/sim/providers/registry.ts— Imported and registered both providersapps/sim/.env.example— DocumentedASTRAFLOW_API_KEYandASTRAFLOW_CN_API_KEYReferences
Fixes #(issue)
Type of Change
Testing
Provider follows the same OpenAI-compatible pattern as existing providers (xAI, OpenRouter). Reviewers should verify that both
astraflowandastraflow-cnendpoints are correctly registered and that the env var names (ASTRAFLOW_API_KEY,ASTRAFLOW_CN_API_KEY) are properly documented in.env.example.Checklist
Screenshots/Videos
N/A