Skip to content

feat: add Astraflow provider support#4614

Open
ucloudnb666 wants to merge 1 commit into
simstudioai:mainfrom
ucloudnb666:feat/astraflow-1778817979
Open

feat: add Astraflow provider support#4614
ucloudnb666 wants to merge 1 commit into
simstudioai:mainfrom
ucloudnb666:feat/astraflow-1778817979

Conversation

@ucloudnb666
Copy link
Copy Markdown

Summary

Adds Astraflow (by UCloud / 优刻得) as a new LLM provider in sim. Astraflow is an OpenAI-compatible AI model aggregation platform supporting 200+ models.

Two endpoints are registered:

  • astraflow — Global endpoint (https://api-us-ca.umodelverse.ai/v1), env var: ASTRAFLOW_API_KEY
  • astraflow-cn — China endpoint (https://api.modelverse.cn/v1), env var: ASTRAFLOW_CN_API_KEY

Changes

  • apps/sim/providers/astraflow/index.ts — New provider implementation (OpenAI-compatible, same pattern as xAI/OpenRouter)
  • apps/sim/providers/types.ts — Added 'astraflow' and 'astraflow-cn' to ProviderId union
  • apps/sim/providers/models.ts — Added astraflow and astraflow-cn entries to PROVIDER_DEFINITIONS
  • apps/sim/providers/registry.ts — Imported and registered both providers
  • apps/sim/.env.example — Documented ASTRAFLOW_API_KEY and ASTRAFLOW_CN_API_KEY

References

Fixes #(issue)

Type of Change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Other: ___________

Testing

Provider follows the same OpenAI-compatible pattern as existing providers (xAI, OpenRouter). Reviewers should verify that both astraflow and astraflow-cn endpoints are correctly registered and that the env var names (ASTRAFLOW_API_KEY, ASTRAFLOW_CN_API_KEY) are properly documented in .env.example.

Checklist

  • Code follows project style guidelines
  • Self-reviewed my changes
  • Tests added/updated and passing
  • No new warnings introduced
  • I confirm that I have read and agree to the terms outlined in the Contributor License Agreement (CLA)

Screenshots/Videos

N/A

Signed-off-by: ucloudnb666 <ucloudnb666@users.noreply.github.com>
@vercel
Copy link
Copy Markdown

vercel Bot commented May 15, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Actions Updated (UTC)
docs Skipped Skipped May 15, 2026 4:06am

Request Review

@cursor
Copy link
Copy Markdown

cursor Bot commented May 15, 2026

PR Summary

Medium Risk
Adds a new third-party LLM provider integration (including streaming and tool-calling loops), which can affect request execution, token/cost accounting, and error handling for the new provider IDs, but is largely additive to existing providers.

Overview
Adds Astraflow provider support (global astraflow and China astraflow-cn) using the OpenAI-compatible chat.completions API, including streaming responses, tool-calling iteration (MAX_TOOL_ITERATIONS), structured output prompting via json_object, and timing/token/cost accounting.

Registers both provider IDs in the provider registry and model/provider definitions (with reseller capabilities and model ID patterns), and documents the optional ASTRAFLOW_API_KEY / ASTRAFLOW_CN_API_KEY env vars in .env.example.

Reviewed by Cursor Bugbot for commit a66fcc0. Bugbot is set up for automated code reviews on this repo. Configure here.

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented May 15, 2026

Greptile Summary

This PR adds Astraflow (UCloud's OpenAI-compatible aggregation platform) as a new provider, exposing both a global and a China endpoint. The registration, type definitions, and environment variable documentation are all consistent with existing provider patterns.

  • apps/sim/providers/astraflow/index.ts — New provider implementation following the xAI/OpenRouter pattern, but with three deviations from the reference: forced-tool cycling is broken because usedForcedTools is never updated, subsequent tool-loop calls in the forced-tool branch omit the tools array, and the streaming execution result is missing isStreaming: true.
  • apps/sim/providers/models.ts, registry.ts, types.ts, .env.example — Wiring and documentation are correct and follow project conventions.

Confidence Score: 3/5

The provider wiring is safe but the core execution logic has multiple defects affecting tool-use and streaming paths.

Three defects in the implementation file: forced-tool iteration always re-forces the same first tool, follow-up API calls in forced-tool mode omit the tools array entirely, and streaming responses lack the isStreaming flag. All three affect live execution paths.

apps/sim/providers/astraflow/index.ts — should be reconciled against apps/sim/providers/xai/index.ts as the closest reference.

Important Files Changed

Filename Overview
apps/sim/providers/astraflow/index.ts New OpenAI-compatible provider; three logic bugs: usedForcedTools never updated (tool cycling broken), missing tools list in forced-tool nextPayload branch, and isStreaming flag absent from streaming result.
apps/sim/providers/models.ts Adds astraflow and astraflow-cn provider definitions with empty model lists and correct isReseller flag; straightforward and consistent with existing patterns.
apps/sim/providers/registry.ts Imports and registers both astraflow providers correctly; no issues.
apps/sim/providers/types.ts Adds astraflow and astraflow-cn to the ProviderId union type; correct.
apps/sim/.env.example Documents ASTRAFLOW_API_KEY and ASTRAFLOW_CN_API_KEY with helpful endpoint comments; consistent with existing .env.example style.

Sequence Diagram

sequenceDiagram
    participant C as Caller
    participant AP as AstraflowProvider
    participant OAI as OpenAI-compat API

    C->>AP: executeRequest(request)
    alt "stream=true AND no tools"
        AP->>OAI: chat.completions.create(stream:true)
        OAI-->>AP: AsyncIterable chunks
        AP-->>C: StreamingExecution (isStreaming missing)
    else non-streaming / tool loop
        AP->>OAI: chat.completions.create(initialPayload)
        OAI-->>AP: response
        loop tool iterations
            AP->>AP: execute tools in parallel
            AP->>OAI: chat.completions.create(nextPayload)
            OAI-->>AP: next response
            note over AP: usedForcedTools never updated
        end
        AP-->>C: ProviderResponse
    end
Loading

Reviews (1): Last reviewed commit: "feat: add Astraflow provider support" | Re-trigger Greptile

Comment on lines +362 to +375

const thisToolsTime = Date.now() - toolsStartTime
toolsTime += thisToolsTime

let nextPayload: any = { ...basePayload, messages: currentMessages }
if (
typeof originalToolChoice === 'object' &&
forcedTools.length > 0
) {
const remaining = forcedTools.filter((t) => !usedForcedTools.includes(t))
nextPayload.tool_choice =
remaining.length > 0
? { type: 'function', function: { name: remaining[0] } }
: 'auto'
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 usedForcedTools never updated — forced-tool cycling is broken

usedForcedTools is initialised as [] and never mutated inside the tool loop, unlike in the reference xAI provider where checkForForcedToolUsage updates it after every model response. Because the filter forcedTools.filter((t) => !usedForcedTools.includes(t)) always sees an empty exclusion set, remaining always equals the full forcedTools list, so the provider forces the same first tool on every single iteration instead of advancing through the sequence. Every iteration will re-force forcedTools[0], making the tool-usage-control feature entirely non-functional.

Comment on lines +366 to +379
let nextPayload: any = { ...basePayload, messages: currentMessages }
if (
typeof originalToolChoice === 'object' &&
forcedTools.length > 0
) {
const remaining = forcedTools.filter((t) => !usedForcedTools.includes(t))
nextPayload.tool_choice =
remaining.length > 0
? { type: 'function', function: { name: remaining[0] } }
: 'auto'
} else {
nextPayload.tools = preparedTools?.tools
nextPayload.tool_choice = 'auto'
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 nextPayload.tools missing in the forced-tool branch

When originalToolChoice is an object (forced tool mode), the payload is built from { ...basePayload, messages: currentMessages } and only tool_choice is appended — tools is never added. The OpenAI-compatible endpoint receives a tool_choice that references a function but no tools array to resolve it against, which will either cause an API error or silently ignore the constraint. The reference xAI provider always sets both tools and tool_choice together in this branch.

Suggested change
let nextPayload: any = { ...basePayload, messages: currentMessages }
if (
typeof originalToolChoice === 'object' &&
forcedTools.length > 0
) {
const remaining = forcedTools.filter((t) => !usedForcedTools.includes(t))
nextPayload.tool_choice =
remaining.length > 0
? { type: 'function', function: { name: remaining[0] } }
: 'auto'
} else {
nextPayload.tools = preparedTools?.tools
nextPayload.tool_choice = 'auto'
}
let nextPayload: any = { ...basePayload, messages: currentMessages }
if (
typeof originalToolChoice === 'object' &&
forcedTools.length > 0
) {
const remaining = forcedTools.filter((t) => !usedForcedTools.includes(t))
nextPayload.tools = preparedTools?.tools
nextPayload.tool_choice =
remaining.length > 0
? { type: 'function', function: { name: remaining[0] } }
: 'auto'
} else {
nextPayload.tools = preparedTools?.tools
nextPayload.tool_choice = 'auto'
}

Comment on lines +191 to +200
logs: [],
metadata: {
startTime: providerStartTimeISO,
endTime: new Date().toISOString(),
duration: Date.now() - providerStartTime,
},
},
} as StreamingExecution

return streamingResult
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 isStreaming: true missing from streaming execution result

Every other provider (xAI, etc.) sets isStreaming: true inside the execution object returned as StreamingExecution. Callers that check this flag to distinguish a streaming result from a completed ProviderResponse will treat this as a non-streaming response and may attempt to read .content directly instead of consuming the stream.

Suggested change
logs: [],
metadata: {
startTime: providerStartTimeISO,
endTime: new Date().toISOString(),
duration: Date.now() - providerStartTime,
},
},
} as StreamingExecution
return streamingResult
logs: [],
metadata: {
startTime: providerStartTimeISO,
endTime: new Date().toISOString(),
duration: Date.now() - providerStartTime,
},
isStreaming: true,
},
} as StreamingExecution
return streamingResult

Copy link
Copy Markdown

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 3 potential issues.

Fix All in Cursor

❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.

Reviewed by Cursor Bugbot for commit a66fcc0. Configure here.

const initialPayload = { ...basePayload }
let originalToolChoice: any
const forcedTools = preparedTools?.forcedTools || []
let usedForcedTools: string[] = []
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Forced tools never tracked, causing infinite re-forcing

High Severity

usedForcedTools is initialized as an empty array and never updated after tool calls are executed. Every other provider (xAI, OpenRouter, Deepseek) calls checkForForcedToolUsage or trackForcedToolUsage to populate this array. Because it stays empty, the filter on line 371 never removes any tools from remaining, so the same forcedTools[0] is re-forced every iteration until MAX_TOOL_ITERATIONS is hit.

Additional Locations (1)
Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit a66fcc0. Configure here.

} else {
nextPayload.tools = preparedTools?.tools
nextPayload.tool_choice = 'auto'
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing tools in forced-tool branch payload

High Severity

When the forced-tool branch is taken (lines 367–375), nextPayload is spread from basePayload which only contains model and messages. The code sets tool_choice to force a specific function but never sets nextPayload.tools. Sending tool_choice without tools definitions will cause the API call to fail. The else branch correctly sets nextPayload.tools = preparedTools?.tools, but the forced-tool branch omits it.

Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit a66fcc0. Configure here.

}

// ── Streaming (no tools) ──────────────────────────────────────────────
if (request.stream && (!tools || tools.length === 0)) {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing streaming-after-tools response path

Medium Severity

When request.stream is true and tools are present, the provider falls through to the non-streaming code path and returns a ProviderResponse instead of a StreamingExecution. Every comparable provider (xAI, OpenRouter, Deepseek, Cerebras) includes a post-tool-loop check for request.stream that creates a final streaming response. The caller expecting a StreamingExecution will receive the wrong response type.

Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit a66fcc0. Configure here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant