A Model Context Protocol (MCP) server that provides tools for interacting with Sunra.ai services.
npx @sunra/mcp-server --helpgit clone https://github.com/sunra-ai/sunra-clients.git
cd sunra-clients/mcp-server
npm install
npm run buildsunra-mcp-server [options]
Options:
-t, --transport <type> Transport type: 'stdio' or 'http' (default: stdio)
-p, --port <number> Port for HTTP transport (default: 3000)
-h, --host <string> Host for HTTP transport (default: localhost)
--help Show this help message
Examples:
sunra-mcp-server # Start with stdio transport
sunra-mcp-server --transport http # Start with HTTP transport on port 3000
sunra-mcp-server -t http -p 8080 # Start with HTTP transport on port 8080Add the following to your .cursor/mcp.json file:
{
"mcpServers": {
"sunra-mcp-server": {
"command": "npx",
"args": ["@sunra/mcp-server"],
"env": {
"SUNRA_KEY": "${SUNRA_KEY}"
}
}
}
}Add the following to your Claude Desktop configuration file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"sunra-mcp-server": {
"command": "npx",
"args": ["@sunra/mcp-server"],
"env": {
"SUNRA_KEY": "${SUNRA_KEY}"
}
}
}
}- Base Tools: Submit, status, result, cancel, subscribe operations
- Model Management: List, search, and get schema information for AI models
- File Management: Upload files to Sunra.ai
- Authentication: Secure API key management
- Multiple Transports: Supports both stdio (for Claude Desktop) and HTTP (for Cursor)
submit- Submit a request to a model endpointstatus- Check the status of a requestresult- Get the result of a completed requestcancel- Cancel a pending requestsubscribe- Submit and wait for completion
list-models- List all available modelssearch-models- Search for models by name or descriptionmodel-schema- Get input and output schemas for a specific model endpoint
upload- Upload files to Sunra.ai storage
set-sunra-key- Configure your Sunra.ai API key
The model-schema tool now accepts a model slug in the format owner/model/endpoint and returns only the input and output schemas:
# Get schema for a specific model endpoint
model-schema --modelSlug "black-forest-labs/flux-kontext-max/text-to-image"The tool automatically resolves OpenAPI $ref references to provide fully expanded schemas. For example, if the original OpenAPI schema contains:
{
"schema": {
"$ref": "#/components/schemas/TextToVideoInput"
}
}The tool will resolve this reference and return the actual schema definition:
{
"inputSchema": {
"type": "object",
"properties": {
"prompt": {
"type": "string",
"description": "Text prompt for video generation"
},
"duration": {
"type": "integer",
"enum": [5, 10],
"description": "Duration of the video in seconds"
}
},
"required": ["prompt"]
}
}The tool handles:
- ✅ Simple references (
#/components/schemas/SchemaName) - ✅ Nested references within objects and arrays
- ✅ Circular references (marked with
$circular: true) - ✅ Missing references (graceful fallback to original
$ref)
Response format:
{
"success": true,
"modelSlug": "black-forest-labs/flux-kontext-max/text-to-image",
"owner": "black-forest-labs",
"model": "flux-kontext-max",
"endpoint": "text-to-image",
"inputSchema": {
"type": "object",
"properties": {
"prompt": {
"type": "string",
"description": "Text prompt for image generation"
}
},
"required": ["prompt"]
},
"outputSchema": {
"type": "object",
"properties": {
"id": {
"type": "string",
"description": "Request ID"
},
"status": {
"type": "string",
"description": "Request status"
},
"output": {
"type": "object",
"description": "Generated output"
}
}
}
}npm testnpm run buildnpm startSet your Sunra.ai API key as an environment variable:
export SUNRA_KEY="your-api-key-here"Or use the set-sunra-key tool at runtime.
To publish to npm:
npm run build
npm publishFor detailed API documentation, see the Sunra.ai API documentation.