Skip to content

ogx-ai/ogx

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3,873 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

OGX

PyPI Version PyPI Downloads Docker Hub Pulls License Slack Unit Tests Integration Tests OpenResponses Conformance Ask DeepWiki

Quick Start | Documentation | OpenAI API Compatibility | Slack

Open-source agentic API server for building AI applications. OpenAI-compatible. Any model, any infrastructure.

OGX Architecture

OGX is a drop-in replacement for the OpenAI API that you can run anywhere — your laptop, your datacenter, or the cloud. Use any OpenAI-compatible client or agentic framework. Swap between Llama, GPT, Gemini, Mistral, or any model without changing your application code.

from openai import OpenAI

client = OpenAI(base_url="http://localhost:8321/v1", api_key="fake")
response = client.chat.completions.create(
    model="llama-3.3-70b",
    messages=[{"role": "user", "content": "Hello"}],
)

What you get

  • Chat Completions & Embeddings — standard /v1/chat/completions, /v1/completions, and /v1/embeddings endpoints, compatible with any OpenAI client
  • Responses API — server-side agentic orchestration with tool calling, MCP server integration, and built-in file search (RAG) in a single API call (learn more)
  • Vector Stores & Files/v1/vector_stores and /v1/files for managed document storage and search
  • Batches/v1/batches for offline batch processing
  • Open Responses conformant — the Responses API implementation passes the Open Responses conformance test suite
  • Multi-SDK support — use the Anthropic SDK (/v1/messages) or Google GenAI SDK (/v1alpha/interactions) natively alongside the OpenAI API

Use any model, use any infrastructure

OGX has a pluggable provider architecture. Develop locally with Ollama, deploy to production with vLLM, or connect to a managed service — the API stays the same.

See the provider documentation for the full list.

Get started

Install and run a OGX server:

# One-line install
curl -LsSf https://github.com/ogx-ai/ogx/raw/main/scripts/install.sh | bash

# Or install via uv
uv pip install ogx[starter]

# Start the server (uses the starter distribution with Ollama)
uv run ogx run starter

Then connect with any OpenAI, Anthropic, or Google GenAI client — Python, TypeScript, curl, or any framework that speaks these APIs.

See the Quick Start guide for detailed setup.

Resources

Client SDKs:

Language SDK Package
Python ogx-client-python PyPI version
TypeScript ogx-client-typescript NPM version

Community

We hold regular community calls every Thursday at 09:00 AM PST — see the Community Event on Slack for details.

Star History Chart

Thanks to all our amazing contributors!

OGX contributors

About

Open GenAI Stack

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors