Starting with Streamlit 1.53, we're introducing experimental support for running Streamlit with Starlette as the underlying web framework. This is the first step toward our long-term goal of fully migrating from Tornado to Starlette/ASGI, and we need your help to identify any issues or gaps before the full transition.
What's New
Note: To use the experimental Starlette features, you need to install the optional dependencies:
pip install streamlit[starlette]
1. Starlette Server Mode (Experimental)
You can now run Streamlit using Starlette instead of Tornado by setting a config option:
# .streamlit/config.toml
[server]
useStarlette = true
Or via command line:
streamlit run app.py --server.useStarlette=true
2. ASGI App Entry Point (Experimental)
For advanced use cases, you can now create an ASGI-compatible Streamlit application using the new App class:
from streamlit.starlette import App
app = App("main.py")
Run it with streamlit run (preferred):
Streamlit will automatically detect the App instance and run it in Starlette mode. Alternatively, you can use any ASGI server directly:
uvicorn myapp:app --host 0.0.0.0 --port 8501
The App class supports advanced configuration including:
- Custom HTTP routes — Add REST API endpoints alongside your Streamlit app
- Middleware — Add security headers, authentication, logging, etc.
- Lifespan hooks — Run code on startup/shutdown (e.g., pre-warm caches, initialize resources)
- Framework integration — Mount Streamlit inside FastAPI, Django, or other ASGI frameworks
📖 For more details on the App interface and its full capabilities, see the specification in PR #13449.
Example: Adding custom routes
from streamlit.starlette import App
from starlette.routing import Route
from starlette.responses import JSONResponse
async def health_check(request):
return JSONResponse({"status": "healthy", "service": "streamlit-app"})
async def get_data(request):
return JSONResponse({"items": ["apple", "banana", "cherry"], "count": 3})
app = App(
"main.py",
routes=[
Route("/api/health", health_check),
Route("/api/data", get_data),
],
)
Example: Adding custom middleware
from streamlit.starlette import App
from starlette.middleware import Middleware
from starlette.middleware.base import BaseHTTPMiddleware
class SecurityHeadersMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request, call_next):
response = await call_next(request)
response.headers["X-Content-Type-Options"] = "nosniff"
response.headers["X-Frame-Options"] = "SAMEORIGIN"
return response
app = App(
"main.py",
middleware=[Middleware(SecurityHeadersMiddleware)],
)
Example: Mounting FastAPI in Streamlit
from fastapi import FastAPI
from starlette.routing import Mount
from streamlit.starlette import App
# Create FastAPI sub-application
api = FastAPI()
@api.get("/health")
async def health():
return {"status": "healthy"}
@api.post("/predict")
async def predict(data: dict):
return {"prediction": data.get("value", 0) * 2}
# Mount FastAPI into Streamlit app
# - Streamlit UI at /
# - FastAPI endpoints at /api/*
# - FastAPI docs at /api/docs
app = App(
"dashboard.py",
routes=[Mount("/api", app=api)],
)
Example: Mounting Streamlit in FastAPI
from fastapi import FastAPI
from streamlit.starlette import App
# Create the Streamlit sub-application
streamlit_app = App("dashboard.py")
# Create FastAPI with Streamlit's lifespan to manage the runtime lifecycle
api = FastAPI(lifespan=streamlit_app.lifespan())
@api.get("/api/data")
async def get_data():
return {"data": [1, 2, 3]}
# Mount Streamlit under /dashboard
api.mount("/dashboard", streamlit_app)
# Run with: uvicorn myapp:api
Example: Lifespan hooks for startup/shutdown
from contextlib import asynccontextmanager
from streamlit.starlette import App
@asynccontextmanager
async def lifespan(app):
# Startup: runs before accepting connections
print("🚀 Starting up...")
model = load_ml_model() # Pre-warm caches
db = init_db_connection()
# Yield state accessible via app.state
yield {"model": model, "db": db}
# Shutdown: cleanup resources
print("👋 Shutting down...")
db.close()
app = App("main.py", lifespan=lifespan)
Why We're Migrating to Starlette/ASGI
The migration to Starlette/ASGI will unlock numerous frequently-requested features:
Custom HTTP Endpoints
Security Headers & Middleware
Framework Integration & ASGI Support
Server Configuration & Lifecycle
SEO & Metadata
We Need Your Feedback
Since we plan to eventually make Starlette the default (and only) server backend, we want to catch any compatibility issues early. Please help us by:
1. Try it out
Enable Starlette mode in your existing apps:
streamlit run your_app.py --server.useStarlette=true
2. Report any issues
If you encounter problems, please comment below with:
- What happened — Error messages, unexpected behavior
- Your environment — OS, Python version, Streamlit version
- How to reproduce — Minimal code example if possible
- Comparison — Does the issue occur with the default Tornado server?
3. Share your use cases
We're especially interested in:
- ⚡ Performance comparisons (response times, memory usage, concurrent users)
- 🔌 Integration scenarios (FastAPI, Django, other ASGI frameworks)
- 🛡️ Security/middleware requirements
- 🚀 Deployment configurations (uvicorn, gunicorn, hypercorn, cloud platforms)
Known Limitations
- This is experimental — APIs may change before the final release
- TBD
Thank you for helping us build a better Streamlit! Your early feedback is invaluable in ensuring a smooth transition for the entire community. 🙏
Please comment below with your experiences, questions, or concerns.

Starting with Streamlit 1.53, we're introducing experimental support for running Streamlit with Starlette as the underlying web framework. This is the first step toward our long-term goal of fully migrating from Tornado to Starlette/ASGI, and we need your help to identify any issues or gaps before the full transition.
What's New
1. Starlette Server Mode (Experimental)
You can now run Streamlit using Starlette instead of Tornado by setting a config option:
Or via command line:
2. ASGI App Entry Point (Experimental)
For advanced use cases, you can now create an ASGI-compatible Streamlit application using the new
Appclass:Run it with
streamlit run(preferred):Streamlit will automatically detect the
Appinstance and run it in Starlette mode. Alternatively, you can use any ASGI server directly:The
Appclass supports advanced configuration including:Example: Adding custom routes
Example: Adding custom middleware
Example: Mounting FastAPI in Streamlit
Example: Mounting Streamlit in FastAPI
Example: Lifespan hooks for startup/shutdown
Why We're Migrating to Starlette/ASGI
The migration to Starlette/ASGI will unlock numerous frequently-requested features:
Custom HTTP Endpoints
Security Headers & Middleware
Framework Integration & ASGI Support
Server Configuration & Lifecycle
SEO & Metadata
We Need Your Feedback
Since we plan to eventually make Starlette the default (and only) server backend, we want to catch any compatibility issues early. Please help us by:
1. Try it out
Enable Starlette mode in your existing apps:
2. Report any issues
If you encounter problems, please comment below with:
3. Share your use cases
We're especially interested in:
Known Limitations
Thank you for helping us build a better Streamlit! Your early feedback is invaluable in ensuring a smooth transition for the entire community. 🙏
Please comment below with your experiences, questions, or concerns.