A Retrieval-Augmented Generation (RAG) system designed to answer questions about course materials using semantic search and AI-powered responses.
This application is a full-stack web application that enables users to query course materials and receive intelligent, context-aware responses. It uses ChromaDB for vector storage, Anthropic's Claude for AI generation, and provides a web interface for interaction.
- Python 3.13 or higher
- uv (Python package manager)
- One of the following AI backends:
- Anthropic API key (for Claude AI) - Paid, best quality
- Groq API key (for Llama models) - Free, fast
- Ollama (for local models) - Free, private
💡 New to this project? Check out:
- OLLAMA_STARTUP_GUIDE.md - Complete guide for using local Ollama models (recommended for beginners)
- MULTI_BACKEND_GUIDE.md - Comparison and setup for all AI backends
- QUICK_START.md - Quick start guide with Groq (free)
-
Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh -
Install Python dependencies
uv sync
-
Set up environment variables
Create a
.envfile in the root directory:ANTHROPIC_API_KEY=your_anthropic_api_key_here
Use the provided shell script:
chmod +x run.sh
./run.shcd backend
uv run uvicorn app:app --reload --port 8000The application will be available at:
- Web Interface:
http://localhost:8000 - API Documentation:
http://localhost:8000/docs