Skip to content

Knuckles-Team/langfuse-agent

Repository files navigation

Langfuse Agent - A2A | AG-UI | MCP

PyPI - Version MCP Server PyPI - Downloads GitHub Repo stars GitHub forks GitHub contributors PyPI - License GitHub

GitHub last commit (by committer) GitHub pull requests GitHub closed pull requests GitHub issues

GitHub top language GitHub language count GitHub repo size GitHub repo file count (file type) PyPI - Wheel PyPI - Implementation

Version: 0.1.7

Overview

Langfuse Agent MCP Server + A2A Agent

Agent for interacting with Langfuse Observability API

This repository is actively maintained - Contributions are welcome!

MCP

Using as an MCP Server

The MCP Server can be run in two modes: stdio (for local testing) or http (for networked access).

Environment Variables

  • LANGFUSE_URL: The URL of the target service.
  • LANGFUSE_TOKEN: The API token or access token.

Run in stdio mode (default):

export LANGFUSE_URL="http://localhost:8080"
export LANGFUSE_TOKEN="your_token"
langfuse-mcp --transport "stdio"

Run in HTTP mode:

export LANGFUSE_URL="http://localhost:8080"
export LANGFUSE_TOKEN="your_token"
langfuse-mcp --transport "http" --host "0.0.0.0" --port "8000"

A2A Agent

Run A2A Server

export LANGFUSE_URL="http://localhost:8080"
export LANGFUSE_TOKEN="your_token"
langfuse-agent --provider openai --model-id gpt-4o --api-key sk-...

Docker

Build

docker build -t langfuse-agent .

Run MCP Server

docker run -d \
  --name langfuse-agent \
  -p 8000:8000 \
  -e TRANSPORT=http \
  -e LANGFUSE_URL="http://your-service:8080" \
  -e LANGFUSE_TOKEN="your_token" \
  knucklessg1/langfuse-agent:latest

Deploy with Docker Compose

services:
  langfuse-agent:
    image: knucklessg1/langfuse-agent:latest
    environment:
      - HOST=0.0.0.0
      - PORT=8000
      - TRANSPORT=http
      - LANGFUSE_URL=http://your-service:8080
      - LANGFUSE_TOKEN=your_token
    ports:
      - 8000:8000

Configure mcp.json for AI Integration (e.g. Claude Desktop)

{
  "mcpServers": {
    "langfuse": {
      "command": "uv",
      "args": [
        "run",
        "--with",
        "langfuse-agent",
        "langfuse-mcp"
      ],
      "env": {
        "LANGFUSE_URL": "http://your-service:8080",
        "LANGFUSE_TOKEN": "your_token"
      }
    }
  }
}

Install Python Package

python -m pip install langfuse-agent
uv pip install langfuse-agent

Repository Owners

GitHub followers GitHub User's stars

About

Langfuse Agent loads Langfuse MCP and Pydantic Graph

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors