A music industry management platform for artists, managers, and collaborators. Handles artist profiles, project/work management, rights registration, contract analysis, royalty calculations, and collaboration workflows.
- Conda (Miniconda or Anaconda)
- Node.js 18+ (installed via Conda)
- Python 3.11+ (installed via Conda)
- Poetry (Python dependency management)
- go-task (task runner — optional but recommended)
# Create and activate conda environment
conda create -n msanii-ai nodejs python=3.11 -c conda-forge
conda activate msanii-ai
# Install go-task (Taskfile runner)
brew install go-taskNote: Do NOT install Homebrew's
taskpackage — that's Taskwarrior (a todo app), not the Taskfile runner. Usego-task.
# Frontend
npm install
# Backend
cd src/backend
poetry install
cd ../..Or with go-task:
task installcp .env.example .envEdit .env and fill in:
- Supabase (required):
VITE_SUPABASE_URL— Project URL from Supabase Dashboard → Settings → APIVITE_SUPABASE_ANON_KEY— Anon/public key (same page)VITE_SUPABASE_SECRET_KEY— Service role key (same page, used by backend for RLS bypass)DATABASE_PW— Database password (Settings → Database)
- Backend:
VITE_BACKEND_API_URL(default:http://localhost:8000) - OpenAI:
OPENAI_API_KEY(for Zoe contract analysis) - Integrations (optional): Google Drive and Slack OAuth credentials — see
.env.examplefor details
# Terminal 1: Frontend
npm run dev
# → http://localhost:8080
# Terminal 2: Backend
cd src/backend
poetry run uvicorn main:app --port 8000| Command | Description |
|---|---|
npm run dev |
Dev server on http://localhost:8080 |
npm run build |
Production build (outputs to dist/) |
npm run lint |
ESLint |
npm run preview |
Preview production build |
| Command | Description |
|---|---|
poetry install |
Install Python dependencies |
poetry run uvicorn main:app --port 8000 |
Local backend server |
poetry run pytest -v |
Run all tests |
poetry run pytest tests/test_X.py -v |
Run specific test file |
poetry run ruff check . |
Lint Python code |
poetry run ruff format . |
Auto-format Python code |
| Command | Description |
|---|---|
task install |
Install all dependencies (Poetry + npm) |
task test |
Run all backend tests |
task lint |
Run all linters (ruff + ESLint) |
task lint:backend |
Run ruff lint + format check |
task lint:frontend |
Run ESLint |
task format |
Auto-fix formatting (ruff + ESLint) |
task ci |
Full CI pipeline locally (lint + test + build) |
Uses Supabase Auth with Google OAuth. To set up:
- Go to Supabase Dashboard
- Get project credentials from Settings → API
- Add to
.env:VITE_SUPABASE_URL=your-project-url VITE_SUPABASE_ANON_KEY=your-anon-key VITE_SUPABASE_SECRET_KEY=your-service-role-key
Enables importing contracts/files from Drive into projects, and exporting documents (split sheets, royalty reports) back to Drive.
Setup:
- Create OAuth credentials at Google Cloud Console
- Set authorized redirect URI to:
{BACKEND_URL}/integrations/google-drive/callback - Add to
.env:GOOGLE_DRIVE_CLIENT_IDandGOOGLE_DRIVE_CLIENT_SECRET
OAuth flow: User clicks "Connect" → redirected to Google consent screen → grants Drive access → redirected back to /workspace?connected=google_drive → token encrypted and stored in integration_connections table.
Enables per-project Slack channel linking, rich Block Kit notifications (task updates, contract uploads, royalty calculations), and inbound @mention notifications.
Setup:
- Create a Slack app at api.slack.com/apps
- Add OAuth scopes:
channels:read,chat:write,commands,incoming-webhook - Enable Event Subscriptions → subscribe to
app_mentionevent → set request URL to{BACKEND_URL}/integrations/slack/webhook - Set authorized redirect URI to:
{BACKEND_URL}/integrations/slack/callback - Add to
.env:SLACK_CLIENT_IDandSLACK_CLIENT_SECRET
OAuth flow: Same pattern as Drive — user clicks "Connect" → Slack consent → token stored. The app then sends notifications to linked channels and receives @mention webhooks.
Both integrations require encryption keys for secure token storage:
# Generate encryption key (one-time)
python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"
# Generate OAuth state secret (one-time)
python -c "import secrets; print(secrets.token_urlsafe(32))"Add both to .env as INTEGRATION_ENCRYPTION_KEY and INTEGRATION_OAUTH_STATE_SECRET.
Tests use pytest with FastAPI's TestClient and mocked Supabase (no real database needed):
cd src/backend
poetry run pytest -v # All tests
poetry run pytest tests/test_integrations.py -v # Integration tests only
poetry run pytest tests/test_boards.py -v # Board tests onlyThe tests mock the Supabase client and OAuth tokens — they verify endpoint behavior (request/response shapes, status codes, error handling) without calling external APIs. For example, integration tests verify:
- Connection listing returns correct fields and omits encrypted tokens
- OAuth auth endpoints return redirect URLs
- Disconnect endpoints clean up properly
- Slack webhook handles URL verification challenges and
app_mentionevents - OneClick share validates required fields and returns correct responses
npm run build # Catches TypeScript errors, missing importsoneclick/
├── src/
│ ├── backend/ # FastAPI server (Python, Poetry, Docker)
│ │ ├── main.py # App entry, all routers mounted here
│ │ ├── boards/ # Kanban board management
│ │ ├── integrations/ # Google Drive, Slack, Notion, Monday.com
│ │ ├── oneclick/ # Royalty calculator + PDF share
│ │ ├── registry/ # Rights registry
│ │ ├── splitsheet/ # Split sheet generator
│ │ ├── settings/ # Workspace settings
│ │ ├── projects/ # Project management
│ │ ├── tests/ # pytest test suite
│ │ └── zoe_chatbot/ # Zoe AI contract chatbot
│ ├── components/ # React components
│ │ ├── ui/ # shadcn base components
│ │ ├── project/ # Project detail tabs + integration UIs
│ │ ├── workspace/ # Workspace tabs, integration hub, boards
│ │ ├── oneclick/ # OneClick calculation UI
│ │ ├── registry/ # Rights registry panels
│ │ ├── notes/ # BlockNote rich text editor
│ │ └── zoe/ # Zoe AI chat
│ ├── pages/ # Route pages (lazy-loaded)
│ ├── hooks/ # React Query hooks
│ ├── contexts/ # AuthContext
│ ├── integrations/ # Supabase client + types
│ ├── types/ # TypeScript type definitions
│ └── lib/ # Utilities
├── supabase/migrations/ # Database migrations
├── Taskfile.yml # Task runner config
└── .env.example # Environment variable template
| Environment | Frontend | Backend | Trigger |
|---|---|---|---|
| Dev | Vercel (auto-deploy from main) |
Cloud Run (msanii-backend-dev) |
Push to main |
| Prod | Vercel (CLI deploy) | Cloud Run (msanii-backend) |
Published tag release (v*) |
Both environments share the same Supabase database — data is user-scoped.
Push or merge to main:
git checkout main
git merge your-feature-branch
git push origin mainCreate a tag release:
git tag v1.0.0
git push origin v1.0.0Or create a release through GitHub: Releases → Draft a new release → Choose tag → Publish.
| Secret | Purpose |
|---|---|
GCP_PROJECT_ID |
GCP project for Cloud Run |
GCP_WORKLOAD_IDENTITY_PROVIDER |
GCP auth (WIF) |
GCP_SERVICE_ACCOUNT_EMAIL |
GCP auth (WIF) |
DEV_ALLOWED_ORIGINS |
Dev Vercel URL for CORS |
PROD_ALLOWED_ORIGINS |
Prod Vercel URL for CORS |
VERCEL_PROD_TOKEN |
Vercel API token for prod deploys |
VERCEL_ORG_ID |
Vercel org ID |
VERCEL_PROJECT_ID |
Vercel prod project ID |
- Create a feature branch from
main - Make your changes
- Run
task ci(or manually:task lint && task test && npm run build) - Submit a pull request to
main - Once merged, changes auto-deploy to dev
- When ready for prod, create a tag release
This project is private and proprietary.