Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .cursor/commands/pr-comments-review.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Please review the PR comments with `gh pr status` and fix & resolve those issues that are valid and relevant. Leave those comments that are mostly bullshit unresolved. Report the result to me in detail. Do NOT automatically commit or stage the changes back to the PR!
5 changes: 4 additions & 1 deletion AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This file provides guidance to coding agents when working with code in this repo

#### Extra commands
These commands are usually already called by the user, but you can remind them to run it for you if they forgot to.
- **Build packages**: `pnpm build:packages`
- **Build packages**: `pnpm build:packages` (you should never call this yourself)
- **Start dependencies**: `pnpm restart-deps` (resets & restarts Docker containers for DB, Inbucket, etc. Usually already called by the user)
- **Run development**: Already called by the user in the background. You don't need to do this. This will also watch for changes and rebuild packages, codegen, etc. Do NOT call build:packages, dev, codegen, or anything like that yourself, as the dev is already running it.
- **Run minimal dev**: `pnpm dev:basic` (only backend and dashboard for resource-limited systems)
Expand Down Expand Up @@ -93,6 +93,9 @@ To see all development ports, refer to the index.html of `apps/dev-launchpad/pub
- If there is an external browser tool connected, use it to test changes you make to the frontend when possible.
- Whenever you update an SDK implementation in `sdks/implementations`, make sure to update the specs accordingly in `sdks/specs` such that if you reimplemented the entire SDK from the specs again, you would get the same implementation. (For example, if the specs are not precise enough to describe a change you made, make the specs more precise.)
- When building internal tools for Stack Auth developers (eg. internal interfaces like the WAL info log etc.): Make the interfaces look very concise, assume the user is a pro-user. This only applies to internal tools that are used primarily by Stack Auth developers.
- The dev server already builds the packages in the background whenever you update a file. If you run into issues with typechecking or linting in a dependency after updating something in a package, just wait a few seconds, and then try again, and they will likely be resolved.
- When asked to review PR comments, you can use `gh pr status` to get the current pull request you're working on.
- NEVER EVER AUTOMATICALLY COMMIT OR STAGE ANY CHANGES — DON'T MODIFY GIT WITHOUT USER CONSENT!
- When building frontend or React code for the dashboard, refer to DESIGN-GUIDE.md.
- NEVER implement a hacky solution without EXPLICIT approval from the user. Always go the extra mile to make sure the solution is clean, maintainable, and robust.
- Fail early, fail loud. Fail fast with an error instead of silently continuing.
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,123 @@
-- Migration to fix incorrectly formatted trusted domain entries in EnvironmentConfigOverride.
--
-- A previous migration sometimes generated entries like:
-- "domains.trustedDomains.<id>.<property1>": value1,
-- "domains.trustedDomains.<id>.<property2>": value2
--
-- Without the parent key:
-- "domains.trustedDomains.<id>": { ... }
--
-- This migration adds an empty object at the <id> level for any missing parent keys:
-- "domains.trustedDomains.<id>": {},
-- "domains.trustedDomains.<id>.<property1>": value1,
-- "domains.trustedDomains.<id>.<property2>": value2

-- Add temporary column to track processed rows (outside transaction so it's visible immediately)
-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
ALTER TABLE /* SCHEMA_NAME_SENTINEL */."EnvironmentConfigOverride" ADD COLUMN IF NOT EXISTS "temp_trusted_domains_checked" BOOLEAN DEFAULT FALSE;
-- SPLIT_STATEMENT_SENTINEL

-- Create index on the temporary column for efficient querying
-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
CREATE INDEX CONCURRENTLY IF NOT EXISTS "temp_eco_trusted_domains_checked_idx"
ON /* SCHEMA_NAME_SENTINEL */."EnvironmentConfigOverride" ("temp_trusted_domains_checked")
WHERE "temp_trusted_domains_checked" IS NOT TRUE;
-- SPLIT_STATEMENT_SENTINEL

-- Process rows in batches (outside transaction so each batch commits independently)
-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
-- CONDITIONALLY_REPEAT_MIGRATION_SENTINEL
WITH rows_to_check AS (
-- Get unchecked rows
SELECT "projectId", "branchId", "config"
FROM /* SCHEMA_NAME_SENTINEL */."EnvironmentConfigOverride"
WHERE "temp_trusted_domains_checked" IS NOT TRUE
-- Keep batch size small for consistent performance
LIMIT 1000
),
matching_keys AS (
-- Find all keys that look like "domains.trustedDomains.<id>.<property...>"
-- (4 or more dot-separated parts starting with domains.trustedDomains)
SELECT
rtc."projectId",
rtc."branchId",
key,
-- Extract the parent key: domains.trustedDomains.<id>
(string_to_array(key, '.'))[1] || '.' ||
(string_to_array(key, '.'))[2] || '.' ||
(string_to_array(key, '.'))[3] AS parent_key
FROM rows_to_check rtc,
jsonb_object_keys(rtc."config") AS key
WHERE key ~ '^domains\.trustedDomains\.[^.]+\..+'
-- Pattern matches: domains.trustedDomains.<id>.<anything>
-- e.g. "domains.trustedDomains.abc123.baseUrl"
),
missing_parents AS (
-- Find parent keys that don't exist in the config
SELECT DISTINCT
mk."projectId",
mk."branchId",
mk.parent_key
FROM matching_keys mk
JOIN rows_to_check rtc
ON rtc."projectId" = mk."projectId"
AND rtc."branchId" = mk."branchId"
WHERE NOT (rtc."config" ? mk.parent_key)
),
parents_to_add AS (
-- Aggregate all missing parent keys per row into a single jsonb object
SELECT
mp."projectId",
mp."branchId",
jsonb_object_agg(mp.parent_key, '{}'::jsonb) AS new_keys
FROM missing_parents mp
GROUP BY mp."projectId", mp."branchId"
),
updated_with_keys AS (
-- Update rows that need new parent keys
UPDATE /* SCHEMA_NAME_SENTINEL */."EnvironmentConfigOverride" eco
SET
"config" = eco."config" || pta.new_keys,
"updatedAt" = NOW(),
"temp_trusted_domains_checked" = TRUE
FROM parents_to_add pta
WHERE eco."projectId" = pta."projectId"
AND eco."branchId" = pta."branchId"
RETURNING eco."projectId", eco."branchId"
),
marked_as_checked AS (
-- Mark all checked rows (including ones that didn't need fixing)
UPDATE /* SCHEMA_NAME_SENTINEL */."EnvironmentConfigOverride" eco
SET "temp_trusted_domains_checked" = TRUE
FROM rows_to_check rtc
WHERE eco."projectId" = rtc."projectId"
AND eco."branchId" = rtc."branchId"
AND NOT EXISTS (
Comment thread
N2D4 marked this conversation as resolved.
SELECT 1 FROM updated_with_keys uwk
WHERE uwk."projectId" = eco."projectId"
AND uwk."branchId" = eco."branchId"
)
RETURNING eco."projectId"
)
SELECT COUNT(*) > 0 AS should_repeat_migration
FROM rows_to_check;
-- SPLIT_STATEMENT_SENTINEL

-- Clean up: drop temporary index (outside transaction since CREATE was also outside)
-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
DROP INDEX IF EXISTS /* SCHEMA_NAME_SENTINEL */."temp_eco_trusted_domains_checked_idx";
-- SPLIT_STATEMENT_SENTINEL

-- Clean up: drop temporary column (outside transaction)
-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
ALTER TABLE /* SCHEMA_NAME_SENTINEL */."EnvironmentConfigOverride" DROP COLUMN IF EXISTS "temp_trusted_domains_checked";
5 changes: 1 addition & 4 deletions apps/backend/scripts/db-migrations.ts
Original file line number Diff line number Diff line change
@@ -1,15 +1,14 @@
import { applyMigrations } from "@/auto-migrations";
import { MIGRATION_FILES_DIR, getMigrationFiles } from "@/auto-migrations/utils";
import { Prisma } from "@/generated/prisma/client";
import { getClickhouseAdminClient } from "@/lib/clickhouse";
import { globalPrismaClient, globalPrismaSchema, sqlQuoteIdent } from "@/prisma-client";
import { spawnSync } from "child_process";
import fs from "fs";
import path from "path";
import * as readline from "readline";
import { seed } from "../prisma/seed";
import { getEnvVariable } from "@stackframe/stack-shared/dist/utils/env";
import { runClickhouseMigrations } from "./clickhouse-migrations";
import { getClickhouseAdminClient } from "@/lib/clickhouse";

const getClickhouseClient = () => getClickhouseAdminClient();

Expand Down Expand Up @@ -81,7 +80,6 @@ const generateMigrationFile = async () => {
const folderName = `${timestampPrefix()}_${migrationName}`;
const migrationDir = path.join(MIGRATION_FILES_DIR, folderName);
const migrationSqlPath = path.join(migrationDir, 'migration.sql');
const diffUrl = getEnvVariable('STACK_DATABASE_CONNECTION_STRING');

console.log(`Generating migration ${folderName}...`);
const diffResult = spawnSync(
Expand All @@ -92,7 +90,6 @@ const generateMigrationFile = async () => {
'migrate',
'diff',
'--from-config-datasource',
diffUrl,
'--to-schema',
'prisma/schema.prisma',
'--script',
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
import { resetBranchConfigOverrideKeys, resetEnvironmentConfigOverrideKeys } from "@/lib/config";
import { createSmartRouteHandler } from "@/route-handlers/smart-route-handler";
import { adaptSchema, adminAuthTypeSchema, yupArray, yupNumber, yupObject, yupString } from "@stackframe/stack-shared/dist/schema-fields";

const levelSchema = yupString().oneOf(["branch", "environment"]).defined();

const levelConfigs = {
branch: {
reset: (options: { projectId: string, branchId: string, keysToReset: string[] }) =>
resetBranchConfigOverrideKeys(options),
},
environment: {
reset: (options: { projectId: string, branchId: string, keysToReset: string[] }) =>
resetEnvironmentConfigOverrideKeys(options),
},
};

export const POST = createSmartRouteHandler({
metadata: {
hidden: true,
summary: 'Reset config override keys',
description: 'Remove specific keys (and their nested descendants) from the config override at a given level. Uses the same nested key logic as the override algorithm.',
tags: ['Config'],
},
request: yupObject({
auth: yupObject({
type: adminAuthTypeSchema,
tenancy: adaptSchema,
}).defined(),
params: yupObject({
level: levelSchema,
}).defined(),
body: yupObject({
keys: yupArray(yupString().defined()).defined(),
}).defined(),
}),
response: yupObject({
statusCode: yupNumber().oneOf([200]).defined(),
bodyType: yupString().oneOf(["success"]).defined(),
}),
handler: async (req) => {
const levelConfig = levelConfigs[req.params.level];

await levelConfig.reset({
projectId: req.auth.tenancy.project.id,
branchId: req.auth.tenancy.branchId,
keysToReset: req.body.keys,
});

return {
statusCode: 200 as const,
bodyType: "success" as const,
};
},
});
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
import { getBranchConfigOverrideQuery, getEnvironmentConfigOverrideQuery, overrideBranchConfigOverride, overrideEnvironmentConfigOverride, setBranchConfigOverride, setBranchConfigOverrideSource, setEnvironmentConfigOverride } from "@/lib/config";
import { getBranchConfigOverrideQuery, getEnvironmentConfigOverrideQuery, overrideBranchConfigOverride, overrideEnvironmentConfigOverride, setBranchConfigOverride, setBranchConfigOverrideSource, setEnvironmentConfigOverride, validateBranchConfigOverride, validateEnvironmentConfigOverride } from "@/lib/config";
import { enqueueExternalDbSync } from "@/lib/external-db-sync-queue";
import { globalPrismaClient, rawQuery } from "@/prisma-client";
import { createSmartRouteHandler } from "@/route-handlers/smart-route-handler";
import { branchConfigSchema, environmentConfigSchema, getConfigOverrideErrors, migrateConfigOverride } from "@stackframe/stack-shared/dist/config/schema";
import { adaptSchema, adminAuthTypeSchema, branchConfigSourceSchema, yupNumber, yupObject, yupString } from "@stackframe/stack-shared/dist/schema-fields";
import { StatusError } from "@stackframe/stack-shared/dist/utils/errors";
import { StatusError, captureError } from "@stackframe/stack-shared/dist/utils/errors";
import * as yup from "yup";

type BranchConfigSourceApi = yup.InferType<typeof branchConfigSourceSchema>;
Expand Down Expand Up @@ -50,6 +50,11 @@ const levelConfigs = {
branchId: options.branchId,
branchConfigOverrideOverride: options.config,
}),
validate: (options: { projectId: string, branchId: string, config: any }) =>
validateBranchConfigOverride({
projectId: options.projectId,
branchConfigOverride: options.config,
}),
requiresSource: true,
},
environment: {
Expand All @@ -69,6 +74,12 @@ const levelConfigs = {
branchId: options.branchId,
environmentConfigOverrideOverride: options.config,
}),
validate: (options: { projectId: string, branchId: string, config: any }) =>
validateEnvironmentConfigOverride({
projectId: options.projectId,
branchId: options.branchId,
environmentConfigOverride: options.config,
}),
requiresSource: false,
},
};
Expand Down Expand Up @@ -141,6 +152,20 @@ async function parseAndValidateConfig(
return migratedConfig;
}

async function warnOnValidationFailure(
levelConfig: typeof levelConfigs["branch" | "environment"],
options: { projectId: string, branchId: string, config: any },
) {
try {
const validationResult = await levelConfig.validate(options);
if (validationResult.status === "error") {
captureError("config-override-validation-warning", `Config override validation warning for project ${options.projectId} (this may not be a logic error, but rather a client/implementation issue — e.g. dot notation into non-existent record entries): ${validationResult.error}`);
}
} catch (e) {
captureError("config-override-validation-check-failed", e);
}
}

export const PUT = createSmartRouteHandler({
metadata: {
hidden: true,
Expand Down Expand Up @@ -179,6 +204,12 @@ export const PUT = createSmartRouteHandler({
source: req.body.source as BranchConfigSourceApi,
});

await warnOnValidationFailure(levelConfig, {
projectId: req.auth.tenancy.project.id,
branchId: req.auth.tenancy.branchId,
config: parsedConfig,
});

if (req.params.level === "environment" && shouldEnqueueExternalDbSync(parsedConfig)) {
await enqueueExternalDbSync(req.auth.tenancy.id);
}
Expand Down Expand Up @@ -220,6 +251,12 @@ export const PATCH = createSmartRouteHandler({
config: parsedConfig,
});

await warnOnValidationFailure(levelConfig, {
projectId: req.auth.tenancy.project.id,
branchId: req.auth.tenancy.branchId,
config: parsedConfig,
});

if (req.params.level === "environment" && shouldEnqueueExternalDbSync(parsedConfig)) {
await enqueueExternalDbSync(req.auth.tenancy.id);
}
Expand Down
2 changes: 1 addition & 1 deletion apps/backend/src/auto-migrations/index.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ export async function applyMigrations(options: {
}

for (const statementRaw of migration.sql.split('SPLIT_STATEMENT_SENTINEL')) {
const statement = statementRaw.replace('/* SCHEMA_NAME_SENTINEL */', sqlQuoteIdentToString(options.schema));
const statement = statementRaw.replaceAll('/* SCHEMA_NAME_SENTINEL */', sqlQuoteIdentToString(options.schema));
const runOutside = statement.includes('RUN_OUTSIDE_TRANSACTION_SENTINEL');
const isSingleStatement = statement.includes('SINGLE_STATEMENT_SENTINEL');
const isConditionallyRepeatMigration = statement.includes('CONDITIONALLY_REPEAT_MIGRATION_SENTINEL');
Expand Down
Loading
Loading