Skip to content
Open
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
99 commits
Select commit Hold shift + click to select a range
66af47d
Enhance documentation tools integration
mantrakp04 Mar 23, 2026
30d53e8
Merge branch 'dev' into dario-likes-mcps
mantrakp04 Mar 23, 2026
5078747
Enhance error handling and API response for documentation tools
mantrakp04 Mar 24, 2026
aaf49db
Merge branch 'dev' into dario-likes-mcps
mantrakp04 Mar 24, 2026
844e916
Refactor askStackAuth key to ask_stack_auth in API documentation
mantrakp04 Mar 24, 2026
274c742
fix: register private submodule gitlink in the index
mantrakp04 Mar 25, 2026
c7a3cca
Merge branch 'dev' into dario-likes-mcps
mantrakp04 Mar 25, 2026
ef2289f
Merge branch 'dev' into dario-likes-mcps
mantrakp04 Apr 3, 2026
d8065c4
Update environment configurations and remove internal secret validati…
mantrakp04 Apr 4, 2026
3b27eee
Merge branch 'dev' into dario-likes-mcps
mantrakp04 Apr 6, 2026
b82efa4
Merge branch 'dev' into dario-likes-mcps
mantrakp04 Apr 6, 2026
158498b
Merge branch 'dev' into dario-likes-mcps
mantrakp04 Apr 8, 2026
b22d4b0
Merge branch 'dev' into dario-likes-mcps
mantrakp04 Apr 9, 2026
95ca0a2
initial commit
aadesh18 Apr 10, 2026
fbab066
Merge remote-tracking branch 'origin/dario-likes-mcps' into llm-mcp-flow
aadesh18 Apr 10, 2026
73152a1
pnpm lock
aadesh18 Apr 10, 2026
e16040c
changed port
aadesh18 Apr 10, 2026
a07dbab
spacetime db ci change
aadesh18 Apr 10, 2026
ef77edc
ci fix
aadesh18 Apr 10, 2026
84dffa2
security fix
aadesh18 Apr 10, 2026
a0486e9
security fixes
aadesh18 Apr 11, 2026
8c596ec
Merge branch 'dev' into dario-likes-mcps
mantrakp04 Apr 12, 2026
ef6963d
Merge branch 'dev' into dario-likes-mcps
N2D4 Apr 12, 2026
1c69185
Merge branch 'dario-likes-mcps' into llm-mcp-flow
aadesh18 Apr 12, 2026
f794bd6
Merge remote-tracking branch 'origin/dev' into llm-mcp-flow
aadesh18 Apr 12, 2026
59a060a
merge error
aadesh18 Apr 13, 2026
0485c73
pr comment changes
aadesh18 Apr 13, 2026
97ee052
Merge branch 'dev' into llm-mcp-flow
aadesh18 Apr 13, 2026
411f775
bug fix
aadesh18 Apr 13, 2026
c514efd
Merge branch 'llm-mcp-flow' of https://github.com/stack-auth/stack-au…
aadesh18 Apr 13, 2026
516c424
Merge branch 'dev' into llm-mcp-flow
aadesh18 Apr 13, 2026
b0e3341
pr comments
aadesh18 Apr 13, 2026
a630be1
Merge branch 'llm-mcp-flow' of https://github.com/stack-auth/stack-au…
aadesh18 Apr 13, 2026
8c7bc54
tests failing
aadesh18 Apr 13, 2026
7a54be9
comment changes
aadesh18 Apr 13, 2026
bd3925d
Merge branch 'dev' into llm-mcp-flow
aadesh18 Apr 13, 2026
ca461d4
tests fix
aadesh18 Apr 13, 2026
224468c
Merge branch 'llm-mcp-flow' of https://github.com/stack-auth/stack-au…
aadesh18 Apr 13, 2026
042e616
tests fix
aadesh18 Apr 13, 2026
149d6d7
fixed the order
aadesh18 Apr 13, 2026
574cc4a
Merge branch 'dev' into llm-mcp-flow
aadesh18 Apr 13, 2026
3293845
Merge branch 'dev' into llm-mcp-flow
aadesh18 Apr 13, 2026
d8e99d6
Merge branch 'dev' into llm-mcp-flow
aadesh18 Apr 14, 2026
fa4c814
Merge branch 'dev' into llm-mcp-flow
aadesh18 Apr 14, 2026
a4c3306
pr changes
aadesh18 Apr 14, 2026
35739af
Merge branch 'dev' into llm-mcp-flow
aadesh18 Apr 14, 2026
15e5879
Merge remote-tracking branch 'origin/dev' into llm-mcp-flow
aadesh18 Apr 15, 2026
140ee7e
Merge branch 'dev' into llm-mcp-flow
aadesh18 Apr 15, 2026
afd84bc
minor fix
aadesh18 Apr 15, 2026
b0a329f
initial commit
aadesh18 Apr 15, 2026
c819537
proxy logging implemented
aadesh18 Apr 15, 2026
7a2332f
Merge remote-tracking branch 'origin/dev' into ai-analytics
aadesh18 Apr 15, 2026
83a37d1
pr message fixes
aadesh18 Apr 17, 2026
4fb5154
internal tool security update
aadesh18 Apr 19, 2026
30e3e5c
Merge branch 'dev' into ai-analytics
aadesh18 Apr 19, 2026
edd33b1
added e2e tests
aadesh18 Apr 20, 2026
a43eb11
bot comment
aadesh18 Apr 20, 2026
1ccef9c
Update seed function to preserve existing user metadata when updating…
aadesh18 Apr 20, 2026
4965534
refactor: replace callReducer with callReducerStrict for improved err…
aadesh18 Apr 20, 2026
9ba7b5e
clean up
aadesh18 Apr 20, 2026
ddde9c6
feat: implement timeout for SpacetimeDB HTTP calls to prevent hanging…
aadesh18 Apr 20, 2026
c329a46
fix: improve error handling for missing SpacetimeDB service token in …
aadesh18 Apr 20, 2026
26ce83f
fix: encode URI components in fetch requests to prevent errors with s…
aadesh18 Apr 20, 2026
f9386a8
bot fixes
aadesh18 Apr 20, 2026
dc5ab66
fix: add log token retrieval in getServiceToken function
aadesh18 Apr 20, 2026
2532632
fix: enhance error handling in isSpacetimedbReachable and update priv…
aadesh18 Apr 20, 2026
53a9f2c
fix: update footer separator in ConversationReplay component for impr…
aadesh18 Apr 20, 2026
0eff6b2
bug fix
aadesh18 Apr 20, 2026
170b4fe
fix: refactor MCP review authorization and improve logging mechanisms
aadesh18 Apr 21, 2026
a0bab5d
tests clean up
aadesh18 Apr 21, 2026
3654af5
Merge remote-tracking branch 'origin/dev' into ai-analytics
aadesh18 Apr 21, 2026
dbc7988
bug fix
aadesh18 Apr 21, 2026
d8b7499
Custom Dashboard Improvements (#1359)
aadesh18 Apr 24, 2026
331d208
Update backend environment variables and refactor AI query route imports
aadesh18 Apr 28, 2026
49d2c04
Enhance image attachment validation in AI query route
aadesh18 Apr 28, 2026
60c538b
Add context to system prompt in AI query route
aadesh18 Apr 28, 2026
0aea8ef
edited comment
aadesh18 Apr 28, 2026
39facf4
added comment
aadesh18 Apr 28, 2026
45ff5f2
aman comment changes
aadesh18 May 4, 2026
89d43b1
Merge remote-tracking branch 'origin/dev' into ai-analytics
aadesh18 May 4, 2026
971bad9
merge changes
aadesh18 May 4, 2026
16542f1
removed mcpCorrelationId
aadesh18 May 5, 2026
10e6cfc
Enhance qaId validation in MCP review routes to ensure it is a non-ne…
aadesh18 May 5, 2026
5552bd7
Refactor reviewer handling in MCP review route to improve readability…
aadesh18 May 5, 2026
2e78347
Implement update_qa_entry_with_publish reducer to streamline QA entry…
aadesh18 May 5, 2026
0cedc49
Refactor AI query logging to handle serialization errors gracefully a…
aadesh18 May 5, 2026
a087f6b
Refactor AI query logging to encapsulate error handling within async …
aadesh18 May 5, 2026
7e850db
Update tool call instructions in AI prompts to specify usage of patch…
aadesh18 May 5, 2026
7527bcf
Enhance occurrenceIndex validation in applyDashboardPatches to ensure…
aadesh18 May 5, 2026
cfcedeb
Refactor applyDashboardPatches to use 'draft' instead of 'running' fo…
aadesh18 May 5, 2026
9500e1e
sizing fix
aadesh18 May 5, 2026
9599254
Merge branch 'dev' into ai-analytics
aadesh18 May 6, 2026
c664a1d
Merge remote-tracking branch 'origin/dev' into ai-analytics
aadesh18 May 8, 2026
1389d37
lint fix
aadesh18 May 8, 2026
185d245
update reviewer authentication and API calls
aadesh18 May 11, 2026
29a6a88
Merge remote-tracking branch 'origin/dev' into ai-analytics
aadesh18 May 11, 2026
d63bb9c
test fix
aadesh18 May 12, 2026
6b8838e
Merge branch 'dev' into ai-analytics
aadesh18 May 12, 2026
26f0ff6
bot comments
aadesh18 May 13, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Merge branch 'dev' into dario-likes-mcps
  • Loading branch information
mantrakp04 committed Mar 23, 2026
commit 30d53e8cc83d5910d8058e447023b2f67083b2ad
4 changes: 2 additions & 2 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
[submodule "packages/private"]
path = packages/private
[submodule "backend-private-repo"]
path = apps/backend/src/private/implementation
url = https://github.com/stack-auth/private.git
branch = main
1 change: 1 addition & 0 deletions apps/backend/.gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
src/generated
src/private/implementation.generated.ts

# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.

Expand Down
6 changes: 4 additions & 2 deletions apps/backend/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,12 @@
"start": "next start --port ${NEXT_PUBLIC_STACK_PORT_PREFIX:-81}02",
"codegen-prisma": "STACK_DATABASE_CONNECTION_STRING=\"${STACK_DATABASE_CONNECTION_STRING:-placeholder-database-connection-string}\" pnpm run prisma generate",
"codegen-prisma:watch": "STACK_DATABASE_CONNECTION_STRING=\"${STACK_DATABASE_CONNECTION_STRING:-placeholder-database-connection-string}\" pnpm run prisma generate --watch",
"generate-private-sign-up-risk-engine": "pnpm run with-env tsx scripts/generate-private-sign-up-risk-engine.ts",
"generate-private-sign-up-risk-engine:watch": "chokidar 'src/private/src/sign-up-risk-engine.ts' -c 'pnpm run generate-private-sign-up-risk-engine'",
"codegen-route-info": "pnpm run with-env tsx scripts/generate-route-info.ts",
"codegen-route-info:watch": "pnpm run with-env tsx watch --clear-screen=false scripts/generate-route-info.ts",
"codegen": "pnpm run with-env pnpm run generate-migration-imports && pnpm run with-env bash -c 'if [ \"$STACK_ACCELERATE_ENABLED\" = \"true\" ]; then pnpm run prisma generate --no-engine; else pnpm run codegen-prisma; fi' && pnpm run codegen-docs && pnpm run codegen-route-info",
"codegen:watch": "concurrently -n \"prisma,docs,route-info,migration-imports\" -k \"pnpm run codegen-prisma:watch\" \"pnpm run codegen-docs:watch\" \"pnpm run codegen-route-info:watch\" \"pnpm run generate-migration-imports:watch\"",
"codegen": "pnpm run with-env pnpm run generate-migration-imports && pnpm run with-env bash -c 'if [ \"$STACK_ACCELERATE_ENABLED\" = \"true\" ]; then pnpm run prisma generate --no-engine; else pnpm run codegen-prisma; fi' && pnpm run generate-private-sign-up-risk-engine && pnpm run codegen-docs && pnpm run codegen-route-info",
"codegen:watch": "pnpm run generate-private-sign-up-risk-engine && concurrently -n \"prisma,private-risk-engine,docs,route-info,migration-imports\" -k \"pnpm run codegen-prisma:watch\" \"pnpm run generate-private-sign-up-risk-engine:watch\" \"pnpm run codegen-docs:watch\" \"pnpm run codegen-route-info:watch\" \"pnpm run generate-migration-imports:watch\"",
"psql-inner": "psql $(echo $STACK_DATABASE_CONNECTION_STRING | sed 's/\\?.*$//')",
"clickhouse": "pnpm run with-env clickhouse-client --host localhost --port ${NEXT_PUBLIC_STACK_PORT_PREFIX:-81}37 --user stackframe --password PASSWORD-PLACEHOLDER--9gKyMxJeMx",
"psql": "pnpm run with-env:dev pnpm run psql-inner",
Expand Down
Original file line number Diff line number Diff line change
@@ -1,19 +1,18 @@
ALTER TABLE "ProjectUser" ADD COLUMN "signUpRiskScoreBot" SMALLINT NOT NULL DEFAULT 0;
ALTER TABLE "ProjectUser" ADD COLUMN "signUpRiskScoreFreeTrialAbuse" SMALLINT NOT NULL DEFAULT 0;

ALTER TABLE "ProjectUser"
ADD CONSTRAINT "ProjectUser_risk_score_bot_range"
CHECK ("signUpRiskScoreBot" >= 0 AND "signUpRiskScoreBot" <= 100) NOT VALID;

ALTER TABLE "ProjectUser"
ADD CONSTRAINT "ProjectUser_risk_score_free_trial_abuse_range"
CHECK ("signUpRiskScoreFreeTrialAbuse" >= 0 AND "signUpRiskScoreFreeTrialAbuse" <= 100) NOT VALID;

ALTER TABLE "ProjectUser" ADD COLUMN "signUpCountryCode" TEXT;

-- Add the sign-up metadata columns first.
-- `signedUpAt` starts nullable so we can backfill existing rows before enforcing it.
ALTER TABLE "ProjectUser"
ADD COLUMN "signUpRiskScoreBot" SMALLINT NOT NULL DEFAULT 0,
ADD COLUMN "signUpRiskScoreFreeTrialAbuse" SMALLINT NOT NULL DEFAULT 0,
ADD COLUMN "signUpCountryCode" TEXT,
ADD COLUMN "signedUpAt" TIMESTAMP(3),
ADD COLUMN "signUpIp" TEXT,
ADD COLUMN "signUpIpTrusted" BOOLEAN,
ADD COLUMN "signUpEmailNormalized" TEXT,
ADD COLUMN "signUpEmailBase" TEXT;

-- Add the risk score bounds without validating existing rows yet.
ALTER TABLE "ProjectUser"
ADD CONSTRAINT "ProjectUser_risk_score_bot_range"
CHECK ("signUpRiskScoreBot" >= 0 AND "signUpRiskScoreBot" <= 100) NOT VALID,
ADD CONSTRAINT "ProjectUser_risk_score_free_trial_abuse_range"
CHECK ("signUpRiskScoreFreeTrialAbuse" >= 0 AND "signUpRiskScoreFreeTrialAbuse" <= 100) NOT VALID;
Original file line number Diff line number Diff line change
@@ -1,16 +1,19 @@
-- Backfill `signedUpAt` from `createdAt` in small batches so the migration stays
-- safely under the transaction timeout on large tables.
-- SINGLE_STATEMENT_SENTINEL
-- CONDITIONALLY_REPEAT_MIGRATION_SENTINEL
WITH to_update AS (
SELECT "projectUserId", "tenancyId"
FROM "ProjectUser"
WHERE "signedUpAt" IS NULL
LIMIT 10000
SELECT "projectUserId", "tenancyId"
FROM "ProjectUser"
WHERE "signedUpAt" IS NULL
LIMIT 10000
),
updated AS (
UPDATE "ProjectUser" pu
SET "signedUpAt" = pu."createdAt"
FROM to_update tu
WHERE pu."tenancyId" = tu."tenancyId" AND pu."projectUserId" = tu."projectUserId"
RETURNING 1
UPDATE "ProjectUser" pu
SET "signedUpAt" = pu."createdAt"
FROM to_update tu
WHERE pu."tenancyId" = tu."tenancyId"
AND pu."projectUserId" = tu."projectUserId"
RETURNING 1
)
SELECT COUNT(*) > 0 AS should_repeat_migration FROM updated;
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,54 @@ export const preMigration = async (sql: Sql) => {
const regularUserId = randomUUID();
const anonUserId = randomUUID();

await sql`INSERT INTO "Project" ("id", "createdAt", "updatedAt", "displayName", "description", "isProductionMode") VALUES (${projectId}, NOW(), NOW(), 'Test', '', false)`;
await sql`INSERT INTO "Tenancy" ("id", "createdAt", "updatedAt", "projectId", "branchId", "hasNoOrganization") VALUES (${tenancyId}::uuid, NOW(), NOW(), ${projectId}, 'main', 'TRUE'::"BooleanTrue")`;
await sql`INSERT INTO "ProjectUser" ("projectUserId", "tenancyId", "mirroredProjectId", "mirroredBranchId", "createdAt", "updatedAt", "lastActiveAt") VALUES (${regularUserId}::uuid, ${tenancyId}::uuid, ${projectId}, 'main', NOW(), NOW(), NOW())`;
await sql`INSERT INTO "ProjectUser" ("projectUserId", "tenancyId", "mirroredProjectId", "mirroredBranchId", "createdAt", "updatedAt", "lastActiveAt", "isAnonymous") VALUES (${anonUserId}::uuid, ${tenancyId}::uuid, ${projectId}, 'main', NOW(), NOW(), NOW(), true)`;
await sql`
INSERT INTO "Project" ("id", "createdAt", "updatedAt", "displayName", "description", "isProductionMode")
VALUES (${projectId}, NOW(), NOW(), 'Test', '', false)
`;
await sql`
INSERT INTO "Tenancy" ("id", "createdAt", "updatedAt", "projectId", "branchId", "hasNoOrganization")
VALUES (${tenancyId}::uuid, NOW(), NOW(), ${projectId}, 'main', 'TRUE'::"BooleanTrue")
`;
await sql`
INSERT INTO "ProjectUser" (
"projectUserId",
"tenancyId",
"mirroredProjectId",
"mirroredBranchId",
"createdAt",
"updatedAt",
"lastActiveAt"
) VALUES (
${regularUserId}::uuid,
${tenancyId}::uuid,
${projectId},
'main',
NOW(),
NOW(),
NOW()
)
`;
await sql`
INSERT INTO "ProjectUser" (
"projectUserId",
"tenancyId",
"mirroredProjectId",
"mirroredBranchId",
"createdAt",
"updatedAt",
"lastActiveAt",
"isAnonymous"
) VALUES (
${anonUserId}::uuid,
${tenancyId}::uuid,
${projectId},
'main',
NOW(),
NOW(),
NOW(),
true
)
`;

return { regularUserId, anonUserId };
};
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
-- Add the indexes needed for recent sign-up heuristics and sorting.
-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
Expand All @@ -16,6 +17,7 @@ CREATE INDEX CONCURRENTLY IF NOT EXISTS "ProjectUser_signUpIp_recent_idx"
CREATE INDEX CONCURRENTLY IF NOT EXISTS "ProjectUser_signUpEmailBase_recent_idx"
ON "ProjectUser"("tenancyId", "isAnonymous", "signUpEmailBase", "signedUpAt");

-- Validate the risk score bounds once every row has the new columns.
-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
Expand All @@ -26,27 +28,8 @@ ALTER TABLE "ProjectUser" VALIDATE CONSTRAINT "ProjectUser_risk_score_bot_range"
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
ALTER TABLE "ProjectUser" VALIDATE CONSTRAINT "ProjectUser_risk_score_free_trial_abuse_range";

-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
CREATE OR REPLACE FUNCTION "set_project_user_signed_up_at_from_created_at"()
RETURNS TRIGGER AS $$
BEGIN
IF NEW."signedUpAt" IS NULL THEN
NEW."signedUpAt" := NEW."createdAt";
END IF;
RETURN NEW;
END;
$$ LANGUAGE plpgsql;

-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
CREATE TRIGGER "ProjectUser_set_signedUpAt_from_createdAt"
BEFORE INSERT ON "ProjectUser"
FOR EACH ROW
EXECUTE FUNCTION "set_project_user_signed_up_at_from_created_at"();

-- Enforce `signedUpAt` after the backfill is complete. We intentionally require
-- inserts to provide the value explicitly instead of hiding that behavior in a trigger.
-- SPLIT_STATEMENT_SENTINEL
-- SINGLE_STATEMENT_SENTINEL
-- RUN_OUTSIDE_TRANSACTION_SENTINEL
Expand Down

This file was deleted.

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,139 @@
import { randomUUID } from 'crypto';
import type { Sql } from 'postgres';
import { expect } from 'vitest';

export const postMigration = async (sql: Sql) => {
const projectId = `test-${randomUUID()}`;
const tenancyId = randomUUID();
const userId = randomUUID();
const explicitSignedUpAt = '2026-03-08 12:34:56.789';

const triggers = await sql`
SELECT tgname
FROM pg_trigger
WHERE tgrelid = '"ProjectUser"'::regclass
AND tgname = 'ProjectUser_set_signedUpAt_from_createdAt'
AND NOT tgisinternal
`;
expect(triggers).toHaveLength(0);

const functions = await sql`
SELECT proname
FROM pg_proc
WHERE proname = 'set_project_user_signed_up_at_from_created_at'
`;
expect(functions).toHaveLength(0);

const constraints = await sql`
SELECT conname, convalidated
FROM pg_constraint
WHERE conrelid = '"ProjectUser"'::regclass
AND conname IN (
'ProjectUser_risk_score_bot_range',
'ProjectUser_risk_score_free_trial_abuse_range',
'ProjectUser_signedUpAt_not_null'
)
ORDER BY conname
`;
expect(constraints).toHaveLength(3);
for (const constraint of constraints) {
expect(constraint.convalidated, `${constraint.conname} should be validated`).toBe(true);
}

const indexes = await sql`
SELECT indexname, indexdef
FROM pg_indexes
WHERE schemaname = current_schema()
AND tablename = 'ProjectUser'
AND indexname IN (
'ProjectUser_signedUpAt_asc',
'ProjectUser_signUpIp_recent_idx',
'ProjectUser_signUpEmailBase_recent_idx'
)
ORDER BY indexname
`;
expect(indexes.map((row) => row.indexname)).toEqual([
'ProjectUser_signUpEmailBase_recent_idx',
'ProjectUser_signUpIp_recent_idx',
'ProjectUser_signedUpAt_asc',
]);

const indexDefByName = Object.fromEntries(indexes.map((row) => [row.indexname, row.indexdef]));
expect(indexDefByName['ProjectUser_signedUpAt_asc']).toContain('"tenancyId", "isAnonymous", "signedUpAt"');
expect(indexDefByName['ProjectUser_signUpIp_recent_idx']).toContain('"tenancyId", "isAnonymous", "signUpIp", "signedUpAt"');
expect(indexDefByName['ProjectUser_signUpEmailBase_recent_idx']).toContain('"tenancyId", "isAnonymous", "signUpEmailBase", "signedUpAt"');

const colInfo = await sql`
SELECT is_nullable, column_default
FROM information_schema.columns
WHERE table_name = 'ProjectUser'
AND column_name = 'signedUpAt'
`;
expect(colInfo).toHaveLength(1);
expect(colInfo[0].is_nullable).toBe('NO');
expect(colInfo[0].column_default).toBe(null);

await sql`
INSERT INTO "Project" ("id", "createdAt", "updatedAt", "displayName", "description", "isProductionMode")
VALUES (${projectId}, NOW(), NOW(), 'Test', '', false)
`;
await sql`
INSERT INTO "Tenancy" ("id", "createdAt", "updatedAt", "projectId", "branchId", "hasNoOrganization")
VALUES (${tenancyId}::uuid, NOW(), NOW(), ${projectId}, 'main', 'TRUE'::"BooleanTrue")
`;

await expect(sql`
INSERT INTO "ProjectUser" (
"projectUserId",
"tenancyId",
"mirroredProjectId",
"mirroredBranchId",
"createdAt",
"updatedAt",
"lastActiveAt"
) VALUES (
${userId}::uuid,
${tenancyId}::uuid,
${projectId},
'main',
NOW(),
NOW(),
NOW()
)
`).rejects.toThrow(/signedUpAt/);

await sql`
INSERT INTO "ProjectUser" (
"projectUserId",
"tenancyId",
"mirroredProjectId",
"mirroredBranchId",
"createdAt",
"updatedAt",
"lastActiveAt",
"signedUpAt"
) VALUES (
${userId}::uuid,
${tenancyId}::uuid,
${projectId},
'main',
NOW(),
NOW(),
NOW(),
${explicitSignedUpAt}::timestamp
)
`;

const insertedRows = await sql`
SELECT
"signedUpAt",
"createdAt",
"signedUpAt" = ${explicitSignedUpAt}::timestamp AS "matchesExplicitSignedUpAt"
FROM "ProjectUser"
WHERE "projectUserId" = ${userId}::uuid
`;
expect(insertedRows).toHaveLength(1);
expect(insertedRows[0].signedUpAt).not.toBeNull();
expect(insertedRows[0].matchesExplicitSignedUpAt).toBe(true);
expect(insertedRows[0].signedUpAt.toISOString()).not.toBe(insertedRows[0].createdAt.toISOString());
};
Loading
Loading
You are viewing a condensed version of this merge commit. You can view the full changes here.