Generate TypeScript types for tables and views in a SQL database.
Includes comments from tables, views and columns for supported providers.
Highly configurable: choose your own naming and casing schemes, add types, extend base types, generating enums from table data and more.
Supports the following databases: MySQL, Microsoft SQL Server, SQLite and Postgres courtesy of knex.
CREATE TABLE [dbo].[Customers](
[CustomerID] [nchar](5) NOT NULL,
[CompanyName] [nvarchar](40) NOT NULL,
[ContactName] [nvarchar](30) NULL
)
CREATE TABLE [dbo].[Employees](
[EmployeeID] [int] IDENTITY(1,1) NOT NULL,
[Name] [nvarchar](20) NOT NULL,
[BirthDate] [datetime] NULL,
[Photo] [image] NULL
)
export interface CustomersEntity {
'CustomerID': string;
'CompanyName': string;
'ContactTitle'?: string | null;
}
export interface EmployeesEntity {
'EmployeeID'?: number;
'Name': string;
'BirthDate'?: Date | null;
'Photo'?: Buffer | null;
}Install into your project using npm / yarn.
npm install @rmp135/sql-ts
Install your relevant SQL driver. Refer to the knex documentation to determine which driver you should install.
For example npm install mysql2.
Create a configuration file, for example mysql.json. This will mirror connection details from knex.
The most basic MySQL setup is below, modify as appropriate. Additional options can be applied by referring to the Config.
{
"client": "mysql2",
"connection": {
"host": "localhost",
"user": "user",
"password": "password"
}
}Run npx @rmp135/sql-ts with the path of the configuration file created above.
npx @rmp135/sql-ts -c ./mysql.json
The file will be exported with the filename Database.ts (or with the name specified by filename) at the current working directory.
!> Warning: if this file exists, it will be overwritten.
Alternatively, use as a node module, importing the Client and chaining together your operations.
?>The Node module will be exported as an ES module package and will require your consuming package to also be of type module.
Import the Client and create a configuration (see config).
import { Client } from '@rmp135/sql-ts'
const config = { } // Config as before.The most basic syntax is chaining fromObject, fetchDatabase and toTypescript.
const definition = await Client
.fromConfig(config)
.fetchDatabase()
.toTypescript()If you have a knex database already, you can omit the knex specific options and specify the connection here.
const definition = await Client
.fromConfig(config) // knex config omitted.
.fetchDatabase(knexDb) // Pre-existing knex connection.
.toTypescript()To output the database as an object for processing before converting to TypeScript use toObject and continue with fromObject.
const asObject = await Client
.fromConfig(config)
.fetchDatabase(knexDb)
.toObject()
// Process asObject here.
const definition = await Client
.fromObject(asObject, config)
.toTypescript()To simplify processing, a series of mapping functions are available: mapSchema, mapSchemas, mapTable, mapTables, mapColumn and mapColumns. The plural versions operate on all of the specified object, the singular ones operate on a specifically named object. See Object Name Format.
This is performed after the database definition has been fetched and the config options applied.
const definition = await Client
.fromConfig(config)
.fetchDatabase()
.mapSchema('public', (schema) => { // Maps to schema with name public.
schema.namespaceName = 'new_name' // Change the output namespace name.
return schema // Return the mapped schema.
})
.mapTables((table, schema) => ({...table, interfaceName: `${schema.name}${table.name}Table`})) // Change the interface name of all tables.
.mapColumn('public.users.name', (column, table, schema) => ({...column, optional: true })) // Change optionality of column named "name" in table "users" in schema "public".
.toTypescript()?> Note: Execution of the fetching and mapping functions are deferred until either toTypescript or toObject are called.
The configuration extends the knex configuration with some additional properties. If you're passing a knex connection into fetchDatabase, the knex related information is not required.
Filter the tables to include only those specified. These must be in the format schema.table. See Object Name Format for schema naming conventions.
{
"client": "...",
"connection": {},
"tables": ["schema1.Table1", "schema2.Table2"]
}Filter the tables to exclude those specified. These must be in the format schema.table. See Object Name Format for schema naming conventions.
Excluding a table takes precedence over including it. Specifying a table in both configuration options will exclude it.
{
"client": "...",
"connection": {},
"excludedTables": [
"schema1.knex_migrations",
"schema1.knex_migrations_lock",
"schema2.android_metadata"
]
}Override the types on a per column basis. This requires the full name of the column in the format schema.table.column. See Object Name Format for schema naming conventions.
{
"client": "...",
"connection": {},
"typeOverrides": {
"dbo.Table_1.ColumnName": "string",
"dbo.Table_1.Name": "number"
}
}Adds additional types to the type resolution. The order in which types are resolved is typeOverrides (see above), this typeMap, then the global TypeMap file before defaulting to any.
{
"client": "...",
"connection": {},
"typeMap": {
"number": ["decimal", "float"],
"string": ["nvarchar", "varchar"]
}
}Specifies the filepath and name that the file should be saved as, in relation to the current working directory. Defaults to "Database". The .ts extension is not required.
{
"client": "...",
"connection": {},
"filename": "DatabaseModels"
}Specifies the format that the exported interface names will take. The token ${table} will be replaced with the table name.
Defaults to ${table}Entity.
{
"client": "...",
"connection": {},
"interfaceNameFormat": "${table}Model" // User becomes UserModel
}Specifices the format for exported enums will take. The token ${name} will be replaced with the table name.
Defaults to ${name} (no change).
{
"client": "...",
"connection": {},
"enumNameFormat": "${name}Enum" // LogLevel becomes LogLevelEnum
}Because enum keys cannot numeric, we must convert them before populating the interface file. This option allows you to specify a custom format for numeric keys. Keys that are not numbers are not effected. The ${key} token will be replaced with the key name.
Defaults to _${key}.
{
"client": "...",
"connection": {},
"enumNumericKeyFormat": "_${key}" // "2" becomes "_2"
}The following options concerns modifying the case values of certain elements.
Valid case values are "pascal" for PascalCase, "camel" for camelCase, "lower" for lowercase and "upper" for UPPERCASE. If the value is empty, missing or invalid, no case conversion will be applied to value. Some values will be modified to be TypeScript safe, others are wrapped in quotes.
Determines the casing for table names
{
"client": "...",
"connection": {},
"tableNameCasing": "pascal"
}Determines the casing for column names.
{
"client": "...",
"connection": {},
"columnNameCasing": "camel"
}Determines the casing for enum names. Any none alphanumeric charaters will be removed. If the enum starts with numbers, those numbers will be removed.
{
"client": "...",
"connection": {},
"enumNameCasing": "lower"
}Determines the casing for enum keys. Keys are wrapped in quotes to allow for any value, use ["index notation"] to reference keys with none alphanumeric values.
{
"client": "...",
"connection": {},
"enumKeyCasing": "upper"
}Uses the pluralize library to attempt to singularise the table names. Defaults false.
{
"client": "...",
"connection": {},
"singularTableNames": true
}Specifies whether the table schema should be used as a namespace. The functionality differs between database providers. Defaults to false.
See Object Name Format for information on how schemas are read from different providers.
{
"client": "...",
"connection": {},
"schemaAsNamespace": true
}Specifies which schemas to import. If MySQL is connected to without specifying a database, this can be used to import from multiple databases. Default [] (all schemas).
!> The default schema on Postgres is public which is a reserved keyword in TypeScript. You may need to use the noImplicitUseStrict flag when transpiling.
See Object Name Format for information on how schemas are read from different providers.
{
"client": "...",
"connection": {},
"schemas": ["dbo", "schema1"]
}Specifies additional properties to be assigned to the output TypeScript file. Key is in the format schema.table and the value is a list of raw strings.
{
"client": "...",
"connection": {},
"additionalProperties": {
"dbo.Table_1": [
"propertyOne: string",
"propertyTwo?: number | null"
]
}
}Specifies the superclass than should be applied to the generated interface. Key is in the format schema.table and the value is the extension to apply. The following would generate export interface Table_1 extends Extension, AnotherExtension { }
{
"client": "...",
"connection": {},
"extends": {
"dbo.Table_1": "Extension, AnotherExtension"
}
}Specifies the handlebars template to use when creating the output TypeScript file relative to the current working directory. See src/template.handlebars for the default template.
See the below section on templating for more info on how to use the template.
{
"client": "...",
"connection": {},
"template": "./template.handlebars"
}Determines the optionality of all generated properties. Available options are optional, required and dynamic. Defaults to dynamic.
optional: The generated properties will be optional.required: The generated properties will be required.dynamic: The generated properties will match the optionality of the underlying column. Columns with default or generated values, or that are nullable will be optional.
This can be used to create specific "read" interfaces by removing all optionality from columns.
{
"client": "...",
"connection": {},
"globalOptionality": "dynamic"
}Determines the optionality on a per-column basis.
Key is the fully qualified column name (see Object Name Format) and the value is the optionality (see Global Optionality).
This option will override the global optionality setting of the specified property.
Useful if the dynamic optionality detection is not working as intended.
{
"client": "...",
"connection": {},
"columnOptionality": {
"dbo.Table_1.Column_1": "optional",
"dbo.Table_2.Column_2": "required",
"dbo.Table_3.Column_3": "dynamic"
}
}Determines the order that the columns are sorted when applied to the template. Available options are source and alphabetical. Defaults to alphabetical.
source: The order the columns are fetched from the database.alphabetical: Alphabetical order based on column name.
{
"client": "...",
"connection": {},
"columnSortOrder": "alphabetical"
}Defines tables to be used to generate enums.
Enums can be defined by specifying the fully qualified table name (see Object Name Format) as the key, and the key/value of the enum as the value.
Take the example on the right. This will select all records from the "dbo.LogLevel" table, generating an enum with the "Level" column as the key and the "ID" column as the value.
dbo.LogLevel
| ID | Level |
|---|---|
| 1 | Warning |
| 2 | Error |
| 3 | Info |
export enum LogLevel {
Warning = '1',
Error = '2',
Info = '3'
}For further information, see Table Generated Enums.
{
"client": "...",
"connection": {},
"tableEnums": {
"dbo.LogLevel": {
"key": "Level",
"value": "ID"
}
}
}A simple object that is can be used in the template. Useful for defining arbitrary values at runtime.
The default template does not use this field, it requires modifying the template to your needs. The entire config is passed into the template and can be found via the config.custom key.
{
"client": "...",
"connection": {},
"custom": {
"version": "1.2.3"
}
}Enums are supported for Postgres and MySQL, but they work in different ways.
In Postgres, an ENUM type can be created, references by a column which will become a TypeScript enum type and referenced as the property type. This allows multiple properties to share the same enum.
In MySQL, enums are string literal unions. The column named LogType of ENUM('warning', 'verbose') becomes LogType: 'warning' | 'verbose'.
TypeScript enums can also be generated from data stored in a table and set as the type on another property, allowing for static enum data to be shared between database and TypeScript code as well as providing intellisense during development.
Take for example, a Log and LogLevel table. The Log table references LogLevel by foreign key and the LogLevel contains a static list of levels.
dbo.Log
| ID | Level | Message |
|---|---|---|
| 1 | 1 | Verbose log. |
| 2 | 3 | A warning log. |
| 3 | 3 | Another warning log. |
dbo.LogLevel
| ID | Level |
|---|---|
| 1 | Verbose |
| 2 | Info |
| 3 | Warning |
| 4 | Error |
We can extract the LogLevels as enums using tableEnums and set the Log.Level column type using typeOverrides.
{
"typeOverrides": {
"dbo.Log.Level": "LogLevel"
},
"tableEnums": {
"dbo.LogLevel": {
"key": "Level",
"value": "ID"
}
}
}This creates the following interface file.
export interface LogEntity {
'ID'?: number;
'Level': LogLevel;
}
export enum LogLevel {
'Verbose' = 1,
'Info' = 2,
'Warning' = 3,
'Error' = 4,
}Allowing for example, log.LogLevel = LogLevel.Warning which when persisted to the database will use the underlying ID as the value, keeping referential constraint intact.
Comments provided on tables, views and columns are imported into the template.
By default these will be inserted into the generated file as comments on interfaces and properties.
Different database providers provide these comments in different ways.
| Provider | Source |
|---|---|
| MySQL | "Comments" field |
| SQL Server | Extended property with name "MS_Description" |
| Postgres | "Comment" field |
| SQLite | Not supported |
The default values of columns will be returned as part of the column definition. The value of this field will depend on the database provider but in all cases will return the raw value that the database stores defaults as.
For example, Postgres stores character varying defaults as "'default value'::character varying" so this is what will populate the field. It is up to the library consumer to parse this value.
Fields without a default will return null.
Objects are typically referred to using the pattern {schema}.{table}.{column}, or {schema}.{table} if the column is not required, or simply {schema} if only the schema is required.
Table and column are fairly obvious but schemas are used differently depending on the database provider. The below table lists how the schema is read for each provider.
| Provider | Source |
|---|---|
| Postgres | The schema that the table belongs to. |
| SQL Server | The schema that the table belongs to. |
| MySQL | The database name. |
| SQLite | 'main' |
For example, column ID in table Customer in database Live in a MySQL would be referred to as Live.Customer.ID.
The template.handlbars file controls the output of the generated .ts file. You can modify this file and use the template config option to specify a custom template file.
The inputs to this file are as followed.
{
"schemas": [{
"name": "dbo", // Original name of the schema.
"namespaceName": "dbo", // The computed schema name (default same as name).
"tables": [ // List of all non-filtered tables.
{
"name": "User", // The original database table name.
"schema": "dbo", // The schema the table belongs to.
"additionalProperties": { // Any additional properties that should be added.
"PropertyName": "number" // Property name and type.
},
"extends": "DBEntity", // The superclass, if any, that should be extended.
"interfaceName": "UserEntity", // The computed interface name.
"comment": "a table comment", // A table comment (see Comments below).
"columns": [ // List of columns in this table.
{
"name": "ID", // The original database column name.
"type": "int", // The original database type.
"propertyName": "ID", // The computed Typescript property name
"propertyType": "number", // The computed Typescript type
"nullable": false, // Whether the column is nullable.
"optional": true, // Whether the column is optional for insertion (has a default value).
"isEnum": false, // Whether the column is an enum type (currently only Postgres).
"isPrimaryKey": true, // Whether the column is a primary key.
"comment": "a comment", // A column comment (see Comments above).
"defaultValue": "'default'" // The raw default value of the column, or null (see Default Values above).
}
]
}
],
"enums": [ // List of enums, including table defined.
{
"name": "Severity", // The original database enum name.
"convertedName": "SeverityEnum", // The converted name of the enum.
"schema": "dbo", // The schema the enum belongs to.
"values": [
{
"originalKey": "very high", // The original database key that represents the enum.
"convertedKey": "veryhigh", // Converted value name.
"value": "Very High" // Value this enum represents.
}
]
}
]
}],
"config": { } // The user supplied configuration.
}By default, sql-ts builds the configuration from an INSERT perspective. That is to say, columns that can be NULL, have defaults, or are generated are set as optional (? on property) in the generated interface. This allows for not requiring them on the object and SQL will set the value on insert.
However, this can cause issues when using the generated interface for passing around to other areas of an application that are requiring a concrete value.
To resolve this, you can use the optionality, filename and interfaceNameFormat config options to create read models in a separate database models file.
{
"client": "...",
"connection": {},
"optionality": "required",
"interfaceNameFormat": "${table}ReadEntity",
"filename": "ReadDatabase.ts"
}