This page describes how to execute SQL statements against databases on Cloud SQL instances using the Data API. With the Data API, you use the Cloud SQL Admin API and gcloud CLI to run SQL statements on any instance where you've enabled Data API access.
You can use the Data API with instances that use public IP addresses, private services access, or Private Service Connect. The Data API supports all types of SQL statements including data manipulation language (DML), data definition language (DDL), and data query language (DQL). The Data API is good for running small and quick administrative statements, such as creating database roles or users and making small schema updates.
Before you begin
Before you can execute SQL statements on an instance, do the following:
- Configure the instance for IAM database authentication.
- Add an IAM user or service account to the instance and grant the account the required roles or permissions to execute SQL statements.
Required roles or permissions
By default, user or service accounts with one of the following roles have the
permission to execute SQL statements on a Cloud SQL instance (cloudsql.instances.executesql):
Cloud SQL Admin(roles/cloudsql.admin)Cloud SQL Instance User(roles/cloudsql.instanceUser)Cloud SQL Studio User(roles/cloudsql.studioUser)
You can also define an IAM custom role
for the user or service account that includes the cloudsql.instances.executesql
permission. This permission is
supported in
IAM custom roles.
Enable or disable the Data API
To use the Data API, you must enable it for each instance. You can disable the Data API at any time.
Console
-
In the Google Cloud console, go to the Cloud SQL Instances page.
- To open the Overview page of an instance, click the instance name.
- From the SQL navigation menu, select Connections.
- Click the Networking tab.
- Select the Allow Data API checkbox.
- Click Save.
gcloud
To enable Data API access on an instance, use the gcloud sql instances patch command with the --data-api-access=ALLOW_DATA_API flag:
gcloud sql instances patch INSTANCE_NAME --data-api-access=ALLOW_DATA_API
To disable Data API access, use the --data-api-access=DISALLOW_DATA_API flag:
gcloud sql instances patch INSTANCE_NAME --data-api-access=DISALLOW_DATA_API
Replace INSTANCE_NAME with the name of the instance on which to enable or disable the Data API.
Execute a SQL statement
You can execute SQL statements against databases on your Cloud SQL instance using either gcloud CLI or the REST API.
gcloud
To execute a SQL statement against a database on an instance using the gcloud CLI, use the gcloud sql instances execute-sql command:
gcloud sql instances execute-sql INSTANCE_NAME \ --database=DATABASE_NAME \ --sql=SQL_STATEMENT \ --partial_result_mode=PARTIAL_RESULT_MODE
Make the following replacements:
- INSTANCE_NAME: the name of the instance.
- DATABASE_NAME: the name of the database within the instance.
- SQL_STATEMENT: the SQL statement to execute. If the statement contains spaces or shell special characters, then it must be quoted.
- PARTIAL_RESULT_MODE: optional. Controls how to respond when result is incomplete. Can be
ALLOW_PARTIAL_RESULT,FAIL_PARTIAL_RESULT, orPARTIAL_RESULT_MODE_UNSPECIFIED. See Modifying truncation behavior.
You can also include the --project=PROJECT_ID flag if needed.
Terraform
You can use Data API on Terraform to provision in-database resources such
as databases, tables, extensions, users, and privilege grants, without manually
connecting to the instance. To execute a SQL script on Terraform, use the
google_sql_provision_script Terraform resource.
resource "google_sql_database_instance" "instance" { name = "my-instance" database_version = "MYSQL_8_4" settings { tier = "db-perf-optimized-N-2" data_api_access = "ALLOW_DATA_API" # This allows the use of Data API. database_flags { name = "cloudsql_iam_authentication" value = "on" } } } /* * Create a database user for your account and grant roles so it has privilege * to access the database. Set the type toCLOUD_IAM_USERfor huamn * account orCLOUD_IAM_SERVICE_ACCOUNTfor service account. */ resource "google_sql_user" "iam_user" { name = "account-used-to-apply-this-config@example.com" instance = google_sql_database_instance.instance.name type = "CLOUD_IAM_USER" # Roles granted to the user. Smaller roles are preferred, if exist. # This field doesn't support MySQL 5.6 and 5.7. database_roles = ["cloudsqlsuperuser"] } resource "google_sql_provision_script" "script" { # You can inline the script or import from a file likescript = file("${path.module}/script.sql")# When modified, the whole script will be executed again. It's recommended to # make the script idempotent with patterns likecreate if not exists ...or #if not exists (select ...) then ... end if. script = "CREATE DATABASE pets;" instance = google_sql_database_instance.instance.name # Some of your queries may require a database. You can create and use a # database in the script or explicitly create and reference a database # likedatabase = google_sql_database.database.name. description = "sql script to create DBs" # The identity account used to apply your Terraform config must exist as an # IAM user or IAM service account in the instance. Terraform connects to the # instance via IAM database authentication to execute the script. depends_on = [google_sql_user.iam_user] }
Apply the changes
To apply your Terraform configuration in a Google Cloud project, complete the steps in the following sections.
Prepare Cloud Shell
- Launch Cloud Shell.
-
Set the default Google Cloud project where you want to apply your Terraform configurations.
You only need to run this command once per project, and you can run it in any directory.
export GOOGLE_CLOUD_PROJECT=PROJECT_ID
Environment variables are overridden if you set explicit values in the Terraform configuration file.
Prepare the directory
Each Terraform configuration file must have its own directory (also called a root module).
-
In Cloud Shell, create a directory and a new
file within that directory. The filename must have the
.tfextension—for examplemain.tf. In this tutorial, the file is referred to asmain.tf.mkdir DIRECTORY && cd DIRECTORY && touch main.tf
-
If you are following a tutorial, you can copy the sample code in each section or step.
Copy the sample code into the newly created
main.tf.Optionally, copy the code from GitHub. This is recommended when the Terraform snippet is part of an end-to-end solution.
- Review and modify the sample parameters to apply to your environment.
- Save your changes.
-
Initialize Terraform. You only need to do this once per directory.
terraform init
Optionally, to use the latest Google provider version, include the
-upgradeoption:terraform init -upgrade
Apply the changes
-
Review the configuration and verify that the resources that Terraform is going to create or
update match your expectations:
terraform plan
Make corrections to the configuration as necessary.
-
Apply the Terraform configuration by running the following command and entering
yesat the prompt:terraform apply
Wait until Terraform displays the "Apply complete!" message.
- Open your Google Cloud project to view the results. In the Google Cloud console, navigate to your resources in the UI to make sure that Terraform has created or updated them.
Delete the changes
Deleting a google_sql_provision_script resource won't delete the
in-database resources it created. To delete them, you can explicitly add statements in
the script such as drop ... if exists and then apply the changes.
REST
To execute a SQL statement against a database on an instance using the REST API, send a POST request to the executeSql endpoint:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_NAME/executeSql
The request body should contain the database name and the SQL statement:
{ "database": "DATABASE_NAME", "sqlStatement": "SQL_STATEMENT", "partialResultMode": "PARTIAL_RESULT_MODE" "autoIamAuthn": true }
Make the following replacements:
- PROJECT_ID: your project ID.
- INSTANCE_NAME: the name of the instance.
- DATABASE_NAME: the name of the database within the instance.
- SQL_STATEMENT: the SQL statement to execute.
- PARTIAL_RESULT_MODE: optional. Controls how the API responds when the result exceeds 10 MB. Can be
FAIL_PARTIAL_RESULTorALLOW_PARTIAL_RESULT. See Modifying truncation behavior.
Modify truncation behavior
You can control how large results are handled when executing SQL.
- Include the
"partialResultMode"field in the request. This field accepts the following values:FAIL_PARTIAL_RESULT: Throw an error if the result exceeds 10 MB or if only a partial result can be retrieved. Don't return the result.ALLOW_PARTIAL_RESULT: Return a truncated result and setpartial_resultto true if the result exceeds 10 MB or if only a partial result can be retrieved due to error. Don't throw an error.
Limitations
- The size limit for a response is 10 MB. Results exceeding this size are truncated if
partialResultModeis set toALLOW_PARTIAL_RESULT, otherwise an error is thrown. - Requests are limited to 0.5 MB.
- You can only run SQL statements for Cloud SQL for MySQL instances that are running.
- Cloud SQL doesn't support using the Data API with instances that are set up for external server replication.
- Requests taking longer than 30 seconds are canceled. Setting a higher
statement timeout using
SET SESSION MAX_EXECUTION_TIMEisn't supported. For Cloud SQL for MySQL 5.6 and 5.7, long running DDL statements timing out can cause orphaned files or tables that can't be safely rolled back. Be cautious with statements likeALTER TABLEon large tables. - Cloud SQL limits the number of concurrent
executeSqlrequests to 10 per instance for each user. If this limit is reached, subsequent requests fail with "At most 10 concurrent queries may be run on this instance. Try again later." or "Maximum concurrent reads 10 reached." - Each response can contain a maximum of 10 database messages or warnings.
- If there is a statement syntax or execution error, then no result is returned.
- For Cloud SQL for MySQL, notices and warnings are available only for the last statement of a multi-statement execution.
- Statements that consume a large amount of memory can cause out-of-memory errors. For more information on avoiding these errors, see Best practices for managing memory usage. A database instance running with high memory utilization often causes performance issues, stalls, or even database downtime.
- Data API can be temporarily blocked for data integrity purposes when certain maintenance operations are ongoing on the instance. Retry later if it happens.
- Data API doesn't guarantee data residency yet. Requests will fail with the error "not supported for instances in certain Assured Workloads control packages folders" for certain Assured Workloads projects and for projects with
constraints/sql.restrictNoncompliantResourceCreationmanually enforced.