Skip to content

Commit 896699c

Browse files
committed
include env variables export
Signed-off-by: Danny Chiao <danny@tecton.ai>
1 parent 875210b commit 896699c

File tree

1 file changed

+14
-0
lines changed

1 file changed

+14
-0
lines changed

module_3/README.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -72,6 +72,13 @@ Now we're using a test database in Snowflake.
7272
7373
To get started, go ahead and register the feature repository
7474
```bash
75+
# Note: first you need to export environment variables matching the above variables:
76+
# export SNOWFLAKE_DEPLOYMENT_URL="[YOUR DEPLOYMENT]
77+
# export SNOWFLAKE_USER="[YOUR USER]
78+
# export SNOWFLAKE_PASSWORD="[YOUR PASSWORD]
79+
# export SNOWFLAKE_ROLE="[YOUR ROLE]
80+
# export SNOWFLAKE_WAREHOUSE="[YOUR WAREHOUSE]
81+
# export SNOWFLAKE_DATABASE="[YOUR DATABASE]
7582
cd feature_repo; feast apply
7683
```
7784

@@ -114,6 +121,13 @@ We setup a standalone version of Airflow to set up the `PythonOperator` (Airflow
114121
The below script will copy the dbt DAGs over. In production, you'd want to use Airflow to sync with version controlled dbt DAGS (e.g. that are sync'd to S3).
115122

116123
```bash
124+
# First: export Snowflake related environment variables used above:
125+
# export SNOWFLAKE_DEPLOYMENT_URL="[YOUR DEPLOYMENT]
126+
# export SNOWFLAKE_USER="[YOUR USER]
127+
# export SNOWFLAKE_PASSWORD="[YOUR PASSWORD]
128+
# export SNOWFLAKE_ROLE="[YOUR ROLE]
129+
# export SNOWFLAKE_WAREHOUSE="[YOUR WAREHOUSE]
130+
# export SNOWFLAKE_DATABASE="[YOUR DATABASE]
117131
cd airflow_demo; sh setup_airflow.sh
118132
```
119133

0 commit comments

Comments
 (0)