You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The key thing to note for now is the online store has been configured to be Redis. This is specifically for a single Redis node. If you want to use a Redis cluster, then you'd change this to something like:
68
+
The key thing to note for now is the registry is now swapped for a SQL backed registry (Postgres) and the online store has been configured to be Redis. This is specifically for a single Redis node. If you want to use a Redis cluster, then you'd change this to something like:
Because we use `redis-py` under the hood, this means Feast also works well with hosted Redis instances like AWS Elasticache ([docs](https://docs.aws.amazon.com/AmazonElastiCache/latest/red-ug/ElastiCache-Getting-Started-Tutorials-Connecting.html)).
@@ -90,14 +92,15 @@ We then use Docker Compose to spin up the services we need.
90
92
- This also deploys a Feast push server (on port 6567) + a Feast feature server (on port 6566).
91
93
- These servers embed a `feature_store.yaml` file that enables them to connect to a remote registry. The Dockerfile mostly delegates to calling the `feast serve` CLI command, which instantiates a Feast python server ([docs](https://docs.feast.dev/reference/feature-servers/python-feature-server)):
# Needed to reach online store within Docker network.
101
+
# Needed to reach online store and registry within Docker network.
100
102
RUN sed -i 's/localhost:6379/redis:6379/g' feature_store.yaml
103
+
RUN sed -i 's/127.0.0.1:55001/registry:5432/g' feature_store.yaml
101
104
ENV FEAST_USAGE=False
102
105
103
106
CMD ["feast", "serve", "-h", "0.0.0.0"]
@@ -115,7 +118,8 @@ Creating broker ... done
115
118
Creating feast_feature_server ... done
116
119
Creating feast_push_server ... done
117
120
Creating kafka_events ... done
118
-
Attaching to zookeeper, redis, broker, feast_push_server, feast_feature_server, kafka_events
121
+
Creating registry ... done
122
+
Attaching to zookeeper, redis, broker, feast_push_server, feast_feature_server, kafka_events, registry
119
123
...
120
124
```
121
125
## Step 5: Why register streaming features in Feast?
@@ -179,7 +183,9 @@ Run the Jupyter notebook ([feature_repo/workshop.ipynb](feature_repo/module_1.ip
179
183
### Scheduling materialization
180
184
To ensure fresh features, you'll want to schedule materialization jobs regularly. This can be as simple as having a cron job that calls `feast materialize-incremental`.
181
185
182
-
Users may also be interested in integrating with Airflow, in which case you can build a custom Airflow image with the Feast SDK installed, and then use a `BashOperator` (with `feast materialize-incremental`) or `PythonOperator` (with `store.materialize_incremental(datetime.datetime.now())`):
186
+
Users may also be interested in integrating with Airflow, in which case you can build a custom Airflow image with the Feast SDK installed, and then use a `BashOperator` (with `feast materialize-incremental`) or `PythonOperator` (with `store.materialize_incremental(datetime.datetime.now())`).
0 commit comments