You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: module_1/README.md
+39-5Lines changed: 39 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,8 +12,9 @@ In this module, we focus on building features for online serving, and keeping th
12
12
13
13
-[Workshop](#workshop)
14
14
-[Step 1: Install Feast](#step-1-install-feast)
15
-
-[Step 2: Spin up Kafka + Redis + Feast services](#step-2-spin-up-kafka--redis--feast-services)
16
-
-[Step 3: Materialize batch features & ingest streaming features](#step-3-materialize-batch-features--ingest-streaming-features)
15
+
-[Step 2: Inspect the `feature_store.yaml`](#step-2-inspect-the-feature_storeyaml)
16
+
-[Step 3: Spin up Kafka + Redis + Feast services](#step-3-spin-up-kafka--redis--feast-services)
17
+
-[Step 4: Materialize batch features & ingest streaming features](#step-4-materialize-batch-features--ingest-streaming-features)
17
18
-[A note on Feast feature servers + push servers](#a-note-on-feast-feature-servers--push-servers)
18
19
-[Conclusion](#conclusion)
19
20
-[FAQ](#faq)
@@ -28,11 +29,44 @@ First, we install Feast with Spark and Redis support:
28
29
pip install "feast[spark,redis]"
29
30
```
30
31
31
-
## Step 2: Spin up Kafka + Redis + Feast services
32
+
## Step 2: Inspect the `feature_store.yaml`
33
+
34
+
```yaml
35
+
project: feast_demo_local
36
+
provider: local
37
+
registry:
38
+
path: data/local_registry.db
39
+
cache_ttl_seconds: 5
40
+
online_store:
41
+
type: redis
42
+
connection_string: localhost:6379
43
+
offline_store:
44
+
type: file
45
+
```
46
+
47
+
The key thing to note for now is the online store has been configured to be Redis. This is specifically for a single Redis node. If you want to use a Redis cluster, then you'd change this to something like:
Because we use `redis-py` under the hood, this means Feast also works well with hosted Redis instances like AWS Elasticache ([docs](https://docs.aws.amazon.com/AmazonElastiCache/latest/red-ug/ElastiCache-Getting-Started-Tutorials-Connecting.html)).
64
+
65
+
## Step 3: Spin up Kafka + Redis + Feast services
32
66
33
67
We then use Docker Compose to spin up a local Kafka cluster and automatically publish events to it.
34
68
- This leverages a script (in `kafka_demo/`) that creates a topic, reads from `feature_repo/data/driver_stats.parquet`, generates newer timestamps, and emits them to the topic.
35
-
- This also deploys a Feast push server (on port 6567) + a Feast feature server (on port 6566). The Dockerfile mostly delegates to calling the `feast serve` CLI command:
69
+
- This also deploys a Feast push server (on port 6567) + a Feast feature server (on port 6566). These servers embed a `feature_store.yaml` file that enables them to connect to a remote registry. The Dockerfile mostly delegates to calling the `feast serve` CLI command, which instantiates a Feast python server ([docs](https://docs.feast.dev/reference/feature-servers/python-feature-server)):
## Step 3: Materialize batch features & ingest streaming features
99
+
## Step 4: Materialize batch features & ingest streaming features
66
100
67
101
We'll switch gears into a Jupyter notebook. This will guide you through:
68
102
- Registering a `FeatureView` that has a single schema across both a batch source (`FileSource`) with aggregate features and a stream source (`PushSource`).
0 commit comments