Skip to content

Commit 9b7b2e1

Browse files
committed
Add reference to querying by RepoConfig
Signed-off-by: Danny Chiao <danny@tecton.ai>
1 parent 40e4cc8 commit 9b7b2e1

File tree

2 files changed

+26
-1
lines changed

2 files changed

+26
-1
lines changed

module_0/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -441,7 +441,7 @@ We don't need to do anything new here since data scientists will be doing many o
441441

442442
There are two ways data scientists can use Feast:
443443
- Use Feast primarily as a way of pulling production ready features.
444-
- See the `client/` folder for an example of how users can pull features by only having a `feature_store.yaml`
444+
- See the `client/` or `client_no_yaml` folders for examples of how users can pull features by only having a `feature_store.yaml` or instantiating a `RepoConfig` object
445445
- This is **not recommended** since data scientists cannot register feature services to indicate they depend on certain features in production.
446446
- **[Recommended]** Have a local copy of the feature repository (e.g. `git clone`) and author / iterate / re-use features.
447447
- Data scientist can:
Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
from feast import FeatureStore, RepoConfig
2+
from feast.repo_config import RegistryConfig
3+
4+
# TODO: replace with your bucket
5+
repo_config = RepoConfig(
6+
registry=RegistryConfig(path="s3://feast-workshop-danny/registry.pb"),
7+
project="feast_demo_aws",
8+
provider="aws",
9+
offline_store="file", # Could also be the OfflineStoreConfig e.g. FileOfflineStoreConfig
10+
online_store="null", # Could also be the OnlineStoreConfig e.g. RedisOnlineStoreConfig
11+
)
12+
store = FeatureStore(config=repo_config)
13+
14+
import pandas as pd
15+
16+
# Get the latest feature values for unique entities
17+
entity_df = pd.DataFrame.from_dict({"driver_id": [1001, 1002, 1003, 1004, 1005],})
18+
entity_df["event_timestamp"] = pd.to_datetime("now", utc=True)
19+
training_df = store.get_historical_features(
20+
entity_df=entity_df, features=store.get_feature_service("model_v2"),
21+
).to_df()
22+
23+
# Make batch predictions
24+
# predictions = model.predict(training_df)
25+
print(training_df)

0 commit comments

Comments
 (0)