You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Feast (**Fea**ture **St**ore) is an open source feature store for machine learning. Feast is the fastest path to manage existing infrastructure to productionize analytic data for model training and online inference.
Note: The materialization engine is not constructed via unified compute engine interface.
4
+
3
5
A batch materialization engine is a component of Feast that's responsible for moving data from the offline store into the online store.
4
6
5
-
A materialization engine abstracts over specific technologies or frameworks that are used to materialize data. It allows users to use a pure local serialized approach (which is the default LocalMaterializationEngine), or delegates the materialization to seperate components (e.g. AWS Lambda, as implemented by the the LambdaMaterializaionEngine).
7
+
A materialization engine abstracts over specific technologies or frameworks that are used to materialize data. It allows users to use a pure local serialized approach (which is the default LocalComputeEngine), or delegates the materialization to seperate components (e.g. AWS Lambda, as implemented by the the LambdaComputeEngine).
6
8
7
-
If the built-in engines are not sufficient, you can create your own custom materialization engine. Please see [this guide](../../how-to-guides/customizing-feast/creating-a-custom-materialization-engine.md) for more details.
9
+
If the built-in engines are not sufficient, you can create your own custom materialization engine. Please see [this guide](../../how-to-guides/customizing-feast/creating-a-custom-compute-engine.md) for more details.
8
10
9
11
Please see [feature\_store.yaml](../../reference/feature-repository/feature-store-yaml.md#overview) for configuring engines.
Copy file name to clipboardExpand all lines: docs/getting-started/genai.md
+48-1Lines changed: 48 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -56,7 +56,6 @@ The transformation workflow typically involves:
56
56
3.**Chunking**: Split documents into smaller, semantically meaningful chunks
57
57
4.**Embedding Generation**: Convert text chunks into vector embeddings
58
58
5.**Storage**: Store embeddings and metadata in Feast's feature store
59
-
60
59
### Feature Transformation for LLMs
61
60
62
61
Feast supports transformations that can be used to:
@@ -105,13 +104,61 @@ This integration enables:
105
104
- Efficiently materializing features to vector databases
106
105
- Scaling RAG applications to enterprise-level document repositories
107
106
107
+
## Model Context Protocol (MCP) Support
108
+
109
+
Feast supports the Model Context Protocol (MCP), which enables AI agents and applications to interact with your feature store through standardized MCP interfaces. This allows seamless integration with LLMs and AI agents for GenAI applications.
110
+
111
+
### Key Benefits of MCP Support
112
+
113
+
***Standardized AI Integration**: Enable AI agents to discover and use features dynamically without hardcoded definitions
114
+
***Easy Setup**: Add MCP support with a simple configuration change and `pip install feast[mcp]`
115
+
***Agent-Friendly APIs**: Expose feature store capabilities through MCP tools that AI agents can understand and use
116
+
***Production Ready**: Built on top of Feast's proven feature serving infrastructure
117
+
118
+
### Getting Started with MCP
119
+
120
+
1.**Install MCP support**:
121
+
```bash
122
+
pip install feast[mcp]
123
+
```
124
+
125
+
2.**Configure your feature store** to use MCP:
126
+
```yaml
127
+
feature_server:
128
+
type: mcp
129
+
enabled: true
130
+
mcp_enabled: true
131
+
mcp_server_name: "feast-feature-store"
132
+
mcp_server_version: "1.0.0"
133
+
```
134
+
135
+
### How It Works
136
+
137
+
The MCP integration uses the `fastapi_mcp` library to automatically transform your Feast feature server's FastAPI endpoints into MCP-compatible tools. When you enable MCP support:
138
+
139
+
1. **Automatic Discovery**: The integration scans your FastAPI application and discovers all available endpoints
140
+
2. **Tool Generation**: Each endpoint becomes an MCP tool with auto-generated schemas and descriptions
141
+
3. **Dynamic Access**: AI agents can discover and call these tools dynamically without hardcoded definitions
142
+
4. **Standard Protocol**: Uses the Model Context Protocol for standardized AI-to-API communication
143
+
144
+
### Available MCP Tools
145
+
146
+
The fastapi_mcp integration automatically exposes your Feast feature server's FastAPI endpoints as MCP tools. This means AI assistants can:
147
+
148
+
* **Call `/get-online-features`** to retrieve features from the feature store
149
+
* **Use `/health`** to check server status
150
+
151
+
For a complete example, see the [MCP Feature Store Example](../../examples/mcp_feature_store/).
Copy file name to clipboardExpand all lines: docs/how-to-guides/customizing-feast/creating-a-custom-compute-engine.md
+16-10Lines changed: 16 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,24 +1,24 @@
1
-
# Adding a custom batch materialization engine
1
+
# Adding a custom compute engine
2
2
3
3
### Overview
4
4
5
-
Feast batch materialization operations (`materialize` and `materialize-incremental`) execute through a `BatchMaterializationEngine`.
5
+
Feast batch materialization operations (`materialize` and `materialize-incremental`), and get_historical_features are executed through a `ComputeEngine`.
6
6
7
-
Custom batch materialization engines allow Feast users to extend Feast to customize the materialization process. Examples include:
7
+
Custom batch compute engines allow Feast users to extend Feast to customize the materialization and get_historical_features process. Examples include:
8
8
9
9
* Setting up custom materialization-specific infrastructure during `feast apply` (e.g. setting up Spark clusters or Lambda Functions)
* Tearing down custom materialization-specific infrastructure during `feast teardown` (e.g. tearing down Spark clusters, or deleting Lambda Functions)
12
12
13
-
Feast comes with built-in materialization engines, e.g, `LocalMaterializationEngine`, and an experimental `LambdaMaterializationEngine`. However, users can develop their own materialization engines by creating a class that implements the contract in the [BatchMaterializationEngine class](https://github.com/feast-dev/feast/blob/6d7b38a39024b7301c499c20cf4e7aef6137c47c/sdk/python/feast/infra/materialization/batch\_materialization\_engine.py#L72).
13
+
Feast comes with built-in materialization engines, e.g, `LocalComputeEngine`, and an experimental `LambdaComputeEngine`. However, users can develop their own compute engines by creating a class that implements the contract in the [ComputeEngine class](https://github.com/feast-dev/feast/blob/85514edbb181df083e6a0d24672c00f0624dcaa3/sdk/python/feast/infra/compute_engines/base.py#L19).
14
14
15
15
### Guide
16
16
17
-
The fastest way to add custom logic to Feast is to extend an existing materialization engine. The most generic engine is the `LocalMaterializationEngine` which contains no cloud-specific logic. The guide that follows will extend the `LocalProvider` with operations that print text to the console. It is up to you as a developer to add your custom code to the engine methods, but the guide below will provide the necessary scaffolding to get you started.
17
+
The fastest way to add custom logic to Feast is to implement the ComputeEngine. The guide that follows will extend the `LocalProvider` with operations that print text to the console. It is up to you as a developer to add your custom code to the engine methods, but the guide below will provide the necessary scaffolding to get you started.
18
18
19
19
#### Step 1: Define an Engine class
20
20
21
-
The first step is to define a custom materialization engine class. We've created the `MyCustomEngine` below. This python file can be placed in your `feature_repo` directory if you're following the Quickstart guide.
21
+
The first step is to define a custom compute engine class. We've created the `MyCustomEngine` below. This python file can be placed in your `feature_repo` directory if you're following the Quickstart guide.
22
22
23
23
```python
24
24
from typing import List, Sequence, Union
@@ -27,14 +27,16 @@ from feast.entity import Entity
27
27
from feast.feature_view import FeatureView
28
28
from feast.batch_feature_view import BatchFeatureView
29
29
from feast.stream_feature_view import StreamFeatureView
30
-
from feast.infra.materialization.local_engine import LocalMaterializationJob, LocalMaterializationEngine
30
+
from feast.infra.common.retrieval_task import HistoricalRetrievalTask
31
+
from feast.infra.compute_engines.local.job import LocalMaterializationJob
32
+
from feast.infra.compute_engines.base import ComputeEngine
31
33
from feast.infra.common.materialization_job import MaterializationTask
32
-
from feast.infra.offline_stores.offline_store import OfflineStore
34
+
from feast.infra.offline_stores.offline_store import OfflineStore, RetrievalJob
33
35
from feast.infra.online_stores.online_store import OnlineStore
34
36
from feast.repo_config import RepoConfig
35
37
36
38
37
-
classMyCustomEngine(LocalMaterializationEngine):
39
+
classMyCustomEngine(ComputeEngine):
38
40
def__init__(
39
41
self,
40
42
*,
@@ -80,9 +82,13 @@ class MyCustomEngine(LocalMaterializationEngine):
Notice how in the above engine we have only overwritten two of the methods on the `LocalMaterializatinEngine`, namely `update` and `materialize`. These two methods are convenient to replace if you are planning to launch custom batch jobs.
90
+
Notice how in the above engine we have only overwritten two of the methods on the `LocalComputeEngine`, namely `update` and `materialize`. These two methods are convenient to replace if you are planning to launch custom batch jobs.
91
+
If you want to use the compute to execute the get_historical_features method, you will need to implement the `get_historical_features` method as well.
0 commit comments