diff --git a/docs/SUMMARY.md b/docs/SUMMARY.md
index 27a84d31213..f5ffcd3478e 100644
--- a/docs/SUMMARY.md
+++ b/docs/SUMMARY.md
@@ -163,6 +163,7 @@
* [\[Alpha\] Vector Database](reference/alpha-vector-database.md)
* [\[Alpha\] Data quality monitoring](reference/dqm.md)
* [\[Alpha\] Streaming feature computation with Denormalized](reference/denormalized.md)
+* [OpenLineage Integration](reference/openlineage.md)
* [Feast CLI reference](reference/feast-cli-commands.md)
* [Python API reference](http://rtd.feast.dev)
* [Usage](reference/usage.md)
diff --git a/docs/reference/openlineage.md b/docs/reference/openlineage.md
new file mode 100644
index 00000000000..01837c9936a
--- /dev/null
+++ b/docs/reference/openlineage.md
@@ -0,0 +1,218 @@
+# OpenLineage Integration
+
+This module provides **native integration** between Feast and [OpenLineage](https://openlineage.io/), enabling automatic data lineage tracking for ML feature engineering workflows.
+
+## Overview
+
+When enabled, the integration **automatically** emits OpenLineage events for:
+
+- **Registry changes** - Events when feature views, feature services, and entities are applied
+- **Feature materialization** - START, COMPLETE, and FAIL events when features are materialized
+
+**No code changes required** - just enable OpenLineage in your `feature_store.yaml`!
+
+## Installation
+
+OpenLineage is an optional dependency. Install it with:
+
+```bash
+pip install openlineage-python
+```
+
+Or install Feast with the OpenLineage extra:
+
+```bash
+pip install feast[openlineage]
+```
+
+## Configuration
+
+Add the `openlineage` section to your `feature_store.yaml`:
+
+```yaml
+project: my_project
+registry: data/registry.db
+provider: local
+online_store:
+ type: sqlite
+ path: data/online_store.db
+
+openlineage:
+ enabled: true
+ transport_type: http
+ transport_url: http://localhost:5000
+ transport_endpoint: api/v1/lineage
+ namespace: feast
+ emit_on_apply: true
+ emit_on_materialize: true
+```
+
+Once configured, all Feast operations will automatically emit lineage events.
+
+### Environment Variables
+
+You can also configure via environment variables:
+
+```bash
+export FEAST_OPENLINEAGE_ENABLED=true
+export FEAST_OPENLINEAGE_TRANSPORT_TYPE=http
+export FEAST_OPENLINEAGE_URL=http://localhost:5000
+export FEAST_OPENLINEAGE_ENDPOINT=api/v1/lineage
+export FEAST_OPENLINEAGE_NAMESPACE=feast
+```
+
+## Usage
+
+Once configured, lineage is tracked automatically:
+
+```python
+from feast import FeatureStore
+from datetime import datetime, timedelta
+
+# Create FeatureStore - OpenLineage is initialized automatically if configured
+fs = FeatureStore(repo_path="feature_repo")
+
+# Apply operations emit lineage events automatically
+fs.apply([driver_entity, driver_hourly_stats_view])
+
+# Materialize emits START, COMPLETE/FAIL events automatically
+fs.materialize(
+ start_date=datetime.now() - timedelta(days=1),
+ end_date=datetime.now()
+)
+
+```
+
+## Configuration Options
+
+| Option | Default | Description |
+|--------|---------|-------------|
+| `enabled` | `false` | Enable/disable OpenLineage integration |
+| `transport_type` | `http` | Transport type: `http`, `file`, `kafka` |
+| `transport_url` | - | URL for HTTP transport (required) |
+| `transport_endpoint` | `api/v1/lineage` | API endpoint for HTTP transport |
+| `api_key` | - | Optional API key for authentication |
+| `namespace` | `feast` | Namespace for lineage events (uses project name if set to "feast") |
+| `producer` | `feast` | Producer identifier |
+| `emit_on_apply` | `true` | Emit events on `feast apply` |
+| `emit_on_materialize` | `true` | Emit events on materialization |
+
+## Lineage Graph Structure
+
+When you run `feast apply`, Feast creates a lineage graph that matches the Feast UI:
+
+```
+DataSources ──┐
+ ├──→ feast_feature_views_{project} ──→ FeatureViews
+Entities ─────┘ │
+ │
+ ▼
+ feature_service_{name} ──→ FeatureService
+```
+
+**Jobs created:**
+- `feast_feature_views_{project}`: Shows DataSources + Entities → FeatureViews
+- `feature_service_{name}`: Shows specific FeatureViews → FeatureService (one per service)
+
+**Datasets include:**
+- Schema with feature names, types, descriptions, and tags
+- Feast-specific facets with metadata (TTL, entities, owner, etc.)
+- Documentation facets with descriptions
+
+## Transport Types
+
+### HTTP Transport (Recommended for Production)
+
+```yaml
+openlineage:
+ enabled: true
+ transport_type: http
+ transport_url: http://marquez:5000
+ transport_endpoint: api/v1/lineage
+ api_key: your-api-key # Optional
+```
+
+### File Transport
+
+```yaml
+openlineage:
+ enabled: true
+ transport_type: file
+ additional_config:
+ log_file_path: openlineage_events.json
+```
+
+### Kafka Transport
+
+```yaml
+openlineage:
+ enabled: true
+ transport_type: kafka
+ additional_config:
+ bootstrap_servers: localhost:9092
+ topic: openlineage.events
+```
+
+## Custom Feast Facets
+
+The integration includes custom Feast-specific facets in lineage events:
+
+### FeastFeatureViewFacet
+
+Captures metadata about feature views:
+- `name`: Feature view name
+- `ttl_seconds`: Time-to-live in seconds
+- `entities`: List of entity names
+- `features`: List of feature names
+- `online_enabled` / `offline_enabled`: Store configuration
+- `description`: Feature view description
+- `tags`: Key-value tags
+
+### FeastFeatureServiceFacet
+
+Captures metadata about feature services:
+- `name`: Feature service name
+- `feature_views`: List of feature view names
+- `feature_count`: Total number of features
+- `description`: Feature service description
+- `tags`: Key-value tags
+
+### FeastMaterializationFacet
+
+Captures materialization run metadata:
+- `feature_views`: Feature views being materialized
+- `start_date` / `end_date`: Materialization window
+- `rows_written`: Number of rows written
+
+## Lineage Visualization
+
+Use [Marquez](https://marquezproject.ai/) to visualize your Feast lineage:
+
+```bash
+# Start Marquez
+docker run -p 5000:5000 -p 3000:3000 marquezproject/marquez
+
+# Configure Feast to emit to Marquez (in feature_store.yaml)
+# openlineage:
+# enabled: true
+# transport_type: http
+# transport_url: http://localhost:5000
+```
+
+Then access the Marquez UI at http://localhost:3000 to see your feature lineage.
+
+## Namespace Behavior
+
+- If `namespace` is set to `"feast"` (default): Uses project name as namespace (e.g., `my_project`)
+- If `namespace` is set to a custom value: Uses `{namespace}/{project}` (e.g., `custom/my_project`)
+
+## Feast to OpenLineage Mapping
+
+| Feast Concept | OpenLineage Concept |
+|---------------|---------------------|
+| DataSource | InputDataset |
+| FeatureView | OutputDataset (of feature views job) / InputDataset (of feature service job) |
+| Feature | Schema field |
+| Entity | InputDataset |
+| FeatureService | OutputDataset |
+| Materialization | RunEvent (START/COMPLETE/FAIL) |
diff --git a/examples/openlineage-integration/README.md b/examples/openlineage-integration/README.md
new file mode 100644
index 00000000000..43dc0d29da8
--- /dev/null
+++ b/examples/openlineage-integration/README.md
@@ -0,0 +1,58 @@
+# Feast OpenLineage Integration Example
+
+This example demonstrates Feast's **native OpenLineage integration** for automatic data lineage tracking.
+
+For full documentation, see the [OpenLineage Reference](../../docs/reference/openlineage.md).
+
+## Prerequisites
+
+```bash
+pip install feast[openlineage]
+```
+
+## Running the Demo
+
+1. Start Marquez:
+```bash
+docker run -p 5000:5000 -p 3000:3000 marquezproject/marquez
+```
+
+2. Run the demo:
+```bash
+python openlineage_demo.py --url http://localhost:5000
+```
+
+3. View lineage at http://localhost:3000
+
+## What the Demo Shows
+
+The demo creates a sample feature repository and demonstrates:
+
+- **Entity**: `driver_id`
+- **DataSource**: `driver_stats_source` (Parquet file)
+- **FeatureView**: `driver_hourly_stats` with features like conversion rate, acceptance rate
+- **FeatureService**: `driver_stats_service` aggregating features
+
+When you run the demo, it will:
+1. Create the feature store with OpenLineage enabled
+2. Apply the features (emits lineage events)
+3. Materialize features (emits START/COMPLETE events)
+4. Retrieve features (demonstrates online feature retrieval)
+
+## Lineage Graph
+
+After running the demo, you'll see this lineage in Marquez:
+
+```
+driver_stats_source ──┐
+ ├──→ feast_feature_views_openlineage_demo ──→ driver_hourly_stats
+driver_id ────────────┘ │
+ ▼
+ feature_service_driver_stats_service ──→ driver_stats_service
+```
+
+## Learn More
+
+- [Feast OpenLineage Reference](../../docs/reference/openlineage.md)
+- [OpenLineage Documentation](https://openlineage.io/docs)
+- [Marquez Project](https://marquezproject.ai)
diff --git a/examples/openlineage-integration/openlineage_demo.py b/examples/openlineage-integration/openlineage_demo.py
new file mode 100644
index 00000000000..5bcb40fb902
--- /dev/null
+++ b/examples/openlineage-integration/openlineage_demo.py
@@ -0,0 +1,211 @@
+#!/usr/bin/env python
+# Copyright 2026 The Feast Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+Feast OpenLineage Native Integration Demo
+
+This demo shows how Feast's native OpenLineage integration works.
+When OpenLineage is enabled in feature_store.yaml, lineage events
+are emitted automatically - no code changes required!
+
+Usage:
+ python openlineage_demo.py --url http://localhost:5000
+"""
+
+import argparse
+import tempfile
+from datetime import datetime, timedelta
+from pathlib import Path
+import pandas as pd
+from feast import FeatureStore
+from feast import Entity, FeatureService, FeatureView, FileSource, Field
+from feast.types import Float32, Int64
+
+def create_feature_store_yaml(url: str) -> str:
+ """Create a feature_store.yaml with OpenLineage configuration."""
+ return f"""project: openlineage_demo
+registry: data/registry.db
+provider: local
+online_store:
+ type: sqlite
+ path: data/online_store.db
+
+openlineage:
+ enabled: true
+ transport_type: http
+ transport_url: {url}
+ transport_endpoint: api/v1/lineage
+ namespace: feast
+ emit_on_apply: true
+ emit_on_materialize: true
+"""
+
+
+def run_demo(url: str):
+ """Run the OpenLineage integration demo."""
+ print("Feast OpenLineage Native Integration Demo")
+
+ # Create temporary directory for the demo
+ with tempfile.TemporaryDirectory() as tmpdir:
+ repo_path = Path(tmpdir)
+ data_dir = repo_path / "data"
+ data_dir.mkdir(exist_ok=True)
+
+ # Create feature_store.yaml with OpenLineage configuration
+ feature_store_yaml = create_feature_store_yaml(url)
+ (repo_path / "feature_store.yaml").write_text(feature_store_yaml)
+
+ print(f"Created demo repository at: {repo_path}")
+ print(f"feature_store.yaml:")
+ print("-" * 50)
+ print(feature_store_yaml)
+ print("-" * 50)
+
+ try:
+ import openlineage.client
+ except ImportError:
+ print("OpenLineage client not installed.")
+ print("Install with: pip install openlineage-python")
+ raise ImportError("OpenLineage client not installed")
+
+ fs = FeatureStore(repo_path=str(repo_path))
+ driver_stats_df = pd.DataFrame(
+ {
+ "driver_id": [1001, 1002, 1003, 1001, 1002, 1003],
+ "conv_rate": [0.85, 0.72, 0.91, 0.87, 0.75, 0.89],
+ "acc_rate": [0.95, 0.88, 0.97, 0.94, 0.90, 0.96],
+ "avg_daily_trips": [12, 8, 15, 14, 9, 16],
+ "event_timestamp": pd.to_datetime(
+ [
+ "2024-01-01 10:00:00",
+ "2024-01-01 10:00:00",
+ "2024-01-01 10:00:00",
+ "2024-01-02 10:00:00",
+ "2024-01-02 10:00:00",
+ "2024-01-02 10:00:00",
+ ],
+ utc=True,
+ ),
+ "created": pd.to_datetime(["2024-01-01"] * 6, utc=True),
+ }
+ )
+
+ parquet_path = data_dir / "driver_stats.parquet"
+ driver_stats_df.to_parquet(str(parquet_path))
+
+ driver = Entity(name="driver_id", description="Driver identifier")
+
+ driver_stats_source = FileSource(
+ name="driver_stats_source",
+ path=str(parquet_path),
+ timestamp_field="event_timestamp",
+ created_timestamp_column="created",
+ description="Driver statistics from data warehouse",
+ )
+
+ driver_hourly_stats_view = FeatureView(
+ name="driver_hourly_stats",
+ entities=[driver],
+ ttl=timedelta(days=1),
+ schema=[
+ Field(name="conv_rate", dtype=Float32, description="Conversion rate"),
+ Field(name="acc_rate", dtype=Float32, description="Acceptance rate"),
+ Field(
+ name="avg_daily_trips", dtype=Int64, description="Average daily trips"
+ ),
+ ],
+ source=driver_stats_source,
+ description="Hourly driver performance statistics",
+ tags={"team": "driver_performance", "priority": "high"},
+ )
+
+ driver_stats_service = FeatureService(
+ name="driver_stats_service",
+ features=[driver_hourly_stats_view],
+ description="Driver statistics for real-time scoring",
+ tags={"use_case": "scoring"},
+ )
+
+ try:
+ fs.apply(
+ [driver, driver_stats_source, driver_hourly_stats_view, driver_stats_service]
+ )
+ print("Applied entities, feature views, and feature services")
+ print("OpenLineage events emitted automatically:")
+ print(" - feast_feature_views_openlineage_demo (DataSources → FeatureViews)")
+ print(" - feature_service_driver_stats_service (FeatureViews → FeatureService)")
+ except Exception as e:
+ print(f"Apply failed: {e}")
+
+ start_date = datetime(
+ 2024, 1, 1, tzinfo=driver_stats_df["event_timestamp"].dt.tz
+ )
+ end_date = datetime(2024, 1, 3, tzinfo=driver_stats_df["event_timestamp"].dt.tz)
+ fs.materialize(start_date=start_date, end_date=end_date)
+
+ # Retrieve online features (no OpenLineage events emitted for retrieval)
+ online_features = fs.get_online_features(
+ features=["driver_hourly_stats:conv_rate", "driver_hourly_stats:acc_rate"],
+ entity_rows=[{"driver_id": 1001}, {"driver_id": 1002}],
+ )
+ print(f"Retrieved online features: {online_features.to_dict()}")
+ print(
+ """
+The native OpenLineage integration works automatically when configured.
+
+Lineage Graph Created:
+ DataSources + Entities → feast_feature_views_{project} → FeatureViews
+ FeatureViews → feature_service_{name} → FeatureServices
+
+Key benefits:
+ - No code changes required
+ - Just add 'openlineage' section to feature_store.yaml
+ - All operations emit lineage events automatically
+ - Feature metadata (tags, descriptions) included in lineage
+ - Non-blocking and fail-safe (won't break Feast operations)
+
+View your lineage at: http://localhost:3000
+"""
+ )
+
+
+def main():
+ parser = argparse.ArgumentParser(
+ description="Feast OpenLineage Native Integration Demo",
+ formatter_class=argparse.RawDescriptionHelpFormatter,
+ epilog="""
+Example:
+ # Start Marquez first:
+ docker run -p 5000:5000 -p 3000:3000 marquezproject/marquez
+
+ # Run the demo:
+ python openlineage_demo.py --url http://localhost:5000
+
+ # View lineage at http://localhost:3000
+""",
+ )
+ parser.add_argument(
+ "--url",
+ required=True,
+ help="OpenLineage HTTP URL (e.g., http://localhost:5000)",
+ )
+
+ args = parser.parse_args()
+
+ run_demo(args.url)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/infra/website/docs/blog/feast-openlineage-integration.md b/infra/website/docs/blog/feast-openlineage-integration.md
new file mode 100644
index 00000000000..3c8daa960b7
--- /dev/null
+++ b/infra/website/docs/blog/feast-openlineage-integration.md
@@ -0,0 +1,196 @@
+---
+title: Tracking Feature Lineage with OpenLineage
+description: Feast now supports native OpenLineage integration for automatic data lineage tracking of your ML features - no code changes required.
+date: 2026-01-26
+authors: ["Nikhil Kathole", "Francisco Javier Arceo"]
+---
+
+
+

+
+
+# Tracking Feature Lineage with OpenLineage 🔗
+
+# Feast and OpenLineage
+
+Understanding where your ML features come from and how they flow through your system is critical for debugging, compliance, and governance. We are excited to announce that Feast now supports native integration with [OpenLineage](https://openlineage.io/), the open standard for data lineage collection and analysis.
+
+With this integration, Feast automatically tracks and emits lineage events whenever you apply feature definitions or materialize features—**no code changes required**. Simply enable OpenLineage in your `feature_store.yaml`, and Feast handles the rest.
+
+# Why Data Lineage Matters for Feature Stores
+
+Feature stores manage the lifecycle of ML features, from raw data sources to model inference. As ML systems grow in complexity, teams often struggle to answer fundamental questions:
+
+- *Where does this feature's data come from?*
+- *Which models depend on this feature view?*
+- *What downstream impact will changing this data source have?*
+- *How do I audit the data flow for compliance?*
+
+OpenLineage solves these challenges by providing a standardized way to capture and visualize data lineage. By integrating OpenLineage into Feast, ML teams gain automatic visibility into their feature engineering pipelines without manual instrumentation.
+
+# How It Works
+
+The integration automatically emits OpenLineage events for two key operations:
+
+## Registry Changes (`feast apply`)
+
+When you run `feast apply`, Feast creates a lineage graph that mirrors what you see in the Feast UI:
+
+```
+DataSources ──┐
+ ├──→ feast_feature_views_{project} ──→ FeatureViews
+Entities ─────┘ │
+ │
+ ▼
+ feature_service_{name} ──→ FeatureService
+```
+
+This creates two types of jobs:
+- **`feast_feature_views_{project}`**: Shows how DataSources and Entities flow into FeatureViews
+- **`feature_service_{name}`**: Shows which FeatureViews compose each FeatureService
+
+## Feature Materialization (`feast materialize`)
+
+When materializing features, Feast emits START, COMPLETE, and FAIL events, allowing you to track:
+- Which feature views were materialized
+- The time window of materialization
+- Success or failure status
+- Duration and row counts
+
+# Getting Started
+
+## Step 1: Install OpenLineage
+
+```bash
+pip install feast[openlineage]
+```
+
+## Step 2: Configure Your Feature Store
+
+Add the `openlineage` section to your `feature_store.yaml`:
+
+```yaml
+project: my_fraud_detection
+registry: data/registry.db
+provider: local
+online_store:
+ type: sqlite
+ path: data/online_store.db
+
+openlineage:
+ enabled: true
+ transport_type: http
+ transport_url: http://localhost:5000
+ namespace: feast
+```
+
+## Step 3: Start Marquez (Optional)
+
+[Marquez](https://marquezproject.ai/) is the reference implementation for OpenLineage and provides a beautiful UI for exploring lineage:
+
+```bash
+docker run -p 5000:5000 -p 3000:3000 marquezproject/marquez
+```
+
+## Step 4: Apply Your Features
+
+```python
+from feast import FeatureStore
+
+fs = FeatureStore(repo_path="feature_repo")
+
+# This automatically emits lineage events!
+fs.apply([
+ driver_entity,
+ driver_stats_source,
+ driver_hourly_stats_view,
+ driver_stats_service
+])
+```
+
+Visit http://localhost:3000 to see your lineage graph in Marquez!
+
+# Rich Metadata Tracking
+
+The integration doesn't just track relationships—it captures comprehensive metadata about your Feast objects:
+
+**Feature Views**
+- Feature names, types, and descriptions
+- TTL (time-to-live) configuration
+- Associated entities
+- Custom tags
+- Online/offline store enablement
+
+**Feature Services**
+- Constituent feature views
+- Total feature count
+- Service-level descriptions and tags
+
+**Data Sources**
+- Source type (File, BigQuery, Snowflake, etc.)
+- Connection URIs
+- Timestamp fields
+- Field mappings
+
+All this metadata is attached as OpenLineage facets, making it queryable and explorable in any OpenLineage-compatible tool.
+
+# Try It Out: Complete Working Example
+
+We've included a complete working example in the Feast repository that demonstrates the OpenLineage integration end-to-end. The example creates a driver statistics feature store and shows how lineage events are automatically emitted.
+
+**Run the example:**
+
+```bash
+# Start Marquez first
+docker run -p 5000:5000 -p 3000:3000 marquezproject/marquez
+
+# Clone and run the example
+cd feast/examples/openlineage-integration
+python openlineage_demo.py --url http://localhost:5000
+
+# View lineage at http://localhost:3000
+```
+
+The example demonstrates:
+- Creating entities, data sources, feature views, and feature services
+- Automatic lineage emission on `feast apply`
+- Materialization tracking with START/COMPLETE events
+- Feature retrieval (no lineage events for retrieval operations)
+
+In Marquez, you'll see the complete lineage graph:
+- `driver_stats_source` (DataSource) → `driver_hourly_stats` (FeatureView)
+- `driver_id` (Entity) → `driver_hourly_stats` (FeatureView)
+- `driver_hourly_stats` (FeatureView) → `driver_stats_service` (FeatureService)
+
+
+

+
+
+Check out the [full example code](https://github.com/feast-dev/feast/tree/master/examples/openlineage-integration) for complete details including feature definitions with descriptions and tags.
+
+# Benefits for ML Teams
+
+**Debugging Made Easy**
+When a model's predictions degrade, trace back through the lineage to identify which data source or feature transformation changed.
+
+**Impact Analysis**
+Before modifying a data source, understand all downstream feature views and services that will be affected.
+
+**Compliance & Audit**
+Maintain a complete audit trail of data flow for regulatory requirements like GDPR, CCPA, or SOC2.
+
+**Documentation**
+Auto-generated lineage serves as living documentation that stays in sync with your actual feature store configuration.
+
+**Cross-Team Collaboration**
+Data engineers, ML engineers, and data scientists can all view the same lineage graph to understand the feature store structure.
+
+# How Can I Get Started?
+
+This integration is available now in the latest version of Feast. To get started:
+
+1. Check out the [OpenLineage Integration documentation](https://docs.feast.dev/reference/openlineage)
+2. Try the [example in the Feast repository](https://github.com/feast-dev/feast/tree/master/examples/openlineage-integration)
+3. Join the [Feast Slack](https://slack.feast.dev) to share feedback and ask questions
+
+We're excited to see how teams use OpenLineage integration to improve their ML operations and welcome feedback from the community!
diff --git a/infra/website/public/images/blog/openlineage1.png b/infra/website/public/images/blog/openlineage1.png
new file mode 100644
index 00000000000..50119ab14c3
Binary files /dev/null and b/infra/website/public/images/blog/openlineage1.png differ
diff --git a/infra/website/public/images/blog/openlineage2.png b/infra/website/public/images/blog/openlineage2.png
new file mode 100644
index 00000000000..f5a43c0f411
Binary files /dev/null and b/infra/website/public/images/blog/openlineage2.png differ
diff --git a/pyproject.toml b/pyproject.toml
index e8052558eb6..e04fbb320a8 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -104,6 +104,7 @@ milvus = [
mssql = ["ibis-framework[mssql]>=10.0.0"]
mysql = ["pymysql", "types-PyMySQL"]
opentelemetry = ["prometheus_client", "psutil"]
+openlineage = ["openlineage-python>=1.40.0"]
spark = ["pyspark>=4.0.0"]
trino = ["trino>=0.305.0,<0.400.0", "regex"]
postgres = ["psycopg[binary,pool]==3.2.5"]
@@ -179,7 +180,7 @@ ci = [
"types-setuptools",
"types-tabulate",
"virtualenv<20.24.2",
- "feast[aws, azure, cassandra, clickhouse, couchbase, delta, docling, duckdb, elasticsearch, faiss, gcp, ge, go, grpcio, hazelcast, hbase, ibis, ikv, image, k8s, mcp, milvus, mssql, mysql, opentelemetry, spark, trino, postgres, pytorch, qdrant, rag, ray, redis, singlestore, snowflake, sqlite_vec]"
+ "feast[aws, azure, cassandra, clickhouse, couchbase, delta, docling, duckdb, elasticsearch, faiss, gcp, ge, go, grpcio, hazelcast, hbase, ibis, ikv, image, k8s, mcp, milvus, mssql, mysql, openlineage, opentelemetry, spark, trino, postgres, pytorch, qdrant, rag, ray, redis, singlestore, snowflake, sqlite_vec]"
]
nlp = ["feast[docling, image, milvus, pytorch, rag]"]
dev = ["feast[ci]"]
diff --git a/sdk/python/feast/feature_store.py b/sdk/python/feast/feature_store.py
index e663a9eac29..db635d53d04 100644
--- a/sdk/python/feast/feature_store.py
+++ b/sdk/python/feast/feature_store.py
@@ -111,12 +111,14 @@ class FeatureStore:
repo_path: The path to the feature repo.
_registry: The registry for the feature store.
_provider: The provider for the feature store.
+ _openlineage_emitter: Optional OpenLineage emitter for lineage tracking.
"""
config: RepoConfig
repo_path: Path
_registry: BaseRegistry
_provider: Provider
+ _openlineage_emitter: Optional[Any] = None
def __init__(
self,
@@ -182,6 +184,30 @@ def __init__(
self._provider = get_provider(self.config)
+ # Initialize OpenLineage emitter if configured
+ self._openlineage_emitter = self._init_openlineage_emitter()
+
+ def _init_openlineage_emitter(self) -> Optional[Any]:
+ """Initialize OpenLineage emitter if configured and enabled."""
+ try:
+ if (
+ hasattr(self.config, "openlineage")
+ and self.config.openlineage is not None
+ and self.config.openlineage.enabled
+ ):
+ from feast.openlineage import FeastOpenLineageEmitter
+
+ ol_config = self.config.openlineage.to_openlineage_config()
+ emitter = FeastOpenLineageEmitter(ol_config)
+ if emitter.is_enabled:
+ return emitter
+ except ImportError:
+ # OpenLineage not installed, silently skip
+ pass
+ except Exception as e:
+ warnings.warn(f"Failed to initialize OpenLineage emitter: {e}")
+ return None
+
def __repr__(self) -> str:
return (
f"FeatureStore(\n"
@@ -860,6 +886,23 @@ def _apply_diffs(
if progress_ctx:
progress_ctx.cleanup()
+ # Emit OpenLineage events for applied objects
+ self._emit_openlineage_apply_diffs(registry_diff)
+
+ def _emit_openlineage_apply_diffs(self, registry_diff: RegistryDiff):
+ """Emit OpenLineage events for objects applied via diffs."""
+ if self._openlineage_emitter is None:
+ return
+
+ # Collect all objects that were added or updated
+ objects: List[Any] = []
+ for feast_object_diff in registry_diff.feast_object_diffs:
+ if feast_object_diff.new_feast_object is not None:
+ objects.append(feast_object_diff.new_feast_object)
+
+ if objects:
+ self._emit_openlineage_apply(objects)
+
def apply(
self,
objects: Union[
@@ -1134,6 +1177,18 @@ def apply(
if self.config.registry.cache_mode == "sync":
self.refresh_registry()
+ # Emit OpenLineage events for applied objects
+ self._emit_openlineage_apply(objects)
+
+ def _emit_openlineage_apply(self, objects: List[Any]):
+ """Emit OpenLineage events for applied objects."""
+ if self._openlineage_emitter is None:
+ return
+ try:
+ self._openlineage_emitter.emit_apply(objects, self.project)
+ except Exception as e:
+ warnings.warn(f"Failed to emit OpenLineage apply events: {e}")
+
def teardown(self):
"""Tears down all local and cloud resources for the feature store."""
tables: List[FeatureView] = []
@@ -1543,81 +1598,97 @@ def materialize_incremental(
len(feature_views_to_materialize),
self.config.online_store.type,
)
- # TODO paging large loads
- for feature_view in feature_views_to_materialize:
- if isinstance(feature_view, OnDemandFeatureView):
- if feature_view.write_to_online_store:
- source_fvs = {
- self._get_feature_view(p.name)
- for p in feature_view.source_feature_view_projections.values()
- }
- max_ttl = timedelta(0)
- for fv in source_fvs:
- if fv.ttl and fv.ttl > max_ttl:
- max_ttl = fv.ttl
-
- if max_ttl.total_seconds() > 0:
- odfv_start_date = end_date - max_ttl
+
+ # Emit OpenLineage START event for incremental materialization
+ ol_run_id = self._emit_openlineage_materialize_start(
+ feature_views_to_materialize, None, end_date
+ )
+
+ try:
+ # TODO paging large loads
+ for feature_view in feature_views_to_materialize:
+ if isinstance(feature_view, OnDemandFeatureView):
+ if feature_view.write_to_online_store:
+ source_fvs = {
+ self._get_feature_view(p.name)
+ for p in feature_view.source_feature_view_projections.values()
+ }
+ max_ttl = timedelta(0)
+ for fv in source_fvs:
+ if fv.ttl and fv.ttl > max_ttl:
+ max_ttl = fv.ttl
+
+ if max_ttl.total_seconds() > 0:
+ odfv_start_date = end_date - max_ttl
+ else:
+ odfv_start_date = end_date - timedelta(weeks=52)
+
+ print(
+ f"{Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}:"
+ )
+ self._materialize_odfv(
+ feature_view,
+ odfv_start_date,
+ end_date,
+ full_feature_names=full_feature_names,
+ )
+ continue
+
+ start_date = feature_view.most_recent_end_time
+ if start_date is None:
+ if feature_view.ttl is None:
+ raise Exception(
+ f"No start time found for feature view {feature_view.name}. materialize_incremental() requires"
+ f" either a ttl to be set or for materialize() to have been run at least once."
+ )
+ elif feature_view.ttl.total_seconds() > 0:
+ start_date = _utc_now() - feature_view.ttl
else:
- odfv_start_date = end_date - timedelta(weeks=52)
+ # TODO(felixwang9817): Find the earliest timestamp for this specific feature
+ # view from the offline store, and set the start date to that timestamp.
+ print(
+ f"Since the ttl is 0 for feature view {Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}, "
+ "the start date will be set to 1 year before the current time."
+ )
+ start_date = _utc_now() - timedelta(weeks=52)
+ provider = self._get_provider()
+ print(
+ f"{Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}"
+ f" from {Style.BRIGHT + Fore.GREEN}{utils.make_tzaware(start_date.replace(microsecond=0))}{Style.RESET_ALL}"
+ f" to {Style.BRIGHT + Fore.GREEN}{utils.make_tzaware(end_date.replace(microsecond=0))}{Style.RESET_ALL}:"
+ )
- print(
- f"{Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}:"
- )
- self._materialize_odfv(
+ def tqdm_builder(length):
+ return tqdm(total=length, ncols=100)
+
+ start_date = utils.make_tzaware(start_date)
+ end_date = utils.make_tzaware(end_date) or _utc_now()
+
+ provider.materialize_single_feature_view(
+ config=self.config,
+ feature_view=feature_view,
+ start_date=start_date,
+ end_date=end_date,
+ registry=self._registry,
+ project=self.project,
+ tqdm_builder=tqdm_builder,
+ )
+ if not isinstance(feature_view, OnDemandFeatureView):
+ self._registry.apply_materialization(
feature_view,
- odfv_start_date,
+ self.project,
+ start_date,
end_date,
- full_feature_names=full_feature_names,
)
- continue
- start_date = feature_view.most_recent_end_time
- if start_date is None:
- if feature_view.ttl is None:
- raise Exception(
- f"No start time found for feature view {feature_view.name}. materialize_incremental() requires"
- f" either a ttl to be set or for materialize() to have been run at least once."
- )
- elif feature_view.ttl.total_seconds() > 0:
- start_date = _utc_now() - feature_view.ttl
- else:
- # TODO(felixwang9817): Find the earliest timestamp for this specific feature
- # view from the offline store, and set the start date to that timestamp.
- print(
- f"Since the ttl is 0 for feature view {Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}, "
- "the start date will be set to 1 year before the current time."
- )
- start_date = _utc_now() - timedelta(weeks=52)
- provider = self._get_provider()
- print(
- f"{Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}"
- f" from {Style.BRIGHT + Fore.GREEN}{utils.make_tzaware(start_date.replace(microsecond=0))}{Style.RESET_ALL}"
- f" to {Style.BRIGHT + Fore.GREEN}{utils.make_tzaware(end_date.replace(microsecond=0))}{Style.RESET_ALL}:"
+ # Emit OpenLineage COMPLETE event
+ self._emit_openlineage_materialize_complete(
+ ol_run_id, feature_views_to_materialize
)
-
- def tqdm_builder(length):
- return tqdm(total=length, ncols=100)
-
- start_date = utils.make_tzaware(start_date)
- end_date = utils.make_tzaware(end_date) or _utc_now()
-
- provider.materialize_single_feature_view(
- config=self.config,
- feature_view=feature_view,
- start_date=start_date,
- end_date=end_date,
- registry=self._registry,
- project=self.project,
- tqdm_builder=tqdm_builder,
- )
- if not isinstance(feature_view, OnDemandFeatureView):
- self._registry.apply_materialization(
- feature_view,
- self.project,
- start_date,
- end_date,
- )
+ except Exception as e:
+ # Emit OpenLineage FAIL event
+ self._emit_openlineage_materialize_fail(ol_run_id, str(e))
+ raise
def materialize(
self,
@@ -1670,46 +1741,114 @@ def materialize(
len(feature_views_to_materialize),
self.config.online_store.type,
)
- # TODO paging large loads
- for feature_view in feature_views_to_materialize:
- if isinstance(feature_view, OnDemandFeatureView):
- if feature_view.write_to_online_store:
- print(
- f"{Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}:"
- )
- self._materialize_odfv(
- feature_view,
- start_date,
- end_date,
- full_feature_names=full_feature_names,
- )
- continue
- provider = self._get_provider()
- print(f"{Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}:")
- def tqdm_builder(length):
- return tqdm(total=length, ncols=100)
+ # Emit OpenLineage START event
+ ol_run_id = self._emit_openlineage_materialize_start(
+ feature_views_to_materialize, start_date, end_date
+ )
- start_date = utils.make_tzaware(start_date)
- end_date = utils.make_tzaware(end_date)
+ try:
+ # TODO paging large loads
+ for feature_view in feature_views_to_materialize:
+ if isinstance(feature_view, OnDemandFeatureView):
+ if feature_view.write_to_online_store:
+ print(
+ f"{Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}:"
+ )
+ self._materialize_odfv(
+ feature_view,
+ start_date,
+ end_date,
+ full_feature_names=full_feature_names,
+ )
+ continue
+ provider = self._get_provider()
+ print(
+ f"{Style.BRIGHT + Fore.GREEN}{feature_view.name}{Style.RESET_ALL}:"
+ )
- provider.materialize_single_feature_view(
- config=self.config,
- feature_view=feature_view,
- start_date=start_date,
- end_date=end_date,
- registry=self._registry,
- project=self.project,
- tqdm_builder=tqdm_builder,
- disable_event_timestamp=disable_event_timestamp,
+ def tqdm_builder(length):
+ return tqdm(total=length, ncols=100)
+
+ start_date = utils.make_tzaware(start_date)
+ end_date = utils.make_tzaware(end_date)
+
+ provider.materialize_single_feature_view(
+ config=self.config,
+ feature_view=feature_view,
+ start_date=start_date,
+ end_date=end_date,
+ registry=self._registry,
+ project=self.project,
+ tqdm_builder=tqdm_builder,
+ disable_event_timestamp=disable_event_timestamp,
+ )
+
+ self._registry.apply_materialization(
+ feature_view,
+ self.project,
+ start_date,
+ end_date,
+ )
+
+ # Emit OpenLineage COMPLETE event
+ self._emit_openlineage_materialize_complete(
+ ol_run_id, feature_views_to_materialize
)
+ except Exception as e:
+ # Emit OpenLineage FAIL event
+ self._emit_openlineage_materialize_fail(ol_run_id, str(e))
+ raise
- self._registry.apply_materialization(
- feature_view,
- self.project,
- start_date,
- end_date,
+ def _emit_openlineage_materialize_start(
+ self,
+ feature_views: List[Any],
+ start_date: Optional[datetime],
+ end_date: datetime,
+ ) -> Optional[str]:
+ """Emit OpenLineage START event for materialization."""
+ if self._openlineage_emitter is None:
+ return None
+ try:
+ run_id, success = self._openlineage_emitter.emit_materialize_start(
+ feature_views, start_date, end_date, self.project
)
+ # Return run_id only if START was successfully emitted
+ # This prevents orphaned COMPLETE/FAIL events
+ return run_id if run_id and success else None
+ except Exception as e:
+ warnings.warn(f"Failed to emit OpenLineage materialize start event: {e}")
+ return None
+
+ def _emit_openlineage_materialize_complete(
+ self,
+ run_id: Optional[str],
+ feature_views: List[Any],
+ ):
+ """Emit OpenLineage COMPLETE event for materialization."""
+ if self._openlineage_emitter is None or not run_id:
+ return
+ try:
+ self._openlineage_emitter.emit_materialize_complete(
+ run_id, feature_views, self.project
+ )
+ except Exception as e:
+ warnings.warn(f"Failed to emit OpenLineage materialize complete event: {e}")
+
+ def _emit_openlineage_materialize_fail(
+ self,
+ run_id: Optional[str],
+ error_message: str,
+ ):
+ """Emit OpenLineage FAIL event for materialization."""
+ if self._openlineage_emitter is None or not run_id:
+ return
+ try:
+ self._openlineage_emitter.emit_materialize_fail(
+ run_id, self.project, error_message
+ )
+ except Exception as e:
+ warnings.warn(f"Failed to emit OpenLineage materialize fail event: {e}")
def _fvs_for_push_source_or_raise(
self, push_source_name: str, allow_cache: bool
@@ -2178,7 +2317,7 @@ def get_online_features(
"""
provider = self._get_provider()
- return provider.get_online_features(
+ response = provider.get_online_features(
config=self.config,
features=features,
entity_rows=entity_rows,
@@ -2187,6 +2326,8 @@ def get_online_features(
full_feature_names=full_feature_names,
)
+ return response
+
async def get_online_features_async(
self,
features: Union[List[str], FeatureService],
diff --git a/sdk/python/feast/openlineage/__init__.py b/sdk/python/feast/openlineage/__init__.py
new file mode 100644
index 00000000000..a8328417475
--- /dev/null
+++ b/sdk/python/feast/openlineage/__init__.py
@@ -0,0 +1,85 @@
+# Copyright 2026 The Feast Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+OpenLineage integration for Feast Feature Store.
+
+This module provides **native integration** between Feast and OpenLineage for
+automatic data lineage tracking. When enabled in feature_store.yaml, lineage
+events are emitted automatically for:
+
+- Feature store registry changes (apply operations)
+- Feature materialization (batch and streaming) - START, COMPLETE, FAIL events
+
+Lineage Graph Structure:
+ feast apply creates a lineage graph matching Feast UI:
+
+ DataSources + Entities → feast_feature_views_{project} → FeatureViews
+ FeatureViews → feature_service_{name} → FeatureServices
+
+ Each dataset includes:
+ - Schema with feature names, types, descriptions, and tags
+ - Feast-specific facets with metadata (TTL, entities, owner, etc.)
+
+Usage:
+ Simply configure OpenLineage in your feature_store.yaml:
+
+ ```yaml
+ project: my_project
+ # ... other config ...
+
+ openlineage:
+ enabled: true
+ transport_type: http
+ transport_url: http://localhost:5000
+ transport_endpoint: api/v1/lineage
+ namespace: my_namespace # Optional: defaults to project name
+ ```
+
+ Then use Feast normally - lineage events are emitted automatically!
+
+ ```python
+ from feast import FeatureStore
+
+ fs = FeatureStore(repo_path="feature_repo")
+ fs.apply([entity, feature_view, feature_service]) # Emits lineage
+ fs.materialize(start, end) # Emits START/COMPLETE/FAIL events
+ ```
+"""
+
+from feast.openlineage.client import FeastOpenLineageClient
+from feast.openlineage.config import OpenLineageConfig
+from feast.openlineage.emitter import FeastOpenLineageEmitter
+from feast.openlineage.facets import (
+ FeastDataSourceFacet,
+ FeastEntityFacet,
+ FeastFeatureServiceFacet,
+ FeastFeatureViewFacet,
+ FeastMaterializationFacet,
+ FeastProjectFacet,
+)
+
+__all__ = [
+ # Main classes (used internally by native integration)
+ "FeastOpenLineageClient",
+ "FeastOpenLineageEmitter",
+ "OpenLineageConfig",
+ # Facets (custom Feast metadata in lineage events)
+ "FeastFeatureViewFacet",
+ "FeastFeatureServiceFacet",
+ "FeastDataSourceFacet",
+ "FeastEntityFacet",
+ "FeastMaterializationFacet",
+ "FeastProjectFacet",
+]
diff --git a/sdk/python/feast/openlineage/client.py b/sdk/python/feast/openlineage/client.py
new file mode 100644
index 00000000000..927f998ac3e
--- /dev/null
+++ b/sdk/python/feast/openlineage/client.py
@@ -0,0 +1,305 @@
+# Copyright 2026 The Feast Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+Feast OpenLineage Client.
+
+This module provides a wrapper around the OpenLineage client that is
+specifically designed for Feast Feature Store operations.
+"""
+
+import logging
+from typing import TYPE_CHECKING, Any, Dict, List, Optional
+
+if TYPE_CHECKING:
+ from feast import FeatureStore
+
+from feast.openlineage.config import OpenLineageConfig
+
+try:
+ from openlineage.client import OpenLineageClient
+ from openlineage.client.event_v2 import (
+ DatasetEvent,
+ Job,
+ JobEvent,
+ Run,
+ RunEvent,
+ RunState,
+ set_producer,
+ )
+
+ OPENLINEAGE_AVAILABLE = True
+except ImportError:
+ OPENLINEAGE_AVAILABLE = False
+ OpenLineageClient = None # type: ignore[misc,assignment]
+
+logger = logging.getLogger(__name__)
+
+
+class FeastOpenLineageClient:
+ """
+ OpenLineage client wrapper for Feast Feature Store.
+
+ This client provides convenient methods for emitting OpenLineage events
+ from Feast operations like materialization, feature retrieval, and
+ registry changes.
+
+ Example:
+ from feast.openlineage import FeastOpenLineageClient, OpenLineageConfig
+
+ config = OpenLineageConfig(
+ transport_type="http",
+ transport_url="http://localhost:5000",
+ )
+ client = FeastOpenLineageClient(config)
+
+ # Emit lineage for a feature store
+ client.emit_registry_lineage(feature_store.registry)
+ """
+
+ def __init__(
+ self,
+ config: Optional[OpenLineageConfig] = None,
+ feature_store: Optional["FeatureStore"] = None,
+ ):
+ """
+ Initialize the Feast OpenLineage client.
+
+ Args:
+ config: OpenLineage configuration. If not provided, will try to
+ load from environment variables.
+ feature_store: Optional FeatureStore instance for context.
+ """
+ if not OPENLINEAGE_AVAILABLE:
+ logger.warning(
+ "OpenLineage is not installed. Lineage events will not be emitted. "
+ "Install with: pip install openlineage-python"
+ )
+ self._client = None
+ self._config = config or OpenLineageConfig(enabled=False)
+ self._feature_store = feature_store
+ return
+
+ self._config = config or OpenLineageConfig.from_env()
+ self._feature_store = feature_store
+
+ if not self._config.enabled:
+ logger.info("OpenLineage integration is disabled")
+ self._client = None
+ return
+
+ # Set producer
+ set_producer(self._config.producer)
+
+ # Initialize the OpenLineage client
+ try:
+ transport_config = self._config.get_transport_config()
+ self._client = OpenLineageClient(config={"transport": transport_config})
+ logger.info(
+ f"OpenLineage client initialized with {self._config.transport_type} transport"
+ )
+ except Exception as e:
+ logger.error(f"Failed to initialize OpenLineage client: {e}")
+ self._client = None
+
+ @property
+ def is_enabled(self) -> bool:
+ """Check if the OpenLineage client is enabled and available."""
+ return self._client is not None and self._config.enabled
+
+ @property
+ def config(self) -> OpenLineageConfig:
+ """Get the OpenLineage configuration."""
+ return self._config
+
+ @property
+ def namespace(self) -> str:
+ """Get the default namespace."""
+ return self._config.namespace
+
+ def emit(self, event: Any) -> bool:
+ """
+ Emit an OpenLineage event.
+
+ Args:
+ event: OpenLineage event (RunEvent, DatasetEvent, or JobEvent)
+
+ Returns:
+ True if the event was emitted successfully, False otherwise
+ """
+ if not self.is_enabled or self._client is None:
+ logger.debug("OpenLineage is disabled, skipping event emission")
+ return False
+
+ try:
+ self._client.emit(event)
+ return True
+ except Exception as e:
+ logger.error(f"Failed to emit OpenLineage event: {e}")
+ return False
+
+ def emit_run_event(
+ self,
+ job_name: str,
+ run_id: str,
+ event_type: "RunState",
+ inputs: Optional[List[Any]] = None,
+ outputs: Optional[List[Any]] = None,
+ job_facets: Optional[Dict[str, Any]] = None,
+ run_facets: Optional[Dict[str, Any]] = None,
+ namespace: Optional[str] = None,
+ ) -> bool:
+ """
+ Emit a RunEvent for a Feast operation.
+
+ Args:
+ job_name: Name of the job
+ run_id: Unique run identifier (UUID)
+ event_type: Type of event (START, COMPLETE, FAIL, etc.)
+ inputs: List of input datasets
+ outputs: List of output datasets
+ job_facets: Additional job facets
+ run_facets: Additional run facets
+ namespace: Optional namespace for the job (defaults to client namespace)
+
+ Returns:
+ True if successful, False otherwise
+ """
+ if not self.is_enabled:
+ return False
+
+ from datetime import datetime, timezone
+
+ try:
+ event = RunEvent(
+ eventTime=datetime.now(timezone.utc).isoformat(),
+ eventType=event_type,
+ run=Run(runId=run_id, facets=run_facets or {}),
+ job=Job(
+ namespace=namespace or self.namespace,
+ name=job_name,
+ facets=job_facets or {},
+ ),
+ inputs=inputs or [],
+ outputs=outputs or [],
+ )
+ return self.emit(event)
+ except Exception as e:
+ logger.error(f"Failed to create RunEvent: {e}")
+ return False
+
+ def emit_dataset_event(
+ self,
+ dataset_name: str,
+ namespace: Optional[str] = None,
+ facets: Optional[Dict[str, Any]] = None,
+ ) -> bool:
+ """
+ Emit a DatasetEvent for a Feast dataset (data source, feature view).
+
+ Args:
+ dataset_name: Name of the dataset
+ namespace: Optional namespace (defaults to client namespace)
+ facets: Dataset facets
+
+ Returns:
+ True if successful, False otherwise
+ """
+ if not self.is_enabled:
+ return False
+
+ from datetime import datetime, timezone
+
+ from openlineage.client.event_v2 import StaticDataset
+
+ try:
+ event = DatasetEvent(
+ eventTime=datetime.now(timezone.utc).isoformat(),
+ dataset=StaticDataset(
+ namespace=namespace or self.namespace,
+ name=dataset_name,
+ facets=facets or {},
+ ),
+ )
+ return self.emit(event)
+ except Exception as e:
+ logger.error(f"Failed to create DatasetEvent: {e}")
+ return False
+
+ def emit_job_event(
+ self,
+ job_name: str,
+ inputs: Optional[List[Any]] = None,
+ outputs: Optional[List[Any]] = None,
+ job_facets: Optional[Dict[str, Any]] = None,
+ ) -> bool:
+ """
+ Emit a JobEvent for a Feast job definition.
+
+ Args:
+ job_name: Name of the job
+ inputs: List of input datasets
+ outputs: List of output datasets
+ job_facets: Job facets
+
+ Returns:
+ True if successful, False otherwise
+ """
+ if not self.is_enabled:
+ return False
+
+ from datetime import datetime, timezone
+
+ try:
+ event = JobEvent(
+ eventTime=datetime.now(timezone.utc).isoformat(),
+ job=Job(
+ namespace=self.namespace,
+ name=job_name,
+ facets=job_facets or {},
+ ),
+ inputs=inputs or [],
+ outputs=outputs or [],
+ )
+ return self.emit(event)
+ except Exception as e:
+ logger.error(f"Failed to create JobEvent: {e}")
+ return False
+
+ def close(self, timeout: float = 5.0) -> bool:
+ """
+ Close the OpenLineage client and flush any pending events.
+
+ Args:
+ timeout: Maximum time to wait for pending events
+
+ Returns:
+ True if closed successfully, False otherwise
+ """
+ if self._client is not None:
+ try:
+ return self._client.close(timeout)
+ except Exception as e:
+ logger.error(f"Error closing OpenLineage client: {e}")
+ return False
+ return True
+
+ def __enter__(self):
+ """Context manager entry."""
+ return self
+
+ def __exit__(self, exc_type, exc_val, exc_tb):
+ """Context manager exit."""
+ self.close()
+ return False
diff --git a/sdk/python/feast/openlineage/config.py b/sdk/python/feast/openlineage/config.py
new file mode 100644
index 00000000000..4d8b7684179
--- /dev/null
+++ b/sdk/python/feast/openlineage/config.py
@@ -0,0 +1,164 @@
+# Copyright 2026 The Feast Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+Configuration classes for Feast OpenLineage integration.
+"""
+
+import os
+from dataclasses import dataclass, field
+from typing import Any, Dict, Optional
+
+
+@dataclass
+class OpenLineageConfig:
+ """
+ Configuration for OpenLineage integration.
+
+ Attributes:
+ enabled: Whether OpenLineage integration is enabled
+ transport_type: Type of transport (http, console, file, kafka)
+ transport_url: URL for HTTP transport
+ transport_endpoint: API endpoint for HTTP transport
+ api_key: Optional API key for authentication
+ namespace: Default namespace for Feast jobs and datasets
+ producer: Producer identifier for OpenLineage events
+ emit_on_apply: Emit lineage events when feast apply is called
+ emit_on_materialize: Emit lineage events during materialization
+ additional_config: Additional transport-specific configuration
+ """
+
+ enabled: bool = True
+ transport_type: str = "console"
+ transport_url: Optional[str] = None
+ transport_endpoint: str = "api/v1/lineage"
+ api_key: Optional[str] = None
+ namespace: str = "feast"
+ producer: str = "feast"
+ emit_on_apply: bool = True
+ emit_on_materialize: bool = True
+ additional_config: Dict[str, Any] = field(default_factory=dict)
+
+ @classmethod
+ def from_dict(cls, config_dict: Dict[str, Any]) -> "OpenLineageConfig":
+ """
+ Create OpenLineageConfig from a dictionary.
+
+ Args:
+ config_dict: Dictionary containing configuration values
+
+ Returns:
+ OpenLineageConfig instance
+ """
+ return cls(
+ enabled=config_dict.get("enabled", True),
+ transport_type=config_dict.get("transport_type", "console"),
+ transport_url=config_dict.get("transport_url"),
+ transport_endpoint=config_dict.get("transport_endpoint", "api/v1/lineage"),
+ api_key=config_dict.get("api_key"),
+ namespace=config_dict.get("namespace", "feast"),
+ producer=config_dict.get("producer", "feast"),
+ emit_on_apply=config_dict.get("emit_on_apply", True),
+ emit_on_materialize=config_dict.get("emit_on_materialize", True),
+ additional_config=config_dict.get("additional_config", {}),
+ )
+
+ @classmethod
+ def from_env(cls) -> "OpenLineageConfig":
+ """
+ Create OpenLineageConfig from environment variables.
+
+ Environment variables:
+ FEAST_OPENLINEAGE_ENABLED: Enable/disable OpenLineage (default: true)
+ FEAST_OPENLINEAGE_TRANSPORT_TYPE: Transport type (default: console)
+ FEAST_OPENLINEAGE_URL: HTTP transport URL
+ FEAST_OPENLINEAGE_ENDPOINT: API endpoint (default: api/v1/lineage)
+ FEAST_OPENLINEAGE_API_KEY: API key for authentication
+ FEAST_OPENLINEAGE_NAMESPACE: Default namespace (default: feast)
+ FEAST_OPENLINEAGE_PRODUCER: Producer identifier
+
+ Returns:
+ OpenLineageConfig instance
+ """
+ return cls(
+ enabled=os.getenv("FEAST_OPENLINEAGE_ENABLED", "true").lower() == "true",
+ transport_type=os.getenv("FEAST_OPENLINEAGE_TRANSPORT_TYPE", "console"),
+ transport_url=os.getenv("FEAST_OPENLINEAGE_URL"),
+ transport_endpoint=os.getenv(
+ "FEAST_OPENLINEAGE_ENDPOINT", "api/v1/lineage"
+ ),
+ api_key=os.getenv("FEAST_OPENLINEAGE_API_KEY"),
+ namespace=os.getenv("FEAST_OPENLINEAGE_NAMESPACE", "feast"),
+ producer=os.getenv("FEAST_OPENLINEAGE_PRODUCER", "feast"),
+ emit_on_apply=os.getenv("FEAST_OPENLINEAGE_EMIT_ON_APPLY", "true").lower()
+ == "true",
+ emit_on_materialize=os.getenv(
+ "FEAST_OPENLINEAGE_EMIT_ON_MATERIALIZE", "true"
+ ).lower()
+ == "true",
+ )
+
+ def to_dict(self) -> Dict[str, Any]:
+ """
+ Convert configuration to dictionary.
+
+ Returns:
+ Dictionary representation of the configuration
+ """
+ return {
+ "enabled": self.enabled,
+ "transport_type": self.transport_type,
+ "transport_url": self.transport_url,
+ "transport_endpoint": self.transport_endpoint,
+ "api_key": self.api_key,
+ "namespace": self.namespace,
+ "producer": self.producer,
+ "emit_on_apply": self.emit_on_apply,
+ "emit_on_materialize": self.emit_on_materialize,
+ "additional_config": self.additional_config,
+ }
+
+ def get_transport_config(self) -> Dict[str, Any]:
+ """
+ Get transport-specific configuration for OpenLineage client.
+
+ Returns:
+ Dictionary with transport configuration
+ """
+ config: Dict[str, Any] = {"type": self.transport_type}
+
+ if self.transport_type == "http":
+ if not self.transport_url:
+ raise ValueError("transport_url is required for HTTP transport")
+ config["url"] = self.transport_url
+ config["endpoint"] = self.transport_endpoint
+ if self.api_key:
+ config["auth"] = {
+ "type": "api_key",
+ "apiKey": self.api_key,
+ }
+ elif self.transport_type == "file":
+ config["log_file_path"] = self.additional_config.get(
+ "log_file_path", "openlineage_events.json"
+ )
+ elif self.transport_type == "kafka":
+ config["bootstrap_servers"] = self.additional_config.get(
+ "bootstrap_servers"
+ )
+ config["topic"] = self.additional_config.get("topic", "openlineage.events")
+
+ # Merge additional config
+ config.update(self.additional_config)
+
+ return config
diff --git a/sdk/python/feast/openlineage/emitter.py b/sdk/python/feast/openlineage/emitter.py
new file mode 100644
index 00000000000..1f63e39210e
--- /dev/null
+++ b/sdk/python/feast/openlineage/emitter.py
@@ -0,0 +1,985 @@
+# Copyright 2026 The Feast Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+Feast OpenLineage Emitter.
+
+This module provides high-level functions for emitting OpenLineage events
+from Feast operations like materialization, feature retrieval, and registry changes.
+"""
+
+import logging
+import uuid
+from datetime import datetime
+from typing import TYPE_CHECKING, Any, Dict, List, Optional, Tuple, Union
+
+if TYPE_CHECKING:
+ from feast import FeatureService, FeatureView
+ from feast.infra.registry.base_registry import BaseRegistry
+ from feast.on_demand_feature_view import OnDemandFeatureView
+ from feast.stream_feature_view import StreamFeatureView
+
+from feast.openlineage.client import FeastOpenLineageClient
+from feast.openlineage.config import OpenLineageConfig
+
+try:
+ from openlineage.client.event_v2 import (
+ InputDataset,
+ OutputDataset,
+ RunState,
+ )
+
+ OPENLINEAGE_AVAILABLE = True
+except ImportError:
+ OPENLINEAGE_AVAILABLE = False
+
+logger = logging.getLogger(__name__)
+
+
+class FeastOpenLineageEmitter:
+ """
+ High-level emitter for Feast OpenLineage events.
+
+ This class provides methods for emitting lineage events for various
+ Feast operations including:
+ - Registry apply (feature view, feature service definitions)
+ - Materialization (batch and incremental)
+ - Feature retrieval (online and historical)
+ - Data source relationships
+
+ Example:
+ from feast import FeatureStore
+ from feast.openlineage import FeastOpenLineageEmitter, OpenLineageConfig
+
+ config = OpenLineageConfig(transport_type="http", transport_url="http://localhost:5000")
+ emitter = FeastOpenLineageEmitter(config)
+
+ fs = FeatureStore(repo_path="feature_repo")
+
+ # Emit lineage for registry
+ emitter.emit_registry_lineage(fs.registry, fs.project)
+ """
+
+ def __init__(
+ self,
+ config: Optional[OpenLineageConfig] = None,
+ client: Optional[FeastOpenLineageClient] = None,
+ ):
+ """
+ Initialize the Feast OpenLineage Emitter.
+
+ Args:
+ config: OpenLineage configuration
+ client: Optional pre-configured FeastOpenLineageClient
+ """
+ self._config = config or OpenLineageConfig.from_env()
+ self._client = client or FeastOpenLineageClient(self._config)
+
+ @property
+ def is_enabled(self) -> bool:
+ """Check if the emitter is enabled."""
+ return self._client.is_enabled
+
+ @property
+ def namespace(self) -> str:
+ """Get the default namespace."""
+ return self._config.namespace
+
+ def _get_namespace(self, project: str) -> str:
+ """
+ Get the OpenLineage namespace for a project.
+
+ By default, uses the Feast project name as the namespace.
+ If an explicit namespace is configured (not the default "feast"),
+ it will be used as a prefix: {namespace}/{project}
+
+ Args:
+ project: Feast project name
+
+ Returns:
+ OpenLineage namespace string
+ """
+ # If namespace is default "feast", just use project name
+ if self._config.namespace == "feast":
+ return project
+ # If custom namespace is configured, use it as prefix
+ return f"{self._config.namespace}/{project}"
+
+ def emit_registry_lineage(
+ self,
+ registry: "BaseRegistry",
+ project: str,
+ allow_cache: bool = True,
+ ) -> List[bool]:
+ """
+ Emit lineage events for all objects in a Feast registry.
+
+ This method emits JobEvents for feature views and feature services,
+ and DatasetEvents for data sources and entities.
+
+ Args:
+ registry: Feast registry
+ project: Project name
+ allow_cache: Whether to use cached registry data
+
+ Returns:
+ List of success/failure indicators for each event
+ """
+ if not self.is_enabled:
+ return []
+
+ from feast.feature_view import FeatureView
+ from feast.on_demand_feature_view import OnDemandFeatureView
+ from feast.stream_feature_view import StreamFeatureView
+
+ results = []
+
+ # Get all feature views at once (includes FeatureView, StreamFeatureView, OnDemandFeatureView)
+ all_feature_views: list = []
+ try:
+ all_feature_views = registry.list_all_feature_views(
+ project=project, allow_cache=allow_cache
+ )
+ except Exception as e:
+ logger.error(f"Error listing all feature views: {e}")
+
+ # Emit lineage events for each feature view type
+ for fv in all_feature_views:
+ try:
+ if isinstance(fv, OnDemandFeatureView):
+ result = self.emit_on_demand_feature_view_lineage(fv, project)
+ elif isinstance(fv, StreamFeatureView):
+ result = self.emit_stream_feature_view_lineage(fv, project)
+ elif isinstance(fv, FeatureView):
+ result = self.emit_feature_view_lineage(fv, project)
+ else:
+ continue
+ results.append(result)
+ except Exception as e:
+ logger.error(f"Error emitting lineage for feature view {fv.name}: {e}")
+
+ # Emit events for feature services
+ try:
+ feature_services = registry.list_feature_services(
+ project=project, allow_cache=allow_cache
+ )
+ for fs in feature_services:
+ result = self.emit_feature_service_lineage(
+ fs, all_feature_views, project
+ )
+ results.append(result)
+ except Exception as e:
+ logger.error(f"Error emitting feature service lineage: {e}")
+
+ logger.info(
+ f"Emitted {sum(results)}/{len(results)} lineage events for registry"
+ )
+ return results
+
+ def emit_feature_view_lineage(
+ self,
+ feature_view: "FeatureView",
+ project: str,
+ ) -> bool:
+ """
+ Emit lineage for a feature view definition.
+
+ Args:
+ feature_view: The feature view
+ project: Project name
+
+ Returns:
+ True if successful, False otherwise
+ """
+ if not self.is_enabled:
+ return False
+
+ from feast.openlineage.mappers import feature_view_to_job
+
+ try:
+ namespace = self._get_namespace(project)
+ job, inputs, outputs = feature_view_to_job(
+ feature_view,
+ namespace=namespace,
+ )
+
+ # Emit a RunEvent with COMPLETE state to create lineage connection
+ result = self._client.emit_run_event(
+ job_name=job.name,
+ run_id=str(uuid.uuid4()),
+ event_type=RunState.COMPLETE,
+ inputs=inputs,
+ outputs=outputs,
+ job_facets=job.facets,
+ namespace=namespace,
+ )
+ return result
+ except Exception as e:
+ logger.error(
+ f"Error emitting feature view lineage for {feature_view.name}: {e}"
+ )
+ return False
+
+ def emit_stream_feature_view_lineage(
+ self,
+ stream_feature_view: "StreamFeatureView",
+ project: str,
+ ) -> bool:
+ """
+ Emit lineage for a stream feature view definition.
+
+ Args:
+ stream_feature_view: The stream feature view
+ project: Project name
+
+ Returns:
+ True if successful, False otherwise
+ """
+ if not self.is_enabled:
+ return False
+
+ from feast.openlineage.mappers import feature_view_to_job
+
+ try:
+ namespace = self._get_namespace(project)
+ # StreamFeatureView inherits from FeatureView
+ job, inputs, outputs = feature_view_to_job(
+ stream_feature_view,
+ namespace=namespace,
+ )
+
+ # Emit a RunEvent with COMPLETE state to create lineage connection
+ return self._client.emit_run_event(
+ job_name=f"stream_{job.name}",
+ run_id=str(uuid.uuid4()),
+ event_type=RunState.COMPLETE,
+ inputs=inputs,
+ outputs=outputs,
+ job_facets=job.facets,
+ namespace=namespace,
+ )
+ except Exception as e:
+ logger.error(
+ f"Error emitting stream feature view lineage for {stream_feature_view.name}: {e}"
+ )
+ return False
+
+ def emit_on_demand_feature_view_lineage(
+ self,
+ odfv: "OnDemandFeatureView",
+ project: str,
+ ) -> bool:
+ """
+ Emit lineage for an on-demand feature view definition.
+
+ Args:
+ odfv: The on-demand feature view
+ project: Project name
+
+ Returns:
+ True if successful, False otherwise
+ """
+ if not self.is_enabled:
+ return False
+
+ from feast.openlineage.facets import FeastFeatureViewFacet
+ from feast.openlineage.mappers import feast_field_to_schema_field
+
+ try:
+ from openlineage.client.facet_v2 import schema_dataset
+
+ namespace = self._get_namespace(project)
+
+ # Build inputs from sources
+ inputs = []
+ for source_name, fv_proj in odfv.source_feature_view_projections.items():
+ inputs.append(
+ InputDataset(
+ namespace=namespace,
+ name=fv_proj.name,
+ )
+ )
+
+ for source_name, req_source in odfv.source_request_sources.items():
+ inputs.append(
+ InputDataset(
+ namespace=namespace,
+ name=f"request_source_{source_name}",
+ )
+ )
+
+ # Build output
+ output_facets = {}
+ if odfv.features:
+ output_facets["schema"] = schema_dataset.SchemaDatasetFacet(
+ fields=[feast_field_to_schema_field(f) for f in odfv.features]
+ )
+
+ outputs = [
+ OutputDataset(
+ namespace=namespace,
+ name=odfv.name,
+ facets=output_facets, # type: ignore[arg-type]
+ )
+ ]
+
+ # Build job facets
+ job_facets = {
+ "feast_featureView": FeastFeatureViewFacet(
+ name=odfv.name,
+ ttl_seconds=0,
+ entities=[],
+ features=[f.name for f in odfv.features] if odfv.features else [],
+ online_enabled=True,
+ offline_enabled=True,
+ mode="ON_DEMAND",
+ description=odfv.description if odfv.description else "",
+ owner=odfv.owner if hasattr(odfv, "owner") and odfv.owner else "",
+ tags=odfv.tags if odfv.tags else {},
+ )
+ }
+
+ # Emit a RunEvent with COMPLETE state to create lineage connection
+ return self._client.emit_run_event(
+ job_name=f"on_demand_feature_view_{odfv.name}",
+ run_id=str(uuid.uuid4()),
+ event_type=RunState.COMPLETE,
+ inputs=inputs,
+ outputs=outputs,
+ job_facets=job_facets,
+ namespace=namespace,
+ )
+ except Exception as e:
+ logger.error(
+ f"Error emitting on-demand feature view lineage for {odfv.name}: {e}"
+ )
+ return False
+
+ def emit_feature_service_lineage(
+ self,
+ feature_service: "FeatureService",
+ feature_views: List[
+ Union["FeatureView", "OnDemandFeatureView", "StreamFeatureView"]
+ ],
+ project: str,
+ ) -> bool:
+ """
+ Emit lineage for a feature service definition.
+
+ Args:
+ feature_service: The feature service
+ feature_views: List of all available feature views (FeatureView,
+ OnDemandFeatureView, StreamFeatureView)
+ project: Project name
+
+ Returns:
+ True if successful, False otherwise
+ """
+ if not self.is_enabled:
+ return False
+
+ from feast.openlineage.mappers import feature_service_to_job
+
+ try:
+ # Find the feature views referenced by this service
+ namespace = self._get_namespace(project)
+ fv_names = {proj.name for proj in feature_service.feature_view_projections}
+ referenced_fvs = [fv for fv in feature_views if fv.name in fv_names]
+
+ job, inputs, outputs = feature_service_to_job(
+ feature_service,
+ referenced_fvs,
+ namespace=namespace,
+ )
+
+ # Emit a RunEvent with COMPLETE state to create lineage connection
+ return self._client.emit_run_event(
+ job_name=job.name,
+ run_id=str(uuid.uuid4()),
+ event_type=RunState.COMPLETE,
+ inputs=inputs,
+ outputs=outputs,
+ job_facets=job.facets,
+ namespace=namespace,
+ )
+ except Exception as e:
+ logger.error(
+ f"Error emitting feature service lineage for {feature_service.name}: {e}"
+ )
+ return False
+
+ def emit_materialize_start(
+ self,
+ feature_views: List["FeatureView"],
+ start_date: Optional[datetime],
+ end_date: datetime,
+ project: str,
+ run_id: Optional[str] = None,
+ ) -> Tuple[str, bool]:
+ """
+ Emit a START event for a materialization run.
+
+ Args:
+ feature_views: Feature views being materialized
+ start_date: Start of materialization window (None for incremental)
+ end_date: End of materialization window
+ project: Project name
+ run_id: Optional run ID (will be generated if not provided)
+
+ Returns:
+ Tuple of (run_id, success)
+ """
+ if not self.is_enabled or not self._config.emit_on_materialize:
+ return "", False
+
+ from feast.openlineage.facets import FeastMaterializationFacet
+ from feast.openlineage.mappers import (
+ data_source_to_dataset,
+ online_store_to_dataset,
+ )
+
+ run_id = run_id or str(uuid.uuid4())
+
+ try:
+ namespace = self._get_namespace(project)
+
+ # Build inputs (data sources) - include both batch and stream sources
+ inputs = []
+ seen_sources = set() # Track source names to avoid duplicates
+
+ for fv in feature_views:
+ # Add batch source
+ if hasattr(fv, "batch_source") and fv.batch_source:
+ source_name = getattr(fv.batch_source, "name", None)
+ if source_name and source_name not in seen_sources:
+ seen_sources.add(source_name)
+ inputs.append(
+ data_source_to_dataset(
+ fv.batch_source,
+ namespace=namespace,
+ as_input=True,
+ )
+ )
+
+ # Add stream source (e.g., PushSource)
+ if hasattr(fv, "stream_source") and fv.stream_source:
+ source_name = getattr(fv.stream_source, "name", None)
+ if source_name and source_name not in seen_sources:
+ seen_sources.add(source_name)
+ inputs.append(
+ data_source_to_dataset(
+ fv.stream_source,
+ namespace=namespace,
+ as_input=True,
+ )
+ )
+
+ # Add entities as inputs (use direct name for consistency with emit_apply)
+ if hasattr(fv, "entities") and fv.entities:
+ for entity_name in fv.entities:
+ if entity_name and entity_name != "__dummy":
+ if entity_name not in seen_sources:
+ seen_sources.add(entity_name)
+ inputs.append(
+ InputDataset(
+ namespace=namespace,
+ name=entity_name,
+ )
+ )
+
+ # Build outputs (online store entries)
+ outputs = [
+ online_store_to_dataset(
+ store_type="online_store",
+ feature_view_name=fv.name,
+ namespace=namespace,
+ )
+ for fv in feature_views
+ ]
+
+ # Build run facets
+ run_facets = {
+ "feast_materialization": FeastMaterializationFacet(
+ feature_views=[fv.name for fv in feature_views],
+ start_date=start_date.isoformat() if start_date else None,
+ end_date=end_date.isoformat() if end_date else None,
+ project=project,
+ )
+ }
+
+ success = self._client.emit_run_event(
+ job_name=f"materialize_{project}",
+ run_id=run_id,
+ event_type=RunState.START,
+ inputs=inputs,
+ outputs=outputs,
+ run_facets=run_facets,
+ namespace=namespace,
+ )
+
+ return run_id, success
+ except Exception as e:
+ logger.error(f"Error emitting materialize start event: {e}")
+ return run_id, False
+
+ def emit_materialize_complete(
+ self,
+ run_id: str,
+ feature_views: List["FeatureView"],
+ project: str,
+ rows_written: Optional[int] = None,
+ ) -> bool:
+ """
+ Emit a COMPLETE event for a materialization run.
+
+ Args:
+ run_id: Run ID from the start event
+ feature_views: Feature views that were materialized
+ project: Project name
+ rows_written: Optional count of rows written
+
+ Returns:
+ True if successful, False otherwise
+ """
+ if not self.is_enabled or not self._config.emit_on_materialize:
+ return False
+
+ from feast.openlineage.facets import FeastMaterializationFacet
+ from feast.openlineage.mappers import online_store_to_dataset
+
+ try:
+ namespace = self._get_namespace(project)
+
+ outputs = [
+ online_store_to_dataset(
+ store_type="online_store",
+ feature_view_name=fv.name,
+ namespace=namespace,
+ )
+ for fv in feature_views
+ ]
+
+ run_facets = {
+ "feast_materialization": FeastMaterializationFacet(
+ feature_views=[fv.name for fv in feature_views],
+ project=project,
+ rows_written=rows_written,
+ )
+ }
+
+ return self._client.emit_run_event(
+ job_name=f"materialize_{project}",
+ run_id=run_id,
+ event_type=RunState.COMPLETE,
+ outputs=outputs,
+ run_facets=run_facets,
+ namespace=namespace,
+ )
+ except Exception as e:
+ logger.error(f"Error emitting materialize complete event: {e}")
+ return False
+
+ def emit_materialize_fail(
+ self,
+ run_id: str,
+ project: str,
+ error_message: Optional[str] = None,
+ ) -> bool:
+ """
+ Emit a FAIL event for a materialization run.
+
+ Args:
+ run_id: Run ID from the start event
+ project: Project name
+ error_message: Optional error message
+
+ Returns:
+ True if successful, False otherwise
+ """
+ if not self.is_enabled or not self._config.emit_on_materialize:
+ return False
+
+ try:
+ from openlineage.client.facet_v2 import error_message_run
+
+ namespace = self._get_namespace(project)
+ run_facets = {}
+ if error_message:
+ run_facets["errorMessage"] = error_message_run.ErrorMessageRunFacet(
+ message=error_message,
+ programmingLanguage="python",
+ )
+
+ return self._client.emit_run_event(
+ job_name=f"materialize_{project}",
+ run_id=run_id,
+ event_type=RunState.FAIL,
+ run_facets=run_facets,
+ namespace=namespace,
+ )
+ except Exception as e:
+ logger.error(f"Error emitting materialize fail event: {e}")
+ return False
+
+ def emit_apply(
+ self,
+ objects: List[Any],
+ project: str,
+ ) -> List[bool]:
+ """
+ Emit lineage for a feast apply operation.
+
+ Creates two jobs to match Feast UI lineage model:
+ 1. feast_feature_views_{project}: DataSources + Entities → FeatureViews
+ 2. feast_feature_services_{project}: FeatureViews → FeatureServices
+
+ This creates a lineage graph matching Feast UI:
+ DataSource ──→ FeatureView ──→ FeatureService
+ ↑
+ Entity
+
+ Args:
+ objects: List of Feast objects being applied
+ project: Project name
+
+ Returns:
+ List of success/failure indicators
+ """
+ if not self.is_enabled or not self._config.emit_on_apply:
+ return []
+
+ from feast import Entity, FeatureService
+ from feast.data_source import DataSource
+ from feast.feature_view import FeatureView
+ from feast.on_demand_feature_view import OnDemandFeatureView
+ from feast.openlineage.facets import FeastProjectFacet
+ from feast.openlineage.mappers import (
+ data_source_to_dataset,
+ entity_to_dataset,
+ feast_field_to_schema_field,
+ )
+ from feast.stream_feature_view import StreamFeatureView
+
+ try:
+ from openlineage.client.facet_v2 import schema_dataset
+
+ namespace = self._get_namespace(project)
+ results = []
+
+ # Categorize objects
+ data_sources: List[DataSource] = []
+ entities: List[Entity] = []
+ feature_views: List[Union[FeatureView, OnDemandFeatureView]] = []
+ on_demand_feature_views: List[OnDemandFeatureView] = []
+ feature_services: List[FeatureService] = []
+
+ for obj in objects:
+ if isinstance(obj, StreamFeatureView):
+ feature_views.append(obj)
+ elif isinstance(obj, OnDemandFeatureView):
+ on_demand_feature_views.append(obj)
+ elif isinstance(obj, FeatureView):
+ feature_views.append(obj)
+ elif isinstance(obj, FeatureService):
+ feature_services.append(obj)
+ elif isinstance(obj, DataSource):
+ data_sources.append(obj)
+ elif isinstance(obj, Entity):
+ if obj.name != "__dummy":
+ entities.append(obj)
+
+ # ============================================================
+ # Job 1: DataSources + Entities → FeatureViews
+ # This matches: DataSource → FeatureView and Entity → FeatureView
+ # ============================================================
+ if feature_views or on_demand_feature_views:
+ fv_inputs = []
+ seen_inputs: set = set()
+
+ # Add explicit data sources
+ for ds in data_sources:
+ if ds.name and ds.name not in seen_inputs:
+ seen_inputs.add(ds.name)
+ fv_inputs.append(
+ data_source_to_dataset(
+ ds, namespace=namespace, as_input=True
+ )
+ )
+
+ # Add entities (using direct name to match Feast UI)
+ for entity in entities:
+ if entity.name not in seen_inputs:
+ seen_inputs.add(entity.name)
+ fv_inputs.append(entity_to_dataset(entity, namespace=namespace))
+
+ # Also add data sources from feature views
+ for fv in feature_views:
+ if hasattr(fv, "batch_source") and fv.batch_source:
+ source_name = getattr(fv.batch_source, "name", None)
+ if source_name and source_name not in seen_inputs:
+ seen_inputs.add(source_name)
+ fv_inputs.append(
+ data_source_to_dataset(
+ fv.batch_source, namespace=namespace, as_input=True
+ )
+ )
+ if hasattr(fv, "stream_source") and fv.stream_source:
+ source_name = getattr(fv.stream_source, "name", None)
+ if source_name and source_name not in seen_inputs:
+ seen_inputs.add(source_name)
+ fv_inputs.append(
+ data_source_to_dataset(
+ fv.stream_source, namespace=namespace, as_input=True
+ )
+ )
+
+ # Build FeatureView outputs
+ from openlineage.client.facet_v2 import documentation_dataset
+
+ from feast.openlineage.facets import FeastFeatureViewFacet
+
+ fv_outputs = []
+ for fv in feature_views:
+ output_facets: Dict[str, Any] = {}
+
+ # Add schema with features (includes tags in description)
+ if fv.features:
+ output_facets["schema"] = schema_dataset.SchemaDatasetFacet(
+ fields=[feast_field_to_schema_field(f) for f in fv.features]
+ )
+
+ # Add documentation facet with description
+ if hasattr(fv, "description") and fv.description:
+ output_facets["documentation"] = (
+ documentation_dataset.DocumentationDatasetFacet(
+ description=fv.description
+ )
+ )
+
+ # Add Feast-specific facet with full metadata
+ ttl_seconds = 0
+ if hasattr(fv, "ttl") and fv.ttl:
+ ttl_seconds = int(fv.ttl.total_seconds())
+
+ output_facets["feast_featureView"] = FeastFeatureViewFacet(
+ name=fv.name,
+ ttl_seconds=ttl_seconds,
+ entities=list(fv.entities)
+ if hasattr(fv, "entities") and fv.entities
+ else [],
+ features=[f.name for f in fv.features] if fv.features else [],
+ online_enabled=fv.online if hasattr(fv, "online") else True,
+ description=fv.description
+ if hasattr(fv, "description")
+ else "",
+ owner=fv.owner if hasattr(fv, "owner") else "",
+ tags=fv.tags if hasattr(fv, "tags") else {},
+ )
+
+ fv_outputs.append(
+ OutputDataset(
+ namespace=namespace,
+ name=fv.name,
+ facets=output_facets,
+ )
+ )
+
+ for odfv in on_demand_feature_views:
+ output_facets = {}
+
+ # Add schema with features (includes tags in description)
+ if odfv.features:
+ output_facets["schema"] = schema_dataset.SchemaDatasetFacet(
+ fields=[
+ feast_field_to_schema_field(f) for f in odfv.features
+ ]
+ )
+
+ # Add documentation facet with description
+ if hasattr(odfv, "description") and odfv.description:
+ output_facets["documentation"] = (
+ documentation_dataset.DocumentationDatasetFacet(
+ description=odfv.description
+ )
+ )
+
+ # Add Feast-specific facet with full metadata
+ output_facets["feast_featureView"] = FeastFeatureViewFacet(
+ name=odfv.name,
+ ttl_seconds=0,
+ entities=list(odfv.entities)
+ if hasattr(odfv, "entities") and odfv.entities
+ else [],
+ features=[f.name for f in odfv.features]
+ if odfv.features
+ else [],
+ online_enabled=True,
+ description=odfv.description
+ if hasattr(odfv, "description")
+ else "",
+ owner=odfv.owner if hasattr(odfv, "owner") else "",
+ tags=odfv.tags if hasattr(odfv, "tags") else {},
+ )
+
+ fv_outputs.append(
+ OutputDataset(
+ namespace=namespace,
+ name=odfv.name,
+ facets=output_facets,
+ )
+ )
+
+ # Emit Job 1: Feature Views job
+ job_facets = {
+ "feast_project": FeastProjectFacet(
+ project_name=project,
+ )
+ }
+
+ result1 = self._client.emit_run_event(
+ job_name=f"feast_feature_views_{project}",
+ run_id=str(uuid.uuid4()),
+ event_type=RunState.COMPLETE,
+ inputs=fv_inputs,
+ outputs=fv_outputs,
+ job_facets=job_facets,
+ namespace=namespace,
+ )
+ results.append(result1)
+
+ if result1:
+ logger.info(
+ f"✓ Emitted feature views lineage for '{project}' "
+ f"({len(fv_inputs)} inputs → {len(fv_outputs)} outputs)"
+ )
+
+ # ============================================================
+ # Jobs for FeatureServices: One job per FeatureService
+ # Each job shows: FeatureViews (that are part of this FS) → FeatureService
+ # This matches Feast UI where links are only shown for actual membership
+ # ============================================================
+ for fs in feature_services:
+ fs_inputs = []
+ all_fs_features = [] # Collect all features for schema
+ fv_names_in_fs = [] # Track feature view names
+
+ # Only include FeatureViews that are actually part of this FeatureService
+ for proj in fs.feature_view_projections:
+ fv_name = proj.name
+ if fv_name:
+ fv_names_in_fs.append(fv_name)
+ # Find the feature view to get schema
+ input_facets: Dict[str, Any] = {}
+
+ # Use projection features if specified, otherwise use all from FV
+ proj_features = proj.features if proj.features else []
+
+ for fv in feature_views + on_demand_feature_views:
+ if fv.name == fv_name:
+ # Use projection features if available, else all FV features
+ features_to_use = (
+ proj_features
+ if proj_features
+ else (fv.features if fv.features else [])
+ )
+ if features_to_use:
+ input_facets["schema"] = (
+ schema_dataset.SchemaDatasetFacet(
+ fields=[
+ feast_field_to_schema_field(f)
+ for f in features_to_use
+ ]
+ )
+ )
+ # Collect features for FS output schema
+ all_fs_features.extend(features_to_use)
+ break
+
+ fs_inputs.append(
+ InputDataset(
+ namespace=namespace,
+ name=fv_name,
+ facets=input_facets,
+ )
+ )
+
+ # Build FeatureService output with schema and metadata
+ fs_output_facets: Dict[str, Any] = {}
+
+ # Add schema with all features from constituent feature views
+ if all_fs_features:
+ fs_output_facets["schema"] = schema_dataset.SchemaDatasetFacet(
+ fields=[feast_field_to_schema_field(f) for f in all_fs_features]
+ )
+
+ # Add documentation with feature view list
+ if fv_names_in_fs:
+ from openlineage.client.facet_v2 import documentation_dataset
+
+ fs_output_facets["documentation"] = (
+ documentation_dataset.DocumentationDatasetFacet(
+ description=(
+ f"Feature Service '{fs.name}' aggregates features from: "
+ f"{', '.join(fv_names_in_fs)}. "
+ f"Total features: {len(all_fs_features)}."
+ )
+ )
+ )
+
+ # Add Feast-specific facet with detailed metadata
+ from feast.openlineage.facets import FeastFeatureServiceFacet
+
+ fs_output_facets["feast_featureService"] = FeastFeatureServiceFacet(
+ name=fs.name,
+ feature_views=fv_names_in_fs,
+ feature_count=len(all_fs_features),
+ description=fs.description if fs.description else "",
+ owner=fs.owner if fs.owner else "",
+ tags=fs.tags if fs.tags else {},
+ logging_enabled=getattr(fs, "logging", None) is not None,
+ )
+
+ fs_output = OutputDataset(
+ namespace=namespace,
+ name=fs.name,
+ facets=fs_output_facets,
+ )
+
+ # Emit a job for this specific FeatureService
+ job_facets = {
+ "feast_project": FeastProjectFacet(
+ project_name=project,
+ )
+ }
+
+ result = self._client.emit_run_event(
+ job_name=f"feature_service_{fs.name}", # Prefix to avoid conflict with dataset
+ run_id=str(uuid.uuid4()),
+ event_type=RunState.COMPLETE,
+ inputs=fs_inputs,
+ outputs=[fs_output],
+ job_facets=job_facets,
+ namespace=namespace,
+ )
+ results.append(result)
+
+ return results
+
+ except Exception as e:
+ logger.error(f"Error emitting project lineage for {project}: {e}")
+ return [False]
+
+ def close(self):
+ """Close the underlying client."""
+ self._client.close()
diff --git a/sdk/python/feast/openlineage/facets.py b/sdk/python/feast/openlineage/facets.py
new file mode 100644
index 00000000000..d350b74f0df
--- /dev/null
+++ b/sdk/python/feast/openlineage/facets.py
@@ -0,0 +1,281 @@
+# Copyright 2026 The Feast Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+Custom OpenLineage facets for Feast Feature Store.
+
+These facets extend the standard OpenLineage facets to capture Feast-specific
+metadata about feature views, feature services, data sources, and entities.
+"""
+
+from typing import Dict, List, Optional
+
+import attr
+
+try:
+ from openlineage.client.generated.base import DatasetFacet, JobFacet, RunFacet
+ from openlineage.client.utils import RedactMixin
+
+ OPENLINEAGE_AVAILABLE = True
+except ImportError:
+ # Provide stub classes when OpenLineage is not installed
+ OPENLINEAGE_AVAILABLE = False
+
+ class RedactMixin: # type: ignore[no-redef]
+ pass
+
+ @attr.define
+ class JobFacet: # type: ignore[no-redef]
+ _producer: str = attr.field(default="")
+ _schemaURL: str = attr.field(default="")
+
+ def __attrs_post_init__(self):
+ pass
+
+ @attr.define
+ class DatasetFacet: # type: ignore[no-redef]
+ _producer: str = attr.field(default="")
+ _schemaURL: str = attr.field(default="")
+ _deleted: bool = attr.field(default=None)
+
+ def __attrs_post_init__(self):
+ pass
+
+ @attr.define
+ class RunFacet: # type: ignore[no-redef]
+ _producer: str = attr.field(default="")
+ _schemaURL: str = attr.field(default="")
+
+ def __attrs_post_init__(self):
+ pass
+
+
+# Schema URL base for Feast facets
+FEAST_FACET_SCHEMA_BASE = "https://feast.dev/spec/facets/1-0-0"
+
+
+@attr.define(kw_only=True)
+class FeastFeatureViewFacet(JobFacet):
+ """
+ Custom facet for Feast Feature View metadata.
+
+ This facet captures Feast-specific metadata about feature views including
+ TTL, entities, online/offline status, and transformation mode.
+
+ Attributes:
+ name: Feature view name
+ ttl_seconds: Time-to-live in seconds (0 means no TTL)
+ entities: List of entity names associated with the feature view
+ features: List of feature names in the feature view
+ online_enabled: Whether online retrieval is enabled
+ offline_enabled: Whether offline retrieval is enabled
+ mode: Transformation mode (PYTHON, PANDAS, RAY, SPARK, SQL, etc.)
+ description: Human-readable description
+ owner: Owner of the feature view
+ tags: Key-value tags
+ """
+
+ name: str = attr.field()
+ ttl_seconds: int = attr.field(default=0)
+ entities: List[str] = attr.field(factory=list)
+ features: List[str] = attr.field(factory=list)
+ online_enabled: bool = attr.field(default=True)
+ offline_enabled: bool = attr.field(default=False)
+ mode: Optional[str] = attr.field(default=None)
+ description: str = attr.field(default="")
+ owner: str = attr.field(default="")
+ tags: Dict[str, str] = attr.field(factory=dict)
+
+ @staticmethod
+ def _get_schema() -> str:
+ return f"{FEAST_FACET_SCHEMA_BASE}/FeastFeatureViewFacet.json"
+
+
+@attr.define(kw_only=True)
+class FeastFeatureServiceFacet(JobFacet):
+ """
+ Custom facet for Feast Feature Service metadata.
+
+ This facet captures metadata about feature services which aggregate
+ multiple feature views for serving.
+
+ Attributes:
+ name: Feature service name
+ feature_views: List of feature view names included in the service
+ feature_count: Total number of features in the service
+ description: Human-readable description
+ owner: Owner of the feature service
+ tags: Key-value tags
+ logging_enabled: Whether feature logging is enabled
+ """
+
+ name: str = attr.field()
+ feature_views: List[str] = attr.field(factory=list)
+ feature_count: int = attr.field(default=0)
+ description: str = attr.field(default="")
+ owner: str = attr.field(default="")
+ tags: Dict[str, str] = attr.field(factory=dict)
+ logging_enabled: bool = attr.field(default=False)
+
+ @staticmethod
+ def _get_schema() -> str:
+ return f"{FEAST_FACET_SCHEMA_BASE}/FeastFeatureServiceFacet.json"
+
+
+@attr.define(kw_only=True)
+class FeastDataSourceFacet(DatasetFacet):
+ """
+ Custom facet for Feast Data Source metadata.
+
+ This facet captures metadata about data sources including their type,
+ configuration, and field mappings.
+
+ Attributes:
+ name: Data source name
+ source_type: Type of data source (file, bigquery, snowflake, etc.)
+ timestamp_field: Name of the timestamp field
+ created_timestamp_field: Name of the created timestamp field
+ field_mapping: Mapping from source fields to feature names
+ description: Human-readable description
+ tags: Key-value tags
+ """
+
+ name: str = attr.field()
+ source_type: str = attr.field()
+ timestamp_field: Optional[str] = attr.field(default=None)
+ created_timestamp_field: Optional[str] = attr.field(default=None)
+ field_mapping: Dict[str, str] = attr.field(factory=dict)
+ description: str = attr.field(default="")
+ tags: Dict[str, str] = attr.field(factory=dict)
+
+ @staticmethod
+ def _get_schema() -> str:
+ return f"{FEAST_FACET_SCHEMA_BASE}/FeastDataSourceFacet.json"
+
+
+@attr.define(kw_only=True)
+class FeastEntityFacet(DatasetFacet):
+ """
+ Custom facet for Feast Entity metadata.
+
+ This facet captures metadata about entities which define the keys
+ for feature lookups.
+
+ Attributes:
+ name: Entity name
+ join_keys: List of join key column names
+ value_type: Data type of the entity
+ description: Human-readable description
+ owner: Owner of the entity
+ tags: Key-value tags
+ """
+
+ name: str = attr.field()
+ join_keys: List[str] = attr.field(factory=list)
+ value_type: str = attr.field(default="STRING")
+ description: str = attr.field(default="")
+ owner: str = attr.field(default="")
+ tags: Dict[str, str] = attr.field(factory=dict)
+
+ @staticmethod
+ def _get_schema() -> str:
+ return f"{FEAST_FACET_SCHEMA_BASE}/FeastEntityFacet.json"
+
+
+@attr.define(kw_only=True)
+class FeastMaterializationFacet(RunFacet):
+ """
+ Custom facet for Feast Materialization run metadata.
+
+ This facet captures information about feature materialization runs
+ including the time range, feature views being materialized, and statistics.
+
+ Attributes:
+ feature_views: List of feature view names being materialized
+ start_date: Start date of the materialization window
+ end_date: End date of the materialization window
+ project: Feast project name
+ rows_written: Number of rows written (if available)
+ online_store_type: Type of online store being written to
+ offline_store_type: Type of offline store being read from
+ """
+
+ feature_views: List[str] = attr.field(factory=list)
+ start_date: Optional[str] = attr.field(default=None)
+ end_date: Optional[str] = attr.field(default=None)
+ project: str = attr.field(default="")
+ rows_written: Optional[int] = attr.field(default=None)
+ online_store_type: str = attr.field(default="")
+ offline_store_type: str = attr.field(default="")
+
+ @staticmethod
+ def _get_schema() -> str:
+ return f"{FEAST_FACET_SCHEMA_BASE}/FeastMaterializationFacet.json"
+
+
+@attr.define(kw_only=True)
+class FeastRetrievalFacet(RunFacet):
+ """
+ Custom facet for Feast Feature Retrieval run metadata.
+
+ This facet captures information about feature retrieval operations
+ including whether it's online or historical, the feature service used,
+ and retrieval statistics.
+
+ Attributes:
+ retrieval_type: Type of retrieval (online, historical)
+ feature_service: Name of the feature service used (if any)
+ feature_views: List of feature view names queried
+ features: List of feature names retrieved
+ entity_count: Number of entities queried
+ full_feature_names: Whether full feature names were used
+ """
+
+ retrieval_type: str = attr.field() # "online" or "historical"
+ feature_service: Optional[str] = attr.field(default=None)
+ feature_views: List[str] = attr.field(factory=list)
+ features: List[str] = attr.field(factory=list)
+ entity_count: Optional[int] = attr.field(default=None)
+ full_feature_names: bool = attr.field(default=False)
+
+ @staticmethod
+ def _get_schema() -> str:
+ return f"{FEAST_FACET_SCHEMA_BASE}/FeastRetrievalFacet.json"
+
+
+@attr.define(kw_only=True)
+class FeastProjectFacet(JobFacet):
+ """
+ Custom facet for Feast Project metadata.
+
+ This facet captures information about the Feast project context
+ for lineage events.
+
+ Attributes:
+ project_name: Name of the Feast project
+ provider: Infrastructure provider (local, gcp, aws, etc.)
+ online_store_type: Type of online store
+ offline_store_type: Type of offline store
+ registry_type: Type of registry (file, sql, etc.)
+ """
+
+ project_name: str = attr.field()
+ provider: str = attr.field(default="local")
+ online_store_type: str = attr.field(default="")
+ offline_store_type: str = attr.field(default="")
+ registry_type: str = attr.field(default="file")
+
+ @staticmethod
+ def _get_schema() -> str:
+ return f"{FEAST_FACET_SCHEMA_BASE}/FeastProjectFacet.json"
diff --git a/sdk/python/feast/openlineage/mappers.py b/sdk/python/feast/openlineage/mappers.py
new file mode 100644
index 00000000000..9e6fa8557a5
--- /dev/null
+++ b/sdk/python/feast/openlineage/mappers.py
@@ -0,0 +1,543 @@
+# Copyright 2026 The Feast Authors
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""
+Mapping utilities for converting Feast objects to OpenLineage objects.
+
+This module provides functions to map Feast entities like FeatureViews,
+FeatureServices, DataSources, and Entities to their OpenLineage equivalents.
+"""
+
+from typing import TYPE_CHECKING, Any, Dict, List, Tuple, Union
+
+if TYPE_CHECKING:
+ from feast import Entity, FeatureService, FeatureView
+ from feast.data_source import DataSource
+ from feast.field import Field
+ from feast.on_demand_feature_view import OnDemandFeatureView
+ from feast.stream_feature_view import StreamFeatureView
+
+try:
+ from openlineage.client.event_v2 import (
+ InputDataset,
+ Job,
+ OutputDataset,
+ )
+ from openlineage.client.facet_v2 import (
+ datasource_dataset,
+ documentation_dataset,
+ schema_dataset,
+ )
+
+ OPENLINEAGE_AVAILABLE = True
+except ImportError:
+ OPENLINEAGE_AVAILABLE = False
+
+
+def _check_openlineage_available():
+ """Check if OpenLineage is available and raise if not."""
+ if not OPENLINEAGE_AVAILABLE:
+ raise ImportError(
+ "OpenLineage is not installed. Please install it with: "
+ "pip install openlineage-python"
+ )
+
+
+def feast_field_to_schema_field(
+ field: "Field",
+) -> "schema_dataset.SchemaDatasetFacetFields":
+ """
+ Convert a Feast Field to an OpenLineage schema field.
+
+ Args:
+ field: Feast Field object
+
+ Returns:
+ OpenLineage SchemaDatasetFacetFields object
+ """
+ _check_openlineage_available()
+
+ # Build description with tags
+ description_parts = []
+
+ # Add description if present
+ if hasattr(field, "description") and field.description:
+ description_parts.append(field.description)
+
+ # Add tags if present
+ if hasattr(field, "tags") and field.tags:
+ tags_str = ", ".join(f"{k}={v}" for k, v in field.tags.items())
+ description_parts.append(f"[Tags: {tags_str}]")
+
+ description = " ".join(description_parts) if description_parts else None
+
+ return schema_dataset.SchemaDatasetFacetFields(
+ name=field.name,
+ type=str(field.dtype) if field.dtype else None,
+ description=description,
+ )
+
+
+def data_source_to_dataset(
+ data_source: "DataSource",
+ namespace: str = "feast",
+ as_input: bool = True,
+) -> Any:
+ """
+ Convert a Feast DataSource to an OpenLineage Dataset.
+
+ Args:
+ data_source: Feast DataSource object
+ namespace: OpenLineage namespace
+ as_input: Whether to create an InputDataset (True) or OutputDataset (False)
+
+ Returns:
+ OpenLineage InputDataset or OutputDataset object
+ """
+ _check_openlineage_available()
+
+ from feast.openlineage.facets import FeastDataSourceFacet
+
+ # Determine source type and name
+ source_type = type(data_source).__name__
+ source_name = data_source.name if data_source.name else f"unnamed_{source_type}"
+
+ # Build namespace based on source type
+ dataset_namespace = _get_data_source_namespace(data_source, namespace)
+
+ # Build facets
+ facets: Dict[str, Any] = {}
+
+ # Add datasource facet
+ facets["dataSource"] = datasource_dataset.DatasourceDatasetFacet(
+ name=source_name,
+ uri=_get_data_source_uri(data_source),
+ )
+
+ # Add Feast-specific facet
+ facets["feast_dataSource"] = FeastDataSourceFacet(
+ name=source_name,
+ source_type=source_type,
+ timestamp_field=data_source.timestamp_field
+ if hasattr(data_source, "timestamp_field")
+ else None,
+ created_timestamp_field=data_source.created_timestamp_column
+ if hasattr(data_source, "created_timestamp_column")
+ else None,
+ field_mapping=data_source.field_mapping
+ if hasattr(data_source, "field_mapping")
+ else {},
+ description=data_source.description
+ if hasattr(data_source, "description")
+ else "",
+ tags=data_source.tags if hasattr(data_source, "tags") else {},
+ )
+
+ # Add documentation if available
+ if hasattr(data_source, "description") and data_source.description:
+ facets["documentation"] = documentation_dataset.DocumentationDatasetFacet(
+ description=data_source.description
+ )
+
+ if as_input:
+ return InputDataset(
+ namespace=dataset_namespace,
+ name=source_name,
+ facets=facets,
+ )
+ else:
+ return OutputDataset(
+ namespace=dataset_namespace,
+ name=source_name,
+ facets=facets,
+ )
+
+
+def _get_data_source_namespace(
+ data_source: "DataSource", default_namespace: str
+) -> str:
+ """
+ Get the OpenLineage namespace for a data source.
+
+ Uses the same namespace as other Feast objects to ensure proper
+ lineage connections in the graph.
+
+ Args:
+ data_source: Feast DataSource
+ default_namespace: Default namespace to use
+
+ Returns:
+ Namespace string
+ """
+ # Use consistent namespace to ensure lineage graph connects properly
+ return default_namespace
+
+
+def _get_data_source_uri(data_source: "DataSource") -> str:
+ """
+ Get the URI for a data source.
+
+ Args:
+ data_source: Feast DataSource
+
+ Returns:
+ URI string representing the data source location
+ """
+ if hasattr(data_source, "path") and data_source.path:
+ return data_source.path
+ elif hasattr(data_source, "table") and data_source.table:
+ return f"table://{data_source.table}"
+ elif hasattr(data_source, "query") and data_source.query:
+ return f"query://{hash(data_source.query)}"
+ else:
+ return f"feast://{data_source.name if data_source.name else 'unnamed'}"
+
+
+def feature_view_to_job(
+ feature_view: "FeatureView",
+ namespace: str = "feast",
+ include_schema: bool = True,
+) -> Tuple["Job", List["InputDataset"], List["OutputDataset"]]:
+ """
+ Convert a Feast FeatureView to an OpenLineage Job with inputs/outputs.
+
+ A FeatureView represents a transformation from data sources to features,
+ so it maps to an OpenLineage Job with:
+ - Inputs: The batch and stream sources
+ - Outputs: The feature view itself (as a logical dataset)
+
+ Args:
+ feature_view: Feast FeatureView object
+ namespace: OpenLineage namespace
+ include_schema: Whether to include schema information
+
+ Returns:
+ Tuple of (Job, list of InputDatasets, list of OutputDatasets)
+ """
+ _check_openlineage_available()
+
+ from feast.openlineage.facets import FeastFeatureViewFacet
+
+ # Create job facets
+ job_facets: Dict[str, Any] = {}
+
+ # Add Feast-specific facet
+ ttl_seconds = 0
+ if feature_view.ttl:
+ ttl_seconds = int(feature_view.ttl.total_seconds())
+
+ job_facets["feast_featureView"] = FeastFeatureViewFacet(
+ name=feature_view.name,
+ ttl_seconds=ttl_seconds,
+ entities=feature_view.entities if feature_view.entities else [],
+ features=[f.name for f in feature_view.features]
+ if feature_view.features
+ else [],
+ online_enabled=feature_view.online,
+ offline_enabled=getattr(feature_view, "offline", False),
+ mode=str(feature_view.mode)
+ if hasattr(feature_view, "mode") and feature_view.mode
+ else None,
+ description=feature_view.description if feature_view.description else "",
+ owner=feature_view.owner if feature_view.owner else "",
+ tags=feature_view.tags if feature_view.tags else {},
+ )
+
+ # Add documentation
+ if feature_view.description:
+ job_facets["documentation"] = documentation_dataset.DocumentationDatasetFacet(
+ description=feature_view.description
+ )
+
+ # Create job
+ job = Job(
+ namespace=namespace,
+ name=f"feature_view_{feature_view.name}",
+ facets=job_facets,
+ )
+
+ # Create input datasets from sources
+ inputs: List[InputDataset] = []
+
+ # Add data sources as inputs
+ if hasattr(feature_view, "batch_source") and feature_view.batch_source:
+ inputs.append(
+ data_source_to_dataset(
+ feature_view.batch_source, namespace=namespace, as_input=True
+ )
+ )
+
+ if hasattr(feature_view, "stream_source") and feature_view.stream_source:
+ inputs.append(
+ data_source_to_dataset(
+ feature_view.stream_source, namespace=namespace, as_input=True
+ )
+ )
+
+ # Add entities as inputs (they appear as nodes in lineage)
+ if feature_view.entities:
+ for entity_name in feature_view.entities:
+ if entity_name and entity_name != "__dummy":
+ inputs.append(
+ InputDataset(
+ namespace=namespace,
+ name=entity_name,
+ )
+ )
+
+ # Create output dataset (the feature view itself as a logical dataset)
+ output_facets: Dict[str, Any] = {}
+
+ if include_schema and feature_view.features:
+ output_facets["schema"] = schema_dataset.SchemaDatasetFacet(
+ fields=[feast_field_to_schema_field(f) for f in feature_view.features]
+ )
+
+ outputs = [
+ OutputDataset(
+ namespace=namespace,
+ name=feature_view.name,
+ facets=output_facets,
+ )
+ ]
+
+ return job, inputs, outputs
+
+
+def feature_service_to_job(
+ feature_service: "FeatureService",
+ feature_views: List[
+ Union["FeatureView", "OnDemandFeatureView", "StreamFeatureView"]
+ ],
+ namespace: str = "feast",
+) -> Tuple["Job", List["InputDataset"], List["OutputDataset"]]:
+ """
+ Convert a Feast FeatureService to an OpenLineage Job with inputs/outputs.
+
+ A FeatureService aggregates multiple feature views, so it maps to an
+ OpenLineage Job with:
+ - Inputs: The feature views it consumes
+ - Outputs: The aggregated feature set
+
+ Args:
+ feature_service: Feast FeatureService object
+ feature_views: List of all available feature views (FeatureView,
+ OnDemandFeatureView, StreamFeatureView)
+ namespace: OpenLineage namespace
+
+ Returns:
+ Tuple of (Job, list of InputDatasets, list of OutputDatasets)
+ """
+ _check_openlineage_available()
+
+ from feast.openlineage.facets import FeastFeatureServiceFacet
+
+ # Create job facets
+ job_facets: Dict[str, Any] = {}
+
+ # Get feature view names
+ fv_names = [proj.name for proj in feature_service.feature_view_projections]
+
+ # Build a lookup map for feature views by name
+ fv_by_name = {fv.name: fv for fv in feature_views}
+
+ # Count total features
+ # When proj.features is empty/None, it means "all features from that feature view"
+ # In that case, look up the actual feature view to get the real count
+ total_features = 0
+ for proj in feature_service.feature_view_projections:
+ if proj.features:
+ total_features += len(proj.features)
+ elif proj.name in fv_by_name:
+ fv = fv_by_name[proj.name]
+ if hasattr(fv, "features") and fv.features:
+ total_features += len(fv.features)
+
+ # Add Feast-specific facet
+ job_facets["feast_featureService"] = FeastFeatureServiceFacet(
+ name=feature_service.name,
+ feature_views=fv_names,
+ feature_count=total_features,
+ description=feature_service.description if feature_service.description else "",
+ owner=feature_service.owner if feature_service.owner else "",
+ tags=feature_service.tags if feature_service.tags else {},
+ logging_enabled=getattr(feature_service, "logging", None) is not None,
+ )
+
+ # Add documentation
+ if feature_service.description:
+ job_facets["documentation"] = documentation_dataset.DocumentationDatasetFacet(
+ description=feature_service.description
+ )
+
+ # Create job
+ job = Job(
+ namespace=namespace,
+ name=f"feature_service_{feature_service.name}",
+ facets=job_facets,
+ )
+
+ # Create input datasets from feature views
+ inputs: List[InputDataset] = []
+ all_features = []
+
+ for fv in feature_views:
+ input_facets: Dict[str, Any] = {}
+ if fv.features:
+ input_facets["schema"] = schema_dataset.SchemaDatasetFacet(
+ fields=[feast_field_to_schema_field(f) for f in fv.features]
+ )
+ all_features.extend(fv.features)
+
+ inputs.append(
+ InputDataset(
+ namespace=namespace,
+ name=fv.name,
+ facets=input_facets,
+ )
+ )
+
+ # Create output dataset (the feature service as a logical aggregation)
+ output_facets: Dict[str, Any] = {}
+ if all_features:
+ output_facets["schema"] = schema_dataset.SchemaDatasetFacet(
+ fields=[feast_field_to_schema_field(f) for f in all_features]
+ )
+
+ outputs = [
+ OutputDataset(
+ namespace=namespace,
+ name=feature_service.name,
+ facets=output_facets,
+ )
+ ]
+
+ return job, inputs, outputs
+
+
+def entity_to_dataset(
+ entity: "Entity",
+ namespace: str = "feast",
+) -> "InputDataset":
+ """
+ Convert a Feast Entity to an OpenLineage InputDataset.
+
+ Entities define the keys for feature lookups and can be represented
+ as datasets with schema information.
+
+ Args:
+ entity: Feast Entity object
+ namespace: OpenLineage namespace
+
+ Returns:
+ OpenLineage InputDataset object
+ """
+ _check_openlineage_available()
+
+ from feast.openlineage.facets import FeastEntityFacet
+
+ facets: Dict[str, Any] = {}
+
+ # Add entity facet
+ facets["feast_entity"] = FeastEntityFacet(
+ name=entity.name,
+ join_keys=[entity.join_key] if entity.join_key else [],
+ value_type=str(entity.value_type) if entity.value_type else "STRING",
+ description=entity.description if entity.description else "",
+ owner=entity.owner if hasattr(entity, "owner") and entity.owner else "",
+ tags=entity.tags if entity.tags else {},
+ )
+
+ # Add schema for join keys
+ if entity.join_key:
+ facets["schema"] = schema_dataset.SchemaDatasetFacet(
+ fields=[
+ schema_dataset.SchemaDatasetFacetFields(
+ name=entity.join_key,
+ type=str(entity.value_type) if entity.value_type else "STRING",
+ )
+ ]
+ )
+
+ # Add documentation
+ if entity.description:
+ facets["documentation"] = documentation_dataset.DocumentationDatasetFacet(
+ description=entity.description
+ )
+
+ return InputDataset(
+ namespace=namespace,
+ name=entity.name,
+ facets=facets,
+ )
+
+
+def online_store_to_dataset(
+ store_type: str,
+ feature_view_name: str,
+ namespace: str = "feast",
+) -> "OutputDataset":
+ """
+ Create an OpenLineage OutputDataset for an online store.
+
+ Args:
+ store_type: Type of online store (redis, sqlite, dynamodb, etc.)
+ feature_view_name: Name of the feature view being stored
+ namespace: OpenLineage namespace
+
+ Returns:
+ OpenLineage OutputDataset object
+ """
+ _check_openlineage_available()
+
+ return OutputDataset(
+ namespace=namespace,
+ name=f"online_store_{feature_view_name}",
+ facets={
+ "dataSource": datasource_dataset.DatasourceDatasetFacet(
+ name=f"{store_type}_online_store",
+ uri=f"{store_type}://feast/{feature_view_name}",
+ )
+ },
+ )
+
+
+def offline_store_to_dataset(
+ store_type: str,
+ feature_view_name: str,
+ namespace: str = "feast",
+) -> "InputDataset":
+ """
+ Create an OpenLineage InputDataset for an offline store.
+
+ Args:
+ store_type: Type of offline store (file, bigquery, snowflake, etc.)
+ feature_view_name: Name of the feature view being read
+ namespace: OpenLineage namespace
+
+ Returns:
+ OpenLineage InputDataset object
+ """
+ _check_openlineage_available()
+
+ return InputDataset(
+ namespace=f"{namespace}/offline_store/{store_type}",
+ name=feature_view_name,
+ facets={
+ "dataSource": datasource_dataset.DatasourceDatasetFacet(
+ name=f"{store_type}_offline_store",
+ uri=f"{store_type}://feast/{feature_view_name}",
+ )
+ },
+ )
diff --git a/sdk/python/feast/repo_config.py b/sdk/python/feast/repo_config.py
index 318ca324cd6..72cd46ba0ab 100644
--- a/sdk/python/feast/repo_config.py
+++ b/sdk/python/feast/repo_config.py
@@ -191,6 +191,65 @@ class MaterializationConfig(BaseModel):
If false, feature retrieval jobs will pull all feature values within the specified time range. """
+class OpenLineageConfig(FeastBaseModel):
+ """Configuration for OpenLineage integration.
+
+ This enables automatic data lineage tracking for Feast operations like
+ materialization, feature retrieval, and registry changes.
+
+ Example configuration in feature_store.yaml:
+ openlineage:
+ enabled: true
+ transport_type: http
+ transport_url: http://localhost:5000
+ transport_endpoint: api/v1/lineage
+ namespace: feast
+ """
+
+ enabled: StrictBool = False
+ """ bool: Whether OpenLineage integration is enabled. Defaults to False. """
+
+ transport_type: StrictStr = "console"
+ """ str: Type of transport (http, console, file, kafka). Defaults to console. """
+
+ transport_url: Optional[StrictStr] = None
+ """ str: URL for HTTP transport. Required when transport_type is 'http'. """
+
+ transport_endpoint: StrictStr = "api/v1/lineage"
+ """ str: API endpoint for HTTP transport. Defaults to 'api/v1/lineage'. """
+
+ api_key: Optional[StrictStr] = None
+ """ str: Optional API key for authentication with the lineage server. """
+
+ namespace: StrictStr = "feast"
+ """ str: Default namespace for Feast jobs and datasets. """
+
+ producer: StrictStr = "feast"
+ """ str: Producer identifier for OpenLineage events. """
+
+ emit_on_apply: StrictBool = True
+ """ bool: Emit lineage events when 'feast apply' is called. """
+
+ emit_on_materialize: StrictBool = True
+ """ bool: Emit lineage events during materialization. """
+
+ def to_openlineage_config(self):
+ """Convert to feast.openlineage.OpenLineageConfig."""
+ from feast.openlineage.config import OpenLineageConfig as OLConfig
+
+ return OLConfig(
+ enabled=self.enabled,
+ transport_type=self.transport_type,
+ transport_url=self.transport_url,
+ transport_endpoint=self.transport_endpoint,
+ api_key=self.api_key,
+ namespace=self.namespace,
+ producer=self.producer,
+ emit_on_apply=self.emit_on_apply,
+ emit_on_materialize=self.emit_on_materialize,
+ )
+
+
class RepoConfig(FeastBaseModel):
"""Repo config. Typically loaded from `feature_store.yaml`"""
@@ -253,6 +312,9 @@ class RepoConfig(FeastBaseModel):
)
""" MaterializationConfig: Configuration options for feature materialization behavior. """
+ openlineage_config: Optional[OpenLineageConfig] = Field(None, alias="openlineage")
+ """ OpenLineageConfig: Configuration for OpenLineage data lineage integration (optional). """
+
def __init__(self, **data: Any):
super().__init__(**data)
@@ -288,6 +350,11 @@ def __init__(self, **data: Any):
self.feature_server["type"]
)(**self.feature_server)
+ # Initialize OpenLineage configuration
+ self._openlineage: Optional[OpenLineageConfig] = None
+ if "openlineage" in data:
+ self.openlineage_config = data["openlineage"]
+
if self.entity_key_serialization_version < 3:
warnings.warn(
"The serialization version below 3 are deprecated. "
@@ -391,6 +458,16 @@ def batch_engine(self):
return self._batch_engine
+ @property
+ def openlineage(self) -> Optional[OpenLineageConfig]:
+ """Get the OpenLineage configuration."""
+ if not self._openlineage:
+ if isinstance(self.openlineage_config, Dict):
+ self._openlineage = OpenLineageConfig(**self.openlineage_config)
+ elif self.openlineage_config:
+ self._openlineage = self.openlineage_config
+ return self._openlineage
+
@model_validator(mode="before")
def _validate_auth_config(cls, values: Any) -> Any:
from feast.permissions.auth_model import AuthConfig
diff --git a/sdk/python/requirements/py3.10-ci-requirements.txt b/sdk/python/requirements/py3.10-ci-requirements.txt
index a88b29f085b..c86a8fcf0ea 100644
--- a/sdk/python/requirements/py3.10-ci-requirements.txt
+++ b/sdk/python/requirements/py3.10-ci-requirements.txt
@@ -249,6 +249,7 @@ attrs==25.4.0 \
# aiohttp
# jsonlines
# jsonschema
+ # openlineage-python
# referencing
azure-core==1.38.0 \
--hash=sha256:8194d2682245a3e4e3151a667c686464c3786fed7918b394d035bdcd61bb5993 \
@@ -306,6 +307,7 @@ build==1.4.0 \
--hash=sha256:f1b91b925aa322be454f8330c6fb48b465da993d1e7e7e6fa35027ec49f3c936
# via
# feast (setup.py)
+ # openlineage-python
# pip-tools
# singlestoredb
cassandra-driver==3.29.3 \
@@ -929,16 +931,16 @@ docling==2.27.0 \
--hash=sha256:1288ed75b27e33bf94daff34faffc6d11b7d7ccc13e3df84fb24adad3991f72d \
--hash=sha256:faba35662612a2c687a3a463e501d95f645316436084af92a0442ce162429a3d
# via feast (setup.py)
-docling-core[chunking]==2.60.1 \
- --hash=sha256:45390e50cb4d83a70e2384c70a46e6e64acb15e69674d9d2c67315155f252aef \
- --hash=sha256:64bd71dee243bd11b25f216fec219e046a130b851b8e1d0c0dd362a4aac0e994
+docling-core[chunking]==2.60.2 \
+ --hash=sha256:63aee783f06240455c12c30e9af383b80d7ade80c896f81d68a4aff6cde2e2a1 \
+ --hash=sha256:7a99e1671e796e39d0c735b7ae3833766a97ad287e15d434dfa417917e3b0e6d
# via
# docling
# docling-ibm-models
# docling-parse
-docling-ibm-models==3.10.3 \
- --hash=sha256:6be756e45df155a367087b93e0e5f2d65905e7e81a5f57c1d3ae57096631655a \
- --hash=sha256:e034d1398c99059998da18e38ef80af8a5d975f04de17f6e93efa075fb29cac4
+docling-ibm-models==3.11.0 \
+ --hash=sha256:454401563a8e79cb33b718bc559d9bacca8a0183583e48f8e616c9184c1f5eb1 \
+ --hash=sha256:68f7961069d643bfdab21b1c9ef24a979db293496f4c2283d95b1025a9ac5347
# via docling
docling-parse==4.7.3 \
--hash=sha256:1790e7e4ae202d67875c1c48fd6f8ef5c51d10b0c23157e4989b8673f2f31308 \
@@ -1526,9 +1528,9 @@ grpcio-tools==1.62.3 \
--hash=sha256:f4b1615adf67bd8bb71f3464146a6f9949972d06d21a4f5e87e73f6464d97f57 \
--hash=sha256:f6831fdec2b853c9daa3358535c55eed3694325889aa714070528cf8f92d7d6d
# via feast (setup.py)
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
@@ -1732,6 +1734,7 @@ httpx[http2]==0.27.2 \
# fastapi-mcp
# jupyterlab
# mcp
+ # openlineage-python
# python-keycloak
# qdrant-client
httpx-sse==0.4.3 \
@@ -1928,9 +1931,9 @@ jupyter-server-terminals==0.5.4 \
--hash=sha256:55be353fc74a80bc7f3b20e6be50a55a61cd525626f578dcb66a5708e2007d14 \
--hash=sha256:bbda128ed41d0be9020349f9f1f2a4ab9952a73ed5f5ac9f1419794761fb87f5
# via jupyter-server
-jupyterlab==4.5.2 \
- --hash=sha256:76466ebcfdb7a9bb7e2fbd6459c0e2c032ccf75be673634a84bee4b3e6b13ab6 \
- --hash=sha256:c80a6b9f6dace96a566d590c65ee2785f61e7cd4aac5b4d453dcc7d0d5e069b7
+jupyterlab==4.5.3 \
+ --hash=sha256:4a159f71067cb38e4a82e86a42de8e7e926f384d7f2291964f282282096d27e8 \
+ --hash=sha256:63c9f3a48de72ba00df766ad6eed416394f5bb883829f11eeff0872302520ba7
# via notebook
jupyterlab-pygments==0.3.0 \
--hash=sha256:721aca4d9029252b11cfa9d185e5b5af4d54772bb8072f9b7036f4170054d35d \
@@ -2889,6 +2892,9 @@ opencv-python-headless==4.13.0.90 \
--hash=sha256:eba38bc255d0b7d1969c5bcc90a060ca2b61a3403b613872c750bfa5dfe9e03b \
--hash=sha256:f46b17ea0aa7e4124ca6ad71143f89233ae9557f61d2326bcdb34329a1ddf9bd
# via easyocr
+openlineage-python==1.43.0 \
+ --hash=sha256:595dc641f696d0a1c021440a9ff8155f4e2776452cf118112a09b12cf4038827
+ # via feast (setup.py)
openpyxl==3.1.5 \
--hash=sha256:5282c12b107bffeef825f4617dc029afaf41d0ea60823bbb665ef3079dc79de2 \
--hash=sha256:cf0e3cf56142039133628b5acffe8ef0c12bc902d2aadd3e0fe5878dc08d1050
@@ -3011,6 +3017,7 @@ packaging==26.0 \
# lazy-loader
# marshmallow
# nbconvert
+ # openlineage-python
# pandas-gbq
# pytest
# ray
@@ -4268,6 +4275,7 @@ python-dateutil==2.9.0 \
# jupyter-client
# kubernetes
# moto
+ # openlineage-python
# pandas
# trino
python-docx==1.2.0 \
@@ -4391,6 +4399,7 @@ pyyaml==6.0.3 \
# huggingface-hub
# jupyter-events
# kubernetes
+ # openlineage-python
# pre-commit
# ray
# responses
@@ -4681,6 +4690,7 @@ requests==2.32.5 \
# kubernetes
# moto
# msal
+ # openlineage-python
# python-keycloak
# ray
# requests-oauthlib
@@ -5700,66 +5710,75 @@ transformers==4.57.6 \
# feast (setup.py)
# docling-core
# docling-ibm-models
-tree-sitter==0.24.0 \
- --hash=sha256:01ea01a7003b88b92f7f875da6ba9d5d741e0c84bb1bd92c503c0eecd0ee6409 \
- --hash=sha256:033506c1bc2ba7bd559b23a6bdbeaf1127cee3c68a094b82396718596dfe98bc \
- --hash=sha256:098a81df9f89cf254d92c1cd0660a838593f85d7505b28249216661d87adde4a \
- --hash=sha256:0b26bf9e958da6eb7e74a081aab9d9c7d05f9baeaa830dbb67481898fd16f1f5 \
- --hash=sha256:0d4a6416ed421c4210f0ca405a4834d5ccfbb8ad6692d4d74f7773ef68f92071 \
- --hash=sha256:14beeff5f11e223c37be7d5d119819880601a80d0399abe8c738ae2288804afc \
- --hash=sha256:23641bd25dcd4bb0b6fa91b8fb3f46cc9f1c9f475efe4d536d3f1f688d1b84c8 \
- --hash=sha256:24a8dd03b0d6b8812425f3b84d2f4763322684e38baf74e5bb766128b5633dc7 \
- --hash=sha256:26a5b130f70d5925d67b47db314da209063664585a2fd36fa69e0717738efaf4 \
- --hash=sha256:2a84ff87a2f2a008867a1064aba510ab3bd608e3e0cd6e8fef0379efee266c73 \
- --hash=sha256:3b1f3cbd9700e1fba0be2e7d801527e37c49fc02dc140714669144ef6ab58dce \
- --hash=sha256:464fa5b2cac63608915a9de8a6efd67a4da1929e603ea86abaeae2cb1fe89921 \
- --hash=sha256:4ddb113e6b8b3e3b199695b1492a47d87d06c538e63050823d90ef13cac585fd \
- --hash=sha256:57277a12fbcefb1c8b206186068d456c600dbfbc3fd6c76968ee22614c5cd5ad \
- --hash=sha256:5fc5c3c26d83c9d0ecb4fc4304fba35f034b7761d35286b936c1db1217558b4e \
- --hash=sha256:772e1bd8c0931c866b848d0369b32218ac97c24b04790ec4b0e409901945dd8e \
- --hash=sha256:7d5d9537507e1c8c5fa9935b34f320bfec4114d675e028f3ad94f11cf9db37b9 \
- --hash=sha256:a7c9c89666dea2ce2b2bf98e75f429d2876c569fab966afefdcd71974c6d8538 \
- --hash=sha256:abd95af65ca2f4f7eca356343391ed669e764f37748b5352946f00f7fc78e734 \
- --hash=sha256:c012e4c345c57a95d92ab5a890c637aaa51ab3b7ff25ed7069834b1087361c95 \
- --hash=sha256:d25fa22766d63f73716c6fec1a31ee5cf904aa429484256bd5fdf5259051ed74 \
- --hash=sha256:de0fb7c18c6068cacff46250c0a0473e8fc74d673e3e86555f131c2c1346fb13 \
- --hash=sha256:e0992d483677e71d5c5d37f30dfb2e3afec2f932a9c53eec4fca13869b788c6c \
- --hash=sha256:f3f00feff1fc47a8e4863561b8da8f5e023d382dd31ed3e43cd11d4cae445445 \
- --hash=sha256:f3f08a2ca9f600b3758792ba2406971665ffbad810847398d180c48cee174ee2 \
- --hash=sha256:f58bb4956917715ec4d5a28681829a8dad5c342cafd4aea269f9132a83ca9b34 \
- --hash=sha256:f733a83d8355fc95561582b66bbea92ffd365c5d7a665bc9ebd25e049c2b2abb \
- --hash=sha256:f9691be48d98c49ef8f498460278884c666b44129222ed6217477dffad5d4831 \
- --hash=sha256:f9e8b1605ab60ed43803100f067eed71b0b0e6c1fb9860a262727dbfbbb74751
+tree-sitter==0.25.2 \
+ --hash=sha256:0628671f0de69bb279558ef6b640bcfc97864fe0026d840f872728a86cd6b6cd \
+ --hash=sha256:0c8b6682cac77e37cfe5cf7ec388844957f48b7bd8d6321d0ca2d852994e10d5 \
+ --hash=sha256:1799609636c0193e16c38f366bda5af15b1ce476df79ddaae7dd274df9e44266 \
+ --hash=sha256:20b570690f87f1da424cd690e51cc56728d21d63f4abd4b326d382a30353acc7 \
+ --hash=sha256:260586381b23be33b6191a07cea3d44ecbd6c01aa4c6b027a0439145fcbc3358 \
+ --hash=sha256:3e65ae456ad0d210ee71a89ee112ac7e72e6c2e5aac1b95846ecc7afa68a194c \
+ --hash=sha256:44488e0e78146f87baaa009736886516779253d6d6bac3ef636ede72bc6a8234 \
+ --hash=sha256:463c032bd02052d934daa5f45d183e0521ceb783c2548501cf034b0beba92c9b \
+ --hash=sha256:4973b718fcadfb04e59e746abfbb0288694159c6aeecd2add59320c03368c721 \
+ --hash=sha256:49ee3c348caa459244ec437ccc7ff3831f35977d143f65311572b8ba0a5f265f \
+ --hash=sha256:56ac6602c7d09c2c507c55e58dc7026b8988e0475bd0002f8a386cce5e8e8adc \
+ --hash=sha256:65d3c931013ea798b502782acab986bbf47ba2c452610ab0776cf4a8ef150fc0 \
+ --hash=sha256:6d0302550bbe4620a5dc7649517c4409d74ef18558276ce758419cf09e578897 \
+ --hash=sha256:72a510931c3c25f134aac2daf4eb4feca99ffe37a35896d7150e50ac3eee06c7 \
+ --hash=sha256:7712335855b2307a21ae86efe949c76be36c6068d76df34faa27ce9ee40ff444 \
+ --hash=sha256:7d2ee1acbacebe50ba0f85fff1bc05e65d877958f00880f49f9b2af38dce1af0 \
+ --hash=sha256:a0ec41b895da717bc218a42a3a7a0bfcfe9a213d7afaa4255353901e0e21f696 \
+ --hash=sha256:a925364eb7fbb9cdce55a9868f7525a1905af512a559303bd54ef468fd88cb37 \
+ --hash=sha256:b3d11a3a3ac89bb8a2543d75597f905a9926f9c806f40fcca8242922d1cc6ad5 \
+ --hash=sha256:b3f63a1796886249bd22c559a5944d64d05d43f2be72961624278eff0dcc5cb8 \
+ --hash=sha256:b43a9e4c89d4d0839de27cd4d6902d33396de700e9ff4c5ab7631f277a85ead9 \
+ --hash=sha256:b878e296e63661c8e124177cc3084b041ba3f5936b43076d57c487822426f614 \
+ --hash=sha256:b8ca72d841215b6573ed0655b3a5cd1133f9b69a6fa561aecad40dca9029d75b \
+ --hash=sha256:b8d4429954a3beb3e844e2872610d2a4800ba4eb42bb1990c6a4b1949b18459f \
+ --hash=sha256:bd88fbb0f6c3a0f28f0a68d72df88e9755cf5215bae146f5a1bdc8362b772053 \
+ --hash=sha256:bda059af9d621918efb813b22fb06b3fe00c3e94079c6143fcb2c565eb44cb87 \
+ --hash=sha256:c0c0ab5f94938a23fe81928a21cc0fac44143133ccc4eb7eeb1b92f84748331c \
+ --hash=sha256:c2f8e7d6b2f8489d4a9885e3adcaef4bc5ff0a275acd990f120e29c4ab3395c5 \
+ --hash=sha256:cc0351cfe5022cec5a77645f647f92a936b38850346ed3f6d6babfbeeeca4d26 \
+ --hash=sha256:d77605e0d353ba3fe5627e5490f0fbfe44141bafa4478d88ef7954a61a848dae \
+ --hash=sha256:dd12d80d91d4114ca097626eb82714618dcdfacd6a5e0955216c6485c350ef99 \
+ --hash=sha256:ddabfff809ffc983fc9963455ba1cecc90295803e06e140a4c83e94c1fa3d960 \
+ --hash=sha256:eac4e8e4c7060c75f395feec46421eb61212cb73998dbe004b7384724f3682ab \
+ --hash=sha256:f5ddcd3e291a749b62521f71fc953f66f5fd9743973fd6dd962b092773569601 \
+ --hash=sha256:fbb1706407c0e451c4f8cc016fec27d72d4b211fdd3173320b1ada7a6c74c3ac \
+ --hash=sha256:fe43c158555da46723b28b52e058ad444195afd1db3ca7720c59a254544e9c20
# via docling-core
-tree-sitter-c==0.23.4 \
- --hash=sha256:013403e74765d74e523f380f9df8f3d99e9fe94132a3fc0c8b29cba538a7b2bf \
- --hash=sha256:2c92c0571b36b6da06f8882f34151dc11e67a493e9101cc0026a16da27709c05 \
- --hash=sha256:5e42a3519825ca59c91b2b7aec08dd3c89e02690c7b315d54a1e1743f9be3f15 \
- --hash=sha256:9215c7888dd019038f162ea5646178f6e129cd2b49fc506d14becf5e426121d7 \
- --hash=sha256:98c285a23bf4fb6fb34140d6ea0f0d25d0a93e0d93692f9dffe3db6d1fe08534 \
- --hash=sha256:a4d7bdeaca8f1da72352a945853f56aa5d34e7bc22569ec5bda5d7c1a04e5b0f \
- --hash=sha256:c15c7588c3d95872328019073a8d5eaf7c2691b4d4ef0393a0168399b2ad2356 \
- --hash=sha256:edd36e12cc79b8b5bbc81fc336ff7d2577d0fe16afd18163c9aff7ae3ff69e15
+tree-sitter-c==0.24.1 \
+ --hash=sha256:290bff0f9c79c966496ebae45042f77543e6e4aea725f40587a8611d566231a8 \
+ --hash=sha256:789781afcb710df34144f7e2a20cd80e325114b9119e3956c6bd1dd2d365df98 \
+ --hash=sha256:7d2d0cda0b8dda428c81440c1e94367f9f13548eedca3f49768bde66b1422ad6 \
+ --hash=sha256:942bcd7cbecd810dcf7ca6f8f834391ebf0771a89479646d891ba4ca2fdfdc88 \
+ --hash=sha256:9a74cfd7a11ca5a961fafd4d751892ee65acae667d2818968a6f079397d8d28c \
+ --hash=sha256:9c06ac26a1efdcc8b26a8a6970fbc6997c4071857359e5837d4c42892d45fe1e \
+ --hash=sha256:a6a807705a3978911dc7ee26a7ad36dcfacb6adfc13c190d496660ec9bd66707 \
+ --hash=sha256:d46bbda06f838c2dcb91daf767813671fd366b49ad84ff37db702129267b46e1
# via docling-core
-tree-sitter-javascript==0.23.1 \
- --hash=sha256:041fa22b34250ea6eb313d33104d5303f79504cb259d374d691e38bbdc49145b \
- --hash=sha256:056dc04fb6b24293f8c5fec43c14e7e16ba2075b3009c643abf8c85edc4c7c3c \
- --hash=sha256:5a6bc1055b061c5055ec58f39ee9b2e9efb8e6e0ae970838af74da0afb811f0a \
- --hash=sha256:6ca583dad4bd79d3053c310b9f7208cd597fd85f9947e4ab2294658bb5c11e35 \
- --hash=sha256:94100e491a6a247aa4d14caf61230c171b6376c863039b6d9cd71255c2d815ec \
- --hash=sha256:a11ca1c0f736da42967586b568dff8a465ee148a986c15ebdc9382806e0ce871 \
- --hash=sha256:b2059ce8b150162cda05a457ca3920450adbf915119c04b8c67b5241cd7fcfed \
- --hash=sha256:eb28130cd2fb30d702d614cbf61ef44d1c7f6869e7d864a9cc17111e370be8f7
+tree-sitter-javascript==0.25.0 \
+ --hash=sha256:199d09985190852e0912da2b8d26c932159be314bc04952cf917ed0e4c633e6b \
+ --hash=sha256:1b852d3aee8a36186dbcc32c798b11b4869f9b5041743b63b65c2ef793db7a54 \
+ --hash=sha256:329b5414874f0588a98f1c291f1b28138286617aa907746ffe55adfdcf963f38 \
+ --hash=sha256:622a69d677aa7f6ee2931d8c77c981a33f0ebb6d275aa9d43d3397c879a9bb0b \
+ --hash=sha256:8264a996b8845cfce06965152a013b5d9cbb7d199bc3503e12b5682e62bb1de1 \
+ --hash=sha256:9dc04ba91fc8583344e57c1f1ed5b2c97ecaaf47480011b92fbeab8dda96db75 \
+ --hash=sha256:b70f887fb269d6e58c349d683f59fa647140c410cfe2bee44a883b20ec92e3dc \
+ --hash=sha256:dfcf789064c58dc13c0a4edb550acacfc6f0f280577f1e7a00de3e89fc7f8ddc \
+ --hash=sha256:e5ed840f5bd4a3f0272e441d19429b26eedc257abe5574c8546da6b556865e3c
# via docling-core
-tree-sitter-python==0.23.6 \
- --hash=sha256:28fbec8f74eeb2b30292d97715e60fac9ccf8a8091ce19b9d93e9b580ed280fb \
- --hash=sha256:29dacdc0cd2f64e55e61d96c6906533ebb2791972bec988450c46cce60092f5d \
- --hash=sha256:354bfa0a2f9217431764a631516f85173e9711af2c13dbd796a8815acfe505d9 \
- --hash=sha256:680b710051b144fedf61c95197db0094f2245e82551bf7f0c501356333571f7a \
- --hash=sha256:71334371bd73d5fe080aed39fbff49ed8efb9506edebe16795b0c7567ed6a272 \
- --hash=sha256:7e048733c36f564b379831689006801feb267d8194f9e793fbb395ef1723335d \
- --hash=sha256:8a9dcef55507b6567207e8ee0a6b053d0688019b47ff7f26edc1764b7f4dc0a4 \
- --hash=sha256:a24027248399fb41594b696f929f9956828ae7cc85596d9f775e6c239cd0c2be
+tree-sitter-python==0.25.0 \
+ --hash=sha256:0fbf6a3774ad7e89ee891851204c2e2c47e12b63a5edbe2e9156997731c128bb \
+ --hash=sha256:14a79a47ddef72f987d5a2c122d148a812169d7484ff5c75a3db9609d419f361 \
+ --hash=sha256:480c21dbd995b7fe44813e741d71fed10ba695e7caab627fb034e3828469d762 \
+ --hash=sha256:71959832fc5d9642e52c11f2f7d79ae520b461e63334927e93ca46cd61cd9683 \
+ --hash=sha256:86f118e5eecad616ecdb81d171a36dde9bef5a0b21ed71ea9c3e390813c3baf5 \
+ --hash=sha256:9bcde33f18792de54ee579b00e1b4fe186b7926825444766f849bf7181793a76 \
+ --hash=sha256:b13e090f725f5b9c86aa455a268553c65cadf325471ad5b65cd29cac8a1a68ac \
+ --hash=sha256:be71650ca2b93b6e9649e5d65c6811aad87a7614c8c1003246b303f6b150f61b \
+ --hash=sha256:e6d5b5799628cc0f24691ab2a172a8e676f668fe90dc60468bee14084a35c16d
# via docling-core
tree-sitter-typescript==0.23.2 \
--hash=sha256:05db58f70b95ef0ea126db5560f3775692f609589ed6f8dd0af84b7f19f1cbb7 \
@@ -5804,9 +5823,9 @@ types-pyopenssl==24.1.0.20240722 \
--hash=sha256:47913b4678a01d879f503a12044468221ed8576263c1540dcb0484ca21b08c39 \
--hash=sha256:6a7a5d2ec042537934cfb4c9d4deb0e16c4c6250b09358df1f083682fe6fda54
# via types-redis
-types-python-dateutil==2.9.0.20251115 \
- --hash=sha256:8a47f2c3920f52a994056b8786309b43143faa5a64d4cbb2722d6addabdf1a58 \
- --hash=sha256:9cf9c1c582019753b8639a081deefd7e044b9fa36bd8217f565c6c4e36ee0624
+types-python-dateutil==2.9.0.20260124 \
+ --hash=sha256:7d2db9f860820c30e5b8152bfe78dbdf795f7d1c6176057424e8b3fdd1f581af \
+ --hash=sha256:f802977ae08bf2260142e7ca1ab9d4403772a254409f7bbdf652229997124951
# via feast (setup.py)
types-pytz==2025.2.0.20251108 \
--hash=sha256:0f1c9792cab4eb0e46c52f8845c8f77cf1e313cb3d68bf826aa867fe4717d91c \
@@ -5824,9 +5843,9 @@ types-requests==2.30.0.0 \
--hash=sha256:c6cf08e120ca9f0dc4fa4e32c3f953c3fba222bcc1db6b97695bce8da1ba9864 \
--hash=sha256:dec781054324a70ba64430ae9e62e7e9c8e4618c185a5cb3f87a6738251b5a31
# via feast (setup.py)
-types-setuptools==80.9.0.20251223 \
- --hash=sha256:1b36db79d724c2287d83dc052cf887b47c0da6a2fff044378be0b019545f56e6 \
- --hash=sha256:d3411059ae2f5f03985217d86ac6084efea2c9e9cacd5f0869ef950f308169b2
+types-setuptools==80.10.0.20260124 \
+ --hash=sha256:1b86d9f0368858663276a0cbe5fe5a9722caf94b5acde8aba0399a6e90680f20 \
+ --hash=sha256:efed7e044f01adb9c2806c7a8e1b6aa3656b8e382379b53d5f26ee3db24d4c01
# via
# feast (setup.py)
# types-cffi
@@ -6187,9 +6206,9 @@ watchfiles==1.1.1 \
--hash=sha256:f8979280bdafff686ba5e4d8f97840f929a87ed9cdf133cbbd42f7766774d2aa \
--hash=sha256:f9a2ae5c91cecc9edd47e041a930490c31c3afb1f5e6d71de3dc671bfaca02bf
# via uvicorn
-wcwidth==0.3.1 \
- --hash=sha256:5aedb626a9c0d941b990cfebda848d538d45c9493a3384d080aff809143bd3be \
- --hash=sha256:b2d355df3ec5d51bfc973a22fb4ea9a03b12fdcbf00d0abd22a2c78b12ccc177
+wcwidth==0.3.2 \
+ --hash=sha256:817abc6a89e47242a349b5d100cbd244301690d6d8d2ec6335f26fe6640a6315 \
+ --hash=sha256:d469b3059dab6b1077def5923ed0a8bf5738bd4a1a87f686d5e2de455354c4ad
# via prompt-toolkit
webcolors==25.10.0 \
--hash=sha256:032c727334856fc0b968f63daa252a1ac93d33db2f5267756623c210e57a4f1d \
diff --git a/sdk/python/requirements/py3.10-minimal-requirements.txt b/sdk/python/requirements/py3.10-minimal-requirements.txt
index f6f56b0b337..51b2b91df08 100644
--- a/sdk/python/requirements/py3.10-minimal-requirements.txt
+++ b/sdk/python/requirements/py3.10-minimal-requirements.txt
@@ -883,9 +883,9 @@ grpcio-status==1.62.3 \
--hash=sha256:289bdd7b2459794a12cf95dc0cb727bd4a1742c37bd823f760236c937e53a485 \
--hash=sha256:f9049b762ba8de6b1086789d8315846e094edac2c50beaf462338b301a8fd4b8
# via google-api-core
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
diff --git a/sdk/python/requirements/py3.10-minimal-sdist-requirements-build.txt b/sdk/python/requirements/py3.10-minimal-sdist-requirements-build.txt
index a7bd01d1d7d..3a21d336cd6 100644
--- a/sdk/python/requirements/py3.10-minimal-sdist-requirements-build.txt
+++ b/sdk/python/requirements/py3.10-minimal-sdist-requirements-build.txt
@@ -792,9 +792,9 @@ types-psutil==7.0.0.20250218 \
--hash=sha256:1447a30c282aafefcf8941ece854e1100eee7b0296a9d9be9977292f0269b121 \
--hash=sha256:1e642cdafe837b240295b23b1cbd4691d80b08a07d29932143cbbae30eb0db9c
# via mypy
-types-setuptools==80.9.0.20251223 \
- --hash=sha256:1b36db79d724c2287d83dc052cf887b47c0da6a2fff044378be0b019545f56e6 \
- --hash=sha256:d3411059ae2f5f03985217d86ac6084efea2c9e9cacd5f0869ef950f308169b2
+types-setuptools==80.10.0.20260124 \
+ --hash=sha256:1b86d9f0368858663276a0cbe5fe5a9722caf94b5acde8aba0399a6e90680f20 \
+ --hash=sha256:efed7e044f01adb9c2806c7a8e1b6aa3656b8e382379b53d5f26ee3db24d4c01
# via mypy
typing-extensions==4.15.0 \
--hash=sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466 \
diff --git a/sdk/python/requirements/py3.10-minimal-sdist-requirements.txt b/sdk/python/requirements/py3.10-minimal-sdist-requirements.txt
index 21b9d0070b3..f7d14118214 100644
--- a/sdk/python/requirements/py3.10-minimal-sdist-requirements.txt
+++ b/sdk/python/requirements/py3.10-minimal-sdist-requirements.txt
@@ -892,55 +892,60 @@ googleapis-common-protos[grpc]==1.72.0 \
# google-api-core
# grpc-google-iam-v1
# grpcio-status
-greenlet==3.3.0 \
- --hash=sha256:047ab3df20ede6a57c35c14bf5200fcf04039d50f908270d3f9a7a82064f543b \
- --hash=sha256:087ea5e004437321508a8d6f20efc4cfec5e3c30118e1417ea96ed1d93950527 \
- --hash=sha256:0a5d554d0712ba1de0a6c94c640f7aeba3f85b3a6e1f2899c11c2c0428da9365 \
- --hash=sha256:2662433acbca297c9153a4023fe2161c8dcfdcc91f10433171cf7e7d94ba2221 \
- --hash=sha256:286d093f95ec98fdd92fcb955003b8a3d054b4e2cab3e2707a5039e7b50520fd \
- --hash=sha256:2d9ad37fc657b1102ec880e637cccf20191581f75c64087a549e66c57e1ceb53 \
- --hash=sha256:2de5a0b09eab81fc6a382791b995b1ccf2b172a9fec934747a7a23d2ff291794 \
- --hash=sha256:30a6e28487a790417d036088b3bcb3f3ac7d8babaa7d0139edbaddebf3af9492 \
- --hash=sha256:349345b770dc88f81506c6861d22a6ccd422207829d2c854ae2af8025af303e3 \
- --hash=sha256:39b28e339fc3c348427560494e28d8a6f3561c8d2bcf7d706e1c624ed8d822b9 \
- --hash=sha256:3a898b1e9c5f7307ebbde4102908e6cbfcb9ea16284a3abe15cab996bee8b9b3 \
- --hash=sha256:3c6e9b9c1527a78520357de498b0e709fb9e2f49c3a513afd5a249007261911b \
- --hash=sha256:4243050a88ba61842186cb9e63c7dfa677ec146160b0efd73b855a3d9c7fcf32 \
- --hash=sha256:4449a736606bd30f27f8e1ff4678ee193bc47f6ca810d705981cfffd6ce0d8c5 \
- --hash=sha256:5375d2e23184629112ca1ea89a53389dddbffcf417dad40125713d88eb5f96e8 \
- --hash=sha256:5773edda4dc00e173820722711d043799d3adb4f01731f40619e07ea2750b955 \
- --hash=sha256:60c2ef0f578afb3c8d92ea07ad327f9a062547137afe91f38408f08aacab667f \
- --hash=sha256:670d0f94cd302d81796e37299bcd04b95d62403883b24225c6b5271466612f45 \
- --hash=sha256:6c10513330af5b8ae16f023e8ddbfb486ab355d04467c4679c5cfe4659975dd9 \
- --hash=sha256:6cb3a8ec3db4a3b0eb8a3c25436c2d49e3505821802074969db017b87bc6a948 \
- --hash=sha256:6f8496d434d5cb2dce025773ba5597f71f5410ae499d5dd9533e0653258cdb3d \
- --hash=sha256:73631cd5cccbcfe63e3f9492aaa664d278fda0ce5c3d43aeda8e77317e38efbd \
- --hash=sha256:73f51dd0e0bdb596fb0417e475fa3c5e32d4c83638296e560086b8d7da7c4170 \
- --hash=sha256:7652ee180d16d447a683c04e4c5f6441bae7ba7b17ffd9f6b3aff4605e9e6f71 \
- --hash=sha256:7d2d9fd66bfadf230b385fdc90426fcd6eb64db54b40c495b72ac0feb5766c54 \
- --hash=sha256:7dee147740789a4632cace364816046e43310b59ff8fb79833ab043aefa72fd5 \
- --hash=sha256:83cd0e36932e0e7f36a64b732a6f60c2fc2df28c351bae79fbaf4f8092fe7614 \
- --hash=sha256:87e63ccfa13c0a0f6234ed0add552af24cc67dd886731f2261e46e241608bee3 \
- --hash=sha256:9ee1942ea19550094033c35d25d20726e4f1c40d59545815e1128ac58d416d38 \
- --hash=sha256:9f515a47d02da4d30caaa85b69474cec77b7929b2e936ff7fb853d42f4bf8808 \
- --hash=sha256:a1e41a81c7e2825822f4e068c48cb2196002362619e2d70b148f20a831c00739 \
- --hash=sha256:a687205fb22794e838f947e2194c0566d3812966b41c78709554aa883183fb62 \
- --hash=sha256:a7a34b13d43a6b78abf828a6d0e87d3385680eaf830cd60d20d52f249faabf39 \
- --hash=sha256:a82bb225a4e9e4d653dd2fb7b8b2d36e4fb25bc0165422a11e48b88e9e6f78fb \
- --hash=sha256:ab97cf74045343f6c60a39913fa59710e4bd26a536ce7ab2397adf8b27e67c39 \
- --hash=sha256:ac0549373982b36d5fd5d30beb8a7a33ee541ff98d2b502714a09f1169f31b55 \
- --hash=sha256:b01548f6e0b9e9784a2c99c5651e5dc89ffcbe870bc5fb2e5ef864e9cc6b5dcb \
- --hash=sha256:b299a0cb979f5d7197442dccc3aee67fce53500cd88951b7e6c35575701c980b \
- --hash=sha256:b3c374782c2935cc63b2a27ba8708471de4ad1abaa862ffdb1ef45a643ddbb7d \
- --hash=sha256:b49e7ed51876b459bd645d83db257f0180e345d3f768a35a85437a24d5a49082 \
- --hash=sha256:b96dc7eef78fd404e022e165ec55327f935b9b52ff355b067eb4a0267fc1cffb \
- --hash=sha256:c024b1e5696626890038e34f76140ed1daf858e37496d33f2af57f06189e70d7 \
- --hash=sha256:d198d2d977460358c3b3a4dc844f875d1adb33817f0613f663a656f463764ccc \
- --hash=sha256:d6ed6f85fae6cdfdb9ce04c9bf7a08d666cfcfb914e7d006f44f840b46741931 \
- --hash=sha256:d9125050fcf24554e69c4cacb086b87b3b55dc395a8b3ebe6487b045b2614388 \
- --hash=sha256:dcd2bdbd444ff340e8d6bdf54d2f206ccddbb3ccfdcd3c25bf4afaa7b8f0cf45 \
- --hash=sha256:e29f3018580e8412d6aaf5641bb7745d38c85228dacf51a73bd4e26ddf2a6a8e \
- --hash=sha256:e8e18ed6995e9e2c0b4ed264d2cf89260ab3ac7e13555b8032b25a74c6d18655
+greenlet==3.3.1 \
+ --hash=sha256:02925a0bfffc41e542c70aa14c7eda3593e4d7e274bfcccca1827e6c0875902e \
+ --hash=sha256:04bee4775f40ecefcdaa9d115ab44736cd4b9c5fba733575bfe9379419582e13 \
+ --hash=sha256:070472cd156f0656f86f92e954591644e158fd65aa415ffbe2d44ca77656a8f5 \
+ --hash=sha256:09f51496a0bfbaa9d74d36a52d2580d1ef5ed4fdfcff0a73730abfbbbe1403dd \
+ --hash=sha256:1108b61b06b5224656121c3c8ee8876161c491cbe74e5c519e0634c837cf93d5 \
+ --hash=sha256:12184c61e5d64268a160226fb4818af4df02cfead8379d7f8b99a56c3a54ff3e \
+ --hash=sha256:14194f5f4305800ff329cbf02c5fcc88f01886cadd29941b807668a45f0d2336 \
+ --hash=sha256:20fedaadd422fa02695f82093f9a98bad3dab5fcda793c658b945fcde2ab27ba \
+ --hash=sha256:27289986f4e5b0edec7b5a91063c109f0276abb09a7e9bdab08437525977c946 \
+ --hash=sha256:2f080e028001c5273e0b42690eaf359aeef9cb1389da0f171ea51a5dc3c7608d \
+ --hash=sha256:301860987846c24cb8964bdec0e31a96ad4a2a801b41b4ef40963c1b44f33451 \
+ --hash=sha256:32e4ca9777c5addcbf42ff3915d99030d8e00173a56f80001fb3875998fe410b \
+ --hash=sha256:33a956fe78bbbda82bfc95e128d61129b32d66bcf0a20a1f0c08aa4839ffa951 \
+ --hash=sha256:34a729e2e4e4ffe9ae2408d5ecaf12f944853f40ad724929b7585bca808a9d6f \
+ --hash=sha256:39eda9ba259cc9801da05351eaa8576e9aa83eb9411e8f0c299e05d712a210f2 \
+ --hash=sha256:3a300354f27dd86bae5fbf7002e6dd2b3255cd372e9242c933faf5e859b703fe \
+ --hash=sha256:3e0f3878ca3a3ff63ab4ea478585942b53df66ddde327b59ecb191b19dbbd62d \
+ --hash=sha256:3e63252943c921b90abb035ebe9de832c436401d9c45f262d80e2d06cc659242 \
+ --hash=sha256:41848f3230b58c08bb43dee542e74a2a2e34d3c59dc3076cec9151aeeedcae98 \
+ --hash=sha256:49f4ad195d45f4a66a0eb9c1ba4832bb380570d361912fa3554746830d332149 \
+ --hash=sha256:4b065d3284be43728dd280f6f9a13990b56470b81be20375a207cdc814a983f2 \
+ --hash=sha256:4b9721549a95db96689458a1e0ae32412ca18776ed004463df3a9299c1b257ab \
+ --hash=sha256:50e1457f4fed12a50e427988a07f0f9df53cf0ee8da23fab16e6732c2ec909d4 \
+ --hash=sha256:59913f1e5ada20fde795ba906916aea25d442abcc0593fba7e26c92b7ad76249 \
+ --hash=sha256:5fd23b9bc6d37b563211c6abbb1b3cab27db385a4449af5c32e932f93017080c \
+ --hash=sha256:6423481193bbbe871313de5fd06a082f2649e7ce6e08015d2a76c1e9186ca5b3 \
+ --hash=sha256:65be2f026ca6a176f88fb935ee23c18333ccea97048076aef4db1ef5bc0713ac \
+ --hash=sha256:67ea3fc73c8cd92f42467a72b75e8f05ed51a0e9b1d15398c913416f2dafd49f \
+ --hash=sha256:71c767cf281a80d02b6c1bdc41c9468e1f5a494fb11bc8688c360524e273d7b1 \
+ --hash=sha256:76e39058e68eb125de10c92524573924e827927df5d3891fbc97bd55764a8774 \
+ --hash=sha256:7932f5f57609b6a3b82cc11877709aa7a98e3308983ed93552a1c377069b20c8 \
+ --hash=sha256:7a3ae05b3d225b4155bda56b072ceb09d05e974bc74be6c3fc15463cf69f33fd \
+ --hash=sha256:7ab327905cabb0622adca5971e488064e35115430cec2c35a50fd36e72a315b3 \
+ --hash=sha256:7b2fe4150a0cf59f847a67db8c155ac36aed89080a6a639e9f16df5d6c6096f1 \
+ --hash=sha256:7e806ca53acf6d15a888405880766ec84721aa4181261cd11a457dfe9a7a4975 \
+ --hash=sha256:80aa4d79eb5564f2e0a6144fcc744b5a37c56c4a92d60920720e99210d88db0f \
+ --hash=sha256:92497c78adf3ac703b57f1e3813c2d874f27f71a178f9ea5887855da413cd6d2 \
+ --hash=sha256:96aff77af063b607f2489473484e39a0bbae730f2ea90c9e5606c9b73c44174a \
+ --hash=sha256:aec9ab04e82918e623415947921dea15851b152b822661cce3f8e4393c3df683 \
+ --hash=sha256:b066e8b50e28b503f604fa538adc764a638b38cf8e81e025011d26e8a627fa79 \
+ --hash=sha256:b31c05dd84ef6871dd47120386aed35323c944d86c3d91a17c4b8d23df62f15b \
+ --hash=sha256:bd59acd8529b372775cd0fcbc5f420ae20681c5b045ce25bd453ed8455ab99b5 \
+ --hash=sha256:bfb2d1763d777de5ee495c85309460f6fd8146e50ec9d0ae0183dbf6f0a829d1 \
+ --hash=sha256:c620051669fd04ac6b60ebc70478210119c56e2d5d5df848baec4312e260e4ca \
+ --hash=sha256:c9f9d5e7a9310b7a2f416dd13d2e3fd8b42d803968ea580b7c0f322ccb389b97 \
+ --hash=sha256:cb0feb07fe6e6a74615ee62a880007d976cf739b6669cce95daa7373d4fc69c5 \
+ --hash=sha256:cc98b9c4e4870fa983436afa999d4eb16b12872fab7071423d5262fa7120d57a \
+ --hash=sha256:d842c94b9155f1c9b3058036c24ffb8ff78b428414a19792b2380be9cecf4f36 \
+ --hash=sha256:da19609432f353fed186cc1b85e9440db93d489f198b4bdf42ae19cc9d9ac9b4 \
+ --hash=sha256:e0093bd1a06d899892427217f0ff2a3c8f306182b8c754336d32e2d587c131b4 \
+ --hash=sha256:e2e7e882f83149f0a71ac822ebf156d902e7a5d22c9045e3e0d1daf59cee2cc9 \
+ --hash=sha256:e84b51cbebf9ae573b5fbd15df88887815e3253fc000a7d0ff95170e8f7e9729 \
+ --hash=sha256:ed6b402bc74d6557a705e197d47f9063733091ed6357b3de33619d8a8d93ac53
# via feast (setup.py)
grpc-google-iam-v1==0.14.3 \
--hash=sha256:7a7f697e017a067206a3dfef44e4c634a34d3dee135fe7d7a4613fe3e59217e6 \
@@ -1025,9 +1030,9 @@ grpcio-status==1.62.3 \
--hash=sha256:289bdd7b2459794a12cf95dc0cb727bd4a1742c37bd823f760236c937e53a485 \
--hash=sha256:f9049b762ba8de6b1086789d8315846e094edac2c50beaf462338b301a8fd4b8
# via google-api-core
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
diff --git a/sdk/python/requirements/py3.10-requirements.txt b/sdk/python/requirements/py3.10-requirements.txt
index 28017085cf3..fe0ad957340 100644
--- a/sdk/python/requirements/py3.10-requirements.txt
+++ b/sdk/python/requirements/py3.10-requirements.txt
@@ -178,9 +178,9 @@ fsspec==2026.1.0 \
--hash=sha256:cb76aa913c2285a3b49bdd5fc55b1d7c708d7208126b60f2eb8194fe1b4cbdcc \
--hash=sha256:e987cb0496a0d81bba3a9d1cee62922fb395e7d4c3b575e57f547953334fe07b
# via dask
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
diff --git a/sdk/python/requirements/py3.11-ci-requirements.txt b/sdk/python/requirements/py3.11-ci-requirements.txt
index 477ecd5b7f2..e1bc6b2771b 100644
--- a/sdk/python/requirements/py3.11-ci-requirements.txt
+++ b/sdk/python/requirements/py3.11-ci-requirements.txt
@@ -253,6 +253,7 @@ attrs==25.4.0 \
# aiohttp
# jsonlines
# jsonschema
+ # openlineage-python
# referencing
azure-core==1.38.0 \
--hash=sha256:8194d2682245a3e4e3151a667c686464c3786fed7918b394d035bdcd61bb5993 \
@@ -375,6 +376,7 @@ build==1.4.0 \
--hash=sha256:f1b91b925aa322be454f8330c6fb48b465da993d1e7e7e6fa35027ec49f3c936
# via
# feast (setup.py)
+ # openlineage-python
# pip-tools
# singlestoredb
cassandra-driver==3.29.3 \
@@ -1009,16 +1011,16 @@ docling==2.27.0 \
--hash=sha256:1288ed75b27e33bf94daff34faffc6d11b7d7ccc13e3df84fb24adad3991f72d \
--hash=sha256:faba35662612a2c687a3a463e501d95f645316436084af92a0442ce162429a3d
# via feast (setup.py)
-docling-core[chunking]==2.60.1 \
- --hash=sha256:45390e50cb4d83a70e2384c70a46e6e64acb15e69674d9d2c67315155f252aef \
- --hash=sha256:64bd71dee243bd11b25f216fec219e046a130b851b8e1d0c0dd362a4aac0e994
+docling-core[chunking]==2.60.2 \
+ --hash=sha256:63aee783f06240455c12c30e9af383b80d7ade80c896f81d68a4aff6cde2e2a1 \
+ --hash=sha256:7a99e1671e796e39d0c735b7ae3833766a97ad287e15d434dfa417917e3b0e6d
# via
# docling
# docling-ibm-models
# docling-parse
-docling-ibm-models==3.10.3 \
- --hash=sha256:6be756e45df155a367087b93e0e5f2d65905e7e81a5f57c1d3ae57096631655a \
- --hash=sha256:e034d1398c99059998da18e38ef80af8a5d975f04de17f6e93efa075fb29cac4
+docling-ibm-models==3.11.0 \
+ --hash=sha256:454401563a8e79cb33b718bc559d9bacca8a0183583e48f8e616c9184c1f5eb1 \
+ --hash=sha256:68f7961069d643bfdab21b1c9ef24a979db293496f4c2283d95b1025a9ac5347
# via docling
docling-parse==4.7.3 \
--hash=sha256:1790e7e4ae202d67875c1c48fd6f8ef5c51d10b0c23157e4989b8673f2f31308 \
@@ -1604,9 +1606,9 @@ grpcio-tools==1.62.3 \
--hash=sha256:f4b1615adf67bd8bb71f3464146a6f9949972d06d21a4f5e87e73f6464d97f57 \
--hash=sha256:f6831fdec2b853c9daa3358535c55eed3694325889aa714070528cf8f92d7d6d
# via feast (setup.py)
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
@@ -1810,6 +1812,7 @@ httpx[http2]==0.27.2 \
# fastapi-mcp
# jupyterlab
# mcp
+ # openlineage-python
# python-keycloak
# qdrant-client
httpx-sse==0.4.3 \
@@ -2016,9 +2019,9 @@ jupyter-server-terminals==0.5.4 \
--hash=sha256:55be353fc74a80bc7f3b20e6be50a55a61cd525626f578dcb66a5708e2007d14 \
--hash=sha256:bbda128ed41d0be9020349f9f1f2a4ab9952a73ed5f5ac9f1419794761fb87f5
# via jupyter-server
-jupyterlab==4.5.2 \
- --hash=sha256:76466ebcfdb7a9bb7e2fbd6459c0e2c032ccf75be673634a84bee4b3e6b13ab6 \
- --hash=sha256:c80a6b9f6dace96a566d590c65ee2785f61e7cd4aac5b4d453dcc7d0d5e069b7
+jupyterlab==4.5.3 \
+ --hash=sha256:4a159f71067cb38e4a82e86a42de8e7e926f384d7f2291964f282282096d27e8 \
+ --hash=sha256:63c9f3a48de72ba00df766ad6eed416394f5bb883829f11eeff0872302520ba7
# via notebook
jupyterlab-pygments==0.3.0 \
--hash=sha256:721aca4d9029252b11cfa9d185e5b5af4d54772bb8072f9b7036f4170054d35d \
@@ -3005,6 +3008,9 @@ opencv-python-headless==4.13.0.90 \
--hash=sha256:eba38bc255d0b7d1969c5bcc90a060ca2b61a3403b613872c750bfa5dfe9e03b \
--hash=sha256:f46b17ea0aa7e4124ca6ad71143f89233ae9557f61d2326bcdb34329a1ddf9bd
# via easyocr
+openlineage-python==1.43.0 \
+ --hash=sha256:595dc641f696d0a1c021440a9ff8155f4e2776452cf118112a09b12cf4038827
+ # via feast (setup.py)
openpyxl==3.1.5 \
--hash=sha256:5282c12b107bffeef825f4617dc029afaf41d0ea60823bbb665ef3079dc79de2 \
--hash=sha256:cf0e3cf56142039133628b5acffe8ef0c12bc902d2aadd3e0fe5878dc08d1050
@@ -3156,6 +3162,7 @@ packaging==26.0 \
# lazy-loader
# marshmallow
# nbconvert
+ # openlineage-python
# pandas-gbq
# pytest
# ray
@@ -4464,6 +4471,7 @@ python-dateutil==2.9.0 \
# jupyter-client
# kubernetes
# moto
+ # openlineage-python
# pandas
# trino
python-docx==1.2.0 \
@@ -4587,6 +4595,7 @@ pyyaml==6.0.3 \
# huggingface-hub
# jupyter-events
# kubernetes
+ # openlineage-python
# openshift-client
# pre-commit
# ray
@@ -4878,6 +4887,7 @@ requests==2.32.5 \
# kubernetes
# moto
# msal
+ # openlineage-python
# python-keycloak
# ray
# requests-oauthlib
@@ -5945,66 +5955,75 @@ transformers==4.57.6 \
# feast (setup.py)
# docling-core
# docling-ibm-models
-tree-sitter==0.24.0 \
- --hash=sha256:01ea01a7003b88b92f7f875da6ba9d5d741e0c84bb1bd92c503c0eecd0ee6409 \
- --hash=sha256:033506c1bc2ba7bd559b23a6bdbeaf1127cee3c68a094b82396718596dfe98bc \
- --hash=sha256:098a81df9f89cf254d92c1cd0660a838593f85d7505b28249216661d87adde4a \
- --hash=sha256:0b26bf9e958da6eb7e74a081aab9d9c7d05f9baeaa830dbb67481898fd16f1f5 \
- --hash=sha256:0d4a6416ed421c4210f0ca405a4834d5ccfbb8ad6692d4d74f7773ef68f92071 \
- --hash=sha256:14beeff5f11e223c37be7d5d119819880601a80d0399abe8c738ae2288804afc \
- --hash=sha256:23641bd25dcd4bb0b6fa91b8fb3f46cc9f1c9f475efe4d536d3f1f688d1b84c8 \
- --hash=sha256:24a8dd03b0d6b8812425f3b84d2f4763322684e38baf74e5bb766128b5633dc7 \
- --hash=sha256:26a5b130f70d5925d67b47db314da209063664585a2fd36fa69e0717738efaf4 \
- --hash=sha256:2a84ff87a2f2a008867a1064aba510ab3bd608e3e0cd6e8fef0379efee266c73 \
- --hash=sha256:3b1f3cbd9700e1fba0be2e7d801527e37c49fc02dc140714669144ef6ab58dce \
- --hash=sha256:464fa5b2cac63608915a9de8a6efd67a4da1929e603ea86abaeae2cb1fe89921 \
- --hash=sha256:4ddb113e6b8b3e3b199695b1492a47d87d06c538e63050823d90ef13cac585fd \
- --hash=sha256:57277a12fbcefb1c8b206186068d456c600dbfbc3fd6c76968ee22614c5cd5ad \
- --hash=sha256:5fc5c3c26d83c9d0ecb4fc4304fba35f034b7761d35286b936c1db1217558b4e \
- --hash=sha256:772e1bd8c0931c866b848d0369b32218ac97c24b04790ec4b0e409901945dd8e \
- --hash=sha256:7d5d9537507e1c8c5fa9935b34f320bfec4114d675e028f3ad94f11cf9db37b9 \
- --hash=sha256:a7c9c89666dea2ce2b2bf98e75f429d2876c569fab966afefdcd71974c6d8538 \
- --hash=sha256:abd95af65ca2f4f7eca356343391ed669e764f37748b5352946f00f7fc78e734 \
- --hash=sha256:c012e4c345c57a95d92ab5a890c637aaa51ab3b7ff25ed7069834b1087361c95 \
- --hash=sha256:d25fa22766d63f73716c6fec1a31ee5cf904aa429484256bd5fdf5259051ed74 \
- --hash=sha256:de0fb7c18c6068cacff46250c0a0473e8fc74d673e3e86555f131c2c1346fb13 \
- --hash=sha256:e0992d483677e71d5c5d37f30dfb2e3afec2f932a9c53eec4fca13869b788c6c \
- --hash=sha256:f3f00feff1fc47a8e4863561b8da8f5e023d382dd31ed3e43cd11d4cae445445 \
- --hash=sha256:f3f08a2ca9f600b3758792ba2406971665ffbad810847398d180c48cee174ee2 \
- --hash=sha256:f58bb4956917715ec4d5a28681829a8dad5c342cafd4aea269f9132a83ca9b34 \
- --hash=sha256:f733a83d8355fc95561582b66bbea92ffd365c5d7a665bc9ebd25e049c2b2abb \
- --hash=sha256:f9691be48d98c49ef8f498460278884c666b44129222ed6217477dffad5d4831 \
- --hash=sha256:f9e8b1605ab60ed43803100f067eed71b0b0e6c1fb9860a262727dbfbbb74751
+tree-sitter==0.25.2 \
+ --hash=sha256:0628671f0de69bb279558ef6b640bcfc97864fe0026d840f872728a86cd6b6cd \
+ --hash=sha256:0c8b6682cac77e37cfe5cf7ec388844957f48b7bd8d6321d0ca2d852994e10d5 \
+ --hash=sha256:1799609636c0193e16c38f366bda5af15b1ce476df79ddaae7dd274df9e44266 \
+ --hash=sha256:20b570690f87f1da424cd690e51cc56728d21d63f4abd4b326d382a30353acc7 \
+ --hash=sha256:260586381b23be33b6191a07cea3d44ecbd6c01aa4c6b027a0439145fcbc3358 \
+ --hash=sha256:3e65ae456ad0d210ee71a89ee112ac7e72e6c2e5aac1b95846ecc7afa68a194c \
+ --hash=sha256:44488e0e78146f87baaa009736886516779253d6d6bac3ef636ede72bc6a8234 \
+ --hash=sha256:463c032bd02052d934daa5f45d183e0521ceb783c2548501cf034b0beba92c9b \
+ --hash=sha256:4973b718fcadfb04e59e746abfbb0288694159c6aeecd2add59320c03368c721 \
+ --hash=sha256:49ee3c348caa459244ec437ccc7ff3831f35977d143f65311572b8ba0a5f265f \
+ --hash=sha256:56ac6602c7d09c2c507c55e58dc7026b8988e0475bd0002f8a386cce5e8e8adc \
+ --hash=sha256:65d3c931013ea798b502782acab986bbf47ba2c452610ab0776cf4a8ef150fc0 \
+ --hash=sha256:6d0302550bbe4620a5dc7649517c4409d74ef18558276ce758419cf09e578897 \
+ --hash=sha256:72a510931c3c25f134aac2daf4eb4feca99ffe37a35896d7150e50ac3eee06c7 \
+ --hash=sha256:7712335855b2307a21ae86efe949c76be36c6068d76df34faa27ce9ee40ff444 \
+ --hash=sha256:7d2ee1acbacebe50ba0f85fff1bc05e65d877958f00880f49f9b2af38dce1af0 \
+ --hash=sha256:a0ec41b895da717bc218a42a3a7a0bfcfe9a213d7afaa4255353901e0e21f696 \
+ --hash=sha256:a925364eb7fbb9cdce55a9868f7525a1905af512a559303bd54ef468fd88cb37 \
+ --hash=sha256:b3d11a3a3ac89bb8a2543d75597f905a9926f9c806f40fcca8242922d1cc6ad5 \
+ --hash=sha256:b3f63a1796886249bd22c559a5944d64d05d43f2be72961624278eff0dcc5cb8 \
+ --hash=sha256:b43a9e4c89d4d0839de27cd4d6902d33396de700e9ff4c5ab7631f277a85ead9 \
+ --hash=sha256:b878e296e63661c8e124177cc3084b041ba3f5936b43076d57c487822426f614 \
+ --hash=sha256:b8ca72d841215b6573ed0655b3a5cd1133f9b69a6fa561aecad40dca9029d75b \
+ --hash=sha256:b8d4429954a3beb3e844e2872610d2a4800ba4eb42bb1990c6a4b1949b18459f \
+ --hash=sha256:bd88fbb0f6c3a0f28f0a68d72df88e9755cf5215bae146f5a1bdc8362b772053 \
+ --hash=sha256:bda059af9d621918efb813b22fb06b3fe00c3e94079c6143fcb2c565eb44cb87 \
+ --hash=sha256:c0c0ab5f94938a23fe81928a21cc0fac44143133ccc4eb7eeb1b92f84748331c \
+ --hash=sha256:c2f8e7d6b2f8489d4a9885e3adcaef4bc5ff0a275acd990f120e29c4ab3395c5 \
+ --hash=sha256:cc0351cfe5022cec5a77645f647f92a936b38850346ed3f6d6babfbeeeca4d26 \
+ --hash=sha256:d77605e0d353ba3fe5627e5490f0fbfe44141bafa4478d88ef7954a61a848dae \
+ --hash=sha256:dd12d80d91d4114ca097626eb82714618dcdfacd6a5e0955216c6485c350ef99 \
+ --hash=sha256:ddabfff809ffc983fc9963455ba1cecc90295803e06e140a4c83e94c1fa3d960 \
+ --hash=sha256:eac4e8e4c7060c75f395feec46421eb61212cb73998dbe004b7384724f3682ab \
+ --hash=sha256:f5ddcd3e291a749b62521f71fc953f66f5fd9743973fd6dd962b092773569601 \
+ --hash=sha256:fbb1706407c0e451c4f8cc016fec27d72d4b211fdd3173320b1ada7a6c74c3ac \
+ --hash=sha256:fe43c158555da46723b28b52e058ad444195afd1db3ca7720c59a254544e9c20
# via docling-core
-tree-sitter-c==0.23.4 \
- --hash=sha256:013403e74765d74e523f380f9df8f3d99e9fe94132a3fc0c8b29cba538a7b2bf \
- --hash=sha256:2c92c0571b36b6da06f8882f34151dc11e67a493e9101cc0026a16da27709c05 \
- --hash=sha256:5e42a3519825ca59c91b2b7aec08dd3c89e02690c7b315d54a1e1743f9be3f15 \
- --hash=sha256:9215c7888dd019038f162ea5646178f6e129cd2b49fc506d14becf5e426121d7 \
- --hash=sha256:98c285a23bf4fb6fb34140d6ea0f0d25d0a93e0d93692f9dffe3db6d1fe08534 \
- --hash=sha256:a4d7bdeaca8f1da72352a945853f56aa5d34e7bc22569ec5bda5d7c1a04e5b0f \
- --hash=sha256:c15c7588c3d95872328019073a8d5eaf7c2691b4d4ef0393a0168399b2ad2356 \
- --hash=sha256:edd36e12cc79b8b5bbc81fc336ff7d2577d0fe16afd18163c9aff7ae3ff69e15
+tree-sitter-c==0.24.1 \
+ --hash=sha256:290bff0f9c79c966496ebae45042f77543e6e4aea725f40587a8611d566231a8 \
+ --hash=sha256:789781afcb710df34144f7e2a20cd80e325114b9119e3956c6bd1dd2d365df98 \
+ --hash=sha256:7d2d0cda0b8dda428c81440c1e94367f9f13548eedca3f49768bde66b1422ad6 \
+ --hash=sha256:942bcd7cbecd810dcf7ca6f8f834391ebf0771a89479646d891ba4ca2fdfdc88 \
+ --hash=sha256:9a74cfd7a11ca5a961fafd4d751892ee65acae667d2818968a6f079397d8d28c \
+ --hash=sha256:9c06ac26a1efdcc8b26a8a6970fbc6997c4071857359e5837d4c42892d45fe1e \
+ --hash=sha256:a6a807705a3978911dc7ee26a7ad36dcfacb6adfc13c190d496660ec9bd66707 \
+ --hash=sha256:d46bbda06f838c2dcb91daf767813671fd366b49ad84ff37db702129267b46e1
# via docling-core
-tree-sitter-javascript==0.23.1 \
- --hash=sha256:041fa22b34250ea6eb313d33104d5303f79504cb259d374d691e38bbdc49145b \
- --hash=sha256:056dc04fb6b24293f8c5fec43c14e7e16ba2075b3009c643abf8c85edc4c7c3c \
- --hash=sha256:5a6bc1055b061c5055ec58f39ee9b2e9efb8e6e0ae970838af74da0afb811f0a \
- --hash=sha256:6ca583dad4bd79d3053c310b9f7208cd597fd85f9947e4ab2294658bb5c11e35 \
- --hash=sha256:94100e491a6a247aa4d14caf61230c171b6376c863039b6d9cd71255c2d815ec \
- --hash=sha256:a11ca1c0f736da42967586b568dff8a465ee148a986c15ebdc9382806e0ce871 \
- --hash=sha256:b2059ce8b150162cda05a457ca3920450adbf915119c04b8c67b5241cd7fcfed \
- --hash=sha256:eb28130cd2fb30d702d614cbf61ef44d1c7f6869e7d864a9cc17111e370be8f7
+tree-sitter-javascript==0.25.0 \
+ --hash=sha256:199d09985190852e0912da2b8d26c932159be314bc04952cf917ed0e4c633e6b \
+ --hash=sha256:1b852d3aee8a36186dbcc32c798b11b4869f9b5041743b63b65c2ef793db7a54 \
+ --hash=sha256:329b5414874f0588a98f1c291f1b28138286617aa907746ffe55adfdcf963f38 \
+ --hash=sha256:622a69d677aa7f6ee2931d8c77c981a33f0ebb6d275aa9d43d3397c879a9bb0b \
+ --hash=sha256:8264a996b8845cfce06965152a013b5d9cbb7d199bc3503e12b5682e62bb1de1 \
+ --hash=sha256:9dc04ba91fc8583344e57c1f1ed5b2c97ecaaf47480011b92fbeab8dda96db75 \
+ --hash=sha256:b70f887fb269d6e58c349d683f59fa647140c410cfe2bee44a883b20ec92e3dc \
+ --hash=sha256:dfcf789064c58dc13c0a4edb550acacfc6f0f280577f1e7a00de3e89fc7f8ddc \
+ --hash=sha256:e5ed840f5bd4a3f0272e441d19429b26eedc257abe5574c8546da6b556865e3c
# via docling-core
-tree-sitter-python==0.23.6 \
- --hash=sha256:28fbec8f74eeb2b30292d97715e60fac9ccf8a8091ce19b9d93e9b580ed280fb \
- --hash=sha256:29dacdc0cd2f64e55e61d96c6906533ebb2791972bec988450c46cce60092f5d \
- --hash=sha256:354bfa0a2f9217431764a631516f85173e9711af2c13dbd796a8815acfe505d9 \
- --hash=sha256:680b710051b144fedf61c95197db0094f2245e82551bf7f0c501356333571f7a \
- --hash=sha256:71334371bd73d5fe080aed39fbff49ed8efb9506edebe16795b0c7567ed6a272 \
- --hash=sha256:7e048733c36f564b379831689006801feb267d8194f9e793fbb395ef1723335d \
- --hash=sha256:8a9dcef55507b6567207e8ee0a6b053d0688019b47ff7f26edc1764b7f4dc0a4 \
- --hash=sha256:a24027248399fb41594b696f929f9956828ae7cc85596d9f775e6c239cd0c2be
+tree-sitter-python==0.25.0 \
+ --hash=sha256:0fbf6a3774ad7e89ee891851204c2e2c47e12b63a5edbe2e9156997731c128bb \
+ --hash=sha256:14a79a47ddef72f987d5a2c122d148a812169d7484ff5c75a3db9609d419f361 \
+ --hash=sha256:480c21dbd995b7fe44813e741d71fed10ba695e7caab627fb034e3828469d762 \
+ --hash=sha256:71959832fc5d9642e52c11f2f7d79ae520b461e63334927e93ca46cd61cd9683 \
+ --hash=sha256:86f118e5eecad616ecdb81d171a36dde9bef5a0b21ed71ea9c3e390813c3baf5 \
+ --hash=sha256:9bcde33f18792de54ee579b00e1b4fe186b7926825444766f849bf7181793a76 \
+ --hash=sha256:b13e090f725f5b9c86aa455a268553c65cadf325471ad5b65cd29cac8a1a68ac \
+ --hash=sha256:be71650ca2b93b6e9649e5d65c6811aad87a7614c8c1003246b303f6b150f61b \
+ --hash=sha256:e6d5b5799628cc0f24691ab2a172a8e676f668fe90dc60468bee14084a35c16d
# via docling-core
tree-sitter-typescript==0.23.2 \
--hash=sha256:05db58f70b95ef0ea126db5560f3775692f609589ed6f8dd0af84b7f19f1cbb7 \
@@ -6049,9 +6068,9 @@ types-pyopenssl==24.1.0.20240722 \
--hash=sha256:47913b4678a01d879f503a12044468221ed8576263c1540dcb0484ca21b08c39 \
--hash=sha256:6a7a5d2ec042537934cfb4c9d4deb0e16c4c6250b09358df1f083682fe6fda54
# via types-redis
-types-python-dateutil==2.9.0.20251115 \
- --hash=sha256:8a47f2c3920f52a994056b8786309b43143faa5a64d4cbb2722d6addabdf1a58 \
- --hash=sha256:9cf9c1c582019753b8639a081deefd7e044b9fa36bd8217f565c6c4e36ee0624
+types-python-dateutil==2.9.0.20260124 \
+ --hash=sha256:7d2db9f860820c30e5b8152bfe78dbdf795f7d1c6176057424e8b3fdd1f581af \
+ --hash=sha256:f802977ae08bf2260142e7ca1ab9d4403772a254409f7bbdf652229997124951
# via feast (setup.py)
types-pytz==2025.2.0.20251108 \
--hash=sha256:0f1c9792cab4eb0e46c52f8845c8f77cf1e313cb3d68bf826aa867fe4717d91c \
@@ -6069,9 +6088,9 @@ types-requests==2.30.0.0 \
--hash=sha256:c6cf08e120ca9f0dc4fa4e32c3f953c3fba222bcc1db6b97695bce8da1ba9864 \
--hash=sha256:dec781054324a70ba64430ae9e62e7e9c8e4618c185a5cb3f87a6738251b5a31
# via feast (setup.py)
-types-setuptools==80.9.0.20251223 \
- --hash=sha256:1b36db79d724c2287d83dc052cf887b47c0da6a2fff044378be0b019545f56e6 \
- --hash=sha256:d3411059ae2f5f03985217d86ac6084efea2c9e9cacd5f0869ef950f308169b2
+types-setuptools==80.10.0.20260124 \
+ --hash=sha256:1b86d9f0368858663276a0cbe5fe5a9722caf94b5acde8aba0399a6e90680f20 \
+ --hash=sha256:efed7e044f01adb9c2806c7a8e1b6aa3656b8e382379b53d5f26ee3db24d4c01
# via
# feast (setup.py)
# types-cffi
@@ -6430,9 +6449,9 @@ watchfiles==1.1.1 \
--hash=sha256:f8979280bdafff686ba5e4d8f97840f929a87ed9cdf133cbbd42f7766774d2aa \
--hash=sha256:f9a2ae5c91cecc9edd47e041a930490c31c3afb1f5e6d71de3dc671bfaca02bf
# via uvicorn
-wcwidth==0.3.1 \
- --hash=sha256:5aedb626a9c0d941b990cfebda848d538d45c9493a3384d080aff809143bd3be \
- --hash=sha256:b2d355df3ec5d51bfc973a22fb4ea9a03b12fdcbf00d0abd22a2c78b12ccc177
+wcwidth==0.3.2 \
+ --hash=sha256:817abc6a89e47242a349b5d100cbd244301690d6d8d2ec6335f26fe6640a6315 \
+ --hash=sha256:d469b3059dab6b1077def5923ed0a8bf5738bd4a1a87f686d5e2de455354c4ad
# via prompt-toolkit
webcolors==25.10.0 \
--hash=sha256:032c727334856fc0b968f63daa252a1ac93d33db2f5267756623c210e57a4f1d \
diff --git a/sdk/python/requirements/py3.11-minimal-requirements.txt b/sdk/python/requirements/py3.11-minimal-requirements.txt
index 004190e35eb..6356d6700ec 100644
--- a/sdk/python/requirements/py3.11-minimal-requirements.txt
+++ b/sdk/python/requirements/py3.11-minimal-requirements.txt
@@ -877,9 +877,9 @@ grpcio-status==1.62.3 \
--hash=sha256:289bdd7b2459794a12cf95dc0cb727bd4a1742c37bd823f760236c937e53a485 \
--hash=sha256:f9049b762ba8de6b1086789d8315846e094edac2c50beaf462338b301a8fd4b8
# via google-api-core
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
diff --git a/sdk/python/requirements/py3.11-minimal-sdist-requirements-build.txt b/sdk/python/requirements/py3.11-minimal-sdist-requirements-build.txt
index a29b6fe361f..fc9e8124ac5 100644
--- a/sdk/python/requirements/py3.11-minimal-sdist-requirements-build.txt
+++ b/sdk/python/requirements/py3.11-minimal-sdist-requirements-build.txt
@@ -784,9 +784,9 @@ types-psutil==7.0.0.20250218 \
--hash=sha256:1447a30c282aafefcf8941ece854e1100eee7b0296a9d9be9977292f0269b121 \
--hash=sha256:1e642cdafe837b240295b23b1cbd4691d80b08a07d29932143cbbae30eb0db9c
# via mypy
-types-setuptools==80.9.0.20251223 \
- --hash=sha256:1b36db79d724c2287d83dc052cf887b47c0da6a2fff044378be0b019545f56e6 \
- --hash=sha256:d3411059ae2f5f03985217d86ac6084efea2c9e9cacd5f0869ef950f308169b2
+types-setuptools==80.10.0.20260124 \
+ --hash=sha256:1b86d9f0368858663276a0cbe5fe5a9722caf94b5acde8aba0399a6e90680f20 \
+ --hash=sha256:efed7e044f01adb9c2806c7a8e1b6aa3656b8e382379b53d5f26ee3db24d4c01
# via mypy
typing-extensions==4.15.0 \
--hash=sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466 \
diff --git a/sdk/python/requirements/py3.11-minimal-sdist-requirements.txt b/sdk/python/requirements/py3.11-minimal-sdist-requirements.txt
index ef8ed408eaf..b49420587e5 100644
--- a/sdk/python/requirements/py3.11-minimal-sdist-requirements.txt
+++ b/sdk/python/requirements/py3.11-minimal-sdist-requirements.txt
@@ -884,55 +884,60 @@ googleapis-common-protos[grpc]==1.72.0 \
# google-api-core
# grpc-google-iam-v1
# grpcio-status
-greenlet==3.3.0 \
- --hash=sha256:047ab3df20ede6a57c35c14bf5200fcf04039d50f908270d3f9a7a82064f543b \
- --hash=sha256:087ea5e004437321508a8d6f20efc4cfec5e3c30118e1417ea96ed1d93950527 \
- --hash=sha256:0a5d554d0712ba1de0a6c94c640f7aeba3f85b3a6e1f2899c11c2c0428da9365 \
- --hash=sha256:2662433acbca297c9153a4023fe2161c8dcfdcc91f10433171cf7e7d94ba2221 \
- --hash=sha256:286d093f95ec98fdd92fcb955003b8a3d054b4e2cab3e2707a5039e7b50520fd \
- --hash=sha256:2d9ad37fc657b1102ec880e637cccf20191581f75c64087a549e66c57e1ceb53 \
- --hash=sha256:2de5a0b09eab81fc6a382791b995b1ccf2b172a9fec934747a7a23d2ff291794 \
- --hash=sha256:30a6e28487a790417d036088b3bcb3f3ac7d8babaa7d0139edbaddebf3af9492 \
- --hash=sha256:349345b770dc88f81506c6861d22a6ccd422207829d2c854ae2af8025af303e3 \
- --hash=sha256:39b28e339fc3c348427560494e28d8a6f3561c8d2bcf7d706e1c624ed8d822b9 \
- --hash=sha256:3a898b1e9c5f7307ebbde4102908e6cbfcb9ea16284a3abe15cab996bee8b9b3 \
- --hash=sha256:3c6e9b9c1527a78520357de498b0e709fb9e2f49c3a513afd5a249007261911b \
- --hash=sha256:4243050a88ba61842186cb9e63c7dfa677ec146160b0efd73b855a3d9c7fcf32 \
- --hash=sha256:4449a736606bd30f27f8e1ff4678ee193bc47f6ca810d705981cfffd6ce0d8c5 \
- --hash=sha256:5375d2e23184629112ca1ea89a53389dddbffcf417dad40125713d88eb5f96e8 \
- --hash=sha256:5773edda4dc00e173820722711d043799d3adb4f01731f40619e07ea2750b955 \
- --hash=sha256:60c2ef0f578afb3c8d92ea07ad327f9a062547137afe91f38408f08aacab667f \
- --hash=sha256:670d0f94cd302d81796e37299bcd04b95d62403883b24225c6b5271466612f45 \
- --hash=sha256:6c10513330af5b8ae16f023e8ddbfb486ab355d04467c4679c5cfe4659975dd9 \
- --hash=sha256:6cb3a8ec3db4a3b0eb8a3c25436c2d49e3505821802074969db017b87bc6a948 \
- --hash=sha256:6f8496d434d5cb2dce025773ba5597f71f5410ae499d5dd9533e0653258cdb3d \
- --hash=sha256:73631cd5cccbcfe63e3f9492aaa664d278fda0ce5c3d43aeda8e77317e38efbd \
- --hash=sha256:73f51dd0e0bdb596fb0417e475fa3c5e32d4c83638296e560086b8d7da7c4170 \
- --hash=sha256:7652ee180d16d447a683c04e4c5f6441bae7ba7b17ffd9f6b3aff4605e9e6f71 \
- --hash=sha256:7d2d9fd66bfadf230b385fdc90426fcd6eb64db54b40c495b72ac0feb5766c54 \
- --hash=sha256:7dee147740789a4632cace364816046e43310b59ff8fb79833ab043aefa72fd5 \
- --hash=sha256:83cd0e36932e0e7f36a64b732a6f60c2fc2df28c351bae79fbaf4f8092fe7614 \
- --hash=sha256:87e63ccfa13c0a0f6234ed0add552af24cc67dd886731f2261e46e241608bee3 \
- --hash=sha256:9ee1942ea19550094033c35d25d20726e4f1c40d59545815e1128ac58d416d38 \
- --hash=sha256:9f515a47d02da4d30caaa85b69474cec77b7929b2e936ff7fb853d42f4bf8808 \
- --hash=sha256:a1e41a81c7e2825822f4e068c48cb2196002362619e2d70b148f20a831c00739 \
- --hash=sha256:a687205fb22794e838f947e2194c0566d3812966b41c78709554aa883183fb62 \
- --hash=sha256:a7a34b13d43a6b78abf828a6d0e87d3385680eaf830cd60d20d52f249faabf39 \
- --hash=sha256:a82bb225a4e9e4d653dd2fb7b8b2d36e4fb25bc0165422a11e48b88e9e6f78fb \
- --hash=sha256:ab97cf74045343f6c60a39913fa59710e4bd26a536ce7ab2397adf8b27e67c39 \
- --hash=sha256:ac0549373982b36d5fd5d30beb8a7a33ee541ff98d2b502714a09f1169f31b55 \
- --hash=sha256:b01548f6e0b9e9784a2c99c5651e5dc89ffcbe870bc5fb2e5ef864e9cc6b5dcb \
- --hash=sha256:b299a0cb979f5d7197442dccc3aee67fce53500cd88951b7e6c35575701c980b \
- --hash=sha256:b3c374782c2935cc63b2a27ba8708471de4ad1abaa862ffdb1ef45a643ddbb7d \
- --hash=sha256:b49e7ed51876b459bd645d83db257f0180e345d3f768a35a85437a24d5a49082 \
- --hash=sha256:b96dc7eef78fd404e022e165ec55327f935b9b52ff355b067eb4a0267fc1cffb \
- --hash=sha256:c024b1e5696626890038e34f76140ed1daf858e37496d33f2af57f06189e70d7 \
- --hash=sha256:d198d2d977460358c3b3a4dc844f875d1adb33817f0613f663a656f463764ccc \
- --hash=sha256:d6ed6f85fae6cdfdb9ce04c9bf7a08d666cfcfb914e7d006f44f840b46741931 \
- --hash=sha256:d9125050fcf24554e69c4cacb086b87b3b55dc395a8b3ebe6487b045b2614388 \
- --hash=sha256:dcd2bdbd444ff340e8d6bdf54d2f206ccddbb3ccfdcd3c25bf4afaa7b8f0cf45 \
- --hash=sha256:e29f3018580e8412d6aaf5641bb7745d38c85228dacf51a73bd4e26ddf2a6a8e \
- --hash=sha256:e8e18ed6995e9e2c0b4ed264d2cf89260ab3ac7e13555b8032b25a74c6d18655
+greenlet==3.3.1 \
+ --hash=sha256:02925a0bfffc41e542c70aa14c7eda3593e4d7e274bfcccca1827e6c0875902e \
+ --hash=sha256:04bee4775f40ecefcdaa9d115ab44736cd4b9c5fba733575bfe9379419582e13 \
+ --hash=sha256:070472cd156f0656f86f92e954591644e158fd65aa415ffbe2d44ca77656a8f5 \
+ --hash=sha256:09f51496a0bfbaa9d74d36a52d2580d1ef5ed4fdfcff0a73730abfbbbe1403dd \
+ --hash=sha256:1108b61b06b5224656121c3c8ee8876161c491cbe74e5c519e0634c837cf93d5 \
+ --hash=sha256:12184c61e5d64268a160226fb4818af4df02cfead8379d7f8b99a56c3a54ff3e \
+ --hash=sha256:14194f5f4305800ff329cbf02c5fcc88f01886cadd29941b807668a45f0d2336 \
+ --hash=sha256:20fedaadd422fa02695f82093f9a98bad3dab5fcda793c658b945fcde2ab27ba \
+ --hash=sha256:27289986f4e5b0edec7b5a91063c109f0276abb09a7e9bdab08437525977c946 \
+ --hash=sha256:2f080e028001c5273e0b42690eaf359aeef9cb1389da0f171ea51a5dc3c7608d \
+ --hash=sha256:301860987846c24cb8964bdec0e31a96ad4a2a801b41b4ef40963c1b44f33451 \
+ --hash=sha256:32e4ca9777c5addcbf42ff3915d99030d8e00173a56f80001fb3875998fe410b \
+ --hash=sha256:33a956fe78bbbda82bfc95e128d61129b32d66bcf0a20a1f0c08aa4839ffa951 \
+ --hash=sha256:34a729e2e4e4ffe9ae2408d5ecaf12f944853f40ad724929b7585bca808a9d6f \
+ --hash=sha256:39eda9ba259cc9801da05351eaa8576e9aa83eb9411e8f0c299e05d712a210f2 \
+ --hash=sha256:3a300354f27dd86bae5fbf7002e6dd2b3255cd372e9242c933faf5e859b703fe \
+ --hash=sha256:3e0f3878ca3a3ff63ab4ea478585942b53df66ddde327b59ecb191b19dbbd62d \
+ --hash=sha256:3e63252943c921b90abb035ebe9de832c436401d9c45f262d80e2d06cc659242 \
+ --hash=sha256:41848f3230b58c08bb43dee542e74a2a2e34d3c59dc3076cec9151aeeedcae98 \
+ --hash=sha256:49f4ad195d45f4a66a0eb9c1ba4832bb380570d361912fa3554746830d332149 \
+ --hash=sha256:4b065d3284be43728dd280f6f9a13990b56470b81be20375a207cdc814a983f2 \
+ --hash=sha256:4b9721549a95db96689458a1e0ae32412ca18776ed004463df3a9299c1b257ab \
+ --hash=sha256:50e1457f4fed12a50e427988a07f0f9df53cf0ee8da23fab16e6732c2ec909d4 \
+ --hash=sha256:59913f1e5ada20fde795ba906916aea25d442abcc0593fba7e26c92b7ad76249 \
+ --hash=sha256:5fd23b9bc6d37b563211c6abbb1b3cab27db385a4449af5c32e932f93017080c \
+ --hash=sha256:6423481193bbbe871313de5fd06a082f2649e7ce6e08015d2a76c1e9186ca5b3 \
+ --hash=sha256:65be2f026ca6a176f88fb935ee23c18333ccea97048076aef4db1ef5bc0713ac \
+ --hash=sha256:67ea3fc73c8cd92f42467a72b75e8f05ed51a0e9b1d15398c913416f2dafd49f \
+ --hash=sha256:71c767cf281a80d02b6c1bdc41c9468e1f5a494fb11bc8688c360524e273d7b1 \
+ --hash=sha256:76e39058e68eb125de10c92524573924e827927df5d3891fbc97bd55764a8774 \
+ --hash=sha256:7932f5f57609b6a3b82cc11877709aa7a98e3308983ed93552a1c377069b20c8 \
+ --hash=sha256:7a3ae05b3d225b4155bda56b072ceb09d05e974bc74be6c3fc15463cf69f33fd \
+ --hash=sha256:7ab327905cabb0622adca5971e488064e35115430cec2c35a50fd36e72a315b3 \
+ --hash=sha256:7b2fe4150a0cf59f847a67db8c155ac36aed89080a6a639e9f16df5d6c6096f1 \
+ --hash=sha256:7e806ca53acf6d15a888405880766ec84721aa4181261cd11a457dfe9a7a4975 \
+ --hash=sha256:80aa4d79eb5564f2e0a6144fcc744b5a37c56c4a92d60920720e99210d88db0f \
+ --hash=sha256:92497c78adf3ac703b57f1e3813c2d874f27f71a178f9ea5887855da413cd6d2 \
+ --hash=sha256:96aff77af063b607f2489473484e39a0bbae730f2ea90c9e5606c9b73c44174a \
+ --hash=sha256:aec9ab04e82918e623415947921dea15851b152b822661cce3f8e4393c3df683 \
+ --hash=sha256:b066e8b50e28b503f604fa538adc764a638b38cf8e81e025011d26e8a627fa79 \
+ --hash=sha256:b31c05dd84ef6871dd47120386aed35323c944d86c3d91a17c4b8d23df62f15b \
+ --hash=sha256:bd59acd8529b372775cd0fcbc5f420ae20681c5b045ce25bd453ed8455ab99b5 \
+ --hash=sha256:bfb2d1763d777de5ee495c85309460f6fd8146e50ec9d0ae0183dbf6f0a829d1 \
+ --hash=sha256:c620051669fd04ac6b60ebc70478210119c56e2d5d5df848baec4312e260e4ca \
+ --hash=sha256:c9f9d5e7a9310b7a2f416dd13d2e3fd8b42d803968ea580b7c0f322ccb389b97 \
+ --hash=sha256:cb0feb07fe6e6a74615ee62a880007d976cf739b6669cce95daa7373d4fc69c5 \
+ --hash=sha256:cc98b9c4e4870fa983436afa999d4eb16b12872fab7071423d5262fa7120d57a \
+ --hash=sha256:d842c94b9155f1c9b3058036c24ffb8ff78b428414a19792b2380be9cecf4f36 \
+ --hash=sha256:da19609432f353fed186cc1b85e9440db93d489f198b4bdf42ae19cc9d9ac9b4 \
+ --hash=sha256:e0093bd1a06d899892427217f0ff2a3c8f306182b8c754336d32e2d587c131b4 \
+ --hash=sha256:e2e7e882f83149f0a71ac822ebf156d902e7a5d22c9045e3e0d1daf59cee2cc9 \
+ --hash=sha256:e84b51cbebf9ae573b5fbd15df88887815e3253fc000a7d0ff95170e8f7e9729 \
+ --hash=sha256:ed6b402bc74d6557a705e197d47f9063733091ed6357b3de33619d8a8d93ac53
# via feast (setup.py)
grpc-google-iam-v1==0.14.3 \
--hash=sha256:7a7f697e017a067206a3dfef44e4c634a34d3dee135fe7d7a4613fe3e59217e6 \
@@ -1017,9 +1022,9 @@ grpcio-status==1.62.3 \
--hash=sha256:289bdd7b2459794a12cf95dc0cb727bd4a1742c37bd823f760236c937e53a485 \
--hash=sha256:f9049b762ba8de6b1086789d8315846e094edac2c50beaf462338b301a8fd4b8
# via google-api-core
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
diff --git a/sdk/python/requirements/py3.11-requirements.txt b/sdk/python/requirements/py3.11-requirements.txt
index 40f3ec865e8..735025162ad 100644
--- a/sdk/python/requirements/py3.11-requirements.txt
+++ b/sdk/python/requirements/py3.11-requirements.txt
@@ -174,9 +174,9 @@ fsspec==2026.1.0 \
--hash=sha256:cb76aa913c2285a3b49bdd5fc55b1d7c708d7208126b60f2eb8194fe1b4cbdcc \
--hash=sha256:e987cb0496a0d81bba3a9d1cee62922fb395e7d4c3b575e57f547953334fe07b
# via dask
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
diff --git a/sdk/python/requirements/py3.12-ci-requirements.txt b/sdk/python/requirements/py3.12-ci-requirements.txt
index 974ba0d7fb9..f99befde48e 100644
--- a/sdk/python/requirements/py3.12-ci-requirements.txt
+++ b/sdk/python/requirements/py3.12-ci-requirements.txt
@@ -249,6 +249,7 @@ attrs==25.4.0 \
# aiohttp
# jsonlines
# jsonschema
+ # openlineage-python
# referencing
azure-core==1.38.0 \
--hash=sha256:8194d2682245a3e4e3151a667c686464c3786fed7918b394d035bdcd61bb5993 \
@@ -371,6 +372,7 @@ build==1.4.0 \
--hash=sha256:f1b91b925aa322be454f8330c6fb48b465da993d1e7e7e6fa35027ec49f3c936
# via
# feast (setup.py)
+ # openlineage-python
# pip-tools
# singlestoredb
cassandra-driver==3.29.3 \
@@ -1005,16 +1007,16 @@ docling==2.27.0 \
--hash=sha256:1288ed75b27e33bf94daff34faffc6d11b7d7ccc13e3df84fb24adad3991f72d \
--hash=sha256:faba35662612a2c687a3a463e501d95f645316436084af92a0442ce162429a3d
# via feast (setup.py)
-docling-core[chunking]==2.60.1 \
- --hash=sha256:45390e50cb4d83a70e2384c70a46e6e64acb15e69674d9d2c67315155f252aef \
- --hash=sha256:64bd71dee243bd11b25f216fec219e046a130b851b8e1d0c0dd362a4aac0e994
+docling-core[chunking]==2.60.2 \
+ --hash=sha256:63aee783f06240455c12c30e9af383b80d7ade80c896f81d68a4aff6cde2e2a1 \
+ --hash=sha256:7a99e1671e796e39d0c735b7ae3833766a97ad287e15d434dfa417917e3b0e6d
# via
# docling
# docling-ibm-models
# docling-parse
-docling-ibm-models==3.10.3 \
- --hash=sha256:6be756e45df155a367087b93e0e5f2d65905e7e81a5f57c1d3ae57096631655a \
- --hash=sha256:e034d1398c99059998da18e38ef80af8a5d975f04de17f6e93efa075fb29cac4
+docling-ibm-models==3.11.0 \
+ --hash=sha256:454401563a8e79cb33b718bc559d9bacca8a0183583e48f8e616c9184c1f5eb1 \
+ --hash=sha256:68f7961069d643bfdab21b1c9ef24a979db293496f4c2283d95b1025a9ac5347
# via docling
docling-parse==4.7.3 \
--hash=sha256:1790e7e4ae202d67875c1c48fd6f8ef5c51d10b0c23157e4989b8673f2f31308 \
@@ -1600,9 +1602,9 @@ grpcio-tools==1.62.3 \
--hash=sha256:f4b1615adf67bd8bb71f3464146a6f9949972d06d21a4f5e87e73f6464d97f57 \
--hash=sha256:f6831fdec2b853c9daa3358535c55eed3694325889aa714070528cf8f92d7d6d
# via feast (setup.py)
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
@@ -1806,6 +1808,7 @@ httpx[http2]==0.27.2 \
# fastapi-mcp
# jupyterlab
# mcp
+ # openlineage-python
# python-keycloak
# qdrant-client
httpx-sse==0.4.3 \
@@ -2010,9 +2013,9 @@ jupyter-server-terminals==0.5.4 \
--hash=sha256:55be353fc74a80bc7f3b20e6be50a55a61cd525626f578dcb66a5708e2007d14 \
--hash=sha256:bbda128ed41d0be9020349f9f1f2a4ab9952a73ed5f5ac9f1419794761fb87f5
# via jupyter-server
-jupyterlab==4.5.2 \
- --hash=sha256:76466ebcfdb7a9bb7e2fbd6459c0e2c032ccf75be673634a84bee4b3e6b13ab6 \
- --hash=sha256:c80a6b9f6dace96a566d590c65ee2785f61e7cd4aac5b4d453dcc7d0d5e069b7
+jupyterlab==4.5.3 \
+ --hash=sha256:4a159f71067cb38e4a82e86a42de8e7e926f384d7f2291964f282282096d27e8 \
+ --hash=sha256:63c9f3a48de72ba00df766ad6eed416394f5bb883829f11eeff0872302520ba7
# via notebook
jupyterlab-pygments==0.3.0 \
--hash=sha256:721aca4d9029252b11cfa9d185e5b5af4d54772bb8072f9b7036f4170054d35d \
@@ -2999,6 +3002,9 @@ opencv-python-headless==4.13.0.90 \
--hash=sha256:eba38bc255d0b7d1969c5bcc90a060ca2b61a3403b613872c750bfa5dfe9e03b \
--hash=sha256:f46b17ea0aa7e4124ca6ad71143f89233ae9557f61d2326bcdb34329a1ddf9bd
# via easyocr
+openlineage-python==1.43.0 \
+ --hash=sha256:595dc641f696d0a1c021440a9ff8155f4e2776452cf118112a09b12cf4038827
+ # via feast (setup.py)
openpyxl==3.1.5 \
--hash=sha256:5282c12b107bffeef825f4617dc029afaf41d0ea60823bbb665ef3079dc79de2 \
--hash=sha256:cf0e3cf56142039133628b5acffe8ef0c12bc902d2aadd3e0fe5878dc08d1050
@@ -3146,6 +3152,7 @@ packaging==26.0 \
# lazy-loader
# marshmallow
# nbconvert
+ # openlineage-python
# pandas-gbq
# pytest
# ray
@@ -4454,6 +4461,7 @@ python-dateutil==2.9.0 \
# jupyter-client
# kubernetes
# moto
+ # openlineage-python
# pandas
# trino
python-docx==1.2.0 \
@@ -4577,6 +4585,7 @@ pyyaml==6.0.3 \
# huggingface-hub
# jupyter-events
# kubernetes
+ # openlineage-python
# openshift-client
# pre-commit
# ray
@@ -4868,6 +4877,7 @@ requests==2.32.5 \
# kubernetes
# moto
# msal
+ # openlineage-python
# python-keycloak
# ray
# requests-oauthlib
@@ -5934,66 +5944,75 @@ transformers==4.57.6 \
# feast (setup.py)
# docling-core
# docling-ibm-models
-tree-sitter==0.24.0 \
- --hash=sha256:01ea01a7003b88b92f7f875da6ba9d5d741e0c84bb1bd92c503c0eecd0ee6409 \
- --hash=sha256:033506c1bc2ba7bd559b23a6bdbeaf1127cee3c68a094b82396718596dfe98bc \
- --hash=sha256:098a81df9f89cf254d92c1cd0660a838593f85d7505b28249216661d87adde4a \
- --hash=sha256:0b26bf9e958da6eb7e74a081aab9d9c7d05f9baeaa830dbb67481898fd16f1f5 \
- --hash=sha256:0d4a6416ed421c4210f0ca405a4834d5ccfbb8ad6692d4d74f7773ef68f92071 \
- --hash=sha256:14beeff5f11e223c37be7d5d119819880601a80d0399abe8c738ae2288804afc \
- --hash=sha256:23641bd25dcd4bb0b6fa91b8fb3f46cc9f1c9f475efe4d536d3f1f688d1b84c8 \
- --hash=sha256:24a8dd03b0d6b8812425f3b84d2f4763322684e38baf74e5bb766128b5633dc7 \
- --hash=sha256:26a5b130f70d5925d67b47db314da209063664585a2fd36fa69e0717738efaf4 \
- --hash=sha256:2a84ff87a2f2a008867a1064aba510ab3bd608e3e0cd6e8fef0379efee266c73 \
- --hash=sha256:3b1f3cbd9700e1fba0be2e7d801527e37c49fc02dc140714669144ef6ab58dce \
- --hash=sha256:464fa5b2cac63608915a9de8a6efd67a4da1929e603ea86abaeae2cb1fe89921 \
- --hash=sha256:4ddb113e6b8b3e3b199695b1492a47d87d06c538e63050823d90ef13cac585fd \
- --hash=sha256:57277a12fbcefb1c8b206186068d456c600dbfbc3fd6c76968ee22614c5cd5ad \
- --hash=sha256:5fc5c3c26d83c9d0ecb4fc4304fba35f034b7761d35286b936c1db1217558b4e \
- --hash=sha256:772e1bd8c0931c866b848d0369b32218ac97c24b04790ec4b0e409901945dd8e \
- --hash=sha256:7d5d9537507e1c8c5fa9935b34f320bfec4114d675e028f3ad94f11cf9db37b9 \
- --hash=sha256:a7c9c89666dea2ce2b2bf98e75f429d2876c569fab966afefdcd71974c6d8538 \
- --hash=sha256:abd95af65ca2f4f7eca356343391ed669e764f37748b5352946f00f7fc78e734 \
- --hash=sha256:c012e4c345c57a95d92ab5a890c637aaa51ab3b7ff25ed7069834b1087361c95 \
- --hash=sha256:d25fa22766d63f73716c6fec1a31ee5cf904aa429484256bd5fdf5259051ed74 \
- --hash=sha256:de0fb7c18c6068cacff46250c0a0473e8fc74d673e3e86555f131c2c1346fb13 \
- --hash=sha256:e0992d483677e71d5c5d37f30dfb2e3afec2f932a9c53eec4fca13869b788c6c \
- --hash=sha256:f3f00feff1fc47a8e4863561b8da8f5e023d382dd31ed3e43cd11d4cae445445 \
- --hash=sha256:f3f08a2ca9f600b3758792ba2406971665ffbad810847398d180c48cee174ee2 \
- --hash=sha256:f58bb4956917715ec4d5a28681829a8dad5c342cafd4aea269f9132a83ca9b34 \
- --hash=sha256:f733a83d8355fc95561582b66bbea92ffd365c5d7a665bc9ebd25e049c2b2abb \
- --hash=sha256:f9691be48d98c49ef8f498460278884c666b44129222ed6217477dffad5d4831 \
- --hash=sha256:f9e8b1605ab60ed43803100f067eed71b0b0e6c1fb9860a262727dbfbbb74751
+tree-sitter==0.25.2 \
+ --hash=sha256:0628671f0de69bb279558ef6b640bcfc97864fe0026d840f872728a86cd6b6cd \
+ --hash=sha256:0c8b6682cac77e37cfe5cf7ec388844957f48b7bd8d6321d0ca2d852994e10d5 \
+ --hash=sha256:1799609636c0193e16c38f366bda5af15b1ce476df79ddaae7dd274df9e44266 \
+ --hash=sha256:20b570690f87f1da424cd690e51cc56728d21d63f4abd4b326d382a30353acc7 \
+ --hash=sha256:260586381b23be33b6191a07cea3d44ecbd6c01aa4c6b027a0439145fcbc3358 \
+ --hash=sha256:3e65ae456ad0d210ee71a89ee112ac7e72e6c2e5aac1b95846ecc7afa68a194c \
+ --hash=sha256:44488e0e78146f87baaa009736886516779253d6d6bac3ef636ede72bc6a8234 \
+ --hash=sha256:463c032bd02052d934daa5f45d183e0521ceb783c2548501cf034b0beba92c9b \
+ --hash=sha256:4973b718fcadfb04e59e746abfbb0288694159c6aeecd2add59320c03368c721 \
+ --hash=sha256:49ee3c348caa459244ec437ccc7ff3831f35977d143f65311572b8ba0a5f265f \
+ --hash=sha256:56ac6602c7d09c2c507c55e58dc7026b8988e0475bd0002f8a386cce5e8e8adc \
+ --hash=sha256:65d3c931013ea798b502782acab986bbf47ba2c452610ab0776cf4a8ef150fc0 \
+ --hash=sha256:6d0302550bbe4620a5dc7649517c4409d74ef18558276ce758419cf09e578897 \
+ --hash=sha256:72a510931c3c25f134aac2daf4eb4feca99ffe37a35896d7150e50ac3eee06c7 \
+ --hash=sha256:7712335855b2307a21ae86efe949c76be36c6068d76df34faa27ce9ee40ff444 \
+ --hash=sha256:7d2ee1acbacebe50ba0f85fff1bc05e65d877958f00880f49f9b2af38dce1af0 \
+ --hash=sha256:a0ec41b895da717bc218a42a3a7a0bfcfe9a213d7afaa4255353901e0e21f696 \
+ --hash=sha256:a925364eb7fbb9cdce55a9868f7525a1905af512a559303bd54ef468fd88cb37 \
+ --hash=sha256:b3d11a3a3ac89bb8a2543d75597f905a9926f9c806f40fcca8242922d1cc6ad5 \
+ --hash=sha256:b3f63a1796886249bd22c559a5944d64d05d43f2be72961624278eff0dcc5cb8 \
+ --hash=sha256:b43a9e4c89d4d0839de27cd4d6902d33396de700e9ff4c5ab7631f277a85ead9 \
+ --hash=sha256:b878e296e63661c8e124177cc3084b041ba3f5936b43076d57c487822426f614 \
+ --hash=sha256:b8ca72d841215b6573ed0655b3a5cd1133f9b69a6fa561aecad40dca9029d75b \
+ --hash=sha256:b8d4429954a3beb3e844e2872610d2a4800ba4eb42bb1990c6a4b1949b18459f \
+ --hash=sha256:bd88fbb0f6c3a0f28f0a68d72df88e9755cf5215bae146f5a1bdc8362b772053 \
+ --hash=sha256:bda059af9d621918efb813b22fb06b3fe00c3e94079c6143fcb2c565eb44cb87 \
+ --hash=sha256:c0c0ab5f94938a23fe81928a21cc0fac44143133ccc4eb7eeb1b92f84748331c \
+ --hash=sha256:c2f8e7d6b2f8489d4a9885e3adcaef4bc5ff0a275acd990f120e29c4ab3395c5 \
+ --hash=sha256:cc0351cfe5022cec5a77645f647f92a936b38850346ed3f6d6babfbeeeca4d26 \
+ --hash=sha256:d77605e0d353ba3fe5627e5490f0fbfe44141bafa4478d88ef7954a61a848dae \
+ --hash=sha256:dd12d80d91d4114ca097626eb82714618dcdfacd6a5e0955216c6485c350ef99 \
+ --hash=sha256:ddabfff809ffc983fc9963455ba1cecc90295803e06e140a4c83e94c1fa3d960 \
+ --hash=sha256:eac4e8e4c7060c75f395feec46421eb61212cb73998dbe004b7384724f3682ab \
+ --hash=sha256:f5ddcd3e291a749b62521f71fc953f66f5fd9743973fd6dd962b092773569601 \
+ --hash=sha256:fbb1706407c0e451c4f8cc016fec27d72d4b211fdd3173320b1ada7a6c74c3ac \
+ --hash=sha256:fe43c158555da46723b28b52e058ad444195afd1db3ca7720c59a254544e9c20
# via docling-core
-tree-sitter-c==0.23.4 \
- --hash=sha256:013403e74765d74e523f380f9df8f3d99e9fe94132a3fc0c8b29cba538a7b2bf \
- --hash=sha256:2c92c0571b36b6da06f8882f34151dc11e67a493e9101cc0026a16da27709c05 \
- --hash=sha256:5e42a3519825ca59c91b2b7aec08dd3c89e02690c7b315d54a1e1743f9be3f15 \
- --hash=sha256:9215c7888dd019038f162ea5646178f6e129cd2b49fc506d14becf5e426121d7 \
- --hash=sha256:98c285a23bf4fb6fb34140d6ea0f0d25d0a93e0d93692f9dffe3db6d1fe08534 \
- --hash=sha256:a4d7bdeaca8f1da72352a945853f56aa5d34e7bc22569ec5bda5d7c1a04e5b0f \
- --hash=sha256:c15c7588c3d95872328019073a8d5eaf7c2691b4d4ef0393a0168399b2ad2356 \
- --hash=sha256:edd36e12cc79b8b5bbc81fc336ff7d2577d0fe16afd18163c9aff7ae3ff69e15
+tree-sitter-c==0.24.1 \
+ --hash=sha256:290bff0f9c79c966496ebae45042f77543e6e4aea725f40587a8611d566231a8 \
+ --hash=sha256:789781afcb710df34144f7e2a20cd80e325114b9119e3956c6bd1dd2d365df98 \
+ --hash=sha256:7d2d0cda0b8dda428c81440c1e94367f9f13548eedca3f49768bde66b1422ad6 \
+ --hash=sha256:942bcd7cbecd810dcf7ca6f8f834391ebf0771a89479646d891ba4ca2fdfdc88 \
+ --hash=sha256:9a74cfd7a11ca5a961fafd4d751892ee65acae667d2818968a6f079397d8d28c \
+ --hash=sha256:9c06ac26a1efdcc8b26a8a6970fbc6997c4071857359e5837d4c42892d45fe1e \
+ --hash=sha256:a6a807705a3978911dc7ee26a7ad36dcfacb6adfc13c190d496660ec9bd66707 \
+ --hash=sha256:d46bbda06f838c2dcb91daf767813671fd366b49ad84ff37db702129267b46e1
# via docling-core
-tree-sitter-javascript==0.23.1 \
- --hash=sha256:041fa22b34250ea6eb313d33104d5303f79504cb259d374d691e38bbdc49145b \
- --hash=sha256:056dc04fb6b24293f8c5fec43c14e7e16ba2075b3009c643abf8c85edc4c7c3c \
- --hash=sha256:5a6bc1055b061c5055ec58f39ee9b2e9efb8e6e0ae970838af74da0afb811f0a \
- --hash=sha256:6ca583dad4bd79d3053c310b9f7208cd597fd85f9947e4ab2294658bb5c11e35 \
- --hash=sha256:94100e491a6a247aa4d14caf61230c171b6376c863039b6d9cd71255c2d815ec \
- --hash=sha256:a11ca1c0f736da42967586b568dff8a465ee148a986c15ebdc9382806e0ce871 \
- --hash=sha256:b2059ce8b150162cda05a457ca3920450adbf915119c04b8c67b5241cd7fcfed \
- --hash=sha256:eb28130cd2fb30d702d614cbf61ef44d1c7f6869e7d864a9cc17111e370be8f7
+tree-sitter-javascript==0.25.0 \
+ --hash=sha256:199d09985190852e0912da2b8d26c932159be314bc04952cf917ed0e4c633e6b \
+ --hash=sha256:1b852d3aee8a36186dbcc32c798b11b4869f9b5041743b63b65c2ef793db7a54 \
+ --hash=sha256:329b5414874f0588a98f1c291f1b28138286617aa907746ffe55adfdcf963f38 \
+ --hash=sha256:622a69d677aa7f6ee2931d8c77c981a33f0ebb6d275aa9d43d3397c879a9bb0b \
+ --hash=sha256:8264a996b8845cfce06965152a013b5d9cbb7d199bc3503e12b5682e62bb1de1 \
+ --hash=sha256:9dc04ba91fc8583344e57c1f1ed5b2c97ecaaf47480011b92fbeab8dda96db75 \
+ --hash=sha256:b70f887fb269d6e58c349d683f59fa647140c410cfe2bee44a883b20ec92e3dc \
+ --hash=sha256:dfcf789064c58dc13c0a4edb550acacfc6f0f280577f1e7a00de3e89fc7f8ddc \
+ --hash=sha256:e5ed840f5bd4a3f0272e441d19429b26eedc257abe5574c8546da6b556865e3c
# via docling-core
-tree-sitter-python==0.23.6 \
- --hash=sha256:28fbec8f74eeb2b30292d97715e60fac9ccf8a8091ce19b9d93e9b580ed280fb \
- --hash=sha256:29dacdc0cd2f64e55e61d96c6906533ebb2791972bec988450c46cce60092f5d \
- --hash=sha256:354bfa0a2f9217431764a631516f85173e9711af2c13dbd796a8815acfe505d9 \
- --hash=sha256:680b710051b144fedf61c95197db0094f2245e82551bf7f0c501356333571f7a \
- --hash=sha256:71334371bd73d5fe080aed39fbff49ed8efb9506edebe16795b0c7567ed6a272 \
- --hash=sha256:7e048733c36f564b379831689006801feb267d8194f9e793fbb395ef1723335d \
- --hash=sha256:8a9dcef55507b6567207e8ee0a6b053d0688019b47ff7f26edc1764b7f4dc0a4 \
- --hash=sha256:a24027248399fb41594b696f929f9956828ae7cc85596d9f775e6c239cd0c2be
+tree-sitter-python==0.25.0 \
+ --hash=sha256:0fbf6a3774ad7e89ee891851204c2e2c47e12b63a5edbe2e9156997731c128bb \
+ --hash=sha256:14a79a47ddef72f987d5a2c122d148a812169d7484ff5c75a3db9609d419f361 \
+ --hash=sha256:480c21dbd995b7fe44813e741d71fed10ba695e7caab627fb034e3828469d762 \
+ --hash=sha256:71959832fc5d9642e52c11f2f7d79ae520b461e63334927e93ca46cd61cd9683 \
+ --hash=sha256:86f118e5eecad616ecdb81d171a36dde9bef5a0b21ed71ea9c3e390813c3baf5 \
+ --hash=sha256:9bcde33f18792de54ee579b00e1b4fe186b7926825444766f849bf7181793a76 \
+ --hash=sha256:b13e090f725f5b9c86aa455a268553c65cadf325471ad5b65cd29cac8a1a68ac \
+ --hash=sha256:be71650ca2b93b6e9649e5d65c6811aad87a7614c8c1003246b303f6b150f61b \
+ --hash=sha256:e6d5b5799628cc0f24691ab2a172a8e676f668fe90dc60468bee14084a35c16d
# via docling-core
tree-sitter-typescript==0.23.2 \
--hash=sha256:05db58f70b95ef0ea126db5560f3775692f609589ed6f8dd0af84b7f19f1cbb7 \
@@ -6038,9 +6057,9 @@ types-pyopenssl==24.1.0.20240722 \
--hash=sha256:47913b4678a01d879f503a12044468221ed8576263c1540dcb0484ca21b08c39 \
--hash=sha256:6a7a5d2ec042537934cfb4c9d4deb0e16c4c6250b09358df1f083682fe6fda54
# via types-redis
-types-python-dateutil==2.9.0.20251115 \
- --hash=sha256:8a47f2c3920f52a994056b8786309b43143faa5a64d4cbb2722d6addabdf1a58 \
- --hash=sha256:9cf9c1c582019753b8639a081deefd7e044b9fa36bd8217f565c6c4e36ee0624
+types-python-dateutil==2.9.0.20260124 \
+ --hash=sha256:7d2db9f860820c30e5b8152bfe78dbdf795f7d1c6176057424e8b3fdd1f581af \
+ --hash=sha256:f802977ae08bf2260142e7ca1ab9d4403772a254409f7bbdf652229997124951
# via feast (setup.py)
types-pytz==2025.2.0.20251108 \
--hash=sha256:0f1c9792cab4eb0e46c52f8845c8f77cf1e313cb3d68bf826aa867fe4717d91c \
@@ -6058,9 +6077,9 @@ types-requests==2.30.0.0 \
--hash=sha256:c6cf08e120ca9f0dc4fa4e32c3f953c3fba222bcc1db6b97695bce8da1ba9864 \
--hash=sha256:dec781054324a70ba64430ae9e62e7e9c8e4618c185a5cb3f87a6738251b5a31
# via feast (setup.py)
-types-setuptools==80.9.0.20251223 \
- --hash=sha256:1b36db79d724c2287d83dc052cf887b47c0da6a2fff044378be0b019545f56e6 \
- --hash=sha256:d3411059ae2f5f03985217d86ac6084efea2c9e9cacd5f0869ef950f308169b2
+types-setuptools==80.10.0.20260124 \
+ --hash=sha256:1b86d9f0368858663276a0cbe5fe5a9722caf94b5acde8aba0399a6e90680f20 \
+ --hash=sha256:efed7e044f01adb9c2806c7a8e1b6aa3656b8e382379b53d5f26ee3db24d4c01
# via
# feast (setup.py)
# types-cffi
@@ -6418,9 +6437,9 @@ watchfiles==1.1.1 \
--hash=sha256:f8979280bdafff686ba5e4d8f97840f929a87ed9cdf133cbbd42f7766774d2aa \
--hash=sha256:f9a2ae5c91cecc9edd47e041a930490c31c3afb1f5e6d71de3dc671bfaca02bf
# via uvicorn
-wcwidth==0.3.1 \
- --hash=sha256:5aedb626a9c0d941b990cfebda848d538d45c9493a3384d080aff809143bd3be \
- --hash=sha256:b2d355df3ec5d51bfc973a22fb4ea9a03b12fdcbf00d0abd22a2c78b12ccc177
+wcwidth==0.3.2 \
+ --hash=sha256:817abc6a89e47242a349b5d100cbd244301690d6d8d2ec6335f26fe6640a6315 \
+ --hash=sha256:d469b3059dab6b1077def5923ed0a8bf5738bd4a1a87f686d5e2de455354c4ad
# via prompt-toolkit
webcolors==25.10.0 \
--hash=sha256:032c727334856fc0b968f63daa252a1ac93d33db2f5267756623c210e57a4f1d \
diff --git a/sdk/python/requirements/py3.12-minimal-requirements.txt b/sdk/python/requirements/py3.12-minimal-requirements.txt
index 7ed88615af3..e102e17b40d 100644
--- a/sdk/python/requirements/py3.12-minimal-requirements.txt
+++ b/sdk/python/requirements/py3.12-minimal-requirements.txt
@@ -873,9 +873,9 @@ grpcio-status==1.62.3 \
--hash=sha256:289bdd7b2459794a12cf95dc0cb727bd4a1742c37bd823f760236c937e53a485 \
--hash=sha256:f9049b762ba8de6b1086789d8315846e094edac2c50beaf462338b301a8fd4b8
# via google-api-core
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
diff --git a/sdk/python/requirements/py3.12-minimal-sdist-requirements-build.txt b/sdk/python/requirements/py3.12-minimal-sdist-requirements-build.txt
index 85b2ae1669b..84ab8c9f4c8 100644
--- a/sdk/python/requirements/py3.12-minimal-sdist-requirements-build.txt
+++ b/sdk/python/requirements/py3.12-minimal-sdist-requirements-build.txt
@@ -775,9 +775,9 @@ types-psutil==7.0.0.20250218 \
--hash=sha256:1447a30c282aafefcf8941ece854e1100eee7b0296a9d9be9977292f0269b121 \
--hash=sha256:1e642cdafe837b240295b23b1cbd4691d80b08a07d29932143cbbae30eb0db9c
# via mypy
-types-setuptools==80.9.0.20251223 \
- --hash=sha256:1b36db79d724c2287d83dc052cf887b47c0da6a2fff044378be0b019545f56e6 \
- --hash=sha256:d3411059ae2f5f03985217d86ac6084efea2c9e9cacd5f0869ef950f308169b2
+types-setuptools==80.10.0.20260124 \
+ --hash=sha256:1b86d9f0368858663276a0cbe5fe5a9722caf94b5acde8aba0399a6e90680f20 \
+ --hash=sha256:efed7e044f01adb9c2806c7a8e1b6aa3656b8e382379b53d5f26ee3db24d4c01
# via mypy
typing-extensions==4.15.0 \
--hash=sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466 \
diff --git a/sdk/python/requirements/py3.12-minimal-sdist-requirements.txt b/sdk/python/requirements/py3.12-minimal-sdist-requirements.txt
index f2aff007867..b6da023e3ff 100644
--- a/sdk/python/requirements/py3.12-minimal-sdist-requirements.txt
+++ b/sdk/python/requirements/py3.12-minimal-sdist-requirements.txt
@@ -880,55 +880,60 @@ googleapis-common-protos[grpc]==1.72.0 \
# google-api-core
# grpc-google-iam-v1
# grpcio-status
-greenlet==3.3.0 \
- --hash=sha256:047ab3df20ede6a57c35c14bf5200fcf04039d50f908270d3f9a7a82064f543b \
- --hash=sha256:087ea5e004437321508a8d6f20efc4cfec5e3c30118e1417ea96ed1d93950527 \
- --hash=sha256:0a5d554d0712ba1de0a6c94c640f7aeba3f85b3a6e1f2899c11c2c0428da9365 \
- --hash=sha256:2662433acbca297c9153a4023fe2161c8dcfdcc91f10433171cf7e7d94ba2221 \
- --hash=sha256:286d093f95ec98fdd92fcb955003b8a3d054b4e2cab3e2707a5039e7b50520fd \
- --hash=sha256:2d9ad37fc657b1102ec880e637cccf20191581f75c64087a549e66c57e1ceb53 \
- --hash=sha256:2de5a0b09eab81fc6a382791b995b1ccf2b172a9fec934747a7a23d2ff291794 \
- --hash=sha256:30a6e28487a790417d036088b3bcb3f3ac7d8babaa7d0139edbaddebf3af9492 \
- --hash=sha256:349345b770dc88f81506c6861d22a6ccd422207829d2c854ae2af8025af303e3 \
- --hash=sha256:39b28e339fc3c348427560494e28d8a6f3561c8d2bcf7d706e1c624ed8d822b9 \
- --hash=sha256:3a898b1e9c5f7307ebbde4102908e6cbfcb9ea16284a3abe15cab996bee8b9b3 \
- --hash=sha256:3c6e9b9c1527a78520357de498b0e709fb9e2f49c3a513afd5a249007261911b \
- --hash=sha256:4243050a88ba61842186cb9e63c7dfa677ec146160b0efd73b855a3d9c7fcf32 \
- --hash=sha256:4449a736606bd30f27f8e1ff4678ee193bc47f6ca810d705981cfffd6ce0d8c5 \
- --hash=sha256:5375d2e23184629112ca1ea89a53389dddbffcf417dad40125713d88eb5f96e8 \
- --hash=sha256:5773edda4dc00e173820722711d043799d3adb4f01731f40619e07ea2750b955 \
- --hash=sha256:60c2ef0f578afb3c8d92ea07ad327f9a062547137afe91f38408f08aacab667f \
- --hash=sha256:670d0f94cd302d81796e37299bcd04b95d62403883b24225c6b5271466612f45 \
- --hash=sha256:6c10513330af5b8ae16f023e8ddbfb486ab355d04467c4679c5cfe4659975dd9 \
- --hash=sha256:6cb3a8ec3db4a3b0eb8a3c25436c2d49e3505821802074969db017b87bc6a948 \
- --hash=sha256:6f8496d434d5cb2dce025773ba5597f71f5410ae499d5dd9533e0653258cdb3d \
- --hash=sha256:73631cd5cccbcfe63e3f9492aaa664d278fda0ce5c3d43aeda8e77317e38efbd \
- --hash=sha256:73f51dd0e0bdb596fb0417e475fa3c5e32d4c83638296e560086b8d7da7c4170 \
- --hash=sha256:7652ee180d16d447a683c04e4c5f6441bae7ba7b17ffd9f6b3aff4605e9e6f71 \
- --hash=sha256:7d2d9fd66bfadf230b385fdc90426fcd6eb64db54b40c495b72ac0feb5766c54 \
- --hash=sha256:7dee147740789a4632cace364816046e43310b59ff8fb79833ab043aefa72fd5 \
- --hash=sha256:83cd0e36932e0e7f36a64b732a6f60c2fc2df28c351bae79fbaf4f8092fe7614 \
- --hash=sha256:87e63ccfa13c0a0f6234ed0add552af24cc67dd886731f2261e46e241608bee3 \
- --hash=sha256:9ee1942ea19550094033c35d25d20726e4f1c40d59545815e1128ac58d416d38 \
- --hash=sha256:9f515a47d02da4d30caaa85b69474cec77b7929b2e936ff7fb853d42f4bf8808 \
- --hash=sha256:a1e41a81c7e2825822f4e068c48cb2196002362619e2d70b148f20a831c00739 \
- --hash=sha256:a687205fb22794e838f947e2194c0566d3812966b41c78709554aa883183fb62 \
- --hash=sha256:a7a34b13d43a6b78abf828a6d0e87d3385680eaf830cd60d20d52f249faabf39 \
- --hash=sha256:a82bb225a4e9e4d653dd2fb7b8b2d36e4fb25bc0165422a11e48b88e9e6f78fb \
- --hash=sha256:ab97cf74045343f6c60a39913fa59710e4bd26a536ce7ab2397adf8b27e67c39 \
- --hash=sha256:ac0549373982b36d5fd5d30beb8a7a33ee541ff98d2b502714a09f1169f31b55 \
- --hash=sha256:b01548f6e0b9e9784a2c99c5651e5dc89ffcbe870bc5fb2e5ef864e9cc6b5dcb \
- --hash=sha256:b299a0cb979f5d7197442dccc3aee67fce53500cd88951b7e6c35575701c980b \
- --hash=sha256:b3c374782c2935cc63b2a27ba8708471de4ad1abaa862ffdb1ef45a643ddbb7d \
- --hash=sha256:b49e7ed51876b459bd645d83db257f0180e345d3f768a35a85437a24d5a49082 \
- --hash=sha256:b96dc7eef78fd404e022e165ec55327f935b9b52ff355b067eb4a0267fc1cffb \
- --hash=sha256:c024b1e5696626890038e34f76140ed1daf858e37496d33f2af57f06189e70d7 \
- --hash=sha256:d198d2d977460358c3b3a4dc844f875d1adb33817f0613f663a656f463764ccc \
- --hash=sha256:d6ed6f85fae6cdfdb9ce04c9bf7a08d666cfcfb914e7d006f44f840b46741931 \
- --hash=sha256:d9125050fcf24554e69c4cacb086b87b3b55dc395a8b3ebe6487b045b2614388 \
- --hash=sha256:dcd2bdbd444ff340e8d6bdf54d2f206ccddbb3ccfdcd3c25bf4afaa7b8f0cf45 \
- --hash=sha256:e29f3018580e8412d6aaf5641bb7745d38c85228dacf51a73bd4e26ddf2a6a8e \
- --hash=sha256:e8e18ed6995e9e2c0b4ed264d2cf89260ab3ac7e13555b8032b25a74c6d18655
+greenlet==3.3.1 \
+ --hash=sha256:02925a0bfffc41e542c70aa14c7eda3593e4d7e274bfcccca1827e6c0875902e \
+ --hash=sha256:04bee4775f40ecefcdaa9d115ab44736cd4b9c5fba733575bfe9379419582e13 \
+ --hash=sha256:070472cd156f0656f86f92e954591644e158fd65aa415ffbe2d44ca77656a8f5 \
+ --hash=sha256:09f51496a0bfbaa9d74d36a52d2580d1ef5ed4fdfcff0a73730abfbbbe1403dd \
+ --hash=sha256:1108b61b06b5224656121c3c8ee8876161c491cbe74e5c519e0634c837cf93d5 \
+ --hash=sha256:12184c61e5d64268a160226fb4818af4df02cfead8379d7f8b99a56c3a54ff3e \
+ --hash=sha256:14194f5f4305800ff329cbf02c5fcc88f01886cadd29941b807668a45f0d2336 \
+ --hash=sha256:20fedaadd422fa02695f82093f9a98bad3dab5fcda793c658b945fcde2ab27ba \
+ --hash=sha256:27289986f4e5b0edec7b5a91063c109f0276abb09a7e9bdab08437525977c946 \
+ --hash=sha256:2f080e028001c5273e0b42690eaf359aeef9cb1389da0f171ea51a5dc3c7608d \
+ --hash=sha256:301860987846c24cb8964bdec0e31a96ad4a2a801b41b4ef40963c1b44f33451 \
+ --hash=sha256:32e4ca9777c5addcbf42ff3915d99030d8e00173a56f80001fb3875998fe410b \
+ --hash=sha256:33a956fe78bbbda82bfc95e128d61129b32d66bcf0a20a1f0c08aa4839ffa951 \
+ --hash=sha256:34a729e2e4e4ffe9ae2408d5ecaf12f944853f40ad724929b7585bca808a9d6f \
+ --hash=sha256:39eda9ba259cc9801da05351eaa8576e9aa83eb9411e8f0c299e05d712a210f2 \
+ --hash=sha256:3a300354f27dd86bae5fbf7002e6dd2b3255cd372e9242c933faf5e859b703fe \
+ --hash=sha256:3e0f3878ca3a3ff63ab4ea478585942b53df66ddde327b59ecb191b19dbbd62d \
+ --hash=sha256:3e63252943c921b90abb035ebe9de832c436401d9c45f262d80e2d06cc659242 \
+ --hash=sha256:41848f3230b58c08bb43dee542e74a2a2e34d3c59dc3076cec9151aeeedcae98 \
+ --hash=sha256:49f4ad195d45f4a66a0eb9c1ba4832bb380570d361912fa3554746830d332149 \
+ --hash=sha256:4b065d3284be43728dd280f6f9a13990b56470b81be20375a207cdc814a983f2 \
+ --hash=sha256:4b9721549a95db96689458a1e0ae32412ca18776ed004463df3a9299c1b257ab \
+ --hash=sha256:50e1457f4fed12a50e427988a07f0f9df53cf0ee8da23fab16e6732c2ec909d4 \
+ --hash=sha256:59913f1e5ada20fde795ba906916aea25d442abcc0593fba7e26c92b7ad76249 \
+ --hash=sha256:5fd23b9bc6d37b563211c6abbb1b3cab27db385a4449af5c32e932f93017080c \
+ --hash=sha256:6423481193bbbe871313de5fd06a082f2649e7ce6e08015d2a76c1e9186ca5b3 \
+ --hash=sha256:65be2f026ca6a176f88fb935ee23c18333ccea97048076aef4db1ef5bc0713ac \
+ --hash=sha256:67ea3fc73c8cd92f42467a72b75e8f05ed51a0e9b1d15398c913416f2dafd49f \
+ --hash=sha256:71c767cf281a80d02b6c1bdc41c9468e1f5a494fb11bc8688c360524e273d7b1 \
+ --hash=sha256:76e39058e68eb125de10c92524573924e827927df5d3891fbc97bd55764a8774 \
+ --hash=sha256:7932f5f57609b6a3b82cc11877709aa7a98e3308983ed93552a1c377069b20c8 \
+ --hash=sha256:7a3ae05b3d225b4155bda56b072ceb09d05e974bc74be6c3fc15463cf69f33fd \
+ --hash=sha256:7ab327905cabb0622adca5971e488064e35115430cec2c35a50fd36e72a315b3 \
+ --hash=sha256:7b2fe4150a0cf59f847a67db8c155ac36aed89080a6a639e9f16df5d6c6096f1 \
+ --hash=sha256:7e806ca53acf6d15a888405880766ec84721aa4181261cd11a457dfe9a7a4975 \
+ --hash=sha256:80aa4d79eb5564f2e0a6144fcc744b5a37c56c4a92d60920720e99210d88db0f \
+ --hash=sha256:92497c78adf3ac703b57f1e3813c2d874f27f71a178f9ea5887855da413cd6d2 \
+ --hash=sha256:96aff77af063b607f2489473484e39a0bbae730f2ea90c9e5606c9b73c44174a \
+ --hash=sha256:aec9ab04e82918e623415947921dea15851b152b822661cce3f8e4393c3df683 \
+ --hash=sha256:b066e8b50e28b503f604fa538adc764a638b38cf8e81e025011d26e8a627fa79 \
+ --hash=sha256:b31c05dd84ef6871dd47120386aed35323c944d86c3d91a17c4b8d23df62f15b \
+ --hash=sha256:bd59acd8529b372775cd0fcbc5f420ae20681c5b045ce25bd453ed8455ab99b5 \
+ --hash=sha256:bfb2d1763d777de5ee495c85309460f6fd8146e50ec9d0ae0183dbf6f0a829d1 \
+ --hash=sha256:c620051669fd04ac6b60ebc70478210119c56e2d5d5df848baec4312e260e4ca \
+ --hash=sha256:c9f9d5e7a9310b7a2f416dd13d2e3fd8b42d803968ea580b7c0f322ccb389b97 \
+ --hash=sha256:cb0feb07fe6e6a74615ee62a880007d976cf739b6669cce95daa7373d4fc69c5 \
+ --hash=sha256:cc98b9c4e4870fa983436afa999d4eb16b12872fab7071423d5262fa7120d57a \
+ --hash=sha256:d842c94b9155f1c9b3058036c24ffb8ff78b428414a19792b2380be9cecf4f36 \
+ --hash=sha256:da19609432f353fed186cc1b85e9440db93d489f198b4bdf42ae19cc9d9ac9b4 \
+ --hash=sha256:e0093bd1a06d899892427217f0ff2a3c8f306182b8c754336d32e2d587c131b4 \
+ --hash=sha256:e2e7e882f83149f0a71ac822ebf156d902e7a5d22c9045e3e0d1daf59cee2cc9 \
+ --hash=sha256:e84b51cbebf9ae573b5fbd15df88887815e3253fc000a7d0ff95170e8f7e9729 \
+ --hash=sha256:ed6b402bc74d6557a705e197d47f9063733091ed6357b3de33619d8a8d93ac53
# via feast (setup.py)
grpc-google-iam-v1==0.14.3 \
--hash=sha256:7a7f697e017a067206a3dfef44e4c634a34d3dee135fe7d7a4613fe3e59217e6 \
@@ -1013,9 +1018,9 @@ grpcio-status==1.62.3 \
--hash=sha256:289bdd7b2459794a12cf95dc0cb727bd4a1742c37bd823f760236c937e53a485 \
--hash=sha256:f9049b762ba8de6b1086789d8315846e094edac2c50beaf462338b301a8fd4b8
# via google-api-core
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
diff --git a/sdk/python/requirements/py3.12-requirements.txt b/sdk/python/requirements/py3.12-requirements.txt
index 7519a055f00..74cb562a296 100644
--- a/sdk/python/requirements/py3.12-requirements.txt
+++ b/sdk/python/requirements/py3.12-requirements.txt
@@ -174,9 +174,9 @@ fsspec==2026.1.0 \
--hash=sha256:cb76aa913c2285a3b49bdd5fc55b1d7c708d7208126b60f2eb8194fe1b4cbdcc \
--hash=sha256:e987cb0496a0d81bba3a9d1cee62922fb395e7d4c3b575e57f547953334fe07b
# via dask
-gunicorn==24.0.0 \
- --hash=sha256:30401647ed4f162a3f7e5b8b3ed77e6e88d9a4ea5599f1ff31f7f54a7610339c \
- --hash=sha256:7f6c916308740942586f49f14cf3743bdfc4130c29520c42706f42fa5a2dacdb
+gunicorn==24.1.1 \
+ --hash=sha256:757f6b621fc4f7581a90600b2cd9df593461f06a41d7259cb9b94499dc4095a8 \
+ --hash=sha256:f006d110e5cb3102859b4f5cd48335dbd9cc28d0d27cd24ddbdafa6c60929408
# via
# feast (setup.py)
# uvicorn-worker
diff --git a/sdk/python/tests/doctest/test_all.py b/sdk/python/tests/doctest/test_all.py
index 8a85a72ab45..bfe7b549032 100644
--- a/sdk/python/tests/doctest/test_all.py
+++ b/sdk/python/tests/doctest/test_all.py
@@ -79,7 +79,11 @@ def test_docstrings():
full_name = package.__name__ + "." + name
try:
# https://github.com/feast-dev/feast/issues/5088
- if "ikv" not in full_name and "milvus" not in full_name:
+ if (
+ "ikv" not in full_name
+ and "milvus" not in full_name
+ and "openlineage" not in full_name
+ ):
temp_module = importlib.import_module(full_name)
if is_pkg:
next_packages.append(temp_module)
diff --git a/setup.py b/setup.py
index 9d1700c820e..88b88d8b221 100644
--- a/setup.py
+++ b/setup.py
@@ -104,6 +104,8 @@
OPENTELEMETRY = ["prometheus_client", "psutil"]
+OPENLINEAGE_REQUIRED = ["openlineage-python>=1.40.0"]
+
MYSQL_REQUIRED = ["pymysql", "types-PyMySQL"]
HBASE_REQUIRED = [
@@ -262,6 +264,7 @@
+ SINGLESTORE_REQUIRED
+ COUCHBASE_REQUIRED
+ OPENTELEMETRY
+ + OPENLINEAGE_REQUIRED
+ FAISS_REQUIRED
+ QDRANT_REQUIRED
+ MILVUS_REQUIRED
@@ -365,6 +368,7 @@
"singlestore": SINGLESTORE_REQUIRED,
"couchbase": COUCHBASE_REQUIRED,
"opentelemetry": OPENTELEMETRY,
+ "openlineage": OPENLINEAGE_REQUIRED,
"faiss": FAISS_REQUIRED,
"qdrant": QDRANT_REQUIRED,
"go": GO_REQUIRED,