Skip to content

Commit 16696b8

Browse files
committed
feat: Consolidate Python packaging - remove setup.py/setup.cfg, standardize on pyproject.toml and uv
Signed-off-by: ntkathole <nikhilkathole2683@gmail.com>
1 parent c636cd4 commit 16696b8

20 files changed

Lines changed: 54 additions & 472 deletions

File tree

.github/workflows/linter.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ jobs:
2626
uses: actions/cache@v4
2727
with:
2828
path: sdk/python/.mypy_cache
29-
key: mypy-${{ runner.os }}-py${{ env.PYTHON }}-${{ hashFiles('pyproject.toml', 'setup.py', 'sdk/python/pyproject.toml', 'sdk/python/requirements/*.txt') }}
29+
key: mypy-${{ runner.os }}-py${{ env.PYTHON }}-${{ hashFiles('pyproject.toml', 'sdk/python/pyproject.toml', 'sdk/python/requirements/*.txt') }}
3030
restore-keys: |
3131
mypy-${{ runner.os }}-py${{ env.PYTHON }}-
3232
mypy-${{ runner.os }}-

Makefile

Lines changed: 5 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ precommit-check: format-python lint-python ## Run all precommit checks
6969

7070
# Install precommit hooks with correct stages
7171
install-precommit: ## Install precommit hooks (runs on commit, not push)
72-
pip install pre-commit
72+
uv pip install pre-commit
7373
pre-commit install --hook-type pre-commit
7474
@echo "✅ Precommit hooks installed (will run on commit, not push)"
7575

@@ -126,31 +126,23 @@ install-hadoop-dependencies-ci: ## Install Hadoop dependencies
126126
tar -xzf $$HOME/hadoop-3.4.2.tar.gz -C $$HOME; \
127127
mv $$HOME/hadoop-3.4.2 $$HOME/hadoop; \
128128
fi
129-
install-python-ci-dependencies: ## Install Python CI dependencies in system environment using piptools
130-
python -m piptools sync sdk/python/requirements/py$(PYTHON_VERSION)-ci-requirements.txt
131-
pip install --no-deps -e .
132-
133-
# Currently used in test-end-to-end.sh
134-
install-python: ## Install Python requirements and develop package (setup.py develop)
135-
python -m piptools sync sdk/python/requirements/py$(PYTHON_VERSION)-requirements.txt
136-
python setup.py develop
137129

138130
lock-python-dependencies-all: ## Recompile and lock all Python dependency sets for all supported versions
139131
# Remove all existing requirements because we noticed the lock file is not always updated correctly.
140132
# Removing and running the command again ensures that the lock file is always up to date.
141133
rm -rf sdk/python/requirements/* 2>/dev/null || true
142134
$(foreach ver,$(PYTHON_VERSIONS),\
143135
pixi run --environment $(call get_env_name,$(ver)) --manifest-path infra/scripts/pixi/pixi.toml \
144-
"uv pip compile -p $(ver) --no-strip-extras setup.py --extra ci \
136+
"uv pip compile -p $(ver) --no-strip-extras pyproject.toml --extra ci \
145137
--generate-hashes --output-file sdk/python/requirements/py$(ver)-ci-requirements.txt" && \
146138
pixi run --environment $(call get_env_name,$(ver)) --manifest-path infra/scripts/pixi/pixi.toml \
147-
"uv pip compile -p $(ver) --no-strip-extras setup.py \
139+
"uv pip compile -p $(ver) --no-strip-extras pyproject.toml \
148140
--generate-hashes --output-file sdk/python/requirements/py$(ver)-requirements.txt" && \
149141
pixi run --environment $(call get_env_name,$(ver)) --manifest-path infra/scripts/pixi/pixi.toml \
150-
"uv pip compile -p $(ver) --no-strip-extras setup.py --extra minimal \
142+
"uv pip compile -p $(ver) --no-strip-extras pyproject.toml --extra minimal \
151143
--generate-hashes --output-file sdk/python/requirements/py$(ver)-minimal-requirements.txt" && \
152144
pixi run --environment $(call get_env_name,$(ver)) --manifest-path infra/scripts/pixi/pixi.toml \
153-
"uv pip compile -p $(ver) --no-strip-extras setup.py --extra minimal-sdist-build \
145+
"uv pip compile -p $(ver) --no-strip-extras pyproject.toml --extra minimal-sdist-build \
154146
--no-emit-package milvus-lite \
155147
--generate-hashes --output-file sdk/python/requirements/py$(ver)-minimal-sdist-requirements.txt" && \
156148
pixi run --environment $(call get_env_name,$(ver)) --manifest-path infra/scripts/pixi/pixi.toml \

docs/how-to-guides/customizing-feast/adding-a-new-offline-store.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -440,10 +440,10 @@ test-python-universal-spark:
440440

441441
### 7. Dependencies
442442

443-
Add any dependencies for your offline store to our `sdk/python/setup.py` under a new `<OFFLINE_STORE>__REQUIRED` list with the packages and add it to the setup script so that if your offline store is needed, users can install the necessary python packages. These packages should be defined as extras so that they are not installed by users by default. You will need to regenerate our requirements files:
443+
Add any dependencies for your offline store to `pyproject.toml` under `[project.optional-dependencies]` as a new extra (e.g. `<offline_store> = ["package1>=1.0", "package2"]`). These packages should be defined as extras so that they are not installed by users by default. You will need to regenerate our requirements lock files:
444444

445445
```
446-
make lock-python-ci-dependencies-all
446+
make lock-python-dependencies-all
447447
```
448448

449449
### 8. Add Documentation

docs/how-to-guides/customizing-feast/adding-support-for-a-new-online-store.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -384,13 +384,12 @@ test-python-universal-cassandra:
384384

385385
### 5. Add Dependencies
386386

387-
Add any dependencies for your online store to our `sdk/python/setup.py` under a new `<ONLINE_STORE>_REQUIRED` list with the packages and add it to the setup script so that if your online store is needed, users can install the necessary python packages. These packages should be defined as extras so that they are not installed by users by default.
387+
Add any dependencies for your online store to `pyproject.toml` under `[project.optional-dependencies]` as a new extra (e.g. `<online_store> = ["package1>=1.0", "package2"]`). These packages should be defined as extras so that they are not installed by users by default.
388388

389-
* You will need to regenerate our requirements files. To do this, create separate pyenv environments for python 3.8, 3.9, and 3.10. In each environment, run the following commands:
389+
* You will need to regenerate our requirements lock files:
390390

391391
```
392-
export PYTHON=<version>
393-
make lock-python-ci-dependencies
392+
make lock-python-dependencies-all
394393
```
395394
396395
### 6. Add Documentation

docs/project/development-guide.md

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -91,13 +91,12 @@ See [Creating a pull request from a fork](https://docs.github.com/en/github/coll
9191

9292
### Pre-commit Hooks
9393
Setup [`pre-commit`](https://pre-commit.com/) to automatically lint and format the codebase on commit:
94-
1. Ensure that you have Python (3.7 and above) with `pip`, installed.
95-
2. Install `pre-commit` with `pip` &amp; install pre-push hooks
94+
1. Ensure that you have Python (3.10 and above) with [uv](https://docs.astral.sh/uv/) installed.
95+
2. Install `pre-commit` and hooks:
9696
```sh
97-
pip install pre-commit
98-
pre-commit install --hook-type pre-commit --hook-type pre-push
97+
make install-precommit
9998
```
100-
3. On push, the pre-commit hook will run. This runs `make format` and `make lint`.
99+
3. On commit, the pre-commit hook will run. This runs `make format` and `make lint`.
101100

102101
### Signing off commits
103102
> :warning: Warning: using the default integrations with IDEs like VSCode or IntelliJ will not sign commits.
@@ -351,18 +350,18 @@ You can run `make test-python-integration-container` to run tests against the co
351350

352351
### Contrib integration tests
353352
#### (Contrib) Running tests for Spark offline store
354-
You can run `make test-python-universal-spark` to run all tests against the Spark offline store. (Note: you'll have to run `pip install -e ".[dev]"` first).
353+
You can run `make test-python-universal-spark` to run all tests against the Spark offline store. (Note: you'll have to run `make install-python-dependencies-dev` first).
355354
356355
Not all tests are passing yet
357356
358357
#### (Contrib) Running tests for Trino offline store
359-
You can run `make test-python-universal-trino` to run all tests against the Trino offline store. (Note: you'll have to run `pip install -e ".[dev]"` first)
358+
You can run `make test-python-universal-trino` to run all tests against the Trino offline store. (Note: you'll have to run `make install-python-dependencies-dev` first)
360359

361360
#### (Contrib) Running tests for Postgres offline store
362-
You can run `test-python-universal-postgres-offline` to run all tests against the Postgres offline store. (Note: you'll have to run `pip install -e ".[dev]"` first)
361+
You can run `test-python-universal-postgres-offline` to run all tests against the Postgres offline store. (Note: you'll have to run `make install-python-dependencies-dev` first)
363362
364363
#### (Contrib) Running tests for Postgres online store
365-
You can run `test-python-universal-postgres-online` to run all tests against the Postgres offline store. (Note: you'll have to run `pip install -e ".[dev]"` first)
364+
You can run `test-python-universal-postgres-online` to run all tests against the Postgres offline store. (Note: you'll have to run `make install-python-dependencies-dev` first)
366365

367366
#### (Contrib) Running tests for HBase online store
368367
TODO

environment-setup.md

Lines changed: 8 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,15 @@
1-
1. install anaconda, install docker
2-
2. create an environment for feast, selecting python 3.9. Activate the environment:
1+
1. Install Docker and [uv](https://docs.astral.sh/uv/getting-started/installation/)
2+
2. Create a virtual environment and activate it:
33
```bash
4-
conda create --name feast python=3.9
5-
conda activate feast
4+
uv venv --python 3.11
5+
source .venv/bin/activate
66
```
7-
3. install dependencies:
7+
3. Install dependencies:
88
```bash
9-
pip install pip-tools
10-
brew install mysql
11-
brew install xz protobuf openssl zlib
12-
pip install cryptography -U
13-
conda install protobuf
14-
conda install pymssql
15-
pip install -e ".[dev]"
16-
make install-python-ci-dependencies PYTHON=3.9
9+
make install-python-dependencies-dev
1710
```
18-
4. start the docker daemon
19-
5. run unit tests:
11+
4. Start the Docker daemon
12+
5. Run unit tests:
2013
```bash
2114
make test-python-unit
2215
```

infra/scripts/test-end-to-end.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,8 +7,8 @@ infra/scripts/download-maven-cache.sh --archive-uri ${MAVEN_CACHE} --output-dir
77
apt-get update && apt-get install -y redis-server postgresql libpq-dev
88

99
make build-java-no-tests REVISION=develop
10-
python -m pip install --upgrade pip setuptools wheel pip-tools
11-
make install-python
10+
pip install uv
11+
make install-python-dependencies-dev
1212
python -m pip install -qr tests/requirements.txt
1313

1414
su -p postgres -c "PATH=$PATH HOME=/tmp pytest -v tests/e2e/ --feast-version develop"

java/serving/src/test/resources/docker-compose/feast10/Dockerfile

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
FROM python:3.11
22

3+
COPY --from=ghcr.io/astral-sh/uv:latest /uv /usr/local/bin/uv
4+
35
WORKDIR /app
46
COPY sdk/python /mnt/feast/sdk/python
57
COPY protos /mnt/feast/protos
6-
COPY setup.py /mnt/feast/setup.py
78
COPY pyproject.toml /mnt/feast/pyproject.toml
89
COPY README.md /mnt/feast/README.md
910
COPY Makefile /mnt/feast/Makefile
1011
ENV SETUPTOOLS_SCM_PRETEND_VERSION=0.1.0
11-
RUN pip install uv
1212
RUN cd /mnt/feast && uv pip install --system .[grpcio,redis]
1313
COPY java/serving/src/test/resources/docker-compose/feast10/ .
1414
EXPOSE 8080

pyproject.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -230,6 +230,7 @@ requires = [
230230
"sphinx!=4.0.0",
231231
"wheel>=0.46.2",
232232
]
233+
build-backend = "setuptools.build_meta"
233234

234235
[tool.setuptools]
235236
packages = {find = {where = ["sdk/python"], exclude = ["java", "infra", "sdk/python/tests", "ui"]}}

sdk/python/docs/conf.py

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,6 @@
1919
import os
2020
import sys
2121

22-
import sphinx_rtd_theme
23-
2422
sys.path.insert(0, os.path.abspath("../../feast"))
2523
sys.path.insert(0, os.path.abspath("../.."))
2624

@@ -32,7 +30,6 @@
3230
# example where the Python protos did not build, which subsequently broke
3331
# the RTD build. In order to fix this, we manually compile the protos.
3432
import subprocess
35-
3633
from pathlib import Path
3734

3835
# cwd will be feast/sdk/python/docs/source
@@ -42,7 +39,9 @@
4239
os.chdir(cwd.parent.parent.parent.parent)
4340

4441
# Compile Python protos
45-
result = subprocess.run(["python", "setup.py", "build_python_protos", "--inplace"], capture_output=True)
42+
result = subprocess.run(
43+
["python", "infra/scripts/generate_protos.py"], capture_output=True
44+
)
4645
stdout = result.stdout.decode("utf-8")
4746
stderr = result.stderr.decode("utf-8")
4847
print(f"Apply stdout:\n{stdout}")

0 commit comments

Comments
 (0)