You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fix: Remove hard-coded integration test setup for AWS & GCP (feast-dev#2970)
* Fix
Signed-off-by: Kevin Zhang <kzhang@tecton.ai>
* Fix lint
Signed-off-by: Kevin Zhang <kzhang@tecton.ai>
* Fix
Signed-off-by: Kevin Zhang <kzhang@tecton.ai>
* see if fix works
Signed-off-by: Kevin Zhang <kzhang@tecton.ai>
* Fix lint
Signed-off-by: Kevin Zhang <kzhang@tecton.ai>
* Fix
Signed-off-by: Kevin Zhang <kzhang@tecton.ai>
* Fix
Signed-off-by: Kevin Zhang <kzhang@tecton.ai>
* Fix lint
Signed-off-by: Kevin Zhang <kzhang@tecton.ai>
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+53-6Lines changed: 53 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -139,7 +139,7 @@ There are two sets of tests you can run:
139
139
#### Local integration tests
140
140
For this approach of running tests, you'll need to have docker set up locally: [Get Docker](https://docs.docker.com/get-docker/)
141
141
142
-
It leverages a file based offline store to test against emulated versions of Datastore, DynamoDB, and Redis, using ephemeral containers.
142
+
It leverages a file based offline store to test against emulated versions of Datastore, DynamoDB, and Redis, using ephemeral containers.
143
143
144
144
These tests create new temporary tables / datasets locally only, and they are cleaned up. when the containers are torn down.
145
145
@@ -161,17 +161,48 @@ To test across clouds, on top of setting up Redis, you also need GCP / AWS / Sno
161
161
gcloud auth login
162
162
gcloud auth application-default login
163
163
```
164
-
3. Export `GCLOUD_PROJECT=[your project]` to your .zshrc
164
+
- When you run `gcloud auth application-default login`, you should see some output of the form:
165
+
```
166
+
Credentials saved to file: [$HOME/.config/gcloud/application_default_credentials.json]
167
+
```
168
+
- You should run `export GOOGLE_APPLICATION_CREDENTIALS="$HOME/.config/gcloud/application_default_credentials.json”` to add the application credentials to your .zshrc or .bashrc.
169
+
3. Run `export GCLOUD_PROJECT=[your project]` to your .zshrc or .bashrc.
170
+
4. Running `gcloud config list` should give you something like this:
171
+
```sh
172
+
$ gcloud config list
173
+
[core]
174
+
account = [your email]
175
+
disable_usage_reporting = True
176
+
project = [your project]
177
+
178
+
Your active configuration is: [default]
179
+
```
180
+
5. Export gcp specific environment variables. Namely,
Then run `make test-python-integration`. Note that for Snowflake / GCP / AWS, this will create new temporary tables / datasets.
182
213
214
+
#### Running specific provider tests or running your test against specific online or offline stores
215
+
216
+
1. If you don't need to have your test run against all of the providers(`gcp`, `aws`, and `snowflake`) or don't need to run against all of the online stores, you can tag your test with specific providers or stores that you need(`@pytest.mark.universal_online_stores` or `@pytest.mark.universal_online_stores` with the `only` parameter). The `only` parameter selects specific offline providers and online stores that your test will test against. Example:
217
+
218
+
```python
219
+
# Only parametrizes this test with the sqlite online store
2. You can also filter tests to run by using pytest's cli filtering. Instead of using the make commands to test Feast, you can filter tests by name with the `-k` parameter. The parametrized integration tests are all uniquely identified by their provider and online store so the `-k` option can select only the tests that you need to run. For example, to run only Redshift related tests, you can use the following command:
0 commit comments