Skip to content

Commit b181bde

Browse files
committed
fix: Fixing the return order of elements when calculating the min and max entity-DF event timestamps in the Spark offline store.
Signed-off-by: Lev Pickovsky <lev.pickovsky@ironsrc.com>
1 parent 00ed65a commit b181bde

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

  • sdk/python/feast/infra/offline_stores/contrib/spark_offline_store

sdk/python/feast/infra/offline_stores/contrib/spark_offline_store/spark.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -324,8 +324,8 @@ def _get_entity_df_event_timestamp_range(
324324
df = spark_session.sql(entity_df).select(entity_df_event_timestamp_col)
325325
# TODO(kzhang132): need utc conversion here.
326326
entity_df_event_timestamp_range = (
327-
df.agg({entity_df_event_timestamp_col: "max"}).collect()[0][0],
328327
df.agg({entity_df_event_timestamp_col: "min"}).collect()[0][0],
328+
df.agg({entity_df_event_timestamp_col: "max"}).collect()[0][0],
329329
)
330330
else:
331331
raise InvalidEntityType(type(entity_df))

0 commit comments

Comments
 (0)