Is your feature request related to a problem? Please describe.
We should allow Feature Views to return matrices/tensors natively. For example, torch.tensors.
At the moment, for some features we require the client to serialize the output into a matrix before running inference. Feast should support executing these transformations and serializing the data into matrices for both online and offline retrieval.
Describe the solution you'd like
features: torch.Tensor = store.get_online_features()
Describe alternatives you've considered
Not supporting this is the alternative, which is the current state, which leaves users to write their own brittle logic to handle various complexities.
Additional context
@HaoXuAI @tokoko I know we discussed sklearn pipelines in the past and I thought I'd share my thoughts.
Is your feature request related to a problem? Please describe.
We should allow Feature Views to return matrices/tensors natively. For example,
torch.tensors.At the moment, for some features we require the client to serialize the output into a matrix before running inference. Feast should support executing these transformations and serializing the data into matrices for both online and offline retrieval.
Describe the solution you'd like
Describe alternatives you've considered
Not supporting this is the alternative, which is the current state, which leaves users to write their own brittle logic to handle various complexities.
Additional context
@HaoXuAI @tokoko I know we discussed sklearn pipelines in the past and I thought I'd share my thoughts.