You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Generally, in the inference scenario, the input data is protobuf, which contains the values of each feature required by the model and the tensor indicating the number of items in the sample. The user tensor shape is [Duser], and the item tensor shape is [N, Ditem]
74
+
75
+
## Performance
76
+
77
+
Compressing user-side features reduces the end-to-end delay of Inference. In a cloud online service case, the performance results are as follows:
0 commit comments