Added support for factors-only initialization in tucker decomposition#596
Added support for factors-only initialization in tucker decomposition#596sebulo wants to merge 1 commit intotensorly:mainfrom
Conversation
JeanKossaifi
left a comment
There was a problem hiding this comment.
Looks good overall, thanks @sebulo, I left some comments in the code. Generally, it's good to group each big change in its own PR.
| factors : list of factors | ||
| """ | ||
| # Validate rank against tensor shape | ||
| rank = validate_tucker_rank(tensor.shape, rank) |
There was a problem hiding this comment.
Is there a reason to add this? We already validate inside the main decomposition function, which would be called less frequently than this one. I think we can assume it is already validated here - and document that in the docstring.
There was a problem hiding this comment.
You are right. It is already validated earlier.
| ] | ||
|
|
||
| elif isinstance(init, (tuple, list)): # either a tuple (core, factors) or a list of factors | ||
| if isinstance(init, tuple) and len(init) == 2: |
There was a problem hiding this comment.
We might want to use Iterable here instead of tuple to keep it consistent and more robust, same below instead of list.
There was a problem hiding this comment.
Also I think we may want to verify that init[1] is also Iterable.. It's verbose but init may be a weird 2-tuple from a partial-tucker
There was a problem hiding this comment.
Thanks for the comments! I addressed these concerns in the newest commit: use Iterable, validate factors in tuple init.
| """ | ||
| if fixed_factors: | ||
| try: | ||
| if isinstance(init, list): # Handle factors-only init |
There was a problem hiding this comment.
Same, let's use Iterable here
| if fixed_factors: | ||
| try: | ||
| if isinstance(init, list): # Handle factors-only init | ||
| init = initialize_tucker(tensor, rank, list(range(tensor.ndim)), random_state, init=init) |
There was a problem hiding this comment.
no need to create intermediate init variable, we can directly unpack into (core, factors)
|
|
||
| res = T.dot(matrix_or_vector, unfold(tensor, mode)) | ||
| unfolded = unfold(tensor, mode) | ||
| if tensor.dtype == torch.complex32 and not handle_complex_half_by_upcasting: |
There was a problem hiding this comment.
Let's leave this for a separate PR, we need to think better on how to incorporate it cleanly. Also we should be using tl.dtype not directly torch.dtype since the latter will break on other backends.
There was a problem hiding this comment.
I agree with @JeanKossaifi, the support of complex operations in Tensorly, while interesting, should be handled separately.
Also, I believe some backends like Pytorch have built-in support for matrix multiplication with complex numbers, that may be faster than doing the splitting of real and imaginary parts manually, e.g. see here
|
Thanks for the useful addition @sebulo! Can I also ask if you can add a small unit test for the initialization with only factor matrices? something like checking that the core computed is indeed the optimal one, and that Tucker runs when called after with this initialization (on very small examples). |
|
@sebulo would be great if you could give this another pass whenever you have time! |
|
Sorry for the long delay. I gave this another pass and pushed a cleaned update focused only on factors-only Tucker initialization. The change allows init in Tucker decomposition to be provided as only the factor matrices, without an explicit core. In that case, the core is recomputed from the input tensor and the provided factors. I also removed the unrelated mode_dot complex change from this PR and added tests for:
|
Motivation
The
initialize_tuckerfunction currently requiresinitto be"svd","random", or a(core, factors)tuple. For warm-starting Tucker decomposition, users may only have factors without a core. This PR letsinitaccept a list of factors and recomputes the core internally.Changes
In
initialize_tucker:initas a list of factors underelif isinstance(init, (tuple, list)):multi_mode_dotIn
mode_dot: