Skip to content

[pt2] Add indices dtype check to embedding meta registration#179754

Closed
XAheli wants to merge 1 commit intopytorch:mainfrom
XAheli:fix/compile-nn-float32
Closed

[pt2] Add indices dtype check to embedding meta registration#179754
XAheli wants to merge 1 commit intopytorch:mainfrom
XAheli:fix/compile-nn-float32

Conversation

@XAheli
Copy link
Copy Markdown
Contributor

@XAheli XAheli commented Apr 8, 2026

Fixes #178042

The aten.embedding meta function was missing the indices dtype check that exists in C++ (checkScalarTypes in Embedding.cpp). During compile, FakeTensor tracing passes the invalid op through without error, and then AOTAutograd's DCE removes the dead node — so the C++ check is never reached.

Added torch._check for indices dtype in the meta function so the error fires during tracing, before DCE runs.

Test: test_embedding_float_indices_error in test/nn/test_embedding.py — covers eager, aot_eager, inductor

Co-authored-with: Claude

cc @bdhirsh @penguinwu @bobrenjc93 @aorenste

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented Apr 8, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/179754

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit 06aabc7 with merge base 42e4e00 (image):

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot Bot commented Apr 8, 2026

This PR needs a release notes: label

If your changes are user facing and intended to be a part of release notes, please use a label starting with release notes:.

If not, please add the topic: not user facing label.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "topic: not user facing"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

@XAheli
Copy link
Copy Markdown
Contributor Author

XAheli commented Apr 8, 2026

@pytorchbot label "module: correctness (silent)"

@pytorch-bot pytorch-bot Bot added the module: correctness (silent) issue that returns an incorrect result silently label Apr 8, 2026
@XAheli
Copy link
Copy Markdown
Contributor Author

XAheli commented Apr 8, 2026

@pytorchbot label "module: pt2-dispatcher"

@pytorch-bot pytorch-bot Bot added the module: pt2-dispatcher PT2 dispatcher-related issues (e.g., aotdispatch, functionalization, faketensor, custom-op, label Apr 8, 2026
@XAheli
Copy link
Copy Markdown
Contributor Author

XAheli commented Apr 8, 2026

@pytorchbot label "topic: fuzzer"

@XAheli
Copy link
Copy Markdown
Contributor Author

XAheli commented Apr 8, 2026

@pytorchbot label "topic: not user facing"

@pytorch-bot pytorch-bot Bot added the topic: not user facing topic category label Apr 8, 2026
@arkadip-maitra arkadip-maitra requested a review from isuruf April 13, 2026 15:25
@XAheli
Copy link
Copy Markdown
Contributor Author

XAheli commented Apr 15, 2026

@pytorchbot merge

@pytorch-bot pytorch-bot Bot added the ciflow/trunk Trigger trunk jobs on your pull request label Apr 15, 2026
@pytorchmergebot
Copy link
Copy Markdown
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

meta-codesync Bot pushed a commit to pytorch/executorch that referenced this pull request Apr 19, 2026
Summary:
## Context

PyTorch PR pytorch/pytorch#179754 (fixing pytorch/pytorch#178042) added a dtype validation check to the `aten.embedding` meta registration in `torch/_meta_registrations.py`:

```python
torch._check(
    indices.dtype in (torch.long, torch.int32),
    lambda: (
        "Expected tensor for argument #1 'indices' to have one of the following "
        f"scalar types: Long, Int; but got {indices.dtype} instead"
    ),
)
```

This aligns the meta function with the C++ implementation (`checkScalarTypes` in `Embedding.cpp`), which already enforced integer indices. Previously, no meta registration existed for `aten.embedding`, so FakeTensor tracing during `torch.export`/`torch.compile` silently accepted float indices, and AOTAutograd's DCE could remove the dead node before the C++ check ever fired.

## Problem

`test_batched_export_with_backprop` in `test_static_attention.py` creates example token inputs using `torch.zeros()` without specifying a dtype:

```python
# Before (defaults to torch.float32)
torch.zeros(batch_size, input_len)
torch.zeros(1, input_len)
```

During `torch.export.export()`, these float32 tensors flow into `self.tok_embeddings(tokens)` (an `nn.Embedding` layer in `llama_transformer.py`), which dispatches to `aten.embedding`. The new meta function dtype check rejects float32 indices, causing the export to fail.

Note that the actual backprop loop already uses integer indices correctly via `torch.randint(config.vocab_size, (batch_size, input_len))` — only the export-tracing example inputs were wrong.

## Fix

Add explicit `dtype=torch.long` to both `torch.zeros` calls used as token example inputs:

```python
# After
torch.zeros(batch_size, input_len, dtype=torch.long)
torch.zeros(1, input_len, dtype=torch.long)
```

Differential Revision: D101547370
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk Trigger trunk jobs on your pull request Merged module: correctness (silent) issue that returns an incorrect result silently module: pt2-dispatcher PT2 dispatcher-related issues (e.g., aotdispatch, functionalization, faketensor, custom-op, open source topic: fuzzer topic: not user facing topic category

Projects

None yet

Development

Successfully merging this pull request may close these issues.

torch.compile silently accepts nn.Embedding with float tensor indices where eager raises TypeError

4 participants