Add 3D (NCW) channels-last dim_order support to Neutron runtime (#19341)#19341
Add 3D (NCW) channels-last dim_order support to Neutron runtime (#19341)#19341JakeStevens wants to merge 1 commit into
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/19341
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 2 New Failures, 5 Unrelated FailuresAs of commit 1ef67ca with merge base e4f5b38 ( NEW FAILURES - The following jobs have failed:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
BROKEN TRUNK - The following jobs failed but was present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
@JakeStevens has exported this pull request. If you are a Meta employee, you can view the originating Diff in D102862305. |
This PR needs a
|
jirioc
left a comment
There was a problem hiding this comment.
The implementation looks good. Note that sometimes the transpositions are done by the neutron instead of the backend (scheduled by AoT), so I you need to make sure that that part is also compatible with 3D.
|
LGTM. But perhaps some test with a simple 3D channels last dim order model would be useful. |
Are there examples in the repo of cpp tests that use the backend cpp file? |
1a79957 to
e3687e0
Compare
…rch#19341) Summary: Although 3dim dim order is not supported pytorch, it actually *is* properly supported within the NXP backend's implementation. The only issue is within the runtime, which forces the check to four dimensions. This PR relaxes and supports 3D. Differential Revision: D102862305
You can use a test like this: class Conv1DModel(torch.nn.Module):
def __init__(self):
super().__init__()
self.conv = torch.nn.Conv1d(2, 2, 3)
def forward(self, x):
return self.conv(x)
def test_3d_channels_last_dim_order__conv():
model = Conv1DModel().eval().to(memory_format=torch.channels_last)
# Use the `channels last` dim order.
input_spec = [
ModelInputSpec((1, 2, 5), dim_order=torch.channels_last)
]
comparator = NumericalStatsOutputComparator(max_mse_error=8e-3)
lower_run_compare(
model,
input_spec,
dataset_creator=RandomDatasetCreator(),
output_comparator=comparator,
dlg_model_verifier=BaseGraphVerifier(1, []),
reference_model=ReferenceModel.QUANTIZED_EDGE_PYTHON
) |
…rch#19341) Summary: Although 3dim dim order is not supported pytorch, it actually *is* properly supported within the NXP backend's implementation. The only issue is within the runtime, which forces the check to four dimensions. This PR relaxes and supports 3D. Differential Revision: D102862305
e3687e0 to
5a6542f
Compare
…rch#19341) Summary: Although 3dim dim order is not supported pytorch, it actually *is* properly supported within the NXP backend's implementation. The only issue is within the runtime, which forces the check to four dimensions. This PR relaxes and supports 3D. Differential Revision: D102862305
5a6542f to
9b3077a
Compare
…rch#19341) Summary: Although 3dim dim order is not supported pytorch, it actually *is* properly supported within the NXP backend's implementation. The only issue is within the runtime, which forces the check to four dimensions. This PR relaxes and supports 3D. Differential Revision: D102862305
9b3077a to
9cac07b
Compare
…rch#19341) Summary: Although 3dim dim order is not supported pytorch, it actually *is* properly supported within the NXP backend's implementation. The only issue is within the runtime, which forces the check to four dimensions. This PR relaxes and supports 3D. Differential Revision: D102862305
9cac07b to
1ef67ca
Compare
| def test_3d_channels_last_dim_order__conv(mocker): | ||
| model = Conv1DModel().eval() | ||
|
|
||
| input_spec = [ModelInputSpec((1, 2, 5), dim_order=torch.channels_last)] |
There was a problem hiding this comment.
According to your modifications to executorch_pipeline, the example input used to export this model will use the contiguous memory format. Also, you are not calling model.to(memory_format=torch.channels_last) here. If I understand correctly, this is because torchdoesn't support 3D channels last memory format. But how will thedim_orderentry in the nodes'meta` attribute be set? I.e. how will ExecuTorch know to use the 3D channels last dim_order?
To me, it looks like the test is just using regular contiguous dim_order, and the specified torch.channels_last in the input spec is being silently ignored.
Is my assumption correct, or am I missing something?
Summary:
Although 3dim dim order is not supported pytorch, it actually is properly supported within the NXP backend's implementation.
The only issue is within the runtime, which forces the check to four dimensions.
This PR relaxes and supports 3D.
Differential Revision: D102862305