Add Half (float16) support to slim ScalarType enum (#18959)#18959
Add Half (float16) support to slim ScalarType enum (#18959)#18959kirklandsign merged 1 commit intopytorch:mainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18959
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New Failure, 2 Unrelated FailuresAs of commit 50e9eff with merge base 7fdd306 ( NEW FAILURE - The following job has failed:
BROKEN TRUNK - The following jobs failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
@digantdesai has exported this pull request. If you are a Meta employee, you can view the originating Diff in D101218928. |
This PR needs a
|
There was a problem hiding this comment.
Pull request overview
Adds float16 (Half) to the AOTI “slim” c10::ScalarType so CUDA sort shims can switch on ScalarType::Half directly without relying on hard-coded casts (avoiding -Werror=switch issues).
Changes:
- Add
ScalarType::Half = 5pluskHalf, and extend helpers (elementSize,toString,isFloatingType,isValidScalarType) to support Half. - Update CUDA sort shim to use
c10_slim::ScalarType::Halfin switch statements. - Extend slim ScalarType unit tests to cover Half (enum value, element size, constants, validity).
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
| backends/cuda/runtime/shims/sort.cu | Switch dispatch now uses ScalarType::Half instead of a locally-cast placeholder. |
| backends/aoti/slim/c10/core/ScalarType.h | Introduces Half into the slim dtype enum and updates dtype utility helpers accordingly. |
| backends/aoti/slim/c10/core/test/test_scalar_type.cpp | Adds coverage ensuring Half is correctly represented and handled by helper APIs. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| // PyTorch ScalarType::Half = 5, now defined in slim ScalarType enum. | ||
| using c10_slim::kHalf; | ||
|
|
| case ScalarType::Long: | ||
| return sizeof(int64_t); | ||
| case ScalarType::Half: | ||
| return 2; // sizeof(__half) = 2 bytes |
Summary: Pull Request resolved: pytorch#18959 The CUDA runtime shims for sort operations use Half (float16) dtype, but it was not defined in the slim ScalarType enum, causing compiler warnings treated as errors (-Werror=switch). This adds proper Half support to the slim ScalarType enum so switch statements can use the enum value directly instead of casting to the underlying type. Differential Revision: D101218928
bff4a46 to
724e55a
Compare
| TEST_F(IsValidScalarTypeTest, HalfIsValid) { | ||
| EXPECT_TRUE(isValidScalarType(ScalarType::Half)); | ||
| } |
| case ScalarType::Long: | ||
| return sizeof(int64_t); | ||
| case ScalarType::Half: | ||
| return 2; // sizeof(__half) = 2 bytes |
Summary: Pull Request resolved: pytorch#18959 The CUDA runtime shims for sort operations use Half (float16) dtype, but it was not defined in the slim ScalarType enum, causing compiler warnings treated as errors (-Werror=switch). This adds proper Half support to the slim ScalarType enum so switch statements can use the enum value directly instead of casting to the underlying type. Reviewed By: kirklandsign Differential Revision: D101218928
724e55a to
50e9eff
Compare
There was a problem hiding this comment.
Pull request overview
This PR adds float16 (Half) support to the slim c10::ScalarType enum to eliminate CUDA shim build warnings/errors caused by missing enum coverage (notably in sort.cu switch statements).
Changes:
- Add
ScalarType::Half = 5(andkHalf) to slimScalarType, plus support inelementSize(),toString(),isFloatingType(), andisValidScalarType(). - Update CUDA sort shims to switch directly on
c10_slim::ScalarType::Halfrather than using a casted placeholder value. - Extend slim ScalarType unit tests to cover
Halfconstants, sizes, and validity.
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
| backends/cuda/runtime/shims/sort.cu | Removes the cast-based kHalf workaround and switches directly on ScalarType::Half. |
| backends/aoti/slim/c10/core/ScalarType.h | Introduces Half into the slim enum and updates helper utilities accordingly. |
| backends/aoti/slim/c10/core/test/test_scalar_type.cpp | Adds test coverage for Half (enum value, element size, constants, validity). |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| case ScalarType::Long: | ||
| return sizeof(int64_t); | ||
| case ScalarType::Half: | ||
| return 2; // sizeof(__half) = 2 bytes |
| {ScalarType::Short, 2, 2, "Short", false, true, true, false}, | ||
| {ScalarType::Int, 3, 4, "Int", false, true, true, false}, | ||
| {ScalarType::Long, 4, 8, "Long", false, true, true, false}, | ||
| {ScalarType::Half, 5, 2, "Half", true, false, false, false}, | ||
| {ScalarType::Float, 6, 4, "Float", true, false, false, false}, |
| // PyTorch ScalarType::Half = 5, now defined in slim ScalarType enum. | ||
| using c10_slim::kHalf; | ||
|
|
Summary:
The CUDA runtime shims for sort operations use Half (float16) dtype, but it was
not defined in the slim ScalarType enum, causing compiler warnings treated as
errors (-Werror=switch). This adds proper Half support to the slim ScalarType
enum so switch statements can use the enum value directly instead of casting
to the underlying type.
Reviewed By: kirklandsign
Differential Revision: D101218928