-
Notifications
You must be signed in to change notification settings - Fork 545
Insights: pytorch/xla
Overview
-
- 3 Merged pull requests
- 2 Open pull requests
- 0 Closed issues
- 1 New issue
Could not load contribution data
Please try again later
3 Pull requests merged by 3 people
-
adding tol for numeric test of checkpointing
#9404 merged
Jun 25, 2025 -
Allgather coalescee: Check tuple shape only if return shape is tuple.
#9403 merged
Jun 25, 2025 -
Prepare for pytorch tensor impl change in is_contiguous_custom
#9402 merged
Jun 25, 2025
2 Pull requests opened by 2 people
-
EmbeddingDenseBackward: Remove `padding_idx` cast to `double`
#9406 opened
Jun 25, 2025 -
Initial support for 3.12
#9407 opened
Jun 25, 2025
1 Issue opened by 1 person
-
Cannot mark sharding or print values of a SPMD tensor in a scanned function
#9405 opened
Jun 25, 2025
10 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Error Handling: refactor `XlaCoordinator` to use status types.
#9386 commented on
Jun 25, 2025 • 1 new comment -
Create and Expose the `torch_xla::OpSharding` wrapper class instead of `xla::OpSharding` class
#9390 commented on
Jun 24, 2025 • 0 new comments -
TPU test flake: SIGSEGV in train_decoder_only_eager_spmd_data_parallel.py
#9046 commented on
Jun 25, 2025 • 0 new comments -
Unnecessary FP64 cast for `padding_idx` in `EmbeddingDenseBackward`
#9392 commented on
Jun 25, 2025 • 0 new comments -
Shardy support
#9348 commented on
Jun 25, 2025 • 0 new comments -
Enable lazy tensor loading for sharded tensors
#9341 commented on
Jun 25, 2025 • 0 new comments -
[RFC] Improved coverage for native distributed collective operations
#9315 commented on
Jun 25, 2025 • 0 new comments -
Support Python 3.12 Build
#8946 commented on
Jun 25, 2025 • 0 new comments -
Fix nested stableHLO composite regions
#9385 commented on
Jun 25, 2025 • 0 new comments -
Misc changes: default sharding + allow scalar tensor math
#9398 commented on
Jun 25, 2025 • 0 new comments