Skip to content

Fix flash attention shard_map for sequence lengths not divisible by c…

0be2831
Select commit
Loading
Failed to load commit list.
Merged

[LTX-2] Fix flash attention shard_map for sequence lengths not divisible by context mesh axis #363

Fix flash attention shard_map for sequence lengths not divisible by c…
0be2831
Select commit
Loading
Failed to load commit list.
Google CLA / cla/google succeeded Mar 31, 2026 in 6s

✅ All contributors are covered under a CLA with Google

See https://cla.developers.google.com/ for more info about Google's Contributor License Agreement (CLA).

ℹ️ Googlers: Go here to view more details and manage scans for this pull request.

Details

The following contributors were found for this pull request:

0be2831 Author: @mbohlool <m***y​@google.com>

(Only the first commit for a unique contributor is listed.)