Skip to content

Fix block size handling for cross attention in TPU flash attention

364be35
Select commit
Loading
Failed to load commit list.
Merged

Fix block size handling for cross attention in TPU flash attention #301

Fix block size handling for cross attention in TPU flash attention
364be35
Select commit
Loading
Failed to load commit list.
Google CLA / cla/google succeeded Dec 30, 2025 in 2s

✅ All contributors are covered under a CLA with Google

See https://cla.developers.google.com/ for more info about Google's Contributor License Agreement (CLA).

ℹ️ Googlers: Go here to view more details and manage scans for this pull request.

Details

The following contributors were found for this pull request:

364be35 Author: @michelle-yooh <y**h​@google.com>

(Only the first commit for a unique contributor is listed.)