Dao-AILab/flash-attention

Fast and memory-efficient exact attention

[view on github]last commit: Apr 15, 2026
stars
23,371
7d
+140
30d
-
90d
-
## star history
## found in