awesomer
_
>
home
/
Dao-AILab/flash-attention
Dao-AILab/flash-attention
Fast and memory-efficient exact attention
[view on github]
last commit: Apr 15, 2026
stars
23,371
7d
+140
30d
-
90d
-
## star history
## found in
Awesome Open Source AI/🧬 1. Core Frameworks & Libraries