awesomer
_
/trending
>
[github]
home
/
Dao-AILab/flash-attention
Dao-AILab/flash-attention
Fast and memory-efficient exact attention
[view on github]
last commit: Apr 3, 2026
stars
23,121
7d
-
30d
-
90d
-
## star history
## found in
Awesome Open Source AI/🧬 1. Core Frameworks & Libraries