Unclaimed project

Are you a maintainer of flash-attention? Claim this project to take control of your public changelog and roadmap.

Claim this project

Changelog

flash-attention

Fast and memory-efficient exact attention

Dao-AILab/flash-attention
23k2.6kPythonBSD-3-Clause

Last updated about 8 hours ago

v2.8.0.post1 - flash-attention Release Notes | AnnounceHQ