Unclaimed project

Are you a maintainer of flash-attention? Claim this project to take control of your public changelog and roadmap.

Claim this project

Changelog

flash-attention

Fast and memory-efficient exact attention

Dao-AILab/flash-attention
22k2.4kPythonBSD-3-Clause

Last updated about 2 months ago