What's Changed
- sync upstream llama.cpp (b7179) by @ngxson in https://github.com/ngxson/wllama/pull/194
Full Changelog: https://github.com/ngxson/wllama/compare/2.3.6...2.3.7
Unclaimed project
Are you a maintainer of wllama? Claim this project to take control of your public changelog and roadmap.
Full Changelog: https://github.com/ngxson/wllama/compare/2.3.6...2.3.7
Full Changelog: https://github.com/ngxson/wllama/compare/2.3.5...2.3.6
Firefox 142 now officially supports the wllama API for extensions 🚀
Link to release note: https://www.firefox.com/en-US/firef...
Small fix for KV cache management, which cause some issues with hybrid and recurrence models
Full Changelog: https://github.com/ngxson/wllama/compare/2.3.3...2.3.4
With latest sync from llama.cpp, new models are now supported, including Hugging Face SmolLM3 and LiquidAI LFM2