Unclaimed project
Are you a maintainer of Sidekick? Claim this project to take control of your public changelog and roadmap.
Changelog
A native macOS app that allows users to chat with a local LLM that can respond with information from files, folders and websites on your Mac without installing any other software. Powered by llama.cpp.
feat: Implement GraphRAG feat: Encourage querying expert resources using tools feat: Overall index progress feat: Revamped chat attachments feat: Paste to add attachments
fix: Reorder inference settings fix: GLM 4.6 reasoning on OpenRouter fix: Crash during indexing fix: GraphRAG indexing indicator fix: Improve table rendering performance fix: Track token usage fix: Fix resources indexing limbo state fix: Fix toolbar text color fix: Resume indexing fix: Prompt field scrolling fix: Fix cancelled requests fix: Delete attachments for deleted conversations fix: Increase extraction speed fix: Improve attachments UI fix: Reduce startup latency fix: Stop worker model toggling vision fix: Remove provider ranking param