Improved
1.0.0 Release Candidate 13
Update
Features
- Local LLM & remote VLMs with an OpenAI compatible API
- File, folder, website indexing and context
- Web search
- Function calling
- Deep Research
- Memory
- Canvas
- Image generation
- Extensions
- Diagrammer
- Slide Studio
- Inline Writing Assistant & Completions
- Detector
New Changes
- feat: Add Homebrew to PATH if installed
- feat: llama.cpp native function calling
- feat: Improved HTML conversation export
- feat: Allow adjusting completion threshold
- feat: Add Claude 4
- feat: Add Mistral's Magistal
- feat: Add MiniMax-M1
- feat: Prioritize provider throughput
- docs: Add Deep Research
Fixes and Performance Improvements
- fix: add creation of marp target directory to move binary
- fix: Fix streaming with high throughput providers
- fix: UI render increment
- fix: Lower completions memory usage
- fix: Recognize models with
:thinkingsuffix as reasoning capable
Installation
Download the disk image (.dmg), mount it, then drag the app into the Applications folder.