v0.1.4
🎉 Another big release from the 🐈 nanobot community — thanks to all contributors, especially our 18 new ones!
This release adds MCP tool server support, real-time progress streaming so users actually see what the agent is doing, and a wave of new providers (Custom OpenAI-compatible, GitHub Copilot, OpenAI Codex, SiliconFlow). Channels got a lot of love too — Telegram now handles media uploads and long messages, Slack got proper mrkdwn and thread replies, Feishu supports rich text. We also added Docker Compose for one-command deployment, scoped sessions to workspace, and switched to json_repair for bulletproof LLM response parsing. Less silence, more providers, better channels — that's the nanobot way.
Highlights
- MCP Support — Connect external tool servers via Model Context Protocol (#554)
- Progress Streaming — Agent shows what it's doing during multi-step tool execution (#802)
- New Providers — Custom OpenAI-compatible endpoints, GitHub Copilot, OpenAI Codex, SiliconFlow (#786, #720, #312, #151, #630)
- Channel Improvements — Telegram media uploads & message splitting, Slack thread replies & mrkdwn, Feishu rich text (#747, #694, #717, #784, #629, #593)