New
๐ Local DeepSeek-r1 Power with Ollama!
Hey everyone,
We've just rolled out a new release packed with awesome updates:
- Browser-Use Upgrade: We're now fully compatible with the latest
browser-useversion 0.1.29! ๐ - Local Ollama Integration: Get ready for completely local and private AI with support for the incredible
deepseek-r1model via Ollama! ๐
Before You Dive In:
- Update Code: Don't forget to
git pullto grab the latest code changes. - Reinstall Dependencies: Run
pip install -r requirements.txtto ensure all your dependencies are up to date.
Important Notes on deepseek-r1:
- Model Size Matters: We've found that
deepseek-r1:14band larger models work exceptionally well! Smaller models may not provide the best experience, so we recommend sticking with the larger options. ๐ค
How to Get Started with Ollama and deepseek-r1:
- Install Ollama: Head over to ollama and download/install Ollama on your system. ๐ป
- Run
deepseek-r1: Open your terminal and run the command:ollama run deepseek-r1:14b(or a larger model if you prefer). - WebUI Setup: Launch the WebUI following the instructions. Here's a crucial step: Uncheck "Use Vision" and set "Max Actions per Step" to 1. โ
- Enjoy! You're now all set to experience the power of local
deepseek-r1. Have fun! ๐ฅณ
Happy Chinese New Year! ๐ฎ