v0.14.0
What's Changed
ollama run --experimentalCLI will now open a new Ollama CLI that includes an agent loop and thebashtool- Anthropic API compatibility: support for the
/v1/messagesAPI - A new
REQUIREScommand for theModelfileallows declaring which version of Ollama is required for the model - For older models, Ollama will avoid an integer underflow on low VRAM systems during memory estimation
- More accurate VRAM measurements for AMD iGPUs
- Ollama's app will now highlight swift source code
- An error will now return when embeddings return
NaNor-Inf - Ollama's Linux install bundles files now use
zstcompression - New experimental support for image generation models, powered by MLX
New Contributors
- @Vallabh-1504 made their first contribution in https://github.com/ollama/ollama/pull/13550
- @majiayu000 made their first contribution in https://github.com/ollama/ollama/pull/13596
- @harrykiselev made their first contribution in https://github.com/ollama/ollama/pull/13615
Full Changelog: https://github.com/ollama/ollama/compare/v0.13.5...v0.14.0-rc2