Ollama launch VS Code
Ollama can now directly launch models in VS Code.
ollama launch vscode # or code
What's Changed
- docs: update claude code and openclaw for web search and non-interactive mode by @ParthSareen in https://github.com/ollama/ollama/pull/14922
- cmd/launch: skip redundant config writes when model unchanged by @hoyyeva in https://github.com/ollama/ollama/pull/14941
- mlxrunner: share KV cache across conversations with common prefixes by @jessegross in https://github.com/ollama/ollama/pull/14887
- app: fix desktop app stuck loading when OLLAMA_HOST is an unspecified bind address by @hoyyeva in https://github.com/ollama/ollama/pull/14885
- parsers: robust xml tool repair by @BruceMacD in https://github.com/ollama/ollama/pull/14961
- add ability to turn on debug request logging by @drifkin in https://github.com/ollama/ollama/pull/14106
- Fix mlxrunner subprocess deadlocks by @jessegross in https://github.com/ollama/ollama/pull/14919
- launch: skip openclaw gateway health check when no daemon install by @BruceMacD in https://github.com/ollama/ollama/pull/14984
- docs: nemoclaw integration by @BruceMacD in https://github.com/ollama/ollama/pull/14962
- mlx: update as of 3/23 by @dhiltgen in https://github.com/ollama/ollama/pull/14789
- docs: add Claude Code with Telegram guide by @ParthSareen in https://github.com/ollama/ollama/pull/15026
- mlxrunner: support partial match on pure transformer caches by @jessegross in https://github.com/ollama/ollama/pull/14985
- mlx: add mxfp4/mxfp8/nvfp4 importing by @pdevine in https://github.com/ollama/ollama/pull/15015
- integration: improve ability to test individual models by @dhiltgen in https://github.com/ollama/ollama/pull/14948
- ci: fix windows cgo compiler error by @dhiltgen in https://github.com/ollama/ollama/pull/15046
Full Changelog: https://github.com/ollama/ollama/compare/v0.18.2...v0.18.3