A Visual Studio Code (or Cursor, Windsurf etc) extension that provides a local UI interface for Ollama models.
This project is no longer actively maintained and may contain bugs. I recommend using Cline instead.
Install it on VS Marketplace or Open VSX
Node.js, Golang, VS Code (Cursor, Windsurf and Trae doesn't work somehow), make and Ollama.
Set environment variables in .env
file, copying params from .env.example
file.
Make sure you have installed npm, VS Code (Cursor, Windsurf and Trae doesn't work) and Ollama.
Before testing make sure you have code
installed in your PATH.
- Press
CMD + Shift + P
and search forShell Command: Install 'code' command in PATH
; - Restart VS Code.
And you are ready for extension testing:
- First, run
make
in Terminal. It will instal necessary dependencies; - Go to
Run
->Start Debugging
(or just PressF5
) - Wait until the new VS Code window will appear (Extension Development Host);
- Press
CMD + Shift + P
and search forAndes
; - Make sure ollama is serving: run
ollama serve
in Terminal on port11434
.
- Manage your installed Ollama models locally;
- Chat with AI;
- Observe the reasoning process and decision-making of AI models.
Visit CHANGELOG.md.
Visit TODO.md.
I created Andes to provide a simple, authentication-free VS Code extension that works exclusively with Ollama models in VSC, Windsurf, Trae, Cursor and other VSC forks. While other plugins like Cline & Continue support multiple AI providers with authentication needed, I wanted a focused solution specifically for local Ollama models.
Sadly it's written in TypeScript, but API for Markdown-to-HTML formating is in Golang. Initially i wanted to write this project in Golang, but it's simplier to write it in TypeScript since i can request Ollama's endpoints directly from TypeScript.