Twinny icon

Twinny

Free, open-source, locally running AI coding assistant for VS Code that connects to Ollama or other local LLM servers.

Visit Website Coding & Software Development Agents
coding ide open-source self-hosted local

Overview

Free, open-source, locally running AI coding assistant for VS Code that connects to Ollama or other local LLM servers.

Details

Twinny is a free and fully open-source AI code completion and chat extension for VS Code, designed to run entirely on local infrastructure via Ollama, llama.cpp, Oobabooga, or any OpenAI-compatible endpoint. It provides fill-in-the-middle completions, chat, code explanation, refactoring, and unit test generation without sending any code to external services. Twinny is positioned as a local Copilot alternative for developers who prioritize privacy, offline use, or running open models. It has a small but active OSS community.

Tags

coding, ide, open-source, self-hosted, local