ai-coding-tools
← All posts
2026-04-15

Running AI coding tools 100% locally in 2026

Ollama + a 70B-class model + Aider or OpenCode will get you further than you'd think. Here's the exact setup, the gotchas, and when it falls apart.

Local AI coding used to be a joke — 7B models that confidently suggested code that didn't compile. That changed in late 2025 when 70B-class open-weight models caught up to 2024-era cloud models. Now the local stack is genuinely usable for everyday work, if you pick the right tool.

The hardware tier

You can run a Q4-quantized 70B model on a single RTX 4090 or an Apple Silicon Mac with 64GB+ of unified memory. For real throughput, two 3090s in parallel or a 128GB Mac Studio. Below a 4090, you're in 32B territory, which is a noticeable step down for complex refactors but fine for completions and quick edits.

The software stack

Ollama is the easy on-ramp — install, pull a model, go. LM Studio is better if you want a GUI and model switching. vLLM is for serious throughput on a server.

On the editor side: Aider, OpenCode, Cline, and Continue all accept a local endpoint. Aider is the simplest to connect. OpenCode has the richest feature set.

What works well locally

  • Inline code completion — latency is actually better than cloud APIs
  • Small, contained edits — "fix this function", "add types here"
  • Shell command suggestions and natural-language search of your codebase
  • Privacy-sensitive work where no code should leave the laptop

Where it falls apart

Long agent runs still favor frontier cloud models by a wide margin. If you kick off a 2-hour "migrate this SDK across 60 files" task, expect a local 70B to get it 60% right. Claude Sonnet 4.6 will get it 95% right. For contained work, local is more than fine.

The 2026 recommendation

Run local for daily completions and sensitive work. Fall back to a cloud model for big agent tasks. Aider's model-per-task config makes this switch painless. OpenCode's provider picker does the same.

All open-source AI coding tools

Filter by open source + bring-your-own-model

Related