← back

/code

maximizing AI productivity for minimalist coders

TL;DR
AI coding tools exist on a spectrum: inline autocomplete (Copilot, Codeium) for flow, chat interfaces (gptel, ellama) for exploration, and agentic assistants (Claude Code, aidermacs) for autonomous tasks. Pick one from each category and master it.

Emacs AI Autocomplete

Section TL;DR
Easiest free: codeium.el (3 steps, unlimited). Best quality: copilot.el ($10/mo or free for students). Most flexible: minuet-ai with Gemini (free API).
PackageCostDepsSetupBest For
codeium.elFreelibxml23 stepsZero cost, just works
copilot.el$10/mo*Node 22+4 stepsBest quality
minuet-aiVariesplz, dash5 stepsMulti-provider

*Free tier: 2000 completions/mo. Students get Pro free via GitHub Education.

codeium.el (Recommended: Free)

100% free, unlimited completions, 70+ languages

Prerequisites

Emacs with libxml2. Check: M-: (libxml-available-p) should return t.

Step 1: Install

;; With straight.el
(use-package codeium
  :straight (:host github :repo "Exafunction/codeium.el")
  :init
  (add-to-list 'completion-at-point-functions #'codeium-completion-at-point)
  :config
  (setq use-dialog-box nil))

;; With quelpa
(use-package codeium
  :quelpa (codeium :fetcher github :repo "Exafunction/codeium.el")
  :init
  (add-to-list 'completion-at-point-functions #'codeium-completion-at-point))

Step 2: Install Language Server

M-x codeium-install

Step 3: Authenticate

M-x codeium-init
;; Browser opens -> sign in with Google/GitHub/email
;; Token auto-applies, or paste manually

Verify

M-x codeium-diagnose
Common issue: libxml2 missing
# Debian/Ubuntu
sudo apt install libxml2-dev
# Then reinstall Emacs or use emacs-gtk package

copilot.el (Best Quality)

Ghost text overlay, TAB to accept

Prerequisites

Step 1: Install

;; Emacs 29+ with use-package + vc
(use-package copilot
  :vc (:url "https://github.com/copilot-emacs/copilot.el"
       :rev :newest :branch "main")
  :hook (prog-mode . copilot-mode)
  :bind (:map copilot-completion-map
         ("<tab>" . copilot-accept-completion)
         ("TAB" . copilot-accept-completion)
         ("M-n" . copilot-next-completion)
         ("M-p" . copilot-previous-completion)))

;; With straight.el
(use-package copilot
  :straight (:host github :repo "copilot-emacs/copilot.el")
  :hook (prog-mode . copilot-mode)
  :bind (:map copilot-completion-map
         ("<tab>" . copilot-accept-completion)))

Step 2: Install Server

M-x copilot-install-server

Step 3: Login

M-x copilot-login
;; Opens browser for GitHub OAuth

Step 4: Verify

M-x copilot-diagnose
;; Should show "Enabled" and your subscription status
Common issue: Node.js too old
# Check version (must be 22+)
node --version

# If using nvm:
nvm install 22 && nvm use 22

# If using system node (Debian/Ubuntu):
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
sudo apt install -y nodejs
Free access

Students: Free Pro via GitHub Student Developer Pack.
Everyone: Free tier gives 2000 completions + 50 chat/month.

minuet-ai.el (Multi-Provider)

GNU ELPA - OpenAI, Claude, Gemini, Codestral, Ollama

Prerequisites

Step 1: Install

;; From GNU ELPA (simplest)
(use-package minuet
  :ensure t
  :bind (("M-y" . minuet-complete-with-minibuffer)
         ("M-i" . minuet-show-suggestion))
  :hook (prog-mode . minuet-auto-suggestion-mode))

Step 2: Get API Key

Gemini (FREE - recommended)
2. Create API key (no credit card)
3. Export: export GEMINI_API_KEY="your-key"
Codestral (FREE - code-optimized)
2. Click "Codestral" tab, generate key
3. Export: export CODESTRAL_API_KEY="your-key"
Ollama (FREE - local)
1. ollama pull qwen2.5-coder:3b
2. ollama serve
3. No API key needed

Step 3: Configure Provider

;; Gemini (free, fast)
(setq minuet-provider 'gemini)
(plist-put minuet-gemini-options :api-key "GEMINI_API_KEY")
(plist-put minuet-gemini-options :model "gemini-2.0-flash")

;; Codestral (free, code-optimized)
(setq minuet-provider 'codestral)
(plist-put minuet-codestral-options :api-key "CODESTRAL_API_KEY")

;; Ollama (local, private)
(setq minuet-provider 'openai-fim-compatible)
(plist-put minuet-openai-fim-compatible-options
           :end-point "http://localhost:11434/v1/completions")
(plist-put minuet-openai-fim-compatible-options :api-key "TERM")
(plist-put minuet-openai-fim-compatible-options :model "qwen2.5-coder:3b")
Provider comparison
ProviderFree?QualitySpeed
Gemini FlashYesGoodFast
CodestralYes*Best for codeFast
DeepSeekYesExcellentSlow
Ollama localYesVariesDepends on HW

*Codestral requires billing info but has generous free tier

Completion Frameworks: Corfu vs Company

TL;DR
New setup: Use Corfu (modern, lighter). Existing company config: Keep company, AI tools work with both.
Corfu (Modern)
+ Uses Emacs native CAPF
+ Lighter, faster
+ Future-proof
- Terminal needs Emacs 31+
Company (Classic)
+ More documentation
+ Works everywhere
+ Mature ecosystem
- Own backend API

Minimal Corfu + AI Setup

(use-package corfu
  :ensure t
  :custom
  (corfu-auto t)
  (corfu-auto-delay 0.2)
  (corfu-auto-prefix 2)
  :init
  (global-corfu-mode))

;; Add codeium (or other AI capf)
(add-to-list 'completion-at-point-functions #'codeium-completion-at-point)

Minimal Company + AI Setup

(use-package company
  :ensure t
  :hook (after-init . global-company-mode)
  :config
  (setq company-idle-delay 0.1
        company-minimum-prefix-length 1))

;; Codeium works via company-capf backend automatically
(add-to-list 'completion-at-point-functions #'codeium-completion-at-point)

Combining AI + LSP with Cape

(use-package cape
  :ensure t)

;; Merge codeium with LSP completions
(defun my/setup-ai-lsp ()
  (setq-local completion-at-point-functions
              (list (cape-capf-super
                     #'codeium-completion-at-point
                     #'eglot-completion-at-point))))
(add-hook 'eglot-managed-mode-hook #'my/setup-ai-lsp)
Which autocomplete to use?

LLM Chat Interfaces

For exploration, explanation, and iterative conversation with AI.

The Swiss Army knife

Simple, extensible LLM client. Works anywhere in Emacs - any buffer, shell, minibuffer. Supports OpenAI, Anthropic, Gemini, xAI, Ollama, and OpenAI-compatible backends. Tool-use, MCP integration, multi-modal input (images, documents). v0.9.9 1100+ stars

(use-package gptel
  :config
  (setq gptel-model 'claude-sonnet-4-20250514
        gptel-backend (gptel-make-anthropic "Claude"
                        :key 'gptel-api-key-from-auth-source)))
Key feature

gptel-rewrite lets you select code and ask for modifications inline.

GNU ELPA - Ollama-focused

Tool for interacting with LLMs. Translation, code review, summarization, grammar fixes. Built on the llm package. Great for local models via Ollama. v1.8.6 GNU ELPA 880+ stars

(use-package ellama
  :init
  (setopt ellama-provider
          (make-llm-ollama
           :chat-model "qwen2.5-coder:7b")))
GNU ELPA - Foundation library

Low-level library abstracting LLM capabilities. Used by ellama and others. Supports multiple providers with unified API. If building your own AI tools, start here.

Which chat interface to use?

Agentic AI Assistants

Autonomous tools that can edit files, run commands, and complete multi-step tasks.

Future of agentic development in Emacs

Emacs-native AI assistant that understands and interacts with your codebase. From the creators of EAF and lsp-bridge. Spiritual successor to aidermacs. Uses tools to interact with environment based on LLM reasoning. Connects to various providers via LiteLLM. early stage

Under active development

Expect breaking changes. Use for testing and feedback.

AI pair programming via Aider

Integrates Aider into Emacs. Cursor-like experience. Native multiline prompts, Tramp support (SSH, Docker), session scratchpads. Works with Claude, DeepSeek, ChatGPT, local models. NonGNU ELPA

(use-package aidermacs
  :config
  (setq aidermacs-backend 'vterm)  ; or 'comint
  (global-set-key (kbd "C-c a") 'aidermacs-transient-menu))
Lightweight Aider UI

Minimal Emacs UI for Aider. AI-driven agile workflows (TDD, refactoring). Bootstrapping utilities for new files and projects. v0.13.1

Terminal-native, editor-agnostic

Anthropic's official CLI agent. Works in any terminal, integrates with any editor via file system. Powerful tool-use, web search, multi-file edits. Run via claude command.

aidermacs / aider.el
+ Mature, well-tested
+ Multiple LLM backends
- Requires Aider CLI
emigo
+ Pure Emacs-native
+ No external deps
- Early stage

Takopi

Telegram bridge for AI coding agents. Send tasks from anywhere, monitor progress remotely.

What it solves

Supported Engines

EngineCommandNotes
Claude Code/claudeAnthropic's agent
Codex/codexOpenAI agent
OpenCode/opencodeOpen-source alternative
Pi/piLightweight agent

Workflow Modes

Assistant Mode
Continuous chat, /new to reset
Best for: solo work, natural flow
Workspace Mode
Forum topics bound to repos/branches
Best for: teams, organized multi-repo work
Handoff Mode
Reply-to-continue with resume lines
Best for: explicit control, terminal-centric

Quick Start

uv tool install -U takopi
takopi --onboard

# Register a project
takopi init my-project

# In Telegram:
/my-project fix the login bug
/claude refactor the auth module
Key Features
Parallel runs across sessions
Voice notes dictate via Telegram
Git worktrees branch isolation
File transfer send/receive files

Krons Infrastructure

The assistant ecosystem powering this guide. Skills, agents, and orchestration for Claude Code.

Skills + Agents for Claude Code

Configuration system packaging domain knowledge into reusable components. 20 skills, 7 agents, 4 commands, 5 hooks. Skills auto-activate based on file context - open a .rs file and Rust patterns inject automatically.

Skills (20)

Languages Go, Python, Rust, TS, SQL
Domains CLI, services, data, ops, trading
Workflows commits, builds, shipping, refine

Agents (7)

@improve code enhancement loops
@readme doc synchronization
@learn pattern extraction
@visual SVG/UI rendering
@distill knowledge synthesis
@research deep investigation

Key Commands

CommandFunction
/shipSpec-driven outer loop, reads specs/, topologically sorts deps, delegates builds
/refine7-step finalization: checkpoint, validate, improve, document, commit
/buildInner loop: planner-worker-judge pattern with parallel execution
# Install
cd kronael/assistants/claude-template
claude "install"
# Compares with ~/.claude/ and prompts before overwriting
Autonomous planner-worker-judge orchestrator

Processes design specs through a multi-stage pipeline. Parallel task execution with configurable workers. Injects skills from ~/.claude/skills/ into worker prompts. Resumable with ship -c.

Pipeline Stages
Validator - evaluates design quality
Planner - decomposes into tasks (runs once)
Workers - parallel execution via Claude CLI
Judge - monitors completion, triggers refinement
Refiner - analyzes results via Codex CLI
Replanner - catches missed work
# Run ship on a design spec
ship design.md

# Resume interrupted run
ship -c

# Verbose mode
ship -v design.md
Container-based assistant deployment

Docker container that runs takopi with Vite dev server for web deployments. Auto-discovers projects from /workspace. Nginx proxy to krons.fiu.wtf.

krons.fiu.wtf
Web deployment target

Any app_name/index.html in the web directory becomes available at https://krons.fiu.wtf/app_name/. No restart needed - Vite hot reloads.

Architecture

specs/        kronael/ship        Claude Code
  |               |                   |
  v               v                   v
design.md --> Planner --> Workers --> File system
                |             |
                v             v
            Judge <----- Results
                |
                v
            Refiner --> Follow-up tasks
    
The key insight

Claude Code is capable out of the box, but it doesn't know your patterns. kronael/assistants encodes cumulative lessons into every session via auto-activating skills.

Learning Path

From zero to AI-augmented coding
1
Install one autocomplete tool
Start with codeium.el (free) or copilot.el (paid). Just one.
2
Add gptel for chat
Configure with your preferred LLM. Use for explanations and exploration.
3
Try an agentic tool
aidermacs for Emacs integration, Claude Code for terminal-first workflow.
4
Set up remote access (optional)
Takopi for mobile task dispatch. Useful for long-running tasks.
Don't install everything at once. Master each tool before adding the next. AI tools are force multipliers - they amplify both productivity and confusion.