maximizing AI productivity for minimalist coders
| Package | Cost | Deps | Setup | Best For |
|---|---|---|---|---|
| codeium.el | Free | libxml2 | 3 steps | Zero cost, just works |
| copilot.el | $10/mo* | Node 22+ | 4 steps | Best quality |
| minuet-ai | Varies | plz, dash | 5 steps | Multi-provider |
*Free tier: 2000 completions/mo. Students get Pro free via GitHub Education.
Emacs with libxml2. Check: M-: (libxml-available-p) should return t.
;; With straight.el (use-package codeium :straight (:host github :repo "Exafunction/codeium.el") :init (add-to-list 'completion-at-point-functions #'codeium-completion-at-point) :config (setq use-dialog-box nil)) ;; With quelpa (use-package codeium :quelpa (codeium :fetcher github :repo "Exafunction/codeium.el") :init (add-to-list 'completion-at-point-functions #'codeium-completion-at-point))
M-x codeium-install
M-x codeium-init ;; Browser opens -> sign in with Google/GitHub/email ;; Token auto-applies, or paste manually
M-x codeium-diagnose
# Debian/Ubuntu sudo apt install libxml2-dev # Then reinstall Emacs or use emacs-gtk package
;; Emacs 29+ with use-package + vc
(use-package copilot
:vc (:url "https://github.com/copilot-emacs/copilot.el"
:rev :newest :branch "main")
:hook (prog-mode . copilot-mode)
:bind (:map copilot-completion-map
("<tab>" . copilot-accept-completion)
("TAB" . copilot-accept-completion)
("M-n" . copilot-next-completion)
("M-p" . copilot-previous-completion)))
;; With straight.el
(use-package copilot
:straight (:host github :repo "copilot-emacs/copilot.el")
:hook (prog-mode . copilot-mode)
:bind (:map copilot-completion-map
("<tab>" . copilot-accept-completion)))
M-x copilot-install-server
M-x copilot-login ;; Opens browser for GitHub OAuth
M-x copilot-diagnose ;; Should show "Enabled" and your subscription status
# Check version (must be 22+) node --version # If using nvm: nvm install 22 && nvm use 22 # If using system node (Debian/Ubuntu): curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash - sudo apt install -y nodejs
Students: Free Pro via GitHub Student Developer Pack.
Everyone: Free tier gives 2000 completions + 50 chat/month.
plz 0.9+ and dash packages;; From GNU ELPA (simplest)
(use-package minuet
:ensure t
:bind (("M-y" . minuet-complete-with-minibuffer)
("M-i" . minuet-show-suggestion))
:hook (prog-mode . minuet-auto-suggestion-mode))
export GEMINI_API_KEY="your-key"export CODESTRAL_API_KEY="your-key"ollama pull qwen2.5-coder:3bollama serve;; Gemini (free, fast)
(setq minuet-provider 'gemini)
(plist-put minuet-gemini-options :api-key "GEMINI_API_KEY")
(plist-put minuet-gemini-options :model "gemini-2.0-flash")
;; Codestral (free, code-optimized)
(setq minuet-provider 'codestral)
(plist-put minuet-codestral-options :api-key "CODESTRAL_API_KEY")
;; Ollama (local, private)
(setq minuet-provider 'openai-fim-compatible)
(plist-put minuet-openai-fim-compatible-options
:end-point "http://localhost:11434/v1/completions")
(plist-put minuet-openai-fim-compatible-options :api-key "TERM")
(plist-put minuet-openai-fim-compatible-options :model "qwen2.5-coder:3b")
| Provider | Free? | Quality | Speed |
|---|---|---|---|
| Gemini Flash | Yes | Good | Fast |
| Codestral | Yes* | Best for code | Fast |
| DeepSeek | Yes | Excellent | Slow |
| Ollama local | Yes | Varies | Depends on HW |
*Codestral requires billing info but has generous free tier
(use-package corfu :ensure t :custom (corfu-auto t) (corfu-auto-delay 0.2) (corfu-auto-prefix 2) :init (global-corfu-mode)) ;; Add codeium (or other AI capf) (add-to-list 'completion-at-point-functions #'codeium-completion-at-point)
(use-package company
:ensure t
:hook (after-init . global-company-mode)
:config
(setq company-idle-delay 0.1
company-minimum-prefix-length 1))
;; Codeium works via company-capf backend automatically
(add-to-list 'completion-at-point-functions #'codeium-completion-at-point)
(use-package cape
:ensure t)
;; Merge codeium with LSP completions
(defun my/setup-ai-lsp ()
(setq-local completion-at-point-functions
(list (cape-capf-super
#'codeium-completion-at-point
#'eglot-completion-at-point))))
(add-hook 'eglot-managed-mode-hook #'my/setup-ai-lsp)
For exploration, explanation, and iterative conversation with AI.
Simple, extensible LLM client. Works anywhere in Emacs - any buffer, shell, minibuffer. Supports OpenAI, Anthropic, Gemini, xAI, Ollama, and OpenAI-compatible backends. Tool-use, MCP integration, multi-modal input (images, documents). v0.9.9 1100+ stars
(use-package gptel
:config
(setq gptel-model 'claude-sonnet-4-20250514
gptel-backend (gptel-make-anthropic "Claude"
:key 'gptel-api-key-from-auth-source)))
gptel-rewrite lets you select code and ask for modifications inline.
Tool for interacting with LLMs. Translation, code review, summarization, grammar fixes.
Built on the llm package. Great for local models via Ollama.
v1.8.6 GNU ELPA 880+ stars
(use-package ellama
:init
(setopt ellama-provider
(make-llm-ollama
:chat-model "qwen2.5-coder:7b")))
Low-level library abstracting LLM capabilities. Used by ellama and others. Supports multiple providers with unified API. If building your own AI tools, start here.
Autonomous tools that can edit files, run commands, and complete multi-step tasks.
Emacs-native AI assistant that understands and interacts with your codebase. From the creators of EAF and lsp-bridge. Spiritual successor to aidermacs. Uses tools to interact with environment based on LLM reasoning. Connects to various providers via LiteLLM. early stage
Expect breaking changes. Use for testing and feedback.
Integrates Aider into Emacs. Cursor-like experience. Native multiline prompts, Tramp support (SSH, Docker), session scratchpads. Works with Claude, DeepSeek, ChatGPT, local models. NonGNU ELPA
(use-package aidermacs :config (setq aidermacs-backend 'vterm) ; or 'comint (global-set-key (kbd "C-c a") 'aidermacs-transient-menu))
Minimal Emacs UI for Aider. AI-driven agile workflows (TDD, refactoring). Bootstrapping utilities for new files and projects. v0.13.1
Anthropic's official CLI agent. Works in any terminal, integrates with any editor via file system.
Powerful tool-use, web search, multi-file edits. Run via claude command.
Telegram bridge for AI coding agents. Send tasks from anywhere, monitor progress remotely.
| Engine | Command | Notes |
|---|---|---|
| Claude Code | /claude | Anthropic's agent |
| Codex | /codex | OpenAI agent |
| OpenCode | /opencode | Open-source alternative |
| Pi | /pi | Lightweight agent |
uv tool install -U takopi takopi --onboard # Register a project takopi init my-project # In Telegram: /my-project fix the login bug /claude refactor the auth module
The assistant ecosystem powering this guide. Skills, agents, and orchestration for Claude Code.
Configuration system packaging domain knowledge into reusable components.
20 skills, 7 agents, 4 commands, 5 hooks.
Skills auto-activate based on file context - open a .rs file and Rust patterns inject automatically.
| Command | Function |
|---|---|
/ship | Spec-driven outer loop, reads specs/, topologically sorts deps, delegates builds |
/refine | 7-step finalization: checkpoint, validate, improve, document, commit |
/build | Inner loop: planner-worker-judge pattern with parallel execution |
# Install cd kronael/assistants/claude-template claude "install" # Compares with ~/.claude/ and prompts before overwriting
Processes design specs through a multi-stage pipeline. Parallel task execution with configurable workers.
Injects skills from ~/.claude/skills/ into worker prompts. Resumable with ship -c.
# Run ship on a design spec ship design.md # Resume interrupted run ship -c # Verbose mode ship -v design.md
Docker container that runs takopi with Vite dev server for web deployments.
Auto-discovers projects from /workspace. Nginx proxy to krons.fiu.wtf.
Any app_name/index.html in the web directory becomes available at
https://krons.fiu.wtf/app_name/. No restart needed - Vite hot reloads.
specs/ kronael/ship Claude Code
| | |
v v v
design.md --> Planner --> Workers --> File system
| |
v v
Judge <----- Results
|
v
Refiner --> Follow-up tasks
Claude Code is capable out of the box, but it doesn't know your patterns. kronael/assistants encodes cumulative lessons into every session via auto-activating skills.