rekipedia scans any repository into a portable SQLite knowledge store and gives every developer an LLM-powered tech lead they can ask anything — grounded in your actual codebase.
No hallucinations, no guessing — every answer is grounded in your actual source code and wiki.
reki export — ready to share or import.--embed-provider openai|ollama|azure|... — any litellm-compatible embedding model for semantic RAG search.Choose your preferred installation method
# Zero install required — runs directly via npx npx rekipedia init . npx rekipedia scan . npx rekipedia serve .
Requires Node.js 18+. The npm package wraps the Python CLI via uvx.
# Zero install required — runs directly via uvx uvx rekipedia init . uvx rekipedia scan . uvx rekipedia serve .
Requires uv. Fastest cold-start with automatic venv isolation.
# Core install (scan + serve + ask) pip install rekipedia # With semantic RAG support (FAISS + numpy ~100MB) pip install "rekipedia[rag]" # Or with uv tool (isolated, globally available) uv tool install rekipedia
Requires Python 3.11+. Both rekipedia and reki commands are available after install.
# Add the tap (first time only) brew tap unrealandychan/tap # Install the Go single binary — no Python required brew install rekipedia # Verify reki --version
The Homebrew Formula installs a single statically-linked binary with no runtime dependencies. Recommended for macOS and Linux.
All commands work with both reki (short) and rekipedia (full).
| Command | Description |
|---|---|
| reki init [REPO] | Scaffold .rekipedia/ with config.yml and update .gitignore |
| reki scan [REPO] | Full analysis — extract symbols, synthesise wiki pages, export JSON |
| reki update [REPO] | Incremental refresh — re-processes only changed files, keeps the rest |
| reki ask [QUESTION] | Interactive Q&A REPL — streaming answers grounded in your codebase |
| reki serve [REPO] | Start a local web UI to browse wiki pages and ask questions |
| reki embed [REPO] | Build or rebuild the FAISS semantic search index for hybrid RAG Q&A |
| reki export [REPO] | Bundle the wiki to a single file — --format md|zip|json |
Powered by litellm — plug in any model with a one-line change in .rekipedia/config.yml.
# .rekipedia/config.yml version: 1 llm: model: ollama/llama4 # any litellm string api_key: "" # or REKIPEDIA_API_KEY base_url: "" # for local endpoints temperature: 0.2 ignore: - .git - node_modules - __pycache__ languages: - python - typescript
| Ollama (local, free) | ollama/llama4 |
| OpenAI | gpt-4o |
| Anthropic | claude-opus-4-6 |
| Google Gemini | gemini/gemini-2.0-pro |
| Azure OpenAI | azure/gpt-4o |
| Any OpenAI-compatible | set base_url in config |
Runtime override: REKIPEDIA_API_KEY, REKIPEDIA_MODEL, OPENAI_API_KEY
Single statically-linked binary — no Python, no runtime. Latest release: v0.9.36
No servers, no cloud sync, no hallucinations. Just run reki scan . and ask anything.