Open source tools for software engineering teams using AI
Building Blocks
Each capability ships as a library, CLI, MCP server, and REST API from a single codebase.
Every block works standalone. When peers are present, they discover each other and compose —
Biff speaks through Vox, PR/FAQ searches through Quarry.
recall Unlock the knowledge trapped on your hard drive.
Local semantic search across PDFs, images, spreadsheets, source code, and 30+ formats. Finds what you mean, not just what you typed. Runs entirely offline — no API keys required.
Semantic search across 30+ file formats including scanned documents via OCR
Fully local — embedding model downloads once, everything stays on your machine
Sub-second results with LanceDB backend
Named databases keep work and personal content isolated
Works as CLI, MCP server, or macOS menu bar app
$ curl -fsSL https://raw.githubusercontent.com/punt-labs/quarry/996c44b/install.sh | sh
Biff
v0.15.1 beta
coordination Team communication for engineers who never leave the terminal.
Resurrects the BSD Unix communication vocabulary as MCP-native slash commands over a NATS relay. Humans and AI agents show up side by side — no separate app, no browser tab, no context switch.
CLI parity: every slash command also available as biff <command> with --json output
Humans and autonomous agents appear side by side in /who
/wall broadcasts with duration-based expiry — ambient awareness without inbox noise
/talk for real-time conversations with ≤2s notification latency
Cross-machine messaging via NATS relay
$ curl -fsSL https://raw.githubusercontent.com/punt-labs/biff/a7ac684/install.sh | sh
Vox
v1.2.1
speech Voice for your AI engineering assistant.
General-purpose text-to-speech engine with multi-provider support — ElevenLabs, OpenAI, and AWS Polly. Delivers spoken notifications when tasks finish, chimes when Claude needs input, and synthesizes arbitrary text. Opt-in only: no audio until you enable it.
Mic API — unmute/record/vibe/who MCP tools with uniform segment input
Five providers — ElevenLabs (recommended), OpenAI, AWS Polly, macOS say, Linux espeak-ng
Voice or chime — /mute switches to audio tones with no TTS API calls
CLI product commands: unmute, record, vibe, on/off, mute, version, status
$ curl -fsSL https://raw.githubusercontent.com/punt-labs/vox/342504c/install.sh | sh
Lux
v0.0.0 alpha
visuals A visual output surface for AI agents.
ImGui display server connected by Unix socket IPC — agents send JSON element trees via MCP tools, the display renders them at 60fps. 22 element kinds including interactive controls (sliders, checkboxes, combos, color pickers), data visualization (tables, plots, markdown), and layout nesting (windows, tabs, groups). Incremental updates patch elements by ID without replacing the scene. The visual counterpart to Vox.
22 element kinds — text, buttons, images, sliders, tables, plots, markdown, draw canvases, and more
Layout nesting — windows contain tab bars contain groups contain any element, arbitrarily deep
Incremental updates — patch individual elements by ID without replacing the scene
Interaction events — clicks, slider changes, menu selections queue as events the agent reads via recv
Render functions — agent-submitted Python code with AST safety scanning and consent dialog
Unix socket IPC — length-prefixed JSON frames, no HTTP overhead
Coming Soon
Persona
v0.0.0 alpha
character Character, voice, and teaching philosophy for every domain tool.
Persona will seek to extract the character layer from domain tools into a standalone building block. Each persona would combine a name, personality, teaching philosophy, and voice hint — grounded in Mollick & Mollick's seven pedagogical roles. The pattern already works in LangLearn, where 28 named instructors (Profesor Garcia, Madame Moreau, Tanaka-sensei) teach through tutor, coach, and simulator roles. Persona would generalize this so Z Spec gets a formal methods mentor, PR/FAQ gets a product strategist, and Use Cases gets a requirements analyst.
Named characters with personality, teaching philosophy, and voice hint
Grounded in Mollick & Mollick’s seven pedagogical roles: Tutor, Coach, Simulator, Mentor, and more
Domain tools auto-select a persona — or the user overrides with their own
Writes state to a shared directory — Vox reads the voice hint, tools read the character
Coming Soon
Tally
v0.0.0 alpha
metering Know what your AI agents cost before the invoice arrives.
Tally will seek to track token consumption, model usage, and spend across sessions and projects. The goal: plug into the same universal access pattern as every other building block — library, CLI, MCP server, and REST — so teams know what their AI agents cost before the invoice arrives.
Per-session and per-project token and cost tracking
Multi-model awareness — tracks different pricing across providers
Historical trends and spend alerts
Queryable via CLI, MCP, or REST
Punt Kit
v0.3.0 alpha
standards Standards, scaffolding, and the universal access pattern every tool follows.
Every Punt Labs tool follows the same pattern: library, CLI, MCP server, and REST — built from a single codebase. Punt Kit defines this pattern along with standards for code quality, CI, and project structure. All other tools stand on this.
Defines the universal access pattern: library, CLI, MCP, and REST from one codebase