Chat with LLMs, manage sessions, and run AI agents from your terminal
Interactive Chat
Chat with any LLM provider directly from your terminal with streaming responses and markdown rendering.
Autonomous Executor
Execute tasks from ROADMAP.md files autonomously. Human-in-the-loop approval, dry-run mode, and rollback support.
Provider Agnostic
Switch between OpenAI, Anthropic, Google, and more with a single command. Auto-detects configured providers.
Session Persistence
Auto-saves conversations with LLM-generated names. Resume, switch, or manage sessions seamlessly.
MCP Server Testing
Debug and test MCP servers. List tools, prompts, resources, and call them directly from the CLI.
Image Generation
Generate images from text prompts using any configured provider. List available models and providers.
Voice Mode
Speak to chat and hear responses. Perfect for hands-free AI interactions.
Easy Setup
Interactive setup wizard guides you through API key configuration. Get started in under a minute.
The ai-infra CLI brings the full power of ai-infra to your terminal. Instead of context-switching between browser tabs and code editors, stay in your flow state. Chat with LLMs, execute autonomous tasks from roadmap files, test MCP servers, generate images, and manage your AI infrastructure—all from one unified command-line interface.
The executor reads tasks from a ROADMAP.md file and executes them autonomously using an AI agent.
Debug and test MCP servers directly from the command line.
Once inside the chat REPL, use these commands to control your session:
Get started with the ai-infra CLI in under a minute.