LiteCLI
Universal CLI/TUI for MCP servers + Skill Compiler for AI agent efficiency — free and open source
Overview
Free and open source. Source code: github.com/ahostbr/LiteCLI
LiteCLI does two things nobody else has built:
-
Universal CLI for MCP servers. Connect any MCP server, get CLI commands instantly. The tool's JSON Schema IS the specification. Zero analysis, zero code generation. Add a backend, run commands.
-
Skill Compiler. Read an AI agent skill, extract the API logic, generate a standalone CLI with encrypted credentials and local LLM fallback. Skills stop burning context re-reasoning about plumbing. They become one-liners.
The Problem
MCP servers have no CLI. The only way to call MCP tools is through an MCP client (Claude Desktop, Cursor). There's no curl-like way to call them from a terminal, a script, or CI. LiteCLI fixes that.
Skills burn context. Every time an AI agent invokes a skill like /gen-image, it loads 500+ tokens of instructions, re-reads which API to call, re-figures out auth, makes the HTTP request, and parses the response. The deterministic parts (endpoint, auth, payload, response format) never change — yet the agent re-reasons about them every invocation. LiteCLI's Skill Compiler extracts that logic into a standalone CLI. The skill becomes a one-liner.
Features
MCP CLI Engine
- Register any MCP server (HTTP or stdio) as a backend
- Auto-generate Click CLI commands from JSON Schema tool definitions
- Commands grouped by backend prefix:
litecli kuro sots-help,litecli lite vault-read - Every schema type maps to the right Click type (string, int, float, bool, Choice, JSON)
- Required parameters enforced automatically
- Dual output: Rich tables for humans,
--jsonfor scripts and AI agents
Skill Compiler Engine
- AI-assisted compilation via
/compile-skillClaude Code meta-skill - Automatic regex-based compilation via
litecli compilefor simple skills - Generated CLIs include direct HTTP calls (httpx), Click options, error handling
--localflag on every compiled CLI routes through local LLM (LM Studio)- Encrypted credential storage via OS keyring (Windows Credential Manager, macOS Keychain)
- Manage compiled CLIs:
litecli compiled list,litecli compiled remove,litecli compiled run
TUI Application
- 4-tab Textual interface: Dashboard, Tools, Execute, Admin
- Sidebar showing backend connection status
- Interactive tool execution with live results
- Keyboard shortcuts for rapid navigation
Additional
- Freeze/thaw: snapshot tool catalog for instant startup without MCP connections
- SKILL.md generation: auto-document all tools for AI agent discovery
- 192 tests across 8 test files
Installation
LiteCLI is installed automatically by the Lite Suite installer. To install manually:
cd C:/Apps/LiteCLI
uv sync
Verify:
litecli --version
# litecli v0.1.0
Usage
Connecting MCP Servers
# HTTP backend
litecli admin add \
--id kuroryuu \
--type http \
--prefix kuro \
--url http://127.0.0.1:8100/mcp \
--description "Kuroryuu MCP Core"
# Stdio backend
litecli admin add \
--id litemcp \
--type stdio \
--prefix lite \
--command python \
--args "-m,litemcp.server"
# Discover what's available
litecli admin discover
Running Auto-Generated Commands
After registering a backend, its tools become CLI commands:
litecli kuro sots-rag --query "authentication patterns" --top-k 5
litecli kuro sots-run-pwsh --command "Get-Process"
litecli --json kuro sots-agents-health
Compiling Skills
AI-assisted (recommended): In Claude Code, invoke /compile-skill:
/compile-skill ~/.claude/skills/gen-image-or-video
Claude reads the skill, identifies APIs, generates the CLI, sets up credentials.
Automatic (simple skills):
litecli compile ~/.claude/skills/gen-image-or-video/SKILL.md
Using Compiled CLIs
# Direct execution
python ~/.litecli/compiled/genimage.py --prompt "sunset" --output hero.png
# Through LiteCLI
litecli compiled run genimage -- --prompt "sunset" --output hero.png
# Local LLM mode (no cloud API)
python ~/.litecli/compiled/genimage.py --prompt "sunset" --output hero.png --local
# List all compiled CLIs
litecli compiled list
Launching the TUI
litecli
# No subcommand = launches the Textual TUI
Pre-Built Compilations
LiteCLI ships with compilations for 5 Lite Suite skills:
| CLI | API | What It Does |
|-----|-----|-------------|
| genimage | Google Gemini | Generate AI images with model/aspect/resolution control |
| genvideo | Google Veo | Generate AI videos with duration/resolution options |
| yt | yt-dlp | Fetch YouTube transcripts by URL |
| videolens | LM Studio / Claude | Extract video frames and caption with VLM |
| liteimage | LiteImage (local) | Generate images via local Stable Diffusion |
Build all with: uv run python compilations/compile_all_skills.py
OpenClaw CLI Pack
LiteCLI is also available as an open-source compiler with a pre-built pack of CLIs compiled from OpenClaw agent skills.
Install via npm:
npx litecli-pack weather --location "NYC"
npx litecli-pack notion --help
npx litecli-pack trello --help
9 pre-compiled commands:
| Command | Description | Auth |
|---------|-------------|------|
| weather | Weather forecasts via wttr.in / Open-Meteo | None |
| notion | Notion pages, databases, and blocks | NOTION_API_KEY |
| trello | Trello boards, lists, and cards | TRELLO_API_KEY + TRELLO_TOKEN |
| openai-image-gen | Batch image generation (OpenAI Images API) | OPENAI_API_KEY |
| openai-whisper | Local speech-to-text (Whisper CLI) | None |
| openai-whisper-api | Cloud speech-to-text (OpenAI API) | OPENAI_API_KEY |
| video-frames | Extract frames/clips from video (ffmpeg) | None |
| canvas | Display HTML on OpenClaw nodes | None |
| coding-agent | Delegate tasks to Codex/Claude Code | None |
Open source: github.com/ahostbr/LiteCLI
The compiler can also be installed standalone via PyPI:
pip install litecli-compiler
litecli compile path/to/SKILL.md
Configuration
LiteCLI uses a backends.json file for MCP backend configuration:
{
"backends": {
"kuroryuu": {
"type": "http",
"prefix": "kuro",
"url": "http://127.0.0.1:8100/mcp",
"timeout": 30,
"description": "Kuroryuu MCP Core",
"enabled": true
}
}
}
Override the config path with LITECLI_CONFIG env var or --config flag.
Requirements
- Python 3.10 or later
- uv package manager
- For MCP backends: a running MCP server (HTTP or stdio)
- For
--localmode: LM Studio running with a loaded model - For credential storage: OS keyring support (built into Windows, macOS, most Linux desktops)
Troubleshooting
"No backends configured" on first run.
Register at least one backend with litecli admin add. See the Usage section above.
litecli compile says "too complex for automatic compilation."
The skill has too many URLs or ambiguous API patterns for regex extraction. Use /compile-skill in Claude Code for AI-assisted compilation instead.
Compiled CLI prompts for API key every time.
The OS keyring may not be available. Install keyring extras for your platform or set the API key as an environment variable as a fallback.
--local flag says "Local LLM not available."
Start LM Studio and load a model. LiteCLI connects to http://169.254.83.107:1234 by default. Override with LITECLI_LLM_URL env var.
