Test coverage: 43.635% 😒👍
clai
(/klaɪ/, like "cli" in "climate") is a command line context-feeder for any ai task.
- MCP client support - Add any MCP server you'd like by simply pasting their configuration.
- Vendor agnosticism - Use any functionality in Clai with most LLM vendors interchangeably.
- Conversations - Create, manage and continue conversations.
- Rate limit circumvention - Automatically summarize + recall complex tasks.
- Profiles - Pre-prompted profiles enabling customized workflows and agents.
- Unix-like - Clai follows the unix philosophy and works seamlessly with data piped in and out.
All of these features are easily combined and tweaked, empowering users to accomplish very diverse use cases. See examples for additional info.
Vendor | Environment Variable | Models |
---|---|---|
OpenAI | OPENAI_API_KEY |
Text models, photo models |
Anthropic | ANTHROPIC_API_KEY |
Text models |
Mistral | MISTRAL_API_KEY |
Text models |
Deepseek | DEEPSEEK_API_KEY |
Text models |
Novita AI | NOVITA_API_KEY |
Text models, use prefix novita: |
Ollama | N/A | Use format ollama: (defaults to llama3), server defaults to localhost:11434 |
Inception | INCEPTION_API_KEY |
Text models |
xAi | XAI_API_KEY |
Text models |
Gemini | GEMINI_API_KEY |
Text models |
go install github.com/baalimago/clai@latest
You may also use the setup script:
curl -fsSL https://raw.githubusercontent.com/baalimago/clai/main/setup.sh | sh
Either look at clai help
or the examples for how to use clai
.
If you have time, you can also check out this blogpost for a slightly more structured introduction on how to use Clai efficiently.
Install Glow for formatted markdown output when querying text responses.