pi-mono: TypeScript Full-Stack AI Agent Toolkit
Why pi-mono Exists
The dominant AI agent tooling is Python: LangChain, LlamaIndex, Dify, CrewAI. But if you’re building products in the TypeScript / Node.js ecosystem and want AI capabilities, the options are limited.
Most teams end up with:
- Backend: Go / Node.js / Java
- Frontend: React / Flutter
- AI integration: either direct OpenAI API calls or a Python microservice layer
pi-mono tries to solve this. It’s a TypeScript monorepo offering a complete toolkit from low-level LLM API to high-level coding agent.
Architecture Overview
pi-mono/
├── packages/
│ ├── ai/ # Unified LLM API
│ ├── agent/ # Agent runtime + tool calling
│ ├── coding-agent/ # Interactive coding agent CLI
│ ├── tui/ # Terminal UI library
│ ├── web-ui/ # Web chat components
│ ├── mom/ # Slack bot
│ └── pods/ # vLLM deployment CLICore Components
1. @mariozechner/pi-ai — Unified Multi-Provider LLM API
import { createAI } from "@mariozechner/pi-ai";
// One interface, switch providers
const ai = createAI({
provider: "openai", // or "anthropic", "google"
model: "gpt-4o",
apiKey: process.env.OPENAI_API_KEY
});
// Unified API regardless of provider
const response = await ai.complete("Hello, world");Supported providers:
- OpenAI (GPT-4o, GPT-4o-mini, etc.)
- Anthropic (Claude 3.5 Sonnet, Opus, etc.)
- Google (Gemini 1.5, etc.)
Value: If your product needs to switch between models (cost optimization, regional compliance), a unified API layer reduces lock-in.
2. @mariozechner/pi-agent-core — Agent Runtime
This is the core. pi-agent-core implements:
- Tool calling: Give agents ability to call tools
- State management: Cross-turn state persistence
- Streaming: Streaming output support
import { createAgent } from "@mariozechner/pi-agent-core";
const agent = createAgent({
ai: openAIInstance,
tools: [
{
name: "read_file",
description: "Read file contents",
parameters: {
path: "string"
},
execute: async ({ path }) => {
return readFileSync(path, "utf-8");
}
}
]
});
// Agent automatically decides when to call tools
const result = await agent.run("Read package.json to check dependencies");3. @mariozechner/pi-coding-agent — Interactive Coding Agent CLI
A developer tool. Install it and chat with AI directly in your terminal for code-related tasks.
# Install
npm install -g @mariozechner/pi-coding-agent
# Run
pi-agent
# Then chat
> add dark mode support to this React component4. @mariozechner/pi-pods — vLLM Deployment CLI
For self-hosting open-source models, pi-pods manages vLLM deployments on GPU pods via CLI.
# Deploy a vLLM instance
pi-pods deploy --model mistralai/Mistral-7B-Instruct-v0.2
# Check status
pi-pods listUse case: running open-source models on your own GPUs with an API interface.
Comparison with LangChain.js
| LangChain.js | pi-mono | |
|---|---|---|
| Scope | General-purpose framework | Full-stack toolkit |
| Package size | Large | Pick what you need |
| Provider coverage | Broad | Mainstream (OpenAI, Anthropic, Google) |
| Coding agent | No | Yes (pi-coding-agent) |
| TUI/Web UI | No | Yes |
| vLLM support | No | Yes (pi-pods) |
| Learning curve | Steep (many concepts) | Gentle (modular, independent) |
Practical Use Cases
Scenario 1: Add AI to Node.js Backend
Express service needing AI to analyze user input:
import { createAI } from "@mariozechner/pi-ai";
import { createAgent } from "@mariozechner/pi-agent-core";
const ai = createAI({ provider: "anthropic", model: "claude-3-5-sonnet" });
const agent = createAgent({
ai,
tools: [yourTools]
});
app.post("/analyze", async (req, res) => {
const result = await agent.run(req.body.userInput);
res.json({ result });
});Scenario 2: Internal Coding Agent
Set up a coding agent CLI for your team to manipulate codebases via natural language:
# Developer runs locally
pi-agent
> add unit tests for the payment module
> refactor this function to use async/await
> explain what this regex doesLimitations
- Not a visual platform like Dify/CrewAI: pi-mono is a codebase—you write TypeScript to integrate
- Provider coverage limited: Doesn’t cover all LLM providers (no Cohere, Azure OpenAI, etc.)
- Production validation: Less production use cases compared to LangChain
Installation
# Full install
npm install @mariozechner/pi-ai @mariozechner/pi-agent-core ...
# Or per-package
npm install @mariozechner/pi-aiRequires Node.js 20+.
Conclusion
pi-mono’s value proposition: TypeScript-native + full-stack coverage + modular packages.
If you’re in the TypeScript ecosystem and want to add AI capabilities without a Python microservice, pi-mono is worth exploring. Its modular design lets you use just pi-ai for API unification, or go full-stack with a complete coding agent setup.