Milady
your schizo AI waifu that actually respects your privacy
Milady is a personal AI assistant that runs on YOUR machine. Not some glowie datacenter. Not the cloud. YOUR computer. Built on elizaOS.
Manages your sessions, tools, and vibes through a Gateway control plane. Connects to Telegram, Discord, whatever normie platform you use. Has a cute WebChat UI too.
tl;dr: local AI gf that's actually fast and doesn't phone home
> Why Milady
Most assistants live in someone else's datacenter. Milady lives on your computer, where your work actually happens.
What you get:
- >Local-first agent runtime (built on elizaOS)
- >A Gateway control plane for sessions/tools/vibes
- >Web dashboard + terminal UI
- >Provider choice: Anthropic, OpenAI, OpenRouter, Groq, xAI, DeepSeek, Ollama
Dashboard: http://localhost:2138
Gateway: ws://localhost:18789/ws
> Downloads
Desktop App (recommended for normies)
Grab from Releases:
| Platform | File | Notes |
|---|---|---|
| macOS (Apple Silicon) | Milady-arm64.dmg | for your overpriced rectangle |
| macOS (Intel) | Milady-x64.dmg | boomer mac |
| Windows | Milady-Setup.exe | for the gamer anons |
| Linux | Milady.AppImage / .deb | I use arch btw |
Signed and notarized. No Gatekeeper FUD. We're legit.
Verify (for the paranoid kings)
cd ~/Downloads
curl -fsSLO https://github.com/milady-ai/milady/releases/latest/download/SHA256SUMS.txt
shasum -a 256 --check --ignore-missing SHA256SUMS.txt> Getting Started
New Environment Setup (recommended)
curl -fsSL https://milady-ai.github.io/milady/install.sh | bash
milady setupThen start Milady:
miladyFirst run onboarding:
┌ milady │ ◇ What should I call your agent? │ mila │ ◇ Pick a vibe │ ● Helpful & friendly │ ○ Tsundere │ ○ Unhinged │ ○ Custom... │ ◇ Connect a brain │ ● Anthropic (Claude) ← recommended, actually smart │ ○ OpenAI (GPT) │ ○ Ollama (local, free, full schizo mode) │ ○ Skip for now │ ◇ API key? │ sk-ant-••••••••••••••••• │ └ Starting agent... Dashboard: http://localhost:2138 Gateway: ws://localhost:18789/ws she's alive. go say hi.
Alternative install paths
Windows:
irm https://milady-ai.github.io/milady/install.ps1 | iexNPM global:
npm install -g miladyai
milady setupSecurity: API token
The API server binds to 127.0.0.1 (loopback) by default — only you can reach it. If you expose it to the network, set a token:
echo "MILADY_API_TOKEN=$(openssl rand -hex 32)" >> .env> Security & Privacy
Default: Loopback only
By default, Milady's API binds to 127.0.0.1 (loopback), so only your machine can access it.
Warning: Network exposure
If you expose it (example: MILADY_API_BIND=0.0.0.0), you must set a token, otherwise anyone on the network can access your dashboard, agent, and wallet endpoints.
Set a token
echo "MILADY_API_TOKEN=$(openssl rand -hex 32)" >> .env> Terminal Commands
milady # start (default)
milady start # same thing
milady start --headless # no browser popup
milady start --verbose # debug mode for when things breakSetup & Config
milady setup # first-time setup
milady configure # interactive config wizard
milady config get <key> # read a config value
milady config set <k> <v> # set a config valueDashboard & UI
milady dashboard # open web UI in browser
milady dashboard --port 3000 # custom portModels
milady models # list configured providers
milady models add # add a new provider
milady models test # test if your API keys workPlugins
milady plugins list # what's installed
milady plugins add <name> # install a plugin
milady plugins remove <name>Misc
milady --version # version check
milady --help # help
milady doctor # diagnose issues> TUI (Terminal UI)
When running, milady shows a live terminal interface:
╭─────────────────────────────────────────────────────────────╮ │ milady v0.1.0 ▲ running │ ├─────────────────────────────────────────────────────────────┤ │ │ │ Agent: mila │ │ Model: anthropic/claude-opus-4-5 │ │ Sessions: 2 active │ │ │ │ ┌─ Activity ──────────────────────────────────────────┐ │ │ │ 12:34:02 [web] user: hey mila │ │ │ │ 12:34:05 [web] mila: hi anon~ what's up? │ │ │ │ 12:35:11 [telegram] user joined │ │ │ │ 12:35:15 [telegram] user: gm │ │ │ │ 12:35:17 [telegram] mila: gm fren │ │ │ └─────────────────────────────────────────────────────┘ │ │ │ │ Tokens: 12,847 in / 3,291 out Cost: $0.42 │ │ │ ╰─────────────────────────────────────────────────────────────╯ [q] quit [r] restart [d] dashboard [l] logs [?] help
TUI Hotkeys
| Key | Action |
|---|---|
| q | quit gracefully |
| r | restart gateway |
| d | open dashboard in browser |
| l | toggle log view |
| c | compact/clear activity |
| ? | show help |
| ↑/↓ | scroll activity |
Headless mode
Don't want the TUI? Run headless:
milady start --headlessLogs go to ~/.milady/logs/. Daemonize with your favorite process manager.
> Chat Commands
Use these in any chat session:
| Command | What it do |
|---|---|
| /status | session status, tokens, cost |
| /new /reset | memory wipe, fresh start |
| /compact | compress context (she summarizes) |
| /think | reasoning: off|minimal|low|medium|high|max |
| /verbose on|off | toggle verbose responses |
| /usage off|tokens|full | per-message token display |
| /model | switch model mid-session |
| /restart | restart the gateway |
| /help | list commands |
> Config
Lives at ~/.milady/milady.json
{
agent: {
name: "mila",
model: "anthropic/claude-opus-4-5",
},
env: {
ANTHROPIC_API_KEY: "sk-ant-...",
},
}Or use ~/.milady/.env for secrets.
Ports
| Service | Default | Env Override |
|---|---|---|
| Gateway (API + WebSocket) | 18789 | MILADY_GATEWAY_PORT |
| Dashboard (Web UI) | 2138 | MILADY_PORT |
Custom ports example
MILADY_GATEWAY_PORT=19000 MILADY_PORT=3000 milady start> Model Providers
| Provider | Env Variable | Vibe |
|---|---|---|
| Anthropic | ANTHROPIC_API_KEY | recommended — claude is cracked |
| OpenAI | OPENAI_API_KEY | gpt-4o, o1, the classics |
| OpenRouter | OPENROUTER_API_KEY | 100+ models one API |
| Ollama | — | local, free, no API key, full privacy |
| Groq | GROQ_API_KEY | fast af |
| xAI | XAI_API_KEY | grok, based |
| DeepSeek | DEEPSEEK_API_KEY | reasoning arc |
> Using Ollama (local models)
Ollama lets you run models locally with zero API keys. Install it, pull a model, and configure Milady:
# install ollama
curl -fsSL https://ollama.ai/install.sh | sh
# pull a model
ollama pull gemma3:4bKnown issue:
The @elizaos/plugin-ollama has an SDK version incompatibility. Use Ollama's OpenAI-compatible endpoint as a workaround.
Edit ~/.milady/milady.json:
{
env: {
OPENAI_API_KEY: "ollama",
OPENAI_BASE_URL: "http://localhost:11434/v1",
SMALL_MODEL: "gemma3:4b",
LARGE_MODEL: "gemma3:4b",
},
}Recommended models for local use
| Model | Size | Vibe |
|---|---|---|
| gemma3:4b | ~3GB | fast, good for chat |
| llama3.2 | ~2GB | lightweight, quick responses |
| mistral | ~4GB | solid all-rounder |
| deepseek-r1:8b | ~5GB | reasoning arc |
> Build from Source
Prerequisites
| Tool | Version | Notes |
|---|---|---|
| Node.js | >= 22 | node --version to check |
| bun | latest | for building and running |
git clone https://github.com/milady-ai/milady.git
cd milady
bun install
bun run build
bun run milady startDev mode with hot reload:
bun run dev> Contributing
This project is built by agents, for agents.
Humans contribute as QA testers — use the app, find bugs, report them. That's the most valuable thing you can do. All code contributions are reviewed and merged by AI agents. No exceptions.
Read CONTRIBUTING.md for the full details.
> License
Viral Public License
Free to use, free to modify, free to distribute. If you build on this, keep it open. That's the deal.
built by agents. tested by humans. that's the split.