Quick start
The fastest path: clone, build, run, connect. You'll have your tutor reachable from your AI assistant in under ten minutes.
http://localhost:3000/mcp. No domain, no TLS, no public IP needed.What you need
Tutor MCP is intentionally lean. A 2 GB VPS handles a small classroom; your laptop handles personal use.
localhost.Install & build
Two paths to a running binary, pick the one that fits. The pre-built binary is the fastest; building from source is for contributors or anyone who wants to pin to a commit.
Download a binary
Single binary, no toolchain. Linux & macOS, amd64 & arm64. The example below grabs Linux amd64.
$ ./tutor-mcp
[info] tutor-mcp listening on :3000
[info] db opened at ./data/runtime.db
For other platforms, swap linux_amd64 for linux_arm64, darwin_amd64, or darwin_arm64. Full list and SHA256SUMS on the release page — verify with sha256sum -c SHA256SUMS if you care.
v0.3.0-alpha.1; bump it deliberately when a new release ships rather than chasing latest.Build from source
The whole tree is one Go module; no Docker, no Node, no Python. Useful if you want to contribute or pin to a specific commit.
$ cd tutor-mcp
$ go build -o tutor-mcp
$ ./tutor-mcp
[info] tutor-mcp listening on :3000
[info] db opened at ./data/runtime.db
Environment & config
The binary reads its configuration from environment variables. Only JWT_SECRET is mandatory.
JWT_SECRET=replace-with-32-char-random-string
DB_PATH=./data/runtime.db
BACKUP_DIR=./backups
BACKUP_RETENTION_DAYS=14
PORT=3000
DISCORD_WEBHOOK_URL=https://discord.com/api/webhooks/… # optional
Reverse proxy & HTTPS
Web AI clients require HTTPS. Caddy gives you automatic Let's Encrypt with two lines:
tutor.your-domain.com {
reverse_proxy localhost:3000
}
Reload Caddy (caddy reload) and your tutor is live at https://tutor.your-domain.com/mcp. That's the URL you'll paste into your AI provider.
Run as a service
A user systemd unit keeps things running and ties cleanly into the documented backup timer.
$ systemctl --user enable --now tutor-mcp-backup.timer
$ journalctl --user -u tutor-mcp -f
Plug into your AI
Once your server is reachable, registering it inside your assistant takes about a minute. The wording differs between providers; the moves don't.
Claude (claude.ai), Pro / Max / Team / Enterprise
- Open Settings → Connectors.
- Click the + next to Connectors.
- Fill in: Name =
Tutor MCP, Server URL =https://your.domain/mcp. - Click Add and complete the OAuth login.
ChatGPT, Plus / Pro / Team / Enterprise / Edu
- Profile menu → Settings → Connectors.
- Open Advanced at the bottom and toggle Developer mode.
- Back to Connectors → Create.
- Name =
Tutor MCP, Description =Adaptive learning brain, Server URL =https://your.domain/mcp. - Click Create and authenticate via OAuth.
Le Chat (Mistral)
- Open
chat.mistral.ai→ Connectors. - Click + Add Connector → Custom MCP Connector tab.
- Name =
tutor_mcp(no spaces), Server URL =https://your.domain/mcp. - Click Connect, complete OAuth.
Gemini
Consumer web Gemini doesn't expose custom MCP connectors yet. For now:
- Gemini Enterprise, register the runtime via Google Cloud Console as a custom MCP data store.
- Gemini CLI, use the command-line interface, which supports MCP integrations directly.
Plug into Claude Code
If you're using Claude Code in your terminal, drop a .mcp.json file in your project root (or ~/.claude/mcp.json globally) and you're connected.
"mcpServers": {
"tutor-mcp": {
"type": "http",
"url": "http://localhost:3000/mcp"
}
}
}
Swap localhost:3000 for your.domain if your server isn't local.
Local & alternative clients
The MCP protocol is open. Tutor MCP speaks plain HTTP MCP, so any client that supports custom MCP servers works, including those wired to local models.
- Cline, VS Code extension, supports MCP and local LLMs.
- Continue, IDE assistant with MCP support.
- OpenWebUI, self-hosted ChatGPT-style frontend.
- Custom client, any LLM that can call MCP tools (Ollama-backed, llama.cpp, vLLM…).
Test your setup
Once your provider is connected, send this prompt in a fresh chat. The assistant should call two MCP tools and reply with a short pedagogical brief.
Drop this into your chat
get_learner_contextget_cockpit_state (or sets up a domain first)If any step fails, jump to Troubleshooting below.
Backup & restore
Your learner model lives in a single SQLite file. Two systemd units handle online backups; an off-host copy (Tailscale rsync, S3, or scheduled SSH pull) keeps you safe from a dead disk.
$ systemctl --user start tutor-mcp-backup.service
To restore from a snapshot:
$ mv ./data/runtime.db ./data/runtime.db.broken-$(date -u +%FT%TZ)
$ rm -f ./data/runtime.db-shm ./data/runtime.db-wal
$ cp ./backups/runtime-2026-05-05T03-30-00Z.db ./data/runtime.db
$ systemctl --user start tutor-mcp
Troubleshooting
The assistant never calls the tutor
Check the logs for missing pipeline decision entries:
If no decisions are logged, the LLM isn't calling get_next_activity, re-trigger explicitly with: « Use Tutor MCP next_activity. »
OAuth handshake fails
Make sure your domain has a valid TLS certificate. curl -I https://your.domain/mcp should return 200 with no warning. AI providers reject self-signed certs.
Repeated phase fallback (NoFringe)
Empty candidate pool, you haven't defined a domain yet, or the goal is too narrow. Run tutor.init_domain with a goal description and three to five concept names.