Welcome to Metrx — The P&L for Your AI Workforce. This documentation covers everything you need to understand, integrate with, and deploy Metrx.
Metrx is a unified platform for tracking AI agent costs, outcomes, and ROI across your organization. It includes:
- Gateway (Cloudflare Worker) — High-performance LLM proxy that handles authentication, cost calculation, and event logging
- Web Dashboard (Next.js) — Real-time insights, team management, billing, and outcome tracking
- Background Workers (BullMQ) — Asynchronous event processing and data synchronization
- SDK (@metrxbot/sdk) — Easy integration for your applications
- Enterprise Stack — Supabase (database), Clerk (auth), Stripe (billing)
Quick Navigation
For API Integration
- API Reference — Complete endpoint documentation with examples
- Integration Guide — Step-by-step setup and usage
For Infrastructure
- Architecture — System design, data flows, and components
- Self-Hosting Guide — Run Metrx on your infrastructure
Additional Resources
- Changelog — Release notes and version history
Getting Started
1. Get an API Key
Sign up at app.metrxbot.com and generate an API key from your organization settings. Your API key grants access to the Gateway.
2. Choose Your Integration Method
Option A: Use the Gateway (Recommended) The Gateway is a drop-in replacement for your existing LLM API calls. You can proxy through Metrx with minimal code changes:
# Instead of calling OpenAI directly:
curl https://api.openai.com/v1/chat/completions \
-H "Authorization: Bearer sk_openai_xxxx" \
...
# Route through Metrx:
curl https://gateway.metrxbot.com/v1/chat/completions \
-H "Authorization: Bearer al_xxxx" \
-H "X-Provider-Key: sk_openai_xxxx" \
...Option B: Use the SDK For advanced use cases, install the SDK:
npm install @metrxbot/sdk3. Track Costs and Outcomes
Once integrated, costs are automatically calculated and tracked. View real-time dashboards and set up webhooks for outcomes.
Pricing Tiers
| Tier | Monthly Calls | Price | Best For |
|---|---|---|---|
| Starter | 1,000 | Free | Evaluation & testing |
| Lite | 10,000 | $29/mo | Small teams |
| Pro | 100,000 | $99/mo | Growing teams |
| Business | 1,000,000 | $499/mo | Enterprise deployments |
| Enterprise | Unlimited | Custom | Custom SLAs & support |
All tiers include real-time cost tracking, outcome management, team management, and webhook support.
Core Concepts
API Key
Your authentication token for the Gateway. Treat it like a password. Never commit it to version control.
Organization
A workspace containing team members, agents, and billing settings.
Agent
A named LLM-powered system within your organization. Each agent tracks its own costs and outcomes.
Session
A conversation thread or workflow instance. Sessions group related LLM calls together.
Event
A single LLM call (chat completion, message, embedding). Events record input/output tokens, cost, latency, and status.
Outcome
A business result tied to one or more sessions. Examples: “successfully handled customer inquiry”, “correctly identified fraud”, “generated valid code”.
Support
For help, visit:
- Documentation: docs.metrxbot.com
- API Status: status.metrxbot.com
- Email: support@metrxbot.com
- Slack: Community Slack
Security & Privacy
Metrx uses industry-standard security practices:
- API keys are hashed and cached in Cloudflare KV for sub-millisecond authentication
- Data is encrypted at rest in Supabase using AES-256
- Transport is TLS 1.3 only
- Row-Level Security (RLS) ensures teams only access their data
- Third-party integrations (OpenAI, Anthropic, etc.) never see your data except your requests
See SECURITY_REVIEW.md for detailed security information.
Ready to integrate? Start with the Integration Guide.