LiteLLM OSS
Ideal for organizations trying to give LLM Access to developers across multiple LLM API's, looking for cost-tracking, guardrails and logging across all LLM endpoints. No data or telemetry is sent to LiteLLM Servers when you self-host.
OpenAI-Compatible
Virtual Keys
Prometheus Metrics
Spend Tracking (Request, Tag, Key, User, Team, Org)
Global Guardrails
Budgets & Rate Limits (Key, User, Team, Org, End-user)
Request/Response Logs
Fallbacks and Loadbalancing

