The Enterprise AI Gateway

๐Ÿš… LiteLLM

Cost Tracking

Cost Tracking

Cost Tracking

Batches API

Guardrails

Model Access

Model Access

Model Access

Budgets

Budgets

Budgets

LLM Observability

Rate Limiting

Rate Limiting

Rate Limiting

Prompt Management

Prompt Management

Prompt Management

s3 Logging

Pass-Through Endpoints

User

User

User

LLMs

MCP

Agents

  • LLM Gateway

  • Budgets & Rate Limits

  • MCP Gateway

  • RBAC & Usage Tracking by Key, Team, Org

  • Agent Gateway

  • Logging to Datadog & OpenTelemetry

  • Built-in and 3rd Party Guardrails

  • OIDC, SSO, Custom Authentication

240M+
240M+

docker pulls

docker pulls

1B+
1B+

requests served

requests served

80%
80%

uptime

uptime

1,005+
1,005+

contributors

contributors

Deploy in any environment

Self-hosted, on-prem, or in your cloud.

Self-hosted, on-prem, or in your cloud.

On-prem

Multi-Cloud

Kubernetes

Helm

Features

All available on LiteLLM Open-Source.

All available on LiteLLM Open-Source.

Delegate access

LiteLLM Enterprise allows you to delegate access of teams and organizations to admins. Reducing the time you spend managing the proxy.

Generate tokens on the fly

With OIDC/JWT Auth and 'group-based access', developers can generate temporary tokens to call models via LiteLLM. Reducing time spent managing keys and model access.

Monitor Proxy in Production

LiteLLM Enterprise provides Prometheus Metrics, Support with SLA's, and additional alerting features, to help you guarantee reliability in production.

Trusted by