https://store-images.s-microsoft.com/image/apps.40025.ac0d3b4a-c67a-45ec-91a9-894c06eaf676.d766dfbe-f6e1-4b60-a237-b6e16bedf5e7.bb839fa9-72f3-43a0-81e1-1cc5aabe08e8
LiteLLM Proxy Server - LLM Gateway
BerriAI
LiteLLM Proxy Server - LLM Gateway
BerriAI
LiteLLM Proxy Server - LLM Gateway
BerriAI
LLM Gateway to call 100+ LLM APIs using the OpenAI format Bedrock,VertexAI, Azure OpenAI
With LiteLLM Proxy Server, you'll get access to a Proxy Server to call 100+ LLMs in a unified interface where you'll be able to track spend, set budgets per virtual key and users.
You'll be able to set budgets & rate limits per project, API key, and model on OpenAI Proxy Servers.
You can also translate inputs to the provider's completion, embedding, and image_generation endpoints as well as retry/fallback logic across multiple LLM deployments.
https://store-images.s-microsoft.com/image/apps.34426.ac0d3b4a-c67a-45ec-91a9-894c06eaf676.d766dfbe-f6e1-4b60-a237-b6e16bedf5e7.f3e21747-4b31-45ca-a056-fbd5d8d5d9a3
https://store-images.s-microsoft.com/image/apps.34426.ac0d3b4a-c67a-45ec-91a9-894c06eaf676.d766dfbe-f6e1-4b60-a237-b6e16bedf5e7.f3e21747-4b31-45ca-a056-fbd5d8d5d9a3