OpenAI

OpenAI Provider

The MultiRoute OpenAI wrapper extends the official openai Python library, routing through MultiRoute's smart LLM/AI router with automatic failover.

Installation

Ensure you have the required dependencies:

pip install multiroute["openai"]

Basic Usage

To enable MultiRoute routing, simply import OpenAI from multiroute.openai instead of the standard openai package.

import os
from multiroute.openai import OpenAI

# Set your keys
os.environ["MULTIROUTE_API_KEY"] = "your-multiroute-key"
os.environ["OPENAI_API_KEY"] = "your-openai-key"

client = OpenAI()

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "How does MultiRoute work?"}]
)

print(response.choices[0].message.content)

Async Support

Full async support is available via AsyncOpenAI.

import asyncio
from multiroute.openai import AsyncOpenAI

async def main():
    client = AsyncOpenAI()
    
    response = await client.chat.completions.create(
        model="gpt-4o",
        messages=[{"role": "user", "content": "Hello!"}]
    )
    print(response.choices[0].message.content)

asyncio.run(main())

On connection error, timeout, or 5xx from the router, the client automatically falls back to OpenAI's API using your OPENAI_API_KEY. If no MULTIROUTE_API_KEY is set, it defaults to direct OpenAI usage and logs a warning.