```
from openai import OpenAI
# Point the client to the TensorZero Gateway
client = OpenAI(base_url="http://localhost:3000/openai/v1", api_key="not-used")
response = client.chat.completions.create(
# Call any model provider (or TensorZero function)
model="tensorzero::model_name::anthropic::claude-sonnet-4-6",
messages=[
{
"role": "user",
"content": "Share a fun fact about TensorZero.",
}
],
)```
You can layer additional features only as needed (fallbacks, templates, A/B testing, etc).