Quick Start
Prerequisites
Installation
Configuration
Update your OpenAI client initialization - just 2 lines:Making Requests
Layer supports three ways to specify your gate:- Using gateId (Recommended)
- Using model field
- Using header
Requires
@ts-ignore comment for TypeScript, but provides the clearest intent.Streaming Example
Streaming works exactly like the standard OpenAI SDK:What’s Supported
Streaming
Streaming
Full support for
stream: true with identical behavior to OpenAITool/Function Calling
Tool/Function Calling
Works across all providers - Layer handles format conversion
Vision
Vision
Image inputs in messages fully supported
All Message Types
All Message Types
System, user, assistant, and tool messages all supported
Standard Parameters
Standard Parameters
temperature, max_tokens, top_p, and other OpenAI parametersUsage & Cost Tracking
Usage & Cost Tracking
Standard
response.usage with token counts, plus Layer’s cost fieldLanguage Examples
Migration Guide
Follow these steps to migrate an existing OpenAI application:1. Update Client Initialization
Find where you initialize the OpenAI client and update it:2. Add Gate ID
AddgateId to your completion calls:
3. Update Environment Variables
.env
4. Test & Deploy
- Test in development
- Verify requests appear in Layer dashboard
- Check cost tracking is working
- Deploy to production
Easy Rollback: If anything breaks, revert the 2 changes (
baseURL + apiKey) - takes 30 seconds.Advanced Usage
Multiple Gates
Use different gates based on the task:Environment-Specific Configuration
Error Handling
How It Works
- You send OpenAI SDK request to
https://api.uselayer.ai/v1/chat/completions - Layer receives Validates gate, applies routing rules
- Layer routes Sends to the model configured in your gate (GPT, Claude, Gemini, etc.)
- Provider responds Returns response in provider’s format
- Layer normalizes Converts back to OpenAI format
- You receive Standard OpenAI response with added cost/metadata
FAQ
Do I need to change my code?
Do I need to change my code?
No. Just change
baseURL and apiKey in the OpenAI client initialization.Can I still use OpenAI directly?
Can I still use OpenAI directly?
Yes. Just revert
baseURL and apiKey to OpenAI values.Does streaming work?
Does streaming work?
Yes.
stream: true works exactly like OpenAI.Can I use Claude or Gemini?
Can I use Claude or Gemini?
Yes. Configure your gate to use any model your code stays the same.
What about function calling?
What about function calling?
Fully supported. Layer handles the conversion across all providers.
Do I need the Layer SDK?
Do I need the Layer SDK?
No. This approach uses only the OpenAI SDK.
Does this work with Vercel AI SDK?
Does this work with Vercel AI SDK?
Yes. The Vercel AI SDK’s OpenAI adapter works with Layer’s OpenAI-compatible endpoint.
Comparison: OpenAI SDK vs Layer SDK
| Feature | OpenAI SDK + Layer | Layer SDK |
|---|---|---|
| Migration Effort | 2 lines of code | Full refactor |
| API Format | OpenAI format | Layer native format |
| Use Case | Drop-in replacement | New projects |
| Multi-modal | Chat only | Chat, image, video, audio |
| Streaming | ||
| Tool Calling | ||
| Cost Tracking | ||
| Admin Operations | L | (via Admin SDK) |
- Existing OpenAI apps � Use OpenAI SDK + Layer (this guide)
- New projects � Use Layer SDK for full feature set