Skip to main content
Layer AI provides an Anthropic-compatible API endpoint. You can use the official Anthropic SDK by pointing it to Layer’s base URL instead of Anthropic’s.

Quick Start

Prerequisites

  • Layer AI account (sign up)
  • Layer API key from your Dashboard
  • A configured gate (gate UUID)

Installation

npm install @anthropic-ai/sdk

Configuration

Update your Anthropic client initialization - just 2 lines:
import Anthropic from '@anthropic-ai/sdk';

const anthropic = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY,
});

Making Requests

Layer supports three ways to specify your gate:

Streaming Example

Streaming works exactly like the standard Anthropic SDK:
const stream = await anthropic.messages.create({
  gateId: process.env.GATE_ID,
  max_tokens: 1024,
  messages: [
    { role: 'user', content: 'Tell me a joke.' }
  ],
  stream: true,
});

for await (const event of stream) {
  if (event.type === 'content_block_delta' && event.delta.type === 'text_delta') {
    process.stdout.write(event.delta.text);
  }
}

What’s Supported

Full support for stream: true with identical behavior to Anthropic
Works across all providers - Layer handles format conversion
Image inputs in messages fully supported
System prompts, user, and assistant messages all supported
temperature, max_tokens, top_p, and other Anthropic parameters
Standard response.usage with token counts, plus Layer’s cost field

Language Examples

import Anthropic from '@anthropic-ai/sdk';

const anthropic = new Anthropic({
  baseURL: 'https://api.uselayer.ai',
  apiKey: process.env.LAYER_API_KEY,
});

const response = await anthropic.messages.create({
  gateId: process.env.GATE_ID,
  max_tokens: 1024,
  messages: [
    { role: 'user', content: 'Hello!' }
  ],
});

console.log(response.content[0].text);

Migration Guide

Follow these steps to migrate an existing Anthropic application:

1. Update Client Initialization

Find where you initialize the Anthropic client and update it:
// Before
const anthropic = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY });

// After
const anthropic = new Anthropic({
  baseURL: 'https://api.uselayer.ai',
  apiKey: process.env.LAYER_API_KEY,
});

2. Add Gate ID

Add gateId to your message calls:
// Before
const response = await anthropic.messages.create({
  model: 'claude-3-5-sonnet-20241022',
  max_tokens: 1024,
  messages: [...],
});

// After
const response = await anthropic.messages.create({
  gateId: process.env.GATE_ID,  // ✓ Add this
  max_tokens: 1024,
  messages: [...],               // Everything else stays the same
});

3. Update Environment Variables

.env
LAYER_API_KEY=layer_your_key
GATE_ID=your-gate-uuid

4. Test & Deploy

  1. Test in development
  2. Verify requests appear in Layer dashboard
  3. Check cost tracking is working
  4. Deploy to production
Easy Rollback: If anything breaks, revert the 2 changes (baseURL + apiKey) - takes 30 seconds.

Advanced Usage

Multiple Gates

Use different gates based on the task:
const COMPLEX_TASK_GATE = 'uuid-for-claude-opus';
const SIMPLE_TASK_GATE = 'uuid-for-claude-haiku';

async function complexTask(prompt: string) {
  return await anthropic.messages.create({
    gateId: COMPLEX_TASK_GATE,
    max_tokens: 4096,
    messages: [{ role: 'user', content: prompt }],
  });
}

async function simpleTask(prompt: string) {
  return await anthropic.messages.create({
    gateId: SIMPLE_TASK_GATE,
    max_tokens: 1024,
    messages: [{ role: 'user', content: prompt }],
  });
}

Environment-Specific Configuration

const baseURL = process.env.NODE_ENV === 'production'
  ? 'https://api.uselayer.ai'
  : 'http://localhost:3001';  // Local Layer API for dev

const anthropic = new Anthropic({ baseURL, apiKey: process.env.LAYER_API_KEY });

Error Handling

try {
  const response = await anthropic.messages.create({
    gateId: process.env.GATE_ID,
    max_tokens: 1024,
    messages: [{ role: 'user', content: 'Hello!' }],
  });

  console.log(response.content[0].text);
} catch (error) {
  if (error.status === 404) {
    console.error('Gate not found - check your GATE_ID');
  } else if (error.status === 429) {
    console.error('Rate limit exceeded');
  } else {
    console.error('Request failed:', error.message);
  }
}

How It Works

  1. You send → Anthropic SDK request to https://api.uselayer.ai/anthropic/v1/messages
  2. Layer receives → Validates gate, applies routing rules
  3. Layer routes → Sends to the model configured in your gate (Claude, GPT, Gemini, etc.)
  4. Provider responds → Returns response in provider’s format
  5. Layer normalizes → Converts back to Anthropic format
  6. You receive → Standard Anthropic response with added cost/metadata
Your code doesn’t know the difference.

FAQ

No. Just change baseURL and apiKey in the Anthropic client initialization.
Yes. Just revert baseURL and apiKey to Anthropic values.
Yes. stream: true works exactly like Anthropic.
Yes. Configure your gate to use any model — your code stays the same.
Fully supported. Layer handles the conversion across all providers.
No. This approach uses only the Anthropic SDK.

Comparison: Anthropic SDK vs Layer SDK

FeatureAnthropic SDK + LayerLayer SDK
Migration Effort2 lines of codeFull refactor
API FormatAnthropic formatLayer native format
Use CaseDrop-in replacementNew projects
Multi-modalChat onlyChat, image, video, audio
Streaming
Tool Use
Cost Tracking
Admin Operations✓ (via Admin SDK)
Recommendation:
  • Existing Anthropic apps → Use Anthropic SDK + Layer (this guide)
  • New projects → Use Layer SDK for full feature set

Next Steps