netlify-ai-gateway

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Netlify AI Gateway

Netlify AI Gateway

IMPORTANT: Only use models listed in the "Available Models" section below. AI Gateway does not support every model a provider offers. Using an unsupported model will cause runtime errors.
Netlify AI Gateway provides access to AI models from multiple providers without managing API keys directly. It is available on all Netlify sites.
重要提示: 仅使用下方「可用模型」章节中列出的模型。AI Gateway并不支持提供商提供的所有模型,使用不支持的模型会导致运行时错误。
Netlify AI Gateway 无需直接管理API密钥即可访问来自多个提供商的AI模型。所有Netlify站点均支持该功能。

How It Works

工作原理

The AI Gateway acts as a proxy — you use standard provider SDKs (OpenAI, Anthropic, Google) but point them at Netlify's gateway URL instead of the provider's API. Netlify handles authentication, rate limiting, and monitoring.
AI Gateway 充当代理——你可以使用标准的提供商SDK(OpenAI、Anthropic、Google),但将其指向Netlify的网关URL而非提供商的API。Netlify会处理身份验证、速率限制和监控。

Setup

配置步骤

  1. Enable AI on your site in the Netlify UI
  2. The environment variable
    OPENAI_BASE_URL
    is set automatically by Netlify
  3. Install the provider SDK you want to use
No provider API keys are needed — Netlify's gateway handles authentication.
  1. 在Netlify UI中启用站点的AI功能
  2. 环境变量
    OPENAI_BASE_URL
    由Netlify自动设置
  3. 安装你想要使用的提供商SDK
无需提供商API密钥——Netlify的网关会处理身份验证。

Using OpenAI SDK

使用OpenAI SDK

bash
npm install openai
typescript
import OpenAI from "openai";

const openai = new OpenAI();
// OPENAI_BASE_URL is auto-configured — no API key or base URL needed

const completion = await openai.chat.completions.create({
  model: "gpt-4o-mini",
  messages: [{ role: "user", content: "Hello!" }],
});
bash
npm install openai
typescript
import OpenAI from "openai";

const openai = new OpenAI();
// OPENAI_BASE_URL已自动配置——无需API密钥或基础URL

const completion = await openai.chat.completions.create({
  model: "gpt-4o-mini",
  messages: [{ role: "user", content: "Hello!" }],
});

Using Anthropic SDK

使用Anthropic SDK

bash
npm install @anthropic-ai/sdk
typescript
import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic({
  baseURL: Netlify.env.get("ANTHROPIC_BASE_URL"),
});

const message = await client.messages.create({
  model: "claude-sonnet-4-5-20250929",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }],
});
bash
npm install @anthropic-ai/sdk
typescript
import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic({
  baseURL: Netlify.env.get("ANTHROPIC_BASE_URL"),
});

const message = await client.messages.create({
  model: "claude-sonnet-4-5-20250929",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }],
});

Using Google AI SDK

使用Google AI SDK

bash
npm install @google/generative-ai
typescript
import { GoogleGenerativeAI } from "@google/generative-ai";

const genAI = new GoogleGenerativeAI("placeholder");
// Configure base URL via environment variable

const model = genAI.getGenerativeModel({ model: "gemini-2.5-flash" });
const result = await model.generateContent("Hello!");
bash
npm install @google/generative-ai
typescript
import { GoogleGenerativeAI } from "@google/generative-ai";

const genAI = new GoogleGenerativeAI("placeholder");
// 通过环境变量配置基础URL

const model = genAI.getGenerativeModel({ model: "gemini-2.5-flash" });
const result = await model.generateContent("Hello!");

In a Netlify Function

在Netlify Function中使用

typescript
import type { Config, Context } from "@netlify/functions";
import OpenAI from "openai";

export default async (req: Request, context: Context) => {
  const { prompt } = await req.json();
  const openai = new OpenAI();

  const completion = await openai.chat.completions.create({
    model: "gpt-4o-mini",
    messages: [{ role: "user", content: prompt }],
  });

  return Response.json({
    response: completion.choices[0].message.content,
  });
};

export const config: Config = {
  path: "/api/ai",
  method: "POST",
};
typescript
import type { Config, Context } from "@netlify/functions";
import OpenAI from "openai";

export default async (req: Request, context: Context) => {
  const { prompt } = await req.json();
  const openai = new OpenAI();

  const completion = await openai.chat.completions.create({
    model: "gpt-4o-mini",
    messages: [{ role: "user", content: prompt }],
  });

  return Response.json({
    response: completion.choices[0].message.content,
  });
};

export const config: Config = {
  path: "/api/ai",
  method: "POST",
};

Environment Variables

环境变量

VariableProviderSet by
OPENAI_BASE_URL
OpenAINetlify (automatic)
ANTHROPIC_BASE_URL
AnthropicNetlify (automatic)
These are configured automatically when AI is enabled on the site. No manual setup required.
变量提供商设置方
OPENAI_BASE_URL
OpenAINetlify(自动)
ANTHROPIC_BASE_URL
AnthropicNetlify(自动)
这些变量会在站点启用AI功能时自动配置,无需手动设置。

Local Development

本地开发

With
@netlify/vite-plugin
or
netlify dev
, gateway environment variables are injected automatically. The AI Gateway is accessible during local development after the site has been deployed at least once.
使用
@netlify/vite-plugin
netlify dev
时,网关环境变量会自动注入。站点至少部署一次后,AI Gateway即可在本地开发中使用。

Available Models

可用模型

For the list of supported models, see https://docs.netlify.com/build/ai-gateway/overview/.
如需查看支持的模型列表,请访问https://docs.netlify.com/build/ai-gateway/overview/.