
如何通过 API 访问 GPT-5 和 GPT-5.2 —— 完整开发者指南
C
Crazyrouter Team
January 23, 2026
30 views中文Tutorial
Share:
OpenAI 发布了迄今最强大的模型:GPT-5、GPT-5.2,以及专注推理的 o3-pro。本文将展示如何通过 Crazyrouter 的统一 API 访问这些前沿模型。
支持的 OpenAI 模型#
Crazyrouter 提供对完整 OpenAI 模型阵列的访问:
| Model | Input ($/1M tokens) | Output ($/1M tokens) | Best For |
|---|---|---|---|
| gpt-5.2 | $1.75 | $14.00 | 最新旗舰,复杂任务 |
| gpt-5.2-pro | $3.50 | $28.00 | 加强版推理能力 |
| gpt-5 | $1.25 | $10.00 | 通用任务 |
| gpt-5-pro | $2.50 | $20.00 | 高级分析 |
| gpt-5-mini | $0.25 | $2.00 | 高性价比 |
| gpt-5-nano | $0.05 | $0.40 | 海量请求场景 |
| o3-pro | $20.00 | $80.00 | 复杂推理 |
| o3-mini | $1.10 | $4.40 | 高效推理 |
| o4-mini | $1.10 | $4.40 | 最新推理模型 |
快速开始#
1. 获取你的 API Key#
- 访问 Crazyrouter 控制台
- 进入「Token Management」
- 点击「Create Token」
- 复制你的 API key(以
sk-开头)
2. 发送你的第一个请求#
使用 Python(推荐)#
python
from openai import OpenAI
client = OpenAI(
api_key="sk-your-api-key",
base_url="https://crazyrouter.com/v1",
default_headers={
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36"
}
)
response = client.chat.completions.create(
model="gpt-5.2",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain quantum computing in simple terms."}
],
temperature=0.7,
max_tokens=1000
)
print(response.choices[0].message.content)
使用 Node.js#
javascript
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'sk-your-api-key',
baseURL: 'https://crazyrouter.com/v1',
defaultHeaders: {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'
}
});
async function main() {
const response = await client.chat.completions.create({
model: 'gpt-5.2',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Explain quantum computing in simple terms.' }
],
temperature: 0.7
});
console.log(response.choices[0].message.content);
}
main();
使用 curl#
bash
curl https://crazyrouter.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-your-api-key" \
-H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36" \
-d '{
"model": "gpt-5.2",
"messages": [{"role": "user", "content": "Hello, GPT-5.2!"}],
"temperature": 0.7
}'
流式响应#
如果需要实时输出,可以启用流式响应:
python
from openai import OpenAI
client = OpenAI(
api_key="sk-your-api-key",
base_url="https://crazyrouter.com/v1",
default_headers={
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36"
}
)
stream = client.chat.completions.create(
model="gpt-5.2",
messages=[{"role": "user", "content": "Write a short story about AI."}],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="", flush=True)
使用推理模型(o3-pro)#
o3-pro 模型在复杂推理任务上表现出色:
python
response = client.chat.completions.create(
model="o3-pro",
messages=[
{"role": "user", "content": "Solve this step by step: If a train travels 120 miles in 2 hours, then stops for 30 minutes, then travels another 90 miles in 1.5 hours, what is the average speed for the entire journey including the stop?"}
]
)
print(response.choices[0].message.content)
GPT-5 Codex 模型#
对于代码生成任务,请使用专门的 codex 模型:
python
response = client.chat.completions.create(
model="gpt-5-codex",
messages=[
{"role": "user", "content": "Write a Python function to implement binary search"}
]
)
可用的 codex 变体:gpt-5-codex、gpt-5-codex-high、gpt-5-codex-medium、gpt-5-codex-low、gpt-5.2-codex
最佳实践#
- 选择合适的模型:简单任务用 gpt-5-nano,复杂任务用 gpt-5.2
- 设置合适的 temperature:偏事实类任务使用较低值(0.1–0.3),创意类任务使用较高值(0.7–1.0)
- 使用流式输出:在聊天类应用中提升用户体验
- 优雅地处理错误:对限流等情况实现重试逻辑
下一步#
- 查看 Model Pricing 获取详细费用
- 阅读 API Documentation 了解高级特性
- 加入我们的 Discord Community 获取支持
如有问题,请联系 support@crazyrouter.com


