OpenAI SDK 相容性
BazaarLink 與 OpenAI Python 和 Node.js SDK 完全相容。只需兩項更改:
之前(直接使用 OpenAI)
python
from openai import OpenAI
client = OpenAI(
api_key="sk-..."
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[...]
)之後(使用 BazaarLink)
python
from openai import OpenAI
client = OpenAI(
base_url="https://bazaarlink.ai/api/v1",
api_key="sk-bl-YOUR_API_KEY"
)
response = client.chat.completions.create(
model="openai/gpt-4o", # add provider/
messages=[...]
)就是這樣!
所有其他功能 — 串流、工具呼叫、結構化輸出、非同步 — 完全相同。唯一的更改是 base_url、api_key,以及在模型 ID 前加上供應商前綴。
Python
安裝
bash
pip install openai
同步
python
from openai import OpenAI
client = OpenAI(
base_url="https://bazaarlink.ai/api/v1",
api_key="sk-bl-YOUR_API_KEY",
)
response = client.chat.completions.create(
model="openai/gpt-4.1",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)非同步
python
import asyncio
from openai import AsyncOpenAI
client = AsyncOpenAI(
base_url="https://bazaarlink.ai/api/v1",
api_key="sk-bl-YOUR_API_KEY",
)
async def main():
response = await client.chat.completions.create(
model="anthropic/claude-sonnet-4.6",
messages=[{"role": "user", "content": "Hello async world!"}],
)
print(response.choices[0].message.content)
asyncio.run(main())非同步串流
python
async def stream_chat():
stream = await client.chat.completions.create(
model="deepseek/deepseek-chat",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True,
)
async for chunk in stream:
content = chunk.choices[0].delta.content
if content:
print(content, end="", flush=True)
asyncio.run(stream_chat())TypeScript / Node.js
安裝
bash
npm install openai
基本用法
typescript
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://bazaarlink.ai/api/v1",
apiKey: process.env.BAZAARLINK_API_KEY,
});
async function chat(message: string): Promise<string> {
const response = await client.chat.completions.create({
model: "openai/gpt-4.1",
messages: [{ role: "user", content: message }],
});
return response.choices[0].message.content ?? "";
}串流
typescript
const stream = await client.chat.completions.create({
model: "openai/gpt-4.1",
messages: [{ role: "user", content: "Tell me a story" }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content ?? "");
}LangChain
透過設定 OpenAI 相容的 base URL,在 LangChain 中使用 BazaarLink。
python
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="https://bazaarlink.ai/api/v1",
api_key="sk-bl-YOUR_API_KEY",
model="anthropic/claude-sonnet-4.6",
)
response = llm.invoke("Explain BazaarLink in one sentence.")
print(response.content)LlamaIndex
在 LlamaIndex RAG 管線中使用 BazaarLink。LlamaIndex 支援 OpenAI 相容的 base URL,更改兩個參數即可。
python
from llama_index.llms.openai import OpenAI
from llama_index.core import Settings
Settings.llm = OpenAI(
model="openai/gpt-4.1",
api_base="https://bazaarlink.ai/api/v1",
api_key="sk-bl-YOUR_API_KEY",
)
# Now use LlamaIndex normally — it calls BazaarLink
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader("./docs").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("What is this documentation about?")
print(response)Vercel AI SDK
在 Next.js 應用中將 Vercel AI SDK 與 BazaarLink 整合。支援串流、工具呼叫、結構化輸出。
Installation
bash
npm install ai @ai-sdk/openai
typescript
import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";
const bazaarlink = createOpenAI({
baseURL: "https://bazaarlink.ai/api/v1",
apiKey: "sk-bl-YOUR_API_KEY",
});
const { text } = await generateText({
model: bazaarlink("openai/gpt-4.1"),
prompt: "Write a haiku about programming",
});
console.log(text);