Client SDKs
The Poe API can be accessed through three primary methods: the native Poe Python library, the OpenAI-compatible REST API, or the Anthropic-compatible REST API. Choose the method that best fits your use case.
Quick Comparison
| Feature | Poe Python Library | OpenAI-Compatible API | Anthropic-Compatible API |
|---|---|---|---|
| Installation | pip install fastapi-poe | Any HTTP client or OpenAI SDK | Any HTTP client or Anthropic SDK |
| Language Support | Python only | Any language | Any language |
| Supported Bots | All bots on Poe | All bots on Poe | ⚠️ Claude models only |
| Custom Parameters | ✅ Full support | ❌ Not supported | ❌ Not supported |
| File Upload | ✅ Native support | Limited | Via Anthropic format |
| Streaming | ✅ Async/sync | ✅ Via standard streaming | ✅ Via SSE |
| Error Handling | ✅ Enhanced | Standard HTTP errors | Anthropic error format |
| Best For | New Python projects, custom parameters | OpenAI migrations, multi-language projects | Anthropic migrations, Claude-only use cases |
Recommendation
For new Python projects: Use the Poe Python library (fastapi-poe) for the best experience, native error handling, and full feature support including custom parameters.
For OpenAI migration or non-Python projects: Use the OpenAI-compatible API for seamless integration with existing tools and access to all bots.
For Anthropic migration (Claude only): Use the Anthropic-compatible API if you're migrating from Anthropic and only need Claude models.
Option 1: Poe Python Library (Recommended)
The official Poe Python library provides the most feature-complete way to interact with Poe bots and models.
Installation
pip install fastapi-poeBasic Usage
import fastapi_poe as fp
api_key = "your_api_key" # Get from https://poe.com/api_key
message = fp.ProtocolMessage(role="user", content="Hello world")
for partial in fp.get_bot_response_sync(
messages=[message],
bot_name="GPT-3.5-Turbo",
api_key=api_key
):
print(partial)import asyncio
import fastapi_poe as fp
async def get_response():
api_key = "your_api_key" # Get from https://poe.com/api_key
message = fp.ProtocolMessage(role="user", content="Hello world")
async for partial in fp.get_bot_response(
messages=[message],
bot_name="GPT-3.5-Turbo",
api_key=api_key
):
print(partial)
asyncio.run(get_response())Key Features
Custom Parameters (Python Library Only)
The Poe Python library supports passing custom parameters that are not available via the OpenAI-compatible API:
import fastapi_poe as fp
message = fp.ProtocolMessage(
role="user",
content="Explain quantum computing",
parameters={"thinking_budget": 1000}
)
for partial in fp.get_bot_response_sync(
messages=[message],
bot_name="Claude-Sonnet-4.5",
api_key=api_key
):
print(partial)import fastapi_poe as fp
message = fp.ProtocolMessage(
role="user",
content="Solve this complex problem: ...",
parameters={"reasoning_effort": "high"}
)
for partial in fp.get_bot_response_sync(
messages=[message],
bot_name="GPT-5",
api_key=api_key
):
print(partial)import fastapi_poe as fp
message = fp.ProtocolMessage(
role="user",
content="A cat in a hat",
parameters={
"aspect_ratio": "9:16",
"style": "photorealistic"
}
)
for partial in fp.get_bot_response_sync(
messages=[message],
bot_name="Imagen-4",
api_key=api_key
):
print(partial)Custom parameters like thinking_budget, reasoning_effort, and aspect_ratio cannot be passed via the OpenAI-compatible API. They must be passed through the parameters field in ProtocolMessage using the Poe Python library.
File Upload
import fastapi_poe as fp
api_key = "your_api_key"
# Upload file
pdf_attachment = fp.upload_file_sync(
open("document.pdf", "rb"),
api_key=api_key
)
# Send message with attachment
message = fp.ProtocolMessage(
role="user",
content="Summarize this document",
attachments=[pdf_attachment]
)
for partial in fp.get_bot_response_sync(
messages=[message],
bot_name="GPT-3.5-Turbo",
api_key=api_key
):
print(partial)Available APIs
The Poe Python library can access:
- ✅ Chat Completions -
fp.get_bot_response()/fp.get_bot_response_sync() - ✅ File Upload -
fp.upload_file()/fp.upload_file_sync() - ✅ Custom Parameters - Via
parametersfield - ✅ All public bots and models
Learn More
For detailed documentation and advanced usage, see the External Application Guide.
Option 2: OpenAI-Compatible API
Use the standard OpenAI SDK or any HTTP client to access Poe models through an OpenAI-compatible interface.
Using the OpenAI SDK
# pip install openai
import os
import openai
client = openai.OpenAI(
api_key=os.getenv("POE_API_KEY"), # https://poe.com/api_key
base_url="https://api.poe.com/v1",
)
chat = client.chat.completions.create(
model="Claude-Sonnet-4",
messages=[{"role": "user", "content": "Top 3 things to do in NYC?"}],
)
print(chat.choices[0].message.content)// npm install openai
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.POE_API_KEY, // https://poe.com/api_key
baseURL: "https://api.poe.com/v1",
});
const completion = await client.chat.completions.create({
model: "Claude-Sonnet-4",
messages: [
{
role: "user",
content: "What are the top 3 things to do in New York?",
},
],
});
console.log(completion.choices[0].message.content);curl "https://api.poe.com/v1/chat/completions" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $POE_API_KEY" \
-d '{
"model": "Claude-Sonnet-4",
"messages": [
{
"role": "user",
"content": "Write a one-sentence bedtime story."
}
]
}'Using Raw HTTP Requests
You can also make direct HTTP requests to the Poe API using any HTTP client:
import os
import requests
headers = {
"Authorization": f"Bearer {os.environ['POE_API_KEY']}",
"Content-Type": "application/json",
}
payload = {
"model": "Claude-Sonnet-4",
"messages": [
{"role": "user", "content": "What are the top 3 things to do in NYC?"}
]
}
response = requests.post(
"https://api.poe.com/v1/chat/completions",
headers=headers,
json=payload,
timeout=30,
)
response.raise_for_status()
print(response.json())const response = await fetch("https://api.poe.com/v1/chat/completions", {
method: "POST",
headers: {
"Authorization": `Bearer ${process.env.POE_API_KEY}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
model: "Claude-Sonnet-4",
messages: [
{ role: "user", content: "What are the top 3 things to do in NYC?" }
]
}),
});
const result = await response.json();
console.log(result);package main
import (
"bytes"
"encoding/json"
"fmt"
"net/http"
"os"
)
func main() {
payload := map[string]any{
"model": "Claude-Sonnet-4",
"messages": []any{
map[string]any{
"role": "user",
"content": "What are the top 3 things to do in NYC?",
},
},
}
body, _ := json.Marshal(payload)
req, _ := http.NewRequest("POST", "https://api.poe.com/v1/chat/completions", bytes.NewReader(body))
req.Header.Set("Authorization", "Bearer "+os.Getenv("POE_API_KEY"))
req.Header.Set("Content-Type", "application/json")
res, _ := http.DefaultClient.Do(req)
defer res.Body.Close()
var result map[string]any
json.NewDecoder(res.Body).Decode(&result)
fmt.Println(result)
}using System;
using System.Net.Http;
using System.Net.Http.Json;
var http = new HttpClient();
http.DefaultRequestHeaders.Authorization =
new System.Net.Http.Headers.AuthenticationHeaderValue(
"Bearer",
Environment.GetEnvironmentVariable("POE_API_KEY")
);
var payload = new {
model = "Claude-Sonnet-4",
messages = new[] {
new { role = "user", content = "What are the top 3 things to do in NYC?" }
}
};
var response = await http.PostAsJsonAsync(
"https://api.poe.com/v1/chat/completions",
payload
);
var result = await response.Content.ReadFromJsonAsync<dynamic>();
Console.WriteLine(result);Available APIs
The OpenAI-compatible API supports:
- ✅ Chat Completions -
/v1/chat/completions(streaming and non-streaming) - ✅ List Models -
/v1/models - ✅ Current Balance -
/usage/current_balance - ✅ Usage History -
/usage/points_history - ❌ Custom Parameters - Not supported (use Poe Python library instead)
Key Limitations
The OpenAI-compatible API has some limitations compared to the native Poe Python library:
- No custom parameter support - Cannot pass
thinking_budget,reasoning_effort,aspect_ratio, etc. - Private bots not supported - Only public bots can be accessed
- App-Creator bot unavailable - Cannot be used via this API
- Best-effort parameter passing - Some model-specific parameters may not work as expected
- Media bots - Should be called with
stream=falsefor best results
For detailed compatibility information, see the OpenAI Compatible API Guide.
Option 3: Anthropic-Compatible API
Use the standard Anthropic SDK or any HTTP client to access Claude models through an Anthropic-compatible interface. This is ideal if you're migrating from Anthropic or prefer the Anthropic Messages API format.
Claude Models Only
The Anthropic-compatible API only supports official Claude models. You cannot use this endpoint to call custom bots or models from other providers like GPT, Gemini, or Llama. For access to all bots on Poe, use the OpenAI-compatible API or the Poe Python SDK.
Using the Anthropic SDK
# pip install anthropic
import os
import anthropic
client = anthropic.Anthropic(
api_key=os.getenv("POE_API_KEY"), # https://poe.com/api_key
base_url="https://api.poe.com",
)
message = client.messages.create(
model="claude-sonnet-4",
max_tokens=1024,
messages=[{"role": "user", "content": "Top 3 things to do in NYC?"}],
)
print(message.content[0].text)// npm install @anthropic-ai/sdk
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
apiKey: process.env.POE_API_KEY, // https://poe.com/api_key
baseURL: "https://api.poe.com",
});
const message = await client.messages.create({
model: "claude-sonnet-4",
max_tokens: 1024,
messages: [
{
role: "user",
content: "What are the top 3 things to do in New York?",
},
],
});
console.log(message.content[0].text);curl "https://api.poe.com/v1/messages" \
-H "Content-Type: application/json" \
-H "x-api-key: $POE_API_KEY" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "claude-sonnet-4",
"max_tokens": 1024,
"messages": [
{
"role": "user",
"content": "Write a one-sentence bedtime story."
}
]
}'Available APIs
The Anthropic-compatible API supports:
- ✅ Messages -
/v1/messages(streaming and non-streaming) - ✅ Tool Use - Function calling with Claude
- ✅ Vision - Image inputs
- ❌ Non-Claude bots - Only official Claude models are supported
Key Differences
- Claude models only - Cannot access GPT, Gemini, Llama, or custom bots
- Anthropic error format - Errors follow Anthropic's API conventions
- Direct provider proxy - Requests are proxied to Anthropic with minimal transformation
For detailed information, see the Anthropic Compatible API Guide.
Authentication
All API methods require authentication using your Poe API key.
Get Your API Key
- Visit https://poe.com/api_key
- Copy your API key
- Store it securely (see best practices below)
Best Practices
Security Warning
Never expose your API key in client-side code, public repositories, or commit it to version control. Always keep it secure on your server or in environment variables.
Key Management:
- ✅ Store API keys in environment variables, not in code
- ✅ Use different API keys for development and production
- ✅ Rotate API keys periodically
- ✅ Revoke compromised keys immediately
- ❌ Never commit API keys to version control
- ❌ Never expose keys in client-side code
Environment Variable Setup
# Add to ~/.bashrc or ~/.zshrc
export POE_API_KEY="your_api_key_here"
# Or set for current session
export POE_API_KEY="your_api_key_here"# Set for current session
$env:POE_API_KEY = "your_api_key_here"
# Set permanently (user level)
[System.Environment]::SetEnvironmentVariable('POE_API_KEY', 'your_api_key_here', 'User')# Create .env file
# POE_API_KEY=your_api_key_here
# Load in Python
from dotenv import load_dotenv
import os
load_dotenv()
api_key = os.getenv("POE_API_KEY")// Create .env file
// POE_API_KEY=your_api_key_here
// Load in Node.js
import dotenv from 'dotenv';
dotenv.config();
const apiKey = process.env.POE_API_KEY;Rate Limits and Usage
Rate Limits
- 500 requests per minute per API key
- Monitor usage with response headers
- Implement exponential backoff for retries
Point-Based Billing
All API usage consumes points from your account:
- Check balance: Get Current Balance
- View history: Get Points History
- Purchase additional points: https://poe.com/api_key
Different models consume different amounts of points based on computational cost. See the List Models endpoint for pricing information.