Python SDK reference
canopy-ai is the official Python client. Python 3.10+. Uses httpx for transport. Passes mypy --strict.
Constructor
from canopy_ai import Canopy
canopy = Canopy(
api_key="ak_live_…", # required
agent_id="agt_…", # required for pay/preview/check/fetch/discover/ping/budget
base_url=None, # optional
http_client=None, # optional httpx.Client
)| Argument | Type | Required | Description |
|---|---|---|---|
api_key | str | Yes | Org API key |
agent_id | str | None | No* | Required for most methods |
base_url | str | None | No | Override base URL |
http_client | httpx.Client | None | No | Custom HTTP client (timeouts, proxies) |
Async client
For async frameworks (LangGraph, FastAPI, asyncio), use AsyncCanopy:
from canopy_ai import AsyncCanopy
canopy = AsyncCanopy(
api_key="ak_live_…",
agent_id="agt_…",
)
result = await canopy.pay(to="0x...", amount_usd=0.10)Same arguments. http_client accepts an httpx.AsyncClient. Every method becomes a coroutine.
pay()
canopy.pay(
*,
to: str,
amount_usd: float,
idempotency_key: str | None = None,
chain_id: int | None = None,
) -> PayResultReturn shapes (TypedDict, discriminated on status):
# Allowed
{
"status": "allowed",
"tx_hash": str | None,
"signature": str | None,
"transaction_id": str | None,
"cost_usd": float | None,
"idempotent": bool,
"dry_run": bool,
}
# Pending approval
{
"status": "pending_approval",
"approval_id": str,
"transaction_id": str,
"reason": str,
}
# Denied
{
"status": "denied",
"reason": str,
"transaction_id": str,
}preview()
canopy.preview(*, to: str, amount_usd: float) -> PayResultcheck()
URL-driven counterpart to preview(). Probes a paywalled URL, parses the 402 (x402 or MPP), runs the agent's policy in dry-run mode, and returns the parsed offer plus an allowed / pending_approval / denied verdict — without signing.
canopy.check(url: str) -> CheckResult
# CheckResult is a TypedDict union discriminated on `status`:
{
"status": "allowed",
"rail": "x402" | "mpp",
"chain_id": int,
"amount_usd": float,
"recipient": {"address": str, "slug": str | None, "name": str | None},
"resource_url": str,
"scheme": str | None, # x402 only
"network": str, # "base", "tempo", "eip155:8453", …
"realm": str | None, # mpp only
"cached": bool, # True when served from the 60s probe cache
}
# Plus `reason` on pending/denied; `approval_threshold_usd` on pending.Server-side probe — Canopy's egress, not yours. Cached per (org, url) for 60 seconds. Async: await async_canopy.check(url).
fetch()
canopy.fetch(
url: str,
*,
method: str = "GET",
headers: dict | None = None,
content: bytes | None = None,
wait_for_approval: bool = True,
) -> httpx.ResponseAuto-pays HTTP 402 responses. Non-402 responses pass through.
get_approval_status() / wait_for_approval()
canopy.get_approval_status(approval_id: str) -> ApprovalStatus
canopy.wait_for_approval(
approval_id: str,
*,
timeout_ms: int = 300_000,
poll_interval_ms: int = 2_000,
) -> ApprovalStatusApprovalStatus is a TypedDict with status, decided_at, expires_at, transaction_id.
discover()
canopy.discover(
*,
category: str | list[str] | None = None,
query: str | None = None,
limit: int = 20,
include_blocked: bool = False,
include_unverified: bool = False,
) -> list[DiscoveredService]ping()
canopy.ping() -> PingResultReturns {"ok": True, "agent": {...}, "org": {...}, "latency_ms": int}.
budget()
canopy.budget() -> BudgetSnapshotReturns {"agent_id", "cap_usd", "spent_usd", "remaining_usd", "period_hours", "period_resets_at"}. cap_usd and remaining_usd are None when no policy is bound.
get_tools()
canopy.get_tools() -> list[CanopyTool]Returns the canonical tool list — canopy_pay, canopy_discover_services, canopy_approve, canopy_deny — as [{name, description, parameters: JSONSchema, execute}]. Use this as the framework-agnostic fallback. For OpenAI, Anthropic, LangChain, or OpenAI Agents SDK prefer the namespace methods or subpath imports below.
canopy.openai
OpenAI Chat Completions adapter. No peer dep.
canopy.openai.tools() -> list[dict]
canopy.openai.dispatch(tool_calls) -> list[dict]tools() returns [{"type": "function", "function": {...}}] ready for chat.completions.create(tools=...). dispatch() consumes completion.choices[0].message.tool_calls (SDK objects or plain dicts) and returns [{"role": "tool", "tool_call_id", "content"}] messages for the next turn.
Skips non-Canopy tool calls; embeds errors as {"error": "..."} JSON. pending_approval outcomes preserve recipient_name, amount_usd, expires_at, chat_approval_enabled.
AsyncCanopy.openai.dispatch(...) is awaitable — same shape.
canopy.anthropic
Anthropic Messages adapter. No peer dep.
canopy.anthropic.tools() -> list[dict]
canopy.anthropic.dispatch(content) -> list[dict]tools() returns [{"name", "description", "input_schema"}]. dispatch() consumes the assistant reply.content blocks and returns [{"type": "tool_result", "tool_use_id", "content"}] blocks to wrap in a user message.
AsyncCanopy.anthropic.dispatch(...) is awaitable.
For Claude Agent SDK, use Connect Claude Agent SDK and the Canopy MCP server instead. Claude Agent SDK requires MCP tools to be allowed with names like mcp__canopy__canopy_pay or the wildcard mcp__canopy__*.
Subpath imports
Adapters that need framework-specific classes ship as subpath modules with optional peer deps. Install with the matching extra.
from canopy_ai import Canopy
from canopy_ai.langchain import to_langchain_tools # pip install 'canopy-ai[langchain]'
from canopy_ai.openai_agents import to_openai_agents_tools # pip install 'canopy-ai[openai-agents]'
canopy = Canopy(api_key=..., agent_id=...)
lc_tools = to_langchain_tools(canopy) # list[StructuredTool]
agents_tools = to_openai_agents_tools(canopy) # list[FunctionTool]Both helpers accept either Canopy or AsyncCanopy and bind the appropriate sync/async executor.
Errors
The error hierarchy is CanopyError → CanopyConfigError, CanopyApiError, CanopyNetworkError, CanopyApprovalTimeoutError. pay(), preview(), and check() never raise for denied or pending_approval. See Errors for the full reference.