Quick Start

Get observability on your AI agent fast. Keep your API key in env vars and start tracing in minutes.

ZappyBee Beta is free while we collect feedback. Fair-use monthly limits apply on trace/step ingest to prevent abuse. If you exceed limits, ingest endpoints return 429 with a reset time.

1. Create a project

Sign up, create a project, and copy your API key from Settings.

2. Install the SDK

bash
npm install zappybee

Or for Python:

bash
pip install zappybee

3. Initialize ZappyBee

Recommended: store secrets in ZAPPYBEE_API_KEY.

typescript
import { ZappyBee } from "zappybee";

ZappyBee.init({
  apiKey: process.env.ZAPPYBEE_API_KEY || "tc_live_...",
  // Optional. Defaults to http://localhost:3001
  baseUrl: process.env.ZAPPYBEE_BASE_URL || "https://tokencat-api-riuvb.ondigitalocean.app",
});

4. Auto-wrap or trace manually

Auto-wrap works for Anthropic and OpenAI. For Gemini, Grok, Mistral, Llama, DeepSeek, and anything else, use manual tracing.

typescript
import Anthropic from "@anthropic-ai/sdk";
import { ZappyBee } from "zappybee";

ZappyBee.init({
  apiKey: process.env.ZAPPYBEE_API_KEY || "tc_live_...",
  // Optional. Defaults to http://localhost:3001
  baseUrl: process.env.ZAPPYBEE_BASE_URL || "https://tokencat-api-riuvb.ondigitalocean.app",
});

const client = new Anthropic();
ZappyBee.wrap(client);

const msg = await client.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 512,
  messages: [{ role: "user", content: "Hello!" }],
});
typescript
import { ZappyBee } from "zappybee";

ZappyBee.init({
  apiKey: process.env.ZAPPYBEE_API_KEY || "tc_live_...",
  // Optional. Defaults to http://localhost:3001
  baseUrl: process.env.ZAPPYBEE_BASE_URL || "https://tokencat-api-riuvb.ondigitalocean.app",
});

const trace = await ZappyBee.startTrace("my-agent");
const step = await trace.startStep("llm", "llm_call", { model: "gemini-2.5-pro" });

// ... call your provider here ...
await step.end({
  status: "success",
  model: "gemini-2.5-pro",
  output: result,
  promptTokens: 123,
  completionTokens: 456,
});
await trace.end({ status: "success" });

5. Handle errors (so you can debug long agents)

For long-running agents, always end the trace and the current step in try/finally. This makes it obvious where the agent failed.

typescript
const trace = await ZappyBee.startTrace("my-agent");

try {
  const step = await trace.startStep("llm-call", "llm_call", { model: "gpt-4o" });
  try {
    // ... your LLM call ...
    await step.end({ status: "success", model: "gpt-4o", output: result, promptTokens: 123, completionTokens: 456 });
  } catch (e) {
    await step.end({ status: "error", model: "gpt-4o", errorMessage: String(e) });
    throw e;
  }
  await trace.end({ status: "success" });
} catch (e) {
  await trace.end({ status: "error", errorMessage: String(e) });
  throw e;
}

Support

Questions, feedback, or need higher beta limits? Email contact.support@zappybee.dev.