Skip to Content
DocsChangelog

Changelog

0.1.0 — 2026-04-24

Initial release. Published as bolder-ai on PyPI.

Included

  • beval.init() with env-driven config (BEVAL_API_KEY, BEVAL_API_URL, BEVAL_PROJECT_ID, BEVAL_DEFAULT_MODEL_ID, BEVAL_DEBUG).
  • beval.log() — fire-and-forget single-log ingest.
  • beval.wrap() — auto-instrument for OpenAI (chat.completions.create) and Anthropic (messages.create).
  • @beval.trace — sync and async function decorator logging as kind="agent".
  • Background thread worker with retry (408/429/5xx, exponential backoff).
  • Graceful atexit shutdown, explicit flush() / shutdown() APIs.
  • Redaction hook (redact= on init()).
  • VLM image ingest — bytes / base64 / data: / http(s) URLs auto-normalized.
  • Drop-on-overflow queue with configurable size.

Known limitations

  • Streaming responses (stream=True) aren’t captured by the wrappers.
  • Async clients (AsyncOpenAI, AsyncAnthropic) aren’t wrapped.
  • Tool / function calling metadata is captured in extra for OpenAI only.
  • No batch ingest; one HTTP POST per log.
  • No nested trace / span support; @beval.trace produces flat logs.
  • Not fork-safe by default (see Reliability → Fork safety).

Roadmap

Up next (0.2)

  • Streaming support. Wrap stream iterators; capture output on stream end. Add ttft_ms as a first-class metric.
  • Async clients. AsyncOpenAI and AsyncAnthropic auto-wrapping.
  • Anthropic tool-use capture. Currently only OpenAI.
  • Batch ingest. Drain the queue in chunks once the gateway’s POST /api/v1/logs/ingest/batch lands. No SDK API change, just a dial.

Beyond

  • Traces and spansbeval.trace_context() context manager, nested @trace, parent/child tree in the dashboard.
  • LiteLLM CustomLogger — official callback integration.
  • LangChain BaseCallbackHandler — per-chain-step spans.
  • LlamaIndex callback manager — per-step spans.
  • Direct-to-S3 image upload — skip inline base64 for large VLM payloads.
  • Server-side SDK config — pull sampling / redaction rules from the gateway at init.
  • OpenTelemetry bridge — OTLP HTTP/JSON export mapping GenAI semconv to BEval logs and spans.

Semver policy

  • Patch (0.1.0 → 0.1.1): bug fixes, dependency bumps, non-breaking additions to extra field conventions.
  • Minor (0.1 → 0.2): new public APIs, new integrations, new optional config fields. Existing code keeps working.
  • Major (0.x → 1.0): breaking changes to public APIs. Will come with a migration guide.

Before 1.0, the SDK commits to:

  • Never renaming or removing fields on beval.log().
  • Never changing the default behavior of beval.init() without an env var / flag opt-out.
  • Never breaking the /api/v1/logs/ingest contract — any new required fields would be opt-in via a new endpoint.
Last updated on