Skip to content
Migration Guide

Langfuse Joined ClickHouse.
Your Observability Doesn’t
Have to Stop.

ClickHouse acquired Langfuse in January 2026 as part of their $400M Series D. Langfuse’s roadmap is now tied to database infrastructure. Curate-Me gives you observability with governance built in -- migrate with one base URL change.

Replace the Langfuse callback with a base URL swap. Keep your tracing. Add governance.


What the ClickHouse acquisition means

Langfuse had 2,000+ paying customers and 26M+ SDK installs per month. Now its product direction is part of ClickHouse’s $15B database infrastructure vision -- not independent AI governance.

Langfuse (Acquired by ClickHouse)

Part of ClickHouse’s $15B ecosystem

  • Product direction tied to ClickHouse corporate priorities
  • Observability-only -- no gateway proxy or policy enforcement
  • No managed runners or sandbox execution
  • Self-hosting requires maintaining your own infrastructure
  • Independence lost -- roadmap decisions made by database company

Curate-Me (Active Development)

Independent platform, weekly releases

  • Observability + governance in one platform
  • 5-step governance chain on every proxied request
  • Managed SaaS -- no infrastructure to maintain
  • BYOVM option for on-prem requirements
  • Independent company -- no acquisition risk

Feature mapping: Langfuse to Curate-Me

Langfuse was the strongest open-source LLM observability tool. Here is how each feature maps to Curate-Me -- plus the governance features Langfuse never built.

Langfuse FeatureCurate-Me Equivalent
TracingW3C Trace Context + distributed tracing
Every gateway request gets a W3C-compliant trace ID. Full request/response capture with latency breakdown and cost attribution.
Cost trackingReal-time cost tracking with per-request attribution
Redis-backed real-time accumulator plus MongoDB audit log. Costs are tracked per-request, per-key, per-org, and per-fleet.
Prompt managementGovernance policies + model allowlists
Instead of versioned prompts, enforce which models and providers each team can use. Governance policies control cost caps, rate limits, and personal data rules.
EvaluationsModel recommendation engine
Cost-aware model optimizer analyzes usage patterns and recommends cheaper models that maintain quality. Automated A/B testing support.
DatasetsUsage analytics + cost breakdowns
Full analytics dashboard with cost trends, model usage distribution, latency percentiles, and governance event logs.
Self-hostingManaged platform (or BYOVM)
Fully managed SaaS. For teams that need on-prem, BYOVM (Bring Your Own VM) lets you run managed runners on your infrastructure.

Migrate in 4 steps

Replace the Langfuse callback handler with a base URL swap. Remove the dependency. Configure governance.

Sign up at dashboard.curate-me.ai

Create your account in 30 seconds. Free tier includes 10K requests per month. No credit card required.

Replace Langfuse callback handler with gateway base URL swap

Instead of wrapping your LLM client with a Langfuse callback, change OPENAI_BASE_URL to https://api.curate-me.ai/v1/openai and add the X-CM-API-Key header. The gateway captures everything Langfuse traced -- and enforces governance on top.

Remove the langfuse package dependency

Uninstall langfuse and langfuse-langchain from your project. The gateway handles tracing, cost tracking, and observability without any client-side SDK.

Configure governance policies in the dashboard

Set daily budgets, model allowlists, personal data scanning rules, and rate limits. Default policies (100 RPM, $10/day budget, personal data scan enabled) are applied automatically on signup.

LangChain integration: before and after

Before (Langfuse callback)
from langfuse.callback import ( CallbackHandler, ) handler = CallbackHandler( public_key="pk-lf-xxx", secret_key="sk-lf-xxx", host="https://cloud.langfuse.com", ) from langchain_openai import ChatOpenAI llm = ChatOpenAI( model="gpt-4o", callbacks=[handler], ) res = llm.invoke("Hello")
Tracing only. No governance. Requires langfuse package.
After (Curate-Me gateway)
# No langfuse import needed from langchain_openai import ChatOpenAI llm = ChatOpenAI( model="gpt-4o", base_url=( "https://api.curate-me.ai" "/v1/openai" ), default_headers={ "X-CM-API-Key": "cm_sk_xxx", }, ) res = llm.invoke("Hello")
Full governance chain. No client SDK. Zero dependencies.

Remove the callback handler. Add two parameters. Remove the langfuse dependency.


5-Minute Migration

Start Free Migration

Replace the Langfuse callback with a base URL swap. Get cost enforcement, personal data scanning, model allowlists, human approvals, and managed runners -- all without a client-side SDK.

10K requests/month free·No credit card required·No client SDK required