Skip to content
Langfuse acquired by ClickHouse -- January 2026

Langfuse Alternatives 2026
After the ClickHouse Acquisition

Langfuse was the leading open-source LLM observability platform with 2,000+ paying customers. ClickHouse acquired it in January 2026 as part of their $15B ecosystem. If you need an independent platform with observability and governance, here are the alternatives.

Curate-Me vs Portkey vs LiteLLM -- feature comparison, migration steps, and FAQ.


What Langfuse did well

Langfuse set the standard for open-source LLM observability. Any replacement should match these strengths -- and ideally go further.

Best-in-Class Tracing

Langfuse's distributed tracing for LLM pipelines was unmatched. Nested spans, token-level cost attribution, and rich metadata made debugging agent workflows straightforward.

Open-Source Foundation

With 26M+ SDK installs per month, Langfuse built the largest open-source LLM observability community. Self-hosting was a first-class option, giving teams full data control.

Prompt Management

Version-controlled prompts with A/B testing and rollback. Teams could iterate on prompts without redeploying code -- a workflow that many teams built around.

Evaluation Framework

Built-in scoring, human annotation, and automated evaluation pipelines. Langfuse made it easy to measure LLM output quality at scale.


Why Langfuse users are looking for alternatives

The ClickHouse acquisition changed the trajectory of Langfuse. These are the concerns driving teams to evaluate alternatives.

Product direction tied to database infrastructure

ClickHouse is a database company. Langfuse's roadmap will prioritize features that drive ClickHouse adoption (data ingestion, analytics queries, storage) rather than AI governance or agent execution capabilities.

Loss of independence

Langfuse was an independent company making product decisions purely for its users. Now it is part of a $15B enterprise with different strategic priorities. Feature requests go through ClickHouse's corporate planning process.

Observability alone is not enough in 2026

When Langfuse launched, observability was the gap. Now the gap is governance -- cost enforcement, personal data scanning, model access controls, human approvals. Tracing tells you what happened; governance controls what happens.

No managed execution environment

As AI agents evolve from simple API calls to complex workflows with browser automation, file operations, and multi-agent coordination, teams need managed runners with sandbox isolation. Langfuse never offered this.

Open-source future is uncertain

While the Langfuse repo remains open-source, corporate acquisitions often lead to license changes, feature gating, or eventual sunsetting of standalone products. Teams that built on Langfuse's open-source model face uncertainty.

Langfuse alternatives compared

A side-by-side comparison of the three main Langfuse alternatives: Curate-Me, Portkey, and LiteLLM.

FeatureCurate-MePortkeyLiteLLM
LLM Observability/Tracing
Yes (Observer SDK)
Partial (logging)
AI Gateway Proxy
Yes (50+ providers)
Yes (200+ models)
Yes (100+ models)
Independent Platform
Yes (OSS)
Managed Service
Cost Tracking
Yes + budget enforcement
Partial (logging)
Budget Enforcement
Yes (per-org daily caps)
Partial (budget limits)
Personal Data Scanning
Prompt Management
Partial
Evaluations/Scoring
Human-in-the-Loop
Rate Limiting
Yes (per-org, per-key)
Managed Runners
Yes (OpenClaw-native)
Immutable Audit Trail
Partial
Time-Travel Debugging
EU AI Act Ready
Self-Hosted Option
Coming soon
Yes (primary model)
Free Tier
1K req/day
10K req/mo
Unlimited (self-hosted)
Pricing
$49/mo Starter
$49/mo
Free (OSS) / Enterprise

Comparison based on publicly available documentation as of March 2026.


The three alternatives in detail

Curate-Me -- Best for observability + governance + runners

Independent platform, weekly releases

Curate-Me is the only alternative that combines observability with a full governance chain and managed OpenClaw runners. While it does not replicate Langfuse’s prompt management and evaluation features, it goes far beyond observability with cost enforcement, personal data scanning, model allowlists, human-in-the-loop approvals, sandbox execution, and EU AI Act compliance. Curate-Me can also run alongside your existing Langfuse SDK if you want to keep tracing while adding governance.

Portkey -- Best for gateway routing + basic observability

$18M funding, active development

Portkey offers an AI gateway with routing, load balancing, and observability features. It provides request logging and basic cost tracking similar to what Langfuse offered. However, it lacks Langfuse’s depth in tracing (no nested spans), has no prompt management, no evaluation framework, no personal data scanning, and no managed runners.

Curate-Me vs Portkey

LiteLLM -- Best for self-hosted, open-source proxy

Open-source, self-hosted

LiteLLM is an open-source Python proxy with a unified OpenAI-compatible interface to 100+ models. Like Langfuse, it is open-source and self-hostable. Unlike Langfuse, it does not offer tracing, prompt management, or evaluations. It is best for teams that want a free model routing layer and are comfortable managing their own infrastructure. LiteLLM is adding RBAC and A2A features but remains primarily a proxy.

Curate-Me vs LiteLLM

Add governance to your Langfuse stack in 5 minutes

Keep your Langfuse tracing. Curate-Me works as a complementary gateway proxy. Add governance without replacing your observability setup.

Sign up for free at curate-me.ai

Create your account in 30 seconds. No credit card required.

Get your API key from the dashboard

Your API key and gateway URL are shown immediately after signup.

Point your AI SDK at the Curate-Me gateway

Change OPENAI_BASE_URL to api.curate-me.ai/v1/openai. Your Langfuse tracing SDK continues to work alongside -- they are complementary.

That's it -- governance + tracing, together

Your existing Langfuse traces keep flowing. Now every request also passes through cost enforcement, personal data scanning, rate limiting, and human approvals.

Before (Langfuse only)
OPENAI_BASE_URL=https://api.openai.com/v1 # Langfuse SDK traces calls # No governance, no cost caps
Tracing and observability only. No policy enforcement.
After (Curate-Me + Langfuse)
OPENAI_BASE_URL=https://api.curate-me.ai/v1/openai X-CM-API-Key: cm_sk_xxx # Langfuse SDK still traces calls
Full governance chain + your existing Langfuse tracing.

Frequently asked questions

What happened to Langfuse?

Langfuse was acquired by ClickHouse in January 2026 as part of ClickHouse's $400M Series D at a $15B valuation. Langfuse had 2,000+ paying customers and 26M+ SDK installs per month, used by 19 of the Fortune 50. Its product direction is now tied to ClickHouse's database infrastructure vision.

Is Langfuse still open source?

The Langfuse open-source repository remains available as of March 2026. However, the team is now part of ClickHouse, and the product roadmap is driven by ClickHouse corporate priorities. The long-term future of the standalone open-source project depends on ClickHouse's strategic decisions.

What is the best Langfuse alternative in 2026?

The best Langfuse alternative depends on your needs. Curate-Me is the best choice if you need observability plus governance (cost enforcement, personal data scanning, human approvals) and managed runners. Portkey is a good option if you only need gateway routing and basic observability. LiteLLM is best if you want a free, self-hosted proxy.

Can I use Curate-Me alongside Langfuse?

Yes. Curate-Me works as a gateway proxy that sits between your app and LLM providers. If you want to keep your Langfuse tracing SDK while adding governance, you can run both: the Langfuse SDK traces calls from your application, while Curate-Me enforces governance policies at the proxy layer. They are complementary.

How do I migrate from Langfuse to Curate-Me?

Migration takes about 5 minutes. Sign up at curate-me.ai, get your API key, then set OPENAI_BASE_URL to api.curate-me.ai/v1/openai and add the X-CM-API-Key header. If you were using Langfuse as a proxy, remove the Langfuse proxy configuration. If you were using the Langfuse SDK for tracing only, you can keep it running alongside Curate-Me.

Does Curate-Me have prompt management like Langfuse?

Curate-Me focuses on governance and execution rather than prompt management. For prompt versioning and evaluation, you can use Curate-Me alongside dedicated prompt management tools. Curate-Me's governance chain adds cost enforcement, personal data scanning, model allowlists, and human approvals that Langfuse never offered.

Is there a free open-source Langfuse alternative?

LiteLLM is a free open-source LLM proxy that you can self-host. It provides model routing and basic logging but does not include tracing, evaluations, or governance features. Curate-Me offers a free tier with 1,000 requests per day with full governance chain access. For self-hosted tracing, you can still run the Langfuse open-source repo alongside Curate-Me.

Start in 5 Minutes

The Best Langfuse Alternative
Is Already Running

Swap one base URL. Get cost enforcement, personal data scanning, model allowlists, human approvals, managed runners, and an immutable audit trail -- all complementary to your existing Langfuse tracing.

10K requests/month free·No credit card required·Works alongside Langfuse tracing