Skip to content
Helicone acquired by Mintlify -- March 3, 2026

Helicone Alternatives 2026
After the Mintlify Acquisition

Helicone was acquired by Mintlify on March 3, 2026 and entered maintenance mode. If you depended on Helicone for AI gateway proxying, cost tracking, or request logging, here are the alternatives that are still actively developed.

Curate-Me vs Portkey vs LiteLLM -- feature comparison, migration steps, and FAQ.


What Helicone did well

Before evaluating alternatives, it is worth acknowledging what made Helicone a strong product. Any replacement should cover these capabilities at minimum.

Fast Rust-Based Proxy

Helicone's gateway was built in Rust, delivering low-latency request proxying. It sat between your app and OpenAI/Anthropic with minimal overhead.

Clean Request Logging

Every LLM request and response was logged with latency, token counts, and cost. The dashboard made it easy to search and filter request history.

Simple Cost Tracking

Automatic cost calculation per request based on model and token counts. Dashboards showed spend over time, by model, and by user.

Prompt Caching

Helicone offered response caching to reduce costs on repeated identical prompts. A useful optimization for development and testing workflows.


Why Helicone users are looking for alternatives

The acquisition changed what Helicone is. These are the concerns driving teams to evaluate alternatives.

Maintenance mode means no new features

Mintlify's roadmap is documentation tools, not AI governance. Helicone will receive security patches but no new integrations, no new providers, and no feature development.

The AI landscape is evolving fast

New models, new providers, and new patterns (agent workflows, A2A communication, orchestration) require active development. A maintenance-mode proxy cannot keep up.

Observability alone is not enough

Helicone logged what happened. It never controlled what happened. Modern AI stacks need cost enforcement, personal data scanning, model access controls, and human approvals.

No managed execution environment

As AI agents move beyond API calls to browser automation, file operations, and tool use, teams need managed runners with sandbox isolation. Helicone never offered this.

Vendor lock-in risk

Depending on an acquired product means depending on the acquirer's priorities. Mintlify could sunset Helicone entirely or fold it into a documentation-specific product.

Helicone alternatives compared

A side-by-side comparison of the three main Helicone alternatives: Curate-Me, Portkey, and LiteLLM.

FeatureCurate-MePortkeyLiteLLM
AI Gateway Proxy
Yes (50+ providers)
Yes (200+ models)
Yes (100+ models)
Active Development
Yes -- weekly releases
Yes (OSS community)
Managed Service
Cost Tracking
Yes + budget enforcement
Partial (logging)
Budget Enforcement
Yes (per-org daily caps)
Partial (budget limits)
Personal Data Scanning
Model Allowlists
Human-in-the-Loop
Rate Limiting
Yes (per-org, per-key)
Managed Runners
Yes (OpenClaw-native)
Immutable Audit Trail
Partial
Time-Travel Debugging
EU AI Act Ready
Self-Hosted Option
Coming soon
Yes (primary model)
Free Tier
1K req/day
10K req/mo
Unlimited (self-hosted)
Pricing
$49/mo Starter
$49/mo
Free (OSS) / Enterprise

Comparison based on publicly available documentation as of March 2026.


The three alternatives in detail

Curate-Me -- Best for governance + managed runners

Independent platform, weekly releases

Curate-Me is the only alternative that combines AI gateway proxying with a full governance chain and managed OpenClaw runners. It covers everything Helicone did (request logging, cost tracking, caching) and adds cost enforcement, personal data scanning, model allowlists, human-in-the-loop approvals, sandbox execution, and EU AI Act compliance. Migration from Helicone takes one environment variable change.

Portkey -- Best for gateway routing only

$18M funding, active development

Portkey is a well-funded AI gateway with strong routing, load balancing, and fallback capabilities. It supports 200+ models and offers a free gateway tier. However, it lacks personal data scanning, human-in-the-loop approvals, managed runners, and compliance tooling. If you need only gateway proxying without governance, Portkey is a solid option.

Curate-Me vs Portkey

LiteLLM -- Best for self-hosted, no-frills proxy

Open-source, self-hosted

LiteLLM is an open-source Python proxy that provides a unified OpenAI-compatible interface to 100+ models. It is free and self-hosted, making it a good fit for teams that want full control over their infrastructure. It lacks a managed offering, governance features, personal data scanning, human approvals, and audit trails.

Curate-Me vs LiteLLM

Migrate from Helicone in 5 minutes

No SDK changes. No code refactoring. Just swap one environment variable.

Sign up for free at curate-me.ai

Create your account in 30 seconds. No credit card required.

Get your API key from the dashboard

Your API key and gateway URL are shown immediately after signup.

Replace your Helicone base URL

Change OPENAI_BASE_URL from oai.helicone.ai to api.curate-me.ai/v1/openai. Swap Helicone-Auth for X-CM-API-Key.

That's it -- governance from day one

Your existing SDK calls work unchanged. Now every request passes through cost enforcement, personal data scanning, and rate limiting.

Before (Helicone)
OPENAI_BASE_URL=https://oai.helicone.ai/v1 Helicone-Auth: Bearer sk_xxx
Logging and cost tracking. Maintenance mode.
After (Curate-Me)
OPENAI_BASE_URL=https://api.curate-me.ai/v1/openai X-CM-API-Key: cm_sk_xxx
Full governance chain. Active development. 50+ providers.

Frequently asked questions

What happened to Helicone?

Helicone was acquired by Mintlify on March 3, 2026. Mintlify is a documentation platform company. After the acquisition, Helicone entered maintenance mode -- the core team was reassigned to Mintlify products and only security patches are being released. No new features are planned for the standalone Helicone product.

Is Helicone still working?

Yes, Helicone is still operational as of March 2026. Existing users can continue using the service, but it is in maintenance mode. Security updates are being applied, but no new features, integrations, or improvements are being developed. Teams that depend on an actively-maintained AI gateway should plan a migration.

What is the best Helicone alternative in 2026?

The best Helicone alternative depends on your needs. Curate-Me is the best choice if you need a full governance chain (cost enforcement, personal data scanning, human approvals) plus managed runners. Portkey is a good option if you only need gateway routing and load balancing. LiteLLM is best if you want a free, self-hosted proxy with no governance features.

How do I migrate from Helicone to Curate-Me?

Migration takes about 5 minutes. Sign up at curate-me.ai, get your API key, then replace your Helicone base URL (oai.helicone.ai) with your Curate-Me gateway URL (api.curate-me.ai/v1/openai). Swap the Helicone-Auth header for X-CM-API-Key. No SDK or code changes are needed.

Does Curate-Me support all the providers Helicone supported?

Yes. Curate-Me supports 50+ LLM providers including OpenAI, Anthropic, Google, DeepSeek, Groq, Mistral, xAI, and more. Helicone supported approximately 15 providers. Your existing provider integrations will work with Curate-Me with no changes beyond the base URL swap.

Is there a free Helicone alternative?

Yes. Curate-Me offers a free tier with 1,000 requests per day (approximately 30K per month) with full governance chain access. LiteLLM is free and open-source but requires self-hosting and does not include governance features. Portkey also has a free tier with 10K requests per month.

Start in 5 Minutes

The Best Helicone Alternative
Is Already Running

Swap one base URL. Get cost enforcement, personal data scanning, model allowlists, human approvals, managed runners, and an immutable audit trail -- all actively developed and supported.

10K requests/month free·No credit card required·One-line migration from Helicone