You don't need a gateway to track LLM cost
Portkey is an AI gateway — every LLM call routes through their servers so they can also handle routing, fallback, and caching. If all you need is cost visibility, LLMeter pulls the same numbers from a read-only API key. No proxy. No SDK. No latency hop.
No credit card required. Free plan never expires.
When does the switch make sense?
Portkey and LLMeter solve different halves of the same problem. Pick by what you actually use today.
You log in for cost numbers
If 90% of your Portkey dashboard time is spent looking at spend per model, customer, or environment — that's the half LLMeter handles natively, without a gateway.
You want less latency
A gateway adds one network hop to every LLM call. For latency-sensitive products (chat, agentic loops), removing the hop is a free p95 improvement.
You'd rather not proxy prompts
LLMeter never sees a single prompt or completion. Read-only billing keys cannot read content — there's no path for it to reach LLMeter servers.
Keep Portkey if you actively use routing, automatic fallback between providers, prompt caching, or guardrails — those have no LLMeter equivalent.
Set up in 3 steps
No SDK installation. No code deployment. Just a read-only key.
Sign up for LLMeter
Free account, no credit card. The free plan covers 1 provider forever.
Paste a read-only provider key
Generate a read-only key in OpenAI / Anthropic / DeepSeek / OpenRouter and paste it into LLMeter. No SDK to install, no virtual keys to mint.
Revert the Portkey base URL (if you only used cost tracking)
If routing/fallback was the reason you adopted Portkey, keep it — LLMeter is complementary. If cost tracking was the only feature, you can revert the proxy and reclaim the latency.
Portkey vs LLMeter
Same goal (LLM cost visibility), two different architectures.
| Feature | Portkey | LLMeter |
|---|---|---|
| Architecture | AI gateway (proxy on every call) | Read-only API key (no proxy) |
| Setup method | SDK install + base URL swap | Paste a read-only key |
| Time to first dashboard | ~10 min (SDK config + virtual keys) | 30 seconds |
| Code changes required | Yes (SDK + base URL) | None |
| Adds latency to LLM calls | Yes (one extra hop) | No (out-of-band) |
| Sees prompts/completions | Yes (proxies all traffic) | No (only billing data) |
| Routing / fallback / caching | Yes (primary feature) | No (out of scope) |
| Open source | SDK yes, gateway hosted | AGPL-3.0 (full stack) |
| Pricing entry | $0 (limited) → $99/mo Pro | Free for 1 provider → $19/mo Pro |
| Best fit | Teams that need routing + fallback | Teams that only need cost visibility |
Gateway vs read-only key
The architectural choice changes what you can ship — and what can break.
Portkey (gateway)
- Routing, fallback, retries, caching, guardrails
- Adds latency on every LLM call
- Gateway outage propagates to your product
- Third party sees prompts and completions
LLMeter (read-only key)
- Pulls usage/billing data directly from providers
- Zero latency impact (out-of-band)
- LLMeter downtime cannot break your LLM calls
- Never sees prompts or completions
Migration FAQ
When does it make sense to drop Portkey for LLMeter?
If the main reason you adopted Portkey was the dashboard for cost and usage, and you do not actively rely on routing, fallback, retries, prompt caching, or guardrails — LLMeter does the cost half without the gateway. Teams that need multi-provider routing or fallback should keep Portkey: those features have no equivalent in LLMeter.
Does LLMeter sit in front of my LLM calls like Portkey does?
No. LLMeter never proxies traffic. It uses read-only API keys to pull usage data from the provider directly (OpenAI usage API, Anthropic billing endpoints, etc.) on a polling schedule. Your application talks to OpenAI / Anthropic / DeepSeek the same way it always did — no extra hop, no extra latency, no extra failure mode.
Can LLMeter see my prompts or completions?
No. Read-only API keys grant access to usage and billing data only. LLMeter never sees the content of a single request. This is a hard architectural property — there is no opt-out because there is no path for prompts to reach LLMeter servers in the first place.
How does the cost data compare to what Portkey shows?
LLMeter pulls cost data straight from the provider, so it matches your invoice to the cent. Portkey computes cost from the request stream it proxies — the numbers usually match but can drift if the gateway misses requests during an incident. LLMeter's source of truth is the provider, not a proxy log.
I run on Portkey today. How risky is the switch?
Low risk because LLMeter is additive. You can run LLMeter alongside Portkey for a week, compare the dashboards, and only then decide whether to drop the gateway. Read-only keys do not interfere with any traffic Portkey is handling.
Is LLMeter free?
Yes — the Free plan covers 1 provider with 30-day retention, forever. The Pro plan ($19/mo) adds unlimited providers, budget alerts, anomaly detection, and 1-year retention. Team plans start at $49/mo with seats and shared budgets.
Cost visibility without the proxy hop.
Plug LLMeter in next to Portkey for a week, compare the numbers, and decide. Free forever for 1 provider.
No credit card required. Free plan never expires.