Token Counter

Paste your text to estimate how many tokens it contains and what it would cost across different AI models. Useful for budgeting API calls and checking context window limits.

Characters

0

Words

0

Estimated Tokens

0

Lines

0

How We Estimate Tokens

Different models use different tokenisers (GPT uses cl100k_base, Claude uses its own, etc.), so exact counts vary. Our estimate uses the widely-accepted approximation:

1 token ≈ 4 characters (English text)
1 token ≈ 0.75 words

For exact counts, use the provider's tokeniser: OpenAI Tokenizer or Anthropic Workbench.

Cost Estimates (as Input)

What this text would cost as input tokens across different models. Output tokens (the model's response) are typically 2-5x more expensive.

Model Input $/1M Cost for this text

Context Window Check

Does your text fit within common context windows?