AI Glossary/Token (in AI)

What Is Token (in AI)?

Definition

In AI, a token is the fundamental unit of text that a language model processes — typically a word, subword, or character — and the total number of tokens determines input limits, output length, and API pricing.

How Token (in AI) Works

AI models don't read text the way humans do. Instead, they break text into tokens using a process called tokenization. Common words like 'the' are usually one token, but longer or less common words get split into multiple tokens (e.g., 'tokenization' might become 'token' + 'ization'). On average, one token is roughly 3/4 of an English word, so 100 tokens is approximately 75 words. Tokens matter because they determine three critical things: how much text fits in the model's context window, how long the response can be, and how much you pay when using AI APIs. Understanding tokens helps users estimate costs and optimize their use of AI models.

Real-World Examples

1

The sentence 'Hello, how are you?' being split into 6 tokens: 'Hello', ',', ' how', ' are', ' you', '?'

2

OpenAI charging $0.01 per 1,000 input tokens for GPT-4o mini API usage

3

A developer using OpenAI's tiktoken library to count tokens before sending a request to stay within context limits

V

Token (in AI) on Vincony

Vincony provides 100 free credits per month, with transparent token-based pricing across all 400+ AI models accessible on the platform.

Try Vincony free →

Recommended Tools

Related Terms