TechniquesLast updated: April 2026
Tokens
The units of text that AI models process — roughly equivalent to word pieces.
In Plain English
Tokens are how AI models break down text for processing. A token might be a whole word, part of a word, or a punctuation mark. On average, one token is about 4 characters or ¾ of a word in English. AI pricing and context limits are measured in tokens. When you hit a "token limit," you've reached the maximum text the model can process at once.
💡Real-World Example
The sentence "Hello, how are you?" is typically 6 tokens: "Hello", ",", " how", " are", " you", "?"
Related Terms
What did you think of our explanation?
