☀️ AI Morning Minute: Token
The bite-sized building blocks of how AI 'reads' and 'writes' text.
What it means
AI doesn’t see “words”; it sees Tokens. These are the chunks of text that AI systems actually process. One token is roughly 4 characters or three-quarters of a word. The AI breaks your input into these chunks, processes them, and then generates new tokens as its output.
Why it matters
Language into Math: Computers can’t understand letters or feelings; they only understand numbers. Tokens turn our human language into a numerical code the AI can actually calculate.
The “Brain” Capacity: Since the AI has a finite amount of “computing power” for any single task, tokens act as a budget. It can only “think” about a certain number of tokens at once before it runs out of room.
Pricing & Limits: Because tokens represent the actual work the computer is doing, that’s how companies measure what to charge you and where to set the limit on how much text you can paste in.
Simple example
Take the sentence: “AI is amazing!” The AI sees this as 4 tokens: AI | is | amazing | !
Common words are usually one token. However, very long or rare words (like extraordinarily) are often chopped into two or three pieces to make them easier for the AI to handle.

