☀️ AI Morning Minute: Context Drift
The reason your chatbot gets worse the longer you talk
You’ve probably felt this. You’re 30 messages into a conversation with a chatbot and it starts ignoring things you told it earlier. A constraint you set disappears. An idea you rejected shows up again.
The answers are still polished, but something is off. The model isn’t broken. It’s drifting.
What it means
Context drift is the gradual loss of coherence that happens as an AI conversation gets longer. The model starts strong, remembering your instructions, building on earlier decisions, and staying on topic. But as the conversation grows, earlier information gets crowded out by newer messages. The model doesn’t forget in the way a person does. It still has access to the earlier text. But as the context window fills up, older tokens compete with newer ones for the model’s attention, and the older ones lose.
Why it matters
It’s the most common reason people think a chatbot “got dumb.” The model didn’t change. The conversation got too long. Research shows that after just one bad interaction, 30% of users are more likely to switch brands, abandon a purchase, or share negative feedback. Drift erodes trust faster than most companies realize.
It’s different from hallucination. A chatbot that asks for your order number again after you already gave it is drifting. A chatbot that tells you your package was delivered by a unicorn is hallucinating. Both are wrong, but the causes and fixes are different.
The fix is counterintuitive: stop and restart. When a model starts reintroducing rejected ideas or dropping constraints you set earlier, the best move isn’t to argue with it for ten more turns. It’s to start a fresh conversation with a clean summary of where you are. Projects and custom instructions help because they reload your core context at the start of every new chat, so you don’t lose the foundation even when you reset the thread.
Simple example
You’re giving directions to a friend over the phone. For the first few turns, they’re sharp. Left on Main, right on Oak, past the school. But the call goes on for 45 minutes with detours, backtracking, and side conversations about lunch. By the time you say “take the next left,” they’ve lost track of where they are and which direction they’re facing. They’re still listening. They still heard every word.
But the conversation got too long for them to hold the whole route in their head. Context drift is that moment when the AI is still responding, but it’s lost the shape of what you were building together.

