☀️ AI Morning Minute: Jagged Frontier
Why AI can write your marketing strategy but can't count the letters in "strawberry"
Arguably, the most confusing thing about AI isn't what it can do. It's that what it can and can't do makes no sense. Tasks that seem hard for humans are easy for AI. Tasks that seem easy for humans trip it up completely. That pattern has a name, and understanding it changes how you use these tools.
What it means:
The jagged frontier is a concept from a 2023 study by Ethan Mollick and researchers at Harvard and the Wharton School, run in collaboration with Boston Consulting Group. It describes the uneven boundary of AI capability: some tasks that look equally difficult to a human fall on completely opposite sides of what AI can handle. The frontier isn't a smooth line that moves forward evenly. It's spiky, full of peaks where AI is superhuman and valleys where it fails at things a child could do.
Why it matters:
In the BCG experiment, 758 consultants using GPT-4 finished 12.2% more tasks, worked 25.1% faster, and produced 40% higher quality results on tasks inside the frontier. But on a task outside the frontier, consultants using AI were 19% less likely to get the right answer than those working without it. Same people, same tool, opposite results.
You can’t predict which side of the frontier a task falls on by how hard it feels. AI can generate a competitive market analysis in seconds but struggles to count the words in a sentence. It can write a legal brief but can’t reliably tell you if “receive” is spelled correctly. The difficulty you perceive and the difficulty AI experiences are two different scales.
The frontier is moving, but it’s staying jagged. Every model update fills in some valleys and extends some peaks. Mollick has noted that the concept is becoming less useful as a cautionary tale because many of the old failure examples have been fixed. But new gaps keep appearing in different places. The shape shifts. The unevenness doesn’t go away.
Simple example:
You hire a new employee who aces every interview question about strategy, writes brilliant memos, and produces presentations faster than anyone on your team. Then you ask them to add up a column of ten numbers and they get it wrong three times in a row. You wouldn't fire them. But you'd stop handing them the calculator and start handing them the strategy decks.
That's what working with AI looks like right now: figure out where the peaks are, use them hard, and keep your hands on the wheel everywhere else.

