Six Months In: Is AI Really Writing 90% of Your Code?

It was only half a year ago when Dario Amodei, the CEO of Anthropic, looked straight at an interviewer and declared something that sounded less like a prediction and more like a prophecy: within 3 to 6 months, AI would be writing 90 percent of the code. Within 12 months, nearly all code might be generated by machines.
Think about that for a second. For decades, we’ve been told software is eating the world. Now, apparently, software itself is being eaten by its own creation. The engineers who built the tools may soon be replaced by the tools themselves. And Silicon Valley cheered. Because for them, it’s not about whether this shift is good for humanity—it’s about speed, efficiency, and, let’s be honest, shareholder returns.
But step back and ask the obvious question: is it even true? Six months in, is AI really writing 90 percent of your code? Or is this just another case of tech leaders overhyping their products to make Wall Street salivate?
The answer is complicated. Yes, AI is writing more code than ever before. GitHub Copilot, ChatGPT, Claude, Code Llama—all of them are spitting out working functions, database queries, entire apps. Developers are leaning on these tools in ways unimaginable just two years ago. Surveys from Stack Overflow and GitHub show a majority of engineers now use AI assistance daily. Some startups brag that they ship features in days, not weeks, because the AI does the boilerplate and glue code.
But 90 percent? That’s not what most developers are experiencing. Talk to engineers in the trenches and you’ll hear the same thing: AI can crank out scaffolding, but humans still wrestle with the logic, the architecture, the edge cases. AI is fast, but it hallucinates. It gets basic syntax right, but it invents APIs that don’t exist. It saves time, but it can also waste it.
Here’s the dirty secret: the 90 percent claim only looks true if you redefine what “code” means. If you mean raw lines of text in an editor, sure—AI can flood you with functions all day long. But code isn’t just text. It’s design, debugging, testing, integrating, refactoring, maintaining. The hard parts aren’t typing semicolons, they’re understanding tradeoffs. And AI isn’t anywhere close to replacing that.
So why do people like Amodei say things like this? Because it fits the narrative Silicon Valley wants to sell: that we’re on the brink of a post-human coding revolution, where AI turns every ambitious idea into an instant product. Investors love that story. Politicians lap it up. It makes the industry sound inevitable, unstoppable, godlike. And conveniently, it justifies funneling billions into AI firms at sky-high valuations.
But pause for a second and think about the implications. If AI really does write nearly all code in the next year, what happens to the global economy? Hundreds of thousands of developers, many of them in emerging markets like India or Nigeria, suddenly find themselves competing with a chatbot that works 24/7 and costs pennies on the dollar. Silicon Valley will say, “Don’t worry, they’ll find something else to do.” Really? What exactly? Because the last time automation hit hard—the offshoring wave of the 1990s—the “something else” for millions of Americans turned out to be part-time service jobs and a devastated middle class.
There’s also the security risk. If AI is generating the vast majority of our code, what guarantees do we have that it’s secure? These models don’t “understand” what they’re writing—they remix patterns. If one vulnerability leaks into the training data, it could replicate across millions of applications instantly. We already struggle with supply chain attacks. What happens when our supply chain is literally an AI that everyone uses?
And then there’s the cultural rot. Programming has long been a discipline that forced rigor, patience, and problem-solving. It taught generations of people how to think logically. If we outsource that to machines, do we hollow out one of the few intellectual meritocracies left in society? Do we create a generation of “prompt engineers” who know how to ask questions but not how to solve problems?
The irony here is thick. For decades, Silicon Valley told us to “learn to code.” Politicians, educators, influencers—they all repeated the mantra. It was supposed to be the skill that guaranteed relevance in a digital economy. And now, the same people who sold that gospel are telling us: actually, forget coding. The AI does that now. Just learn to prompt.
Of course, some will say this is progress. They’ll argue it frees humans to focus on creativity, design, leadership. That’s partly true. But who defines creativity? Who gets to lead? Historically, it’s the people who mastered the underlying tools. If AI holds the tools, then power concentrates even further in the hands of those who own the models. And let’s be clear: that’s not you. That’s Anthropic, OpenAI, Google, and a few others.
So here we are, six months after the prophecy. Is AI writing 90 percent of code? No. It’s writing a lot, maybe half in some workflows, but it’s not replacing the human mind. Yet the hype cycle barrels on. Because the real product isn’t the code—it’s the belief. The belief that AI is inevitable, that resistance is futile, that the only rational move is to get on board, buy the tokens, invest in the SPVs, and hope you’re not the one automated out of existence.
This is why it matters to keep a skeptical eye. Yes, AI will keep advancing. Yes, it will eat more and more of the coding process. But don’t be fooled into thinking it’s already done. Don’t let billionaires in Silicon Valley tell you your job is obsolete before it actually is. And don’t let them set the rules unchecked, while they sell inevitability and rake in billions.
Because here’s the truth: AI doesn’t replace intelligence. It replaces effort. It doesn’t solve problems; it repackages them. And if we hand over the keys too quickly, we won’t just lose jobs—we’ll lose the capacity even to understand the systems running our lives.
Six months in, the lesson is obvious. AI is powerful, but it’s not omnipotent. The real danger isn’t that it writes 90 percent of our code. It’s that we believe it does—and hand over 100 percent of our future to the people who own it.