2 Sources
[1]
Software Developers Say AI Is Rotting Their Brains
Tech company executives are confident that AI will completely transform the economy and point to the changes they see in-house to prove that this change is coming fast. At Meta, Google, Microsoft, and others, leadership says that AI generates a growing share of the overall code, which makes it cheaper and faster to produce. The implication is that if this AI is good enough that tech companies are using it internally to improve efficiency and reduce headcount, it's only a matter of time until every other industry is similarly transformed. Developers who are told to use AI whether they like it or not, however, tell a different story. On Reddit, Hacker News and other places where people in software development talk to each other, more and more people are becoming disillusioned with the promise of code generated by large language models. Developers talk not just about how the AI output is often flawed, but that using AI to get the job done is often a more time consuming, harder, and more frustrating experience because they have to go through the output and fix its mistakes. More concerning, developers who use AI at work report that they feel like they are de-skilling themselves and losing their ability to do their jobs as well as they used to. "We're being told to use [AI] agents for broad changes across our codebase. There's no way to evaluate whether that much code is well-written or secure -- especially when hundreds of other programmers in the company are doing the same," a UX designer at a midsized tech company told me. 404 Media granted all the developers we talked to for this story anonymity because they signed non-disclosure agreements or because they fear retribution from their employers. "We're building a rat's nest of tech debt that will be impossible to untangle when these models become prohibitively expensive (any minute now...)." The actual quality of output doesn't matter as much as our willingness to participate. Tech company executives love to brag about how much of the code at their company is AI-generated. In April, Google said that three quarters of new code at the company was generated by AI. Last year, Microsoft CEO Satya Nadella said up to 30 percent of the company's code was generated by AI. Microsoft's CTO Kevin Scott said he expects 95 percent of all code at the company to be AI-generated by 2030. Meta's Mark Zuckerberg said last year he expects AI to write most of the code improving AI within 12-18 months. Anthropic says 90 percent of the code written by most if its team is AI generated. Tech companies have also been bragging about their "tokenmaxxing," or how much money they're spending on AI tools instead of human employees. Predictably, the huge spike in productivity that these companies claim their own AI products have enabled hasn't resulted in more or better products, shorter work weeks, or better consumer experiences. Mostly, AI implementation in tech companies has been used to justify multiple massive rounds of layoffs. To name just a few examples where tech companies said they reduced headcount because of AI use, more recently, Meta said it would cut 10 percent of its workforce (around 8,000 people), Microsoft said it would offer voluntary retirement to 7 percent of its American workforce (around 125,000 people). Snapchat said it would lay off 16 percent of its full-time staffers (about 1,000 people). The developers I talked to contradicted the narrative about AI's utility in coding in many ways, but the most glaring issue with the narrative AI company executives are pitching is that the adoption of AI tools they see internally isn't voluntary or organic. Developers say they are either explicitly ordered to use AI tools or heavily pressured to use them. "AI in some shape or form is all but explicitly mandated," a software engineer at a FAANG company that brags publicly about its internal AI adoption told me. "Its usage is part of our performance review criteria and most (maybe all?) of us have been reorganized into AI focused 'pods.' We're absolutely flooded with AI tooling and it feels like the answer to every problem is 'use AI first.'" "We've been told performance evaluations are tied to AI adoption," the UX designer told me. "This has led to most of my teammates using it performatively, even if most of us implicitly know that the output is flawed. The actual quality of output doesn't matter as much as our willingness to participate." Another software engineer at a financial technology company told me that he was never forced to use LLMs but that the companies where he worked changed in a way that encouraged their use. His previous employer didn't demand developers use AI but it was encouraged and developers were given access to Cursor, one of the leading coding agents. "It started as a 'who wants to try it' and I volunteered. Later it was slowly, due to costs, that we stopped renewing our JetBrains IDE and forced everyone to move to Cursor (though the editor itself doesn't force you to use AI)," he said. JetBrains IDE is an integrated development environment used by software developers. "Adoption came mostly from inside the engineering team, with a single engineer manager trying to champion it and writing project based rules for Cursor to try to make the output better." All the developers I talked to were excited to try using LLMs at work at first, or were at least curious about them. Their feelings about the tools, based on their personal experience, are now overwhelmingly negative. "There were almost no productivity gains using IDE-based AI tools. AI-generated code ended up with more bugs because I am working on distributed web apps, highly complex multi-system things, so giving the LLM context is very difficult," a software developer at a small web design firm told me. "Another developer on a contract working with me at the moment generates massive amounts of code, leaving me with 1000+ lines of pull requests to review and it takes massive amounts of time to do this. This leads to me feeling more tired and burned out than I've ever felt in my entire life. The cognitive overhead of switching between prompting, coding, checking the LLM's output is a massive energy drain. It has not been a productivity booster at all, it feels like a speedrun towards severe mental exhaustion." The developer in fintech I talked to also said that one major problem with LLMs is that it can generate more code than developers can properly vet or explain. "The sheer breadth of code makes it impossible to be critical enough and then you're either throwing it away or submitting it and feeling scared there might be really low quality stuff that if someone notices will make you embarrassed (and even more embarrassing to say: 'oh i don't know what that is, the AI did that')," he said. "Or worse, you ship it without someone noticing and that is really hit or miss." "I have gotten stuck on bug fixes where, when I run out of Anthropic tokens in Claude Code, I couldn't work anymore. The current system I am working on started to become a monstrosity of complexity where I didn't even know what most of it does anymore, and when I had to fix a bug, it took longer than I would have taken in the past to debug," the software developer at a small web design firm told me. The developers I talked to found AI useful for some tasks. Several developers said that it was good for experimentation, allowing them to quickly prototype an idea or to implement something in a domain they're unfamiliar with. One developer said it was a good information interface. Specifically, he said, the AI helped him find where on the server a certain request is handled, summarize logs, or find documentation related to code changes. The problem all the developers I talked to agreed on is that the more they relied on AI to code, the more the skills they've honed for years deteriorated. This is by now a well studied phenomenon sometimes referred to as "cognitive debt" or "cognitive atrophy." The idea is that people who use AI to automate certain parts of their job lose the ability to do those tasks well, therefore de-skilling themselves. "I had some issues where I forgot how to implement a Laravel API and it scared the shit out of me. I went to university for this, I've been a software engineer for many years now and it feels like I am back before I ever wrote a single line of code," the software developer at a small web design firm told me. "It's making me dumber for sure," the fintech software developer told me. "It's like when we got cellphones and stopped remembering phone numbers, but it's grown to me mentally outsourcing 'thinking' in general. I feel my critical thinking and ability to sit and reason about a problem or a design has degraded because the all-knowing-dalai-llama is just a question away from giving me his take. And supposedly I tell myself ill just use it for inspiration but it ends up being my only thought. It gives you the illusion of productivity and expertise but at the end of the day you are more divorced from the output you submit than before." "When I was using it for code generation, I found myself having a lot of trouble building and maintaining a mental model of the code I was working with," the software engineer at the FAANG told me. "Another aspect is that I joined late last year and [the company's] codebase is massive. As a new hire, part of my job is to learn how to navigate the codebase and use the established conventions, but I think the AI push really hampered my ability to do that." The developers I talked to agreed that LLMs will stick around and play a role in programming in the future in some fashion, but worried about how the industry will adapt to executives' current obsession with the technology, especially when it comes to fostering future generations of developers. "Older programmers will be fine if there are any jobs left in a few years, but I worry for people early in their careers," the UX designer told me. "We are hiring junior programmers who rely on AI to complete the simplest tasks. They don't have the knowledge or experience to know when AI output is error-laden or inefficient." "I wish I had a crystal ball for this one, but my gut feeling is that this method of building software will be unsustainable either economically or in terms of tech debt," the software engineer at the FAANG company said. "There's a pretty clear split on my team between people who love AI coding and those who just do it because it's what the company wants, and generally speaking I find that the people who are still [technically focused individual contributors] with their nose in code all the time are less likely to be big AI boosters. I think the tech and its outputs start to really break down the more you question them and those who are doing that day in and day out tend to have a worse opinion of the tech." "I think there will be a 'reckoning' or 'awakening' from the industry notion that now everyone can code and that vibe coding is viable for a real production app and software companies are dead," the developer in fintech said. "I think we will grow to find the patterns and industry best practices that will balance the negatives of LLM development (hallucination, unstructured code) with better techniques to verify the output's correctness at scale, and the hype and techno optimism of AI will get to a saner middle ground."
[2]
Software Engineers Say They're Losing the Ability to Code Now That AI Does It for Them
Can't-miss innovations from the bleeding edge of science and tech Even for employees who've managed to hold onto their jobs, AI is having a very real impact at work. As corporations boast about how much they're investing in automation, the reality is that many employers are practically forcing the tech on their workers. Nowhere has that been more visible than software engineers, with companies including Meta even introducing "leaderboards" to show which employees are burning the most AI tokens in real-time. But it turns out that embrace of AI by programmers may come at a steep price. Many are speaking out, saying they now simply review AI-generated code instead of writing it themselves -- a rote task, it turns out, that's an important part of keeping their technical skills sharp. Similar complaints are proliferating across social media, and 404 Media talked to more developers who confirmed the problem. "I had some issues where I forgot how to implement a Laravel API and it scared the s**t out of me," one engineer, who requested to remain anonymous, told 404. "I went to university for this, I've been a software engineer for many years now and it feels like I am back before I ever wrote a single line of code." Not all companies mandate the use of AI in the workplace, but the tools are seductive regardless. Another anonymous software engineer told 404 that while he wasn't outright forced to use LLMs, he and his colleagues were discreetly given access to Cursor for coding and found themselves using it regularly. Sure, he found, AI coding tools can save time and effort -- but if they undermine core intellectual skills required to architect larger projects, the problem is soon going to come home to roost for employers. "It's making me dumber for sure," the same developer told 404. "It's like when we got cellphones and stopped remembering phone numbers, but it's grown to me mentally outsourcing 'thinking' in general. I feel my critical thinking and ability to sit and reason about a problem or a design has degraded because the all-knowing-dalai-llama is just a question away from giving me his take." A growing body of research supports the idea that our critical thinking skills are being sanded down by AI use. With everyone from veteran software coders to students morphing into mindless zombies, the apocalypse may come at the hands of AI, after all.
Share
Copy Link
While tech executives boast that AI generates up to 75% of new code at companies like Google and Meta, software developers tell a different story. Engineers report they're losing the ability to code as mandatory AI adoption forces them to review flawed AI-generated code rather than write it themselves, leading to what they describe as cognitive atrophy and mounting technical debt.
Tech company leadership continues to champion AI in coding as a transformative force, with Google reporting that three-quarters of new code is now AI-generated
1
. Microsoft CEO Satya Nadella revealed up to 30 percent of the company's code comes from AI, while CTO Kevin Scott expects 95 percent of all code to be AI-generated by 20301
. Meta's Mark Zuckerberg predicted AI would write most code improving AI within 12-18 months, and Anthropic claims 90 percent of code written by most of its team is AI-generated1
. Yet this aggressive push from tech executives hasn't translated into better products or shorter work weeks—instead, it has justified massive layoffs, with Meta cutting 10 percent of its workforce, Microsoft offering voluntary retirement to 7 percent of American employees, and Snapchat eliminating 16 percent of full-time staff1
.The reality on the ground contradicts the optimistic narrative. Software developers report that AI adoption isn't voluntary or organic—they're either explicitly ordered to use AI tools or face intense management pressure
1
. A software engineer at a FAANG company explained that AI usage is part of performance review criteria, with most employees reorganized into AI-focused 'pods' where the answer to every problem is 'use AI first'1
. A UX designer at a midsized tech company revealed that performance evaluations are tied to AI adoption, leading to performative use of AI even when colleagues know the output is flawed1
. Companies including Meta have introduced leaderboards showing which employees burn the most AI tokens in real-time2
.
Source: Futurism
The most alarming consequence is that software developers report they're losing the ability to code as they shift from writing code to merely reviewing AI-generated code
2
. One engineer who studied computer science and worked for years as a professional told 404 Media: "I had some issues where I forgot how to implement a Laravel API and it scared the s**t out of me. I feel like I am back before I ever wrote a single line of code"2
. Another developer described the cognitive impact bluntly: "It's making me dumber for sure. It's like when we got cellphones and stopped remembering phone numbers, but it's grown to me mentally outsourcing 'thinking' in general"2
. This decline in technical abilities extends beyond simple recall—developers report their critical thinking skills and ability to reason about problems or design have degraded2
.Related Stories

Source: 404 Media
Despite claims of enhanced productivity, developers describe AI in coding as creating more work, not less. They report that AI-generated code is often flawed, making the process more time-consuming, harder, and frustrating because they must review output and fix mistakes
1
. The UX designer warned about agents making broad changes across codebases: "There's no way to evaluate whether that much code is well-written or secure—especially when hundreds of other programmers in the company are doing the same. We're building a rat's nest of tech debt that will be impossible to untangle"1
. The designer noted that actual quality of output matters less than willingness to participate in the AI adoption mandate1
.A growing body of research supports the disillusionment spreading across Reddit, Hacker News, and other developer communities
1
2
. The concern extends beyond individual atrophying skills to systemic risks: if experienced engineers can no longer architect larger projects due to eroded intellectual capabilities, companies may face mounting technical debt and security vulnerabilities. While AI coding tools like Cursor offer short-term time savings, the long-term cost of de-skilling an entire generation of software developers remains unclear. As one engineer noted, the seductive nature of these tools means even developers not explicitly forced to use them find themselves relying on AI regularly2
. The gap between executive enthusiasm and developer experience suggests the transformation tech leadership promises may look very different from what actually unfolds in practice.Summarized by
Navi
[1]
31 Mar 2026•Technology

07 Apr 2026•Technology

11 Jul 2025•Technology

1
Technology

2
Technology

3
Technology
