2 Sources
[1]
Students Are Learning Less and Getting Higher Grades Because of AI, Study Finds
The booming use of generative AI by students is leading to rising grade inflation at universities, according to a working paper published this week by the University of California, Berkeley. There are three ways generative AI can be used by students: augmentation, where the tools perform a supporting role assisting in things like research while the student completes the bulk of the work themselves; reinstatement of new AI-based tasks; or through displacement, where it completely automates the work that a student would otherwise perform themselves, such as writing an essay. All three use cases can improve grades, while only augmentation and reinstatement can further correlate with actual learning and skills building. Some academic tasks, like unsupervised take-home assignments, essays, and other homework, are perfect opportunities for AI displacement, as opposed to proctored exams, oral presentations, or in-class discussions. As part of the study, UC Berkeley senior researcher Igor Chirikov analyzed over 500,000 student-course enrollments across 84 departments at a large Texas university from 2018 to 2025. He found that grade increases were mostly concentrated in courses "with higher shares of writing and coding tasks," where take-home assignments carried the most weight, concluding that students are using AI to cheat on some schoolwork and get better grades. Overall, the researchers found that "AI-exposed courses" saw a 30 percent increase in Ć¢ā¬ÅAâ⬠grades since ChatGPT hit the market. That's not particularly shocking; it's a generative AI use case as old as the dawn of ChatGPT. Plus, a student's GPA could be make-or-break for their future, determining acceptance into postgraduate academic programs and lucrative early-career job opportunities. So, in a world where most industries are leaning into AI often at the expense of the young graduate job market, it makes sense that the average student would seek out an easy way to guarantee their future. What is interesting is that, four years into the widespread presence of generative AI in our daily lives, the study shows that American universities have yet to catch up with its consequences. With more AI-enabled grade inflation, employers will have a tougher time weeding out strong young graduate candidates, the study says. But even more importantly, this increased reliance on AI in academia is bound to create an incompetent workforce that is dependent on AI. "If AI displaces skill-building tasks during learning, students may graduate with weaker capabilities in precisely the domains where AI is strongest, reinforcing a feedback loop between AI in education and AI in production that could accelerate automation," Chirikov writes. So an academic system that caters to AI-enabled grade inflation would create a workforce that does not know how to perform the core duties of their jobs, which in turn would create increased reliance on AI in the workforce and even more wholesale automation of jobs, on the road to a much-feared AI jobs armageddon that some experts claim is already underway in some industries. Some universities are planning to take action against this grade inflation, though whether the planned measures will be truly successful is up for debate. At Princeton, where roughly 30% of seniors admitted to cheating mostly via generative AI in a recent survey, faculty voted this week to overturn a 133-year-old honor code that allowed students to take in-person exams without a faculty member proctoring.
[2]
AI sends A grades into overdrive
Why it matters: Universities and colleges were already concerned about how many students are earning A's and B's, but now must worry that graduates are leaving AI-proficient rather than knowledgeable about their subjects of study. The big picture: It isn't a case where A- students get a slight bump to an A or A+, says Igor Chirikov, a UC Berkeley professor who authored a study on AI and grade inflation. * "We have a C student who is now an A student," Chirikov tells Axios, citing data from grades given between 2018 and 2025 at a Texas research university. What they found: Since the release of ChatGPT in 2022, "excellent" grades rose by 30% in classes where AI is useful, such as English composition and coding. * In classes where it's not -- like sculpture and lab-based courses -- grades remained flat. Worth noting: Chirikov didn't name the university used in the study, but says that it's a "selective" school with over 50,000 students across all major academic disciplines. * "I don't want to single out one university, just because I believe it's not specific to that particular university. It's something that's happening across the higher ed sector." * He also says he chose the university because its grade distribution data is publicly available. Zoom in: Excellent grades have been on the rise since the early 2000s, but Chirikov found that classes that place more weight on homework assignments than on in-class exams see higher rates of grade inflation. * That pattern suggested that unsupervised work is getting an AI-assisted boost, he says. * Another issue with the grading scale is that faculty are sometimes incentivized to "grade more leniently" as student evaluations of their work are often tied to a professor being promoted, Chirikov says. Zoom out: There is no "silver bullet" to stop AI from inflating GPAs, nor is it a new concept that students have inflated GPAs, he says. * "There are many cases when students can select easier courses and get easier A's, and their GPA will be higher. And I think AI just exacerbates the existing trends," Chirikov says. * Regardless, professors have already been getting crafty to crack down on AI-fueled cheating, like requiring handwritten or oral exams. The bottom line: "We need to be creative and think of AI-integrated assignments, and that students can use [LLMs], but they should properly document that," Chirikov says. * "That's not an easy process, but we definitely should invest in that more than we do right now."
Share
Copy Link
A UC Berkeley study analyzing over 500,000 student enrollments reveals that AI sends A grades into overdrive, with excellent grades rising 30% in AI-exposed courses since ChatGPT's release. While students achieve higher GPAs through AI-assisted cheating, researchers warn this trend creates a less competent workforce unable to perform core job duties.
The widespread use of generative AI is fundamentally reshaping how students perform academically, according to a working paper from the
University of California, Berkeley
. UC Berkeley senior researcher Igor Chirikov analyzed over 500,000 student-course enrollments across 84 departments at a large Texas university from 2018 to 2025, uncovering a troubling pattern: students are achieving significantly higher grades while potentially acquiring fewer actual skills. The research identifies three distinct ways students deploy these toolsāaugmentation for research support, reinstatement of AI-based tasks, or complete displacement where AI automates work like essay writing. Only the first two methods correlate with genuine learning, yet displacement remains widespread in unsupervised environments.
Source: Gizmodo
Since
ChatGPT
launched in 2022, AI-exposed courses experienced a 30 percent increase in "A" grades, with the most pronounced effects appearing in classes involvingwriting and coding tasks
. "We have a C student who is now an A student,"Chirikov told Axios
, emphasizing this isn't merely marginal improvement. Meanwhile, courses less susceptible to AI assistanceāsuch as sculpture and lab-based workāsaw grades remain flat, providing clear evidence that AI-assisted cheating drives these academic gains. The pattern becomes especially pronounced in courses where take-home assignments carry significant weight compared to proctored exams, oral presentations, or in-class discussions.Four years into the AI era, universities have yet to develop effective responses to grade inflation. Chirikov deliberately avoided naming the Texas research university in his
UC Berkeley study
, explaining "I don't want to single out one university, just because I believe it's not specific to that particular university. It's something that's happening across the higher ed sector." The challenge extends beyond student behaviorāfaculty sometimes face incentives to grade more leniently since student evaluations often influence promotions. AtPrinceton
, where roughly 30% of seniors admitted to cheating mostly via generative AI in a recent survey, faculty voted this week to overturn a 133-year-oldhonor code
that allowed students to take in-person exams without faculty proctoring.
Source: Axios
Related Stories
The implications extend far beyond campus boundaries. "If AI displaces skill-building tasks during learning, students may graduate with weaker capabilities in precisely the domains where AI is strongest, reinforcing a feedback loop between AI in education and AI in production that could accelerate automation,"
Chirikov writes
. This creates a paradox: graduates enter workplaces unable to perform core duties, driving increased workplace AI dependence and potentially accelerating job automation. Employers will struggle to identifystrong graduates
when inflated GPAs no longer signal actual competency, making hiring decisions increasingly difficult in a job market already strained by AI displacement.Chirikov acknowledges there's no "silver bullet" to address AI and grade inflation at universities. "We need to be creative and think of
AI-integrated assignments
, and that students can use [LLMs], but they should properly document that,"he told Axios
. "That's not an easy process, but we definitely should invest in that more than we do right now." Some professors have already implemented creative countermeasures, requiring handwritten or oral exams to verify authentic student knowledge. The research suggests that while AI exacerbates existing grade inflation trends dating to the early 2000s, the speed and scale of change demands immediate attention. What remains uncertain is whether academic institutions can adapt quickly enough to preserve educational integrity while preparing students for an AI-integrated professional world.Summarized by
Navi
[2]
12 Sept 2025ā¢Technology

03 Dec 2025ā¢Entertainment and Society

29 Oct 2025ā¢Science and Research

1
Technology

2
Technology

3
Business and Economy
