Curated by THEOUTPOST
On Wed, 26 Feb, 8:04 AM UTC
2 Sources
[1]
Surge in UK university students using AI to complete work
The number of UK undergraduate students using artificial intelligence to help them complete their studies has surged over the past 12 months, raising questions about how universities assess their work. More than nine out of 10 students are now using AI in some form, compared with two-thirds a year ago, according to a survey published by the Higher Education Policy Institute think-tank on Wednesday. Experts warned that the sheer speed of take-up of AI among undergraduates required universities to rapidly develop policies to give students clarity on acceptable uses of the technology. Josh Freeman, policy manager at HEPI, said it was "almost unheard of" to see such rapid changes in student behaviour and that it would force a radical change of approach in the way universities assessed students. "There are urgent lessons here for institutions. Every assessment must be reviewed in case it can be completed easily using AI. That will require bold retraining initiatives for staff in the power and potential of generative AI," he added. The findings come a month after science secretary Peter Kyle provoked controversy by saying that it was acceptable for school children to use AI to complete their homework "with supervision and [when] used in the right way". The findings on the use of generative AI tools were based on polling of a representative sample of 1,041 full-time undergraduate students in the UK by Savanta. It found that 88 per cent of students said they had used generative AI such as ChatGPT for assessments, up from 53 per cent in 2024, with students studying science subjects more likely to use the technology than their peers studying social sciences and humanities. Only 29 per cent of humanities students felt that AI generated content "would get a good grade in my subject", compared with 45 per cent of students studying for science, engineering or medical-related degrees. The biggest two reasons students gave for using AI were "saving time" and "improving the quality of my work", with half citing this as the reason they were most likely to make use of AI tools. The proportion of students that considered it acceptable to include AI text in assignments after editing had grown from 17 per cent to 25 per cent over the past year, but only 6 per cent thought using AI-generated content without editing was acceptable. The report also identified what it called "persistent digital divides" in AI competency, with men more likely than women to be frequent users, alongside students from wealthier backgrounds. Nearly half of students said they had already used AI at school. Although the proportion of students saying staff at universities were "well equipped" to support their use of AI had doubled over the year, from 18 to 42 per cent, many students still said they lacked clarity about rules for AI usage. "It's still all very vague and up in the air if/when it can be used and why," one student said. "They dance around the subject. It's not banned but not advised, it's academic misconduct if you use it but lecturers tell us they use it. Very mixed messages," added another. Janice Kay, director of Higher Futures, a higher education consultancy, who wrote the foreword to the HEPI report, said that while it was a "positive sign overall" that students were learning to use AI, it also pointed to coming challenges. "There is little evidence here that AI tools are being misused to cheat and play the system. [But] there are quite a lot of signs that will pose serious challenges for learners, teachers and institutions and these will need to be addressed as higher education transforms," she added.
[2]
UK universities warned to 'stress-test' assessments as 92% of students use AI
Survey of 1,000 students shows 'explosive increase' in use of generative AI in particular over past 12 months British universities have been warned to "stress-test" all assessments after new research revealed "almost all" undergraduates are using generative artificial intelligence (genAI) in their studies. A survey of 1,000 students - both domestic and international - found there had been an "explosive increase" in the use of genAI in the past 12 months. Almost nine out of 10 (88%) in the 2025 poll said they used tools such as ChatGPT for their assessments, up from 53% last year. The proportion using any AI tool surged from 66% in 2024 to 92% in 2025, meaning just 8% of students are not using AI, according to a report published by the Higher Education Policy Institute and Kortext, a digital etextbook provider. Josh Freeman, the report's author, said such dramatic changes in behaviour in just 12 months were almost unheard of, and warned: "Universities should take heed: generative AI is here to stay. "There are urgent lessons here for institutions," Freeman said. "Every assessment must be reviewed in case it can be completed easily using AI. That will require bold retraining initiatives for staff in the power and potential of generative AI. "Institutions will not solve any of these problems alone and should seek to share best practice with each other. Ultimately, AI tools should be harnessed to advance learning rather than inhibit it." Students say they use genAI to explain concepts, summarise articles and suggest research ideas, but almost one in five (18%) admitted to including AI-generated text directly in their work. "When asked why they use AI, students most often find it saves them time (51%) and improves the quality of their work (50%)," the report said. "The main factors putting them off using AI are the risk of being accused of academic misconduct and the fear of getting false or biased results." One student told researchers: "I enjoy working with AI as it makes life easier when doing assignments; however, I do get scared I'll get caught." Women are more worried about these factors than men, who show greater enthusiasm for AI, as do wealthier students and those on science, technology, engineering and maths (Stem) courses. According to the report, half of students from the most privileged backgrounds used genAI to summarise articles, compared with 44% from the least privileged backgrounds. "The digital divide we identified in 2024 appears to have widened," the report concluded. Students generally believe their universities have responded effectively to concerns over academic integrity, with 80% saying their institution's policy is "clear" and 76% believe their institution would spot the use of AI in assessments. Only a third (36%) of students have received training in AI skills from their university. "They dance around the subject," said one student. "It's not banned but not advised, it's academic misconduct if you use it, but lecturers tell us they use it. Very mixed messages." Dr Thomas Lancaster, a computer scientist at Imperial College London who researches academic integrity, said: "Students who aren't using generative AI tools are now a tiny minority. "I know some students are resistant to AI, and I can understand the ethical concerns, but they're really putting themselves at quite a competitive disadvantage, both in education, and in showing themselves as ready for future careers." A spokesperson for Universities UK said: "To effectively educate the workforce of tomorrow, universities must increasingly equip students to work in a world that will be shaped by AI, and it's clear progress is being made. "But they need to balance this with the challenges posed by a rapidly developing technology. This survey shows that universities and students are alive to the potential risks posed by AI tools in the context of exams and assessment. "All have codes of conduct that include severe penalties for students found to be submitting work that is not their own and they engage students from day-one on the implications of cheating."
Share
Share
Copy Link
A dramatic increase in AI usage among UK university students for academic work has prompted calls for urgent policy changes and assessment reviews.
A recent survey by the Higher Education Policy Institute (HEPI) has revealed a dramatic surge in the use of artificial intelligence (AI) among UK undergraduate students. The study found that over 90% of students are now using AI in some form for their academic work, up from two-thirds just a year ago 12. This rapid adoption of AI technology is forcing universities to reassess their policies and teaching methods.
The survey, which polled 1,041 full-time undergraduate students in the UK, showed that 88% of respondents had used generative AI tools like ChatGPT for assessments, a significant increase from 53% in the previous year 1. Science students were more likely to embrace AI compared to their peers in social sciences and humanities. Notably, 45% of students in science, engineering, or medical-related degrees believed AI-generated content could achieve good grades in their subjects, compared to only 29% of humanities students 1.
Students primarily cited "saving time" and "improving the quality of work" as their main reasons for using AI tools, with half of the respondents mentioning these factors 12. However, the survey also highlighted concerns about academic integrity. While 25% of students considered it acceptable to include edited AI-generated text in assignments (up from 17% last year), only 6% believed using unedited AI content was appropriate 1.
The report identified persistent digital divides in AI competency. Male students and those from wealthier backgrounds were more likely to be frequent AI users 12. This disparity raises concerns about equitable access to AI tools and their potential impact on academic performance across different demographic groups.
Universities are struggling to keep pace with the rapid adoption of AI. While the proportion of students saying staff were "well equipped" to support AI use doubled from 18% to 42% over the year, many still reported a lack of clarity regarding AI usage rules 1. Josh Freeman, policy manager at HEPI, emphasized the urgent need for universities to review every assessment and retrain staff in the potential of generative AI 12.
The widespread use of AI is forcing a radical change in how universities assess students. Dr. Thomas Lancaster from Imperial College London noted that students not using AI tools are now a tiny minority, potentially putting themselves at a competitive disadvantage 2. Universities UK acknowledged the need to balance equipping students for an AI-shaped world with addressing the challenges posed by this rapidly developing technology 2.
As AI continues to transform higher education, universities face the challenge of harnessing these tools to advance learning rather than inhibit it. The report suggests that institutions should share best practices and develop clear policies to guide students on acceptable AI use 12. This shift may lead to significant changes in curriculum design, assessment methods, and the overall approach to teaching and learning in higher education.
Reference
[1]
Exeter University pioneers AI-friendly assessments as higher education grapples with ChatGPT's impact. The move sparks debate on academic integrity and the future of education in the AI era.
2 Sources
2 Sources
As AI tools become increasingly prevalent in universities, educators grapple with maintaining academic integrity and fostering critical thinking skills among students.
2 Sources
2 Sources
A growing divide emerges as teenagers increasingly adopt AI technologies for schoolwork and creative tasks, often without parental awareness or understanding. This trend raises questions about education, ethics, and the future of learning in the AI era.
6 Sources
6 Sources
As educators grapple with AI-generated content in classrooms, experts argue that AI detection software is an imperfect solution and call for a fundamental shift in assessment methods to ensure academic integrity.
2 Sources
2 Sources
A recent survey reveals widespread apprehension among Australians regarding artificial intelligence. The study emphasizes the crucial role of media literacy in addressing these concerns and navigating the evolving AI landscape.
5 Sources
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved