Curated by THEOUTPOST
On Fri, 2 May, 12:01 AM UTC
5 Sources
[1]
AI isn't replacing student writing - but it is reshaping it
I'm a writing professor who sees artificial intelligence as more of an opportunity for students, rather than a threat. That sets me apart from some of my colleagues, who fear that AI is accelerating a glut of superficial content, impeding critical thinking and hindering creative expression. They worry that students are simply using it out of sheer laziness or, worse, to cheat. Perhaps that's why so many students are afraid to admit that they use ChatGPT. In The New Yorker magazine, historian D. Graham Burnett recounts asking his undergraduate and graduate students at Princeton whether they'd ever used ChatGPT. No one raised their hand. "It's not that they're dishonest," he writes. "It's that they're paralyzed." Students seem to have internalized the belief that using AI for their coursework is somehow wrong. Yet, whether my colleagues like it or not, most college students are using it. A February 2025 report from the Higher Education Policy Institute in the U.K. found that 92% of university students are using AI in some form. As early as August 2023 - a mere nine months after ChatGPT's public release - more than half of first-year students at Kennesaw State University, the public research institution where I teach, reported that they believed that AI is the future of writing. It's clear that students aren't going to magically stop using AI. So I think it's important to point out some ways in which AI can actually be a useful tool that enhances, rather than hampers, the writing process. Helping with the busywork A February 2025 OpenAI report on ChatGPT use among college-aged users found that more than one-quarter of their ChatGPT conversations were education-related. The report also revealed that the top five uses for students were writing-centered: starting papers and projects (49%); summarizing long texts (48%); brainstorming creative projects (45%); exploring new topics (44%); and revising writing (44%). These figures challenge the assumption that students use AI merely to cheat or write entire papers. Instead, it suggests they are leveraging AI to free up more time to engage in deeper processes and metacognitive behaviors - deliberately organizing ideas, honing arguments and refining style. If AI allows students to automate routine cognitive tasks - like information retrieval or ensuring that verb tenses are consistent - it doesn't mean they're thinking less. It means their thinking is changing. Of course, students can misuse AI if they use the technology passively, reflexively accepting its outputs and ideas. And overreliance on ChatGPT can erode a student's unique voice or style. However, as long as students learn how to use AI intentionally, this shift can be seen as an opportunity, rather than a loss, Clarifying the creative vision It has also become clear that AI, when used responsibly, can augment human creativity. For example, science comedy writer Sarah Rose Siskind recently gave a talk to Harvard students about her creative process. She spoke about how she uses ChatGPT to brainstorm joke setups and explore various comedic scenarios, which allows her to focus on crafting punchlines and refining her comedic timing. Note how Siskin used AI in ways that didn't supplant the human touch. Instead of replacing her creativity, AI amplified it by providing structured and consistent feedback, giving her more time to polish her jokes. Another example is the Rhetorical Prompting Method, which I developed alongside fellow Kennesaw State University researchers. Designed for university students and adult learners, it's a framework for conversing with an AI chatbot, one that emphasizes the importance of agency in guiding AI outputs. When writers use precise language to prompt, critical thinking to reflect, and intentional revision to sculpt inputs and outputs, they direct AI to help them generate content that aligns with their vision. There's still a process The Rhetorical Prompting Method mirrors best practices in process writing, which encourages writers to revisit, refine and revise their drafts. When using ChatGPT, though, it's all about thoughtfully revisiting and revising prompts and outputs. For instance, say a student wants to create a compelling PSA for social media to encourage campus composting. She considers her audience. She prompts ChatGPT to draft a short, upbeat message in under 50 words that's geared to college students. Reading the first output, she notices it lacks urgency. So she revises the prompt to emphasize immediate impact. She also adds some additional specifics that are important to her message, such as the location of an information session. The final PSA reads: "Every scrap counts! Join campus composting today at the Commons. Your leftovers aren't trash - they're tomorrow's gardens. Help our university bloom brighter, one compost bin at a time." The Rhetorical Prompting Method isn't groundbreaking; it's riffing on a process that's been tested in the writing studies discipline for decades. But I've found that it works by directing writers how to intentionally prompt. I know this because we asked users about their experiences. In an ongoing study, my colleagues and I polled 133 people who used the Rhetorical Prompting Method for their academic and professional writing: 92% reported that it helped them evaluate writing choices before and during their process. 75% said that they were able to maintain their authentic voice while using AI assistance. 89% responded that it helped them think critically about their writing. The data suggests that learners take their writing seriously. Their responses reveal that they are thinking carefully about their writing styles and strategies. While this data is preliminary, we continue to gather responses in different courses, disciplines and learning environments. All of this is to say that, while there are divergent points of view over when and where it's appropriate to use AI, students are certainly using it. And being provided with a framework can help them think more deeply about their writing. AI, then, is not just a tool that's useful for trivial tasks. It can be an asset for creativity. If today's students - who are actively using AI to write, revise and explore ideas - see AI as a writing partner, I think it's a good idea for professors to start thinking about helping them learn the best ways to work with it.
[2]
Essay challenge: ChatGPT vs students
ChatGPT vs students: study reveals who writes better (and it's not the AI) AI generated essays don't yet live up to the efforts of real students - according to new research from the University of East Anglia (UK). A new study published today compared the work of 145 real students with essays generated by ChatGPT. While the AI essays were found to be impressively coherent and grammatically sound, they fell short in one crucial area - they lacked a personal touch. As the line between human and machine writing continues to blur, the study underlines the importance of fostering critical literacy and ethical awareness in the digital age. It is hoped that the findings could help educators spot cheating in schools, colleges and universities worldwide by recognising machine-generated essays.. Prof Ken Hyland, from UEA's School of Education and Lifelong Learning, said: "Since its public release, ChatGPT has created considerable anxiety among teachers worried that students will use it to write their assignments. "The fear is that ChatGPT and other AI writing tools potentially facilitate cheating and may weaken core literacy and critical thinking skills. This is especially the case as we don't yet have tools to reliably detect AI-created texts. "In response to these concerns, we wanted to see how closely AI can mimic human essay writing, particularly focusing on how writers engage with readers." The research team analysed 145 essays written by real university students and another 145 generated by ChatGPT. "We were particularly interested in looking at what we called 'engagement markers' like questions and personal commentary," said Prof Hyland. "We found that the essays written by real students consistently featured a rich array of engagement strategies, making them more interactive and persuasive. "They were full of rhetorical questions, personal asides, and direct appeals to the reader - all techniques that enhance clarity, connection, and produce a strong argument. "The ChatGPT essays on the other hand, while linguistically fluent were more impersonal. The AI essays mimicked academic writing conventions but they were unable to inject text with a personal touch or to demonstrate a clear stance. "They tended to avoid questions and limited personal commentary. Overall, they were less engaging, less persuasive, and there was no strong perspective on a topic. "This reflects the nature of its training data and statistical learning methods, which prioritise coherence over conversational nuance," he added. Despite its shortcomings, the study does not dismiss the role of AI in the classroom. Instead, the researchers say that tools like ChatGPT should be used as teaching aids rather than shortcuts. "When students come to school, college or university, we're not just teaching them how to write, we're teaching them how to think - and that's something no algorithm can replicate," added Prof Hyland. This study was led by UEA in collaboration with Prof Kevin Jiang of Jilin University, China. 'Does ChatGPT write like a student? Engagement markers in argumentative essays' is published in the journal Written Communication.
[3]
ChatGPT vs. students: Study reveals who writes better
AI-generated essays don't yet live up to the efforts of real students, according to new research from the University of East Anglia (UK). A new study published in Written Communication compared the work of 145 real students with essays generated by ChatGPT. The paper is titled "Does ChatGPT write like a student? Engagement markers in argumentative essays." While the AI essays were found to be impressively coherent and grammatically sound, they fell short in one crucial area -- they lacked a personal touch. As the line between human and machine writing continues to blur, the study underlines the importance of fostering critical literacy and ethical awareness in the digital age. It is hoped that the findings could help educators spot cheating in schools, colleges and universities worldwide by recognizing machine-generated essays. Prof Ken Hyland, from UEA's School of Education and Lifelong Learning, said, "Since its public release, ChatGPT has created considerable anxiety among teachers worried that students will use it to write their assignments. "The fear is that ChatGPT and other AI writing tools potentially facilitate cheating and may weaken core literacy and critical thinking skills. This is especially the case as we don't yet have tools to reliably detect AI-created texts. "In response to these concerns, we wanted to see how closely AI can mimic human essay writing, particularly focusing on how writers engage with readers." The research team analyzed 145 essays written by real university students and another 145 generated by ChatGPT. "We were particularly interested in looking at what we called 'engagement markers' like questions and personal commentary," said Prof Hyland. "We found that the essays written by real students consistently featured a rich array of engagement strategies, making them more interactive and persuasive. "They were full of rhetorical questions, personal asides, and direct appeals to the reader -- all techniques that enhance clarity, connection, and produce a strong argument. "The ChatGPT essays, on the other hand, while linguistically fluent, were more impersonal. The AI essays mimicked academic writing conventions but they were unable to inject text with a personal touch or to demonstrate a clear stance. "They tended to avoid questions and limited personal commentary. Overall, they were less engaging, less persuasive, and there was no strong perspective on a topic. "This reflects the nature of its training data and statistical learning methods, which prioritize coherence over conversational nuance," he added. Despite its shortcomings, the study does not dismiss the role of AI in the classroom. Instead, the researchers say that tools like ChatGPT should be used as teaching aids rather than shortcuts. "When students come to school, college or university, we're not just teaching them how to write, we're teaching them how to think -- and that's something no algorithm can replicate," added Prof Hyland. This study was led by UEA in collaboration with Prof Kevin Jiang of Jilin University, China.
[4]
AI isn't replacing student writing, but it is reshaping it
I'm a writing professor who sees artificial intelligence as more of an opportunity for students, rather than a threat. That sets me apart from some of my colleagues, who fear that AI is accelerating a glut of superficial content, impeding critical thinking and hindering creative expression. They worry that students are simply using it out of sheer laziness or, worse, to cheat. Perhaps that's why so many students are afraid to admit that they use ChatGPT. In The New Yorker magazine, historian D. Graham Burnett recounts asking his undergraduate and graduate students at Princeton whether they'd ever used ChatGPT. No one raised their hand. "It's not that they're dishonest," he writes. "It's that they're paralyzed." Students seem to have internalized the belief that using AI for their coursework is somehow wrong. Yet, whether my colleagues like it or not, most college students are using it. A February 2025 report from the Higher Education Policy Institute in the U.K. found that 92% of university students are using AI in some form. As early as August 2023 -- a mere nine months after ChatGPT's public release -- more than half of first-year students at Kennesaw State University, the public research institution where I teach, reported that they believed that AI is the future of writing. It's clear that students aren't going to magically stop using AI. So I think it's important to point out some ways in which AI can actually be a useful tool that enhances, rather than hampers, the writing process. Helping with the busywork A February 2025 OpenAI report on ChatGPT use among college-aged users found that more than one-quarter of their ChatGPT conversations were education-related. The report also revealed that the top five uses for students were writing-centered: starting papers and projects (49%); summarizing long texts (48%); brainstorming creative projects (45%); exploring new topics (44%); and revising writing (44%). These figures challenge the assumption that students use AI merely to cheat or write entire papers. Instead, it suggests they are leveraging AI to free up more time to engage in deeper processes and metacognitive behaviors -- deliberately organizing ideas, honing arguments and refining style. If AI allows students to automate routine cognitive tasks -- like information retrieval or ensuring that verb tenses are consistent -- it doesn't mean they're thinking less. It means their thinking is changing. Of course, students can misuse AI if they use the technology passively, reflexively accepting its outputs and ideas. And overreliance on ChatGPT can erode a student's unique voice or style. However, as long as students learn how to use AI intentionally, this shift can be seen as an opportunity, rather than a loss, Clarifying the creative vision It has also become clear that AI, when used responsibly, can augment human creativity. For example, science comedy writer Sarah Rose Siskind recently gave a talk to Harvard students about her creative process. She spoke about how she uses ChatGPT to brainstorm joke setups and explore various comedic scenarios, which allows her to focus on crafting punchlines and refining her comedic timing. Note how Siskind used AI in ways that didn't supplant the human touch. Instead of replacing her creativity, AI amplified it by providing structured and consistent feedback, giving her more time to polish her jokes. Another example is the Rhetorical Prompting Method, which I developed alongside fellow Kennesaw State University researchers. Designed for university students and adult learners, it's a framework for conversing with an AI chatbot, one that emphasizes the importance of agency in guiding AI outputs. When writers use precise language to prompt, critical thinking to reflect, and intentional revision to sculpt inputs and outputs, they direct AI to help them generate content that aligns with their vision. There's still a process The Rhetorical Prompting Method mirrors best practices in process writing, which encourages writers to revisit, refine and revise their drafts. When using ChatGPT, though, it's all about thoughtfully revisiting and revising prompts and outputs. For instance, say a student wants to create a compelling PSA for social media to encourage campus composting. She considers her audience. She prompts ChatGPT to draft a short, upbeat message in under 50 words that's geared to college students. Reading the first output, she notices it lacks urgency. So she revises the prompt to emphasize immediate impact. She also adds some additional specifics that are important to her message, such as the location of an information session. The final PSA reads: "Every scrap counts! Join campus composting today at the Commons. Your leftovers aren't trash -- they're tomorrow's gardens. Help our university bloom brighter, one compost bin at a time." The Rhetorical Prompting Method isn't groundbreaking; it's riffing on a process that's been tested in the writing studies discipline for decades. But I've found that it works by directing writers how to intentionally prompt. I know this because we asked users about their experiences. In an ongoing study, my colleagues and I polled 133 people who used the Rhetorical Prompting Method for their academic and professional writing: The data suggests that learners take their writing seriously. Their responses reveal that they are thinking carefully about their writing styles and strategies. While this data is preliminary, we continue to gather responses in different courses, disciplines and learning environments. All of this is to say that, while there are divergent points of view over when and where it's appropriate to use AI, students are certainly using it. And being provided with a framework can help them think more deeply about their writing. AI, then, is not just a tool that's useful for trivial tasks. It can be an asset for creativity. If today's students -- who are actively using AI to write, revise and explore ideas -- see AI as a writing partner, I think it's a good idea for professors to start thinking about helping them learn the best ways to work with it.
[5]
Man vs. machine: Why students still write better than AI - Earth.com
A new study shows that essays drafted by ChatGPT read smoothly, yet feel distant. AI essays lack the subtle devices that pull readers in. This gap may help teachers separate genuine coursework from machine writing. The finding urges stronger critical literacy in an age of rapid text generation. Researchers at the University of East Anglia, with a collaborator from Jilin University, worked on the comparison. They judged 145 student essays and the same number created by ChatGPT, scanning each for cues that invite a reader to engage. The scientists hope the results will aid teachers and exam boards worldwide in flagging suspicious assignments. Software detectors still generate false positives, so human judgment remains vital. By learning the discourse habits most students use, markers can notice when a script drops the personal edge that usually colors undergraduate prose. This knowledge supports fair grading and helps preserve academic integrity in classrooms already flooded with generative tools. Professor Ken Hyland of UEA's School of Education and Lifelong Learning voiced the worry driving the project. "Since its public release, ChatGPT has created considerable anxiety among teachers worried that students will use it to write their assignments," said Hyland, who warned of wider harm. "The fear is that ChatGPT and other AI writing tools potentially facilitate cheating and may weaken core literacy and critical thinking skills. This is especially the case as we don't yet have tools to reliably detect AI-created texts." The researchers searched for questions, personal asides, direct appeals, and other "engagement markers." "We were particularly interested in looking at what we called 'engagement markers' like questions and personal commentary," Hyland said. "We found that the essays written by real students consistently featured a rich array of engagement strategies, making them more interactive and persuasive." Human writers sprinkled their work with moments that addressed the reader, building a sense of dialogue. "They were full of rhetorical questions, personal asides, and direct appeals to the reader - all techniques that enhance clarity, connection, and produce a strong argument." "The ChatGPT essays, on the other hand, while linguistically fluent, were more impersonal. The AI essays mimicked academic writing conventions but they were unable to inject text with a personal touch or to demonstrate a clear stance," Hayland explained. Without questions or vivid commentary, the machine output felt flat. "They tended to avoid questions and limited personal commentary. Overall, they were less engaging, less persuasive, and there was no strong perspective on a topic." "This reflects the nature of its training data and statistical learning methods, which prioritize coherence over conversational nuance," noted Hyland. Even so, the authors do not reject AI tools. They see them as potential tutors when used openly. "When students come to school, college or university, we're not just teaching them how to write, we're teaching them how to think - and that's something no algorithm can replicate," said Hyland. The researchers urge teachers to design process‑based tasks that require drafts and reflection - steps no chatbot can provide. Training students to spot engagement markers, they add, will sharpen both writing and detection skills. The study lands as detection software races to catch up with generative models. Commercial tools still stumble on hybrid texts that mix human and machine sentences. For now, stylistic clues give teachers a slim but useful edge, though the target may shift as AI absorbs more conversational cues. Coursework must remain proof of independent thought. If essays lose that role to AI, qualification systems wobble. By showing where AI still falters, the study offers data to shape new safeguards while keeping space for honest, imaginative student voices. The authors place the project within a wider push for digital literacy. Students meet machine text in news feeds, search results, and chat apps long before they step into lecture halls. Teaching them to ask who wrote a sentence and why it was written is now central to education. Writing with a visible stance trains that habit. It also makes learners less willing to accept anonymous prose at face value. Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Share
Share
Copy Link
Recent studies reveal that while AI-generated essays are coherent, they lack the personal touch and engagement strategies of student-written work. This highlights both the potential and limitations of AI in academic writing.
The integration of AI, particularly ChatGPT, into academic writing has sparked a heated debate in educational circles. While some educators view AI as a threat to traditional learning methods, others see it as an opportunity to enhance the writing process 1.
Despite concerns, AI usage among students is becoming increasingly prevalent. A February 2025 report from the Higher Education Policy Institute found that 92% of university students are using AI in some form 1. This trend is further supported by an OpenAI report revealing that more than a quarter of ChatGPT conversations among college-aged users were education-related 4.
Contrary to fears of AI replacing student writing entirely, research suggests that students are using AI tools to augment their writing process. The top five uses for AI in writing include starting papers, summarizing texts, brainstorming ideas, exploring new topics, and revising written work 1.
A study by the University of East Anglia compared 145 essays written by students with an equal number generated by ChatGPT. The findings revealed a crucial difference: human-written essays consistently featured a rich array of engagement strategies, making them more interactive and persuasive 2.
Professor Ken Hyland, lead researcher, noted that while AI-generated essays were linguistically fluent, they lacked personal touch and clear stance. They tended to avoid rhetorical questions and limited personal commentary, resulting in less engaging and less persuasive content 3.
To address these challenges, researchers have developed the Rhetorical Prompting Method, a framework for interacting with AI chatbots that emphasizes user agency. This method encourages writers to use precise language, critical thinking, and intentional revision when working with AI, ensuring that the final content aligns with their vision 4.
The distinct characteristics of human-written essays may help educators identify AI-generated content, supporting fair grading and academic integrity. However, the study authors do not advocate for banning AI tools. Instead, they suggest using them as teaching aids to enhance the learning process 5.
As AI continues to evolve, the landscape of academic writing is likely to change. Educators are urged to design process-based tasks that require drafts and reflection - steps that AI cannot replicate. The focus is shifting towards teaching students not just how to write, but how to think critically in an age of rapid text generation 5.
Reference
[1]
[2]
Exeter University pioneers AI-friendly assessments as higher education grapples with ChatGPT's impact. The move sparks debate on academic integrity and the future of education in the AI era.
2 Sources
2 Sources
As AI tools become increasingly prevalent in universities, educators grapple with maintaining academic integrity and fostering critical thinking skills among students.
2 Sources
2 Sources
An exploration of why AI, despite its advancements, cannot fully replace human creativity in writing, discussing the nature of large language models, their limitations, and potential uses in the creative process.
3 Sources
3 Sources
A recent study explores the impact of AI on creative writing, revealing both benefits and potential drawbacks. While AI tools can enhance productivity, they may also lead to a homogenization of writing styles.
10 Sources
10 Sources
A new study reveals that while AI-generated stories can match human-written ones in quality, readers show a bias against content they believe is AI-created, even when it's not.
6 Sources
6 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved