2 Sources
[1]
Universities could bolster democracy by fostering students' AI literacy
The fears are familiar: Artificial intelligence is going to eat our jobs, make our students weak and lazy and possibly destroy democracy for good measure. As AI has become more accessible to the public, it's become closely (and probably not unreasonably) associated with academic misconduct, especially plagiarism and other forms of cheating. For some time now, research has been suggesting that the future of AI and post-secondary education would be deeply intertwined. What if, though, teaching students to use AI properly -- ethically, responsibly and critically -- could help make them better, more engaged citizens? Fuelling debate Since its public release in late 2022, ChatGPT, one of the most commonly used generative AI (GenAI) models in the world, has sparked furious academic debate. But the either/or argument that it will kill us or make us stronger is a false dichotomy. As a long-time post-secondary educator, public servant and current doctoral student examining education and civic literacy, I am interested in the potential for AI to help us build a healthier, more inclusive and more robust democracy by creating new ways to engage our critical thinking skills across disciplines. I researched this article, in part, by using Scite.ai, a research tool to which I was introduced by Sarah Eaton, a member of my doctoral supervisory committee whose research focuses on academic ethics in higher education. Eaton has examined issues around student misconduct, and has also argued that the connection between civic and digital literacy, including the use of AI in post-secondary education, is strong and growing. Universities and civic literacy Civic literacy is about fostering students' potential to become active, engaged students in the pursuit of peaceful social change. Somewhere along the way, it seems, universities shied away from that part of their institutional role. Through western modernity, universities came to occupy roles as endowers of knowledge while building on more ancient expectations that education carried social obligations, often construed as a form of "noblesse oblige." Decolonial, democratic and educational criticism rightly underscores the importance of recognizing varied forms of knowledge existing throughout society and in learners' own lives, and how students and diverse disciplines collaborate to construct knowledge. Through this lens, as some scholars have argued, universities have become spaces to foster forms of civic literacy. Educating for democracy The role of colleges and universities in fostering civic literacy, sometimes known as educating for democracy, feeds their contribution to fostering democratic societies. Universities frequently point to this role proudly, speaking of it in broad, glowing terms without offering a lot of specifics. While universities and colleges often talk broadly about creating learning spaces conducive to democratic engagement and good citizenship, principles associated with democracy have tended to be concentrated in a relatively small number of academic disciplines, such as humanities, social and political sciences. The STEM disciplines don't always give them the same attention. The need for digital and AI literacy, across disciplines, raises rich possibilities around fostering the teaching and learning of democratic or civic dispositions. This refers to creating students who become voting citizens, who have the capacity to make informed political decisions about the leaders who represent them or to assess the validity of what those leaders present. Read more: AI is making elections weird: Lessons from a simulated war-game exercise The path to using AI to foster civic literacy requires the reinforcement of critical thinking, which encourages learners to challenge assumptions and cultivate independent thought. Becoming critical, informed citizens Many of us are familiar with concerns that AI doesn't probe deeply; it can't assess credibility as a human might; it's typically working from dated information, having been trained on older, static data sets; it demonstrates bias and discrimination; and sometimes, it can outright hallucinate, making up facts that have no basis in reality. There's a bit of a void at the moment in terms of institutional AI policies on the use or misuse of AI and how everyone understands them, which is understandable, given how new the technology is. This is where the connection between AI and civic literacy is especially strong: the same critical thinking skills we teach our students in literature, science or any other discipline can be applied to when explaining AI policy or transparently examining AI outputs in classes related to curricula and assignments. By teaching students to question outputs and assess their validity, accuracy and trustworthiness, we can help them enhance the very skills they'll ultimately need to become active, informed citizens. They might then stand a better chance of becoming more critical citizens, employing their skills to resolve disputes and assess everything from the news they consume to promises made by political leaders. It can also help develop the skills to combat political polarization and misinformation. True digital literacy includes not only determining in what contexts it could be appropriate to use AI but also how to effectively use AI-powered tools. Need for prudence University educators have to be prudent in our approach, though. So-called "cognitive offloading" -- trusting machines to do our reasoning, thinking and memory work for us -- is a genuine risk. This risk makes the argument for using AI to teach critical thinking even more compelling. Human analysis of the output and its credibility is essential. In a presentation at the University of Calgary in March 2025, Eaton noted: "If anything, problems facing students, educators and citizens of the world may be even more complex in the future than they are today ... These next-generation citizens will be navigating and leading changes we have not yet even imagined." What I am seeing in my research is that a broadening of the discussion to look at AI's potential to foster civic literacy -- as Eaton suggests -- may be crucial to the future of democracy.
[2]
Universities could bolster democracy by fostering students' AI literacy
The fears are familiar: Artificial intelligence is going to eat our jobs, make our students weak and lazy and possibly destroy democracy for good measure. As AI has become more accessible to the public, it's become closely (and probably not unreasonably) associated with academic misconduct, especially plagiarism and other forms of cheating. For some time now, research has been suggesting that the future of AI and post-secondary education would be deeply intertwined. What if, though, teaching students to use AI properly -- ethically, responsibly and critically -- could help make them better, more engaged citizens? Fueling debate Since its public release in late 2022, ChatGPT, one of the most commonly used generative AI (GenAI) models in the world, has sparked furious academic debate. But the either/or argument that it will kill us or make us stronger is a false dichotomy. As a long-time post-secondary educator, public servant and current doctoral student examining education and civic literacy, I am interested in the potential for AI to help us build a healthier, more inclusive and more robust democracy by creating new ways to engage our critical thinking skills across disciplines. I researched this article, in part, by using Scite.ai, a research tool to which I was introduced by Sarah Eaton, a member of my doctoral supervisory committee whose research focuses on academic ethics in higher education. Eaton has examined issues around student misconduct, and has also argued that the connection between civic and digital literacy, including the use of AI in post-secondary education, is strong and growing. Universities and civic literacy Civic literacy is about fostering students' potential to become active, engaged students in the pursuit of peaceful social change. Somewhere along the way, it seems, universities shied away from that part of their institutional role. Through western modernity, universities came to occupy roles as endowers of knowledge while building on more ancient expectations that education carried social obligations, often construed as a form of "noblesse oblige." Decolonial, democratic and educational criticism rightly underscores the importance of recognizing varied forms of knowledge existing throughout society and in learners' own lives, and how students and diverse disciplines collaborate to construct knowledge. Through this lens, as some scholars have argued, universities have become spaces to foster forms of civic literacy. Educating for democracy The role of colleges and universities in fostering civic literacy, sometimes known as educating for democracy, feeds their contribution to fostering democratic societies. Universities frequently point to this role proudly, speaking of it in broad, glowing terms without offering a lot of specifics. While universities and colleges often talk broadly about creating learning spaces conducive to democratic engagement and good citizenship, principles associated with democracy have tended to be concentrated in a relatively small number of academic disciplines, such as humanities, social and political sciences. The STEM disciplines don't always give them the same attention. The need for digital and AI literacy, across disciplines, raises rich possibilities around fostering the teaching and learning of democratic or civic dispositions. This refers to creating students who become voting citizens, who have the capacity to make informed political decisions about the leaders who represent them or to assess the validity of what those leaders present. The path to using AI to foster civic literacy requires the reinforcement of critical thinking, which encourages learners to challenge assumptions and cultivate independent thought. Becoming critical, informed citizens Many of us are familiar with concerns that AI doesn't probe deeply; it can't assess credibility as a human might; it's typically working from dated information, having been trained on older, static data sets; it demonstrates bias and discrimination; and sometimes, it can outright hallucinate, making up facts that have no basis in reality. There's a bit of a void at the moment in terms of institutional AI policies on the use or misuse of AI and how everyone understands them, which is understandable, given how new the technology is. This is where the connection between AI and civic literacy is especially strong: the same critical thinking skills we teach our students in literature, science or any other discipline can be applied to when explaining AI policy or transparently examining AI outputs in classes related to curricula and assignments. By teaching students to question outputs and assess their validity, accuracy and trustworthiness, we can help them enhance the very skills they'll ultimately need to become active, informed citizens. They might then stand a better chance of becoming more critical citizens, employing their skills to resolve disputes and assess everything from the news they consume to promises made by political leaders. It can also help develop the skills to combat political polarization and misinformation. True digital literacy includes not only determining in what contexts it could be appropriate to use AI but also how to effectively use AI-powered tools. Need for prudence University educators have to be prudent in our approach, though. So-called "cognitive offloading" -- trusting machines to do our reasoning, thinking and memory work for us -- is a genuine risk. This risk makes the argument for using AI to teach critical thinking even more compelling. Human analysis of the output and its credibility is essential. In a presentation at the University of Calgary in March 2025, Eaton noted, "If anything, problems facing students, educators and citizens of the world may be even more complex in the future than they are today ... These next-generation citizens will be navigating and leading changes we have not yet even imagined." What I am seeing in my research is that a broadening of the discussion to look at AI's potential to foster civic literacy -- as Eaton suggests -- may be crucial to the future of democracy. This article is republished from The Conversation under a Creative Commons license. Read the original article.
Share
Copy Link
A new perspective on AI in education suggests that teaching students to use AI ethically and critically could enhance civic literacy and strengthen democratic engagement.
In an era where artificial intelligence (AI) is increasingly pervasive, universities are grappling with its impact on education and society. While concerns about AI's potential to facilitate academic misconduct and replace human skills are prevalent, a new perspective suggests that AI could be leveraged to strengthen democracy by enhancing students' civic literacy 1.
Source: Phys.org
Traditionally, universities have been seen as bastions of knowledge dissemination. However, there's a growing recognition of their role in fostering civic engagement and democratic values. This shift aligns with the need to prepare students for a world where AI is ubiquitous. By teaching students to use AI ethically, responsibly, and critically, universities could help create more engaged and informed citizens 1.
The integration of AI literacy across disciplines presents rich possibilities for enhancing democratic dispositions. It goes beyond merely understanding how to use AI tools; it involves developing the capacity to critically evaluate AI outputs, question assumptions, and make informed decisions. These skills are crucial not only in academic settings but also in civic life, where citizens need to assess political information and make informed choices 2.
Source: The Conversation
While AI offers numerous benefits, it also presents challenges. The risk of "cognitive offloading" – relying on machines for reasoning and memory – is a genuine concern. However, this risk underscores the importance of teaching critical thinking skills alongside AI literacy. By encouraging students to analyze AI outputs critically, universities can help develop more discerning and engaged citizens 1.
Currently, there's a lack of comprehensive institutional policies regarding the use of AI in education. This gap presents an opportunity for universities to develop guidelines that not only address the ethical use of AI but also incorporate it into curricula in ways that enhance critical thinking and civic engagement 2.
While principles of democracy have traditionally been concentrated in humanities and social sciences, the pervasiveness of AI necessitates its integration across all disciplines, including STEM fields. This interdisciplinary approach to AI literacy can create a more holistic understanding of its impact on society and democracy 1.
By fostering AI literacy, universities have the potential to create a new generation of citizens who are not only technologically adept but also critically engaged with the democratic process. This approach could help combat issues like political polarization and misinformation, ultimately contributing to a healthier and more robust democracy.
Summarized by
Navi
[1]
Mount Sinai researchers develop an AI model that provides individualized treatment recommendations for atrial fibrillation patients, potentially transforming the standard approach to anticoagulation therapy.
3 Sources
Health
22 hrs ago
3 Sources
Health
22 hrs ago
TSMC achieves unprecedented 70.2% market share in Q2 2025, driven by AI, smartphone, and PC chip demand. The company's revenue hits $30.24 billion, showcasing its technological leadership and market dominance.
3 Sources
Business
22 hrs ago
3 Sources
Business
22 hrs ago
UCLA researchers develop a non-invasive brain-computer interface system with AI assistance, significantly improving performance for users, including those with paralysis, in controlling robotic arms and computer cursors.
5 Sources
Technology
22 hrs ago
5 Sources
Technology
22 hrs ago
Gartner predicts AI-capable PCs will make up 31% of the global PC market by 2025, with shipments reaching 77.8 million units. Despite temporary slowdowns due to tariffs, AI PCs are expected to become the norm by 2029.
2 Sources
Technology
23 hrs ago
2 Sources
Technology
23 hrs ago
AI tools are being used to create hyper-realistic, sexist content featuring bikini-clad women, flooding social media platforms and blurring the line between fiction and reality.
2 Sources
Technology
22 hrs ago
2 Sources
Technology
22 hrs ago