2 Sources
[1]
OpenAI Wants to get College Kids Hooked on AI
AI chatbots like OpenAI's ChatGPT have been shown repeatedly to provide false information, hallucinate completely made-up sources and facts, and lead people astray with their confidently wrong answers to questions. For that reason, AI tools are viewed with skepticism by many educators. So, of course, OpenAI and its competitors are targeting colleges and pushing its services on studentsâ€"concerns be damned. According to the New York Times, OpenAI is in the midst of a major push to make ChatGPT a fixture on college campuses, replacing many aspects of the college experience with AI alternatives. According to the report, the company wants college students to have a "personalized AI account" as soon as they step on campus, same as how they receive a school email address. It envisions ChatGPT serving as everything from a personal tutor to a teacher's aide to a career assistant that helps students find work after graduation. Some schools are already buying in, despite the educational world initially greeting AI with distrust and outright bans. Per the Times, schools like the University of Maryland, Duke University, and California State University have all signed up for OpenAI's premium service, ChatGPT Edu, and have started to integrate the chatbot into different parts of the educational experience. It's not alone in setting its sights on higher education, either. Elon Musk's xAI offered free access to its chatbot Grok to students during exam season, and Google is currently offering its Gemini AI suite to students for free through the end of the 2025-26 academic year. But that is outside of the actual infrastructure of higher education, which is where OpenAI is attempting to operate. Universities opting to embrace AI, after initially taking hardline positions against it over fears of cheating, is unfortunate. There is already a fair amount of evidence piling up that AI is not all that beneficial if your goal is to learn and retain accurate information. A study published earlier this year found that reliance on AI can erode critical thinking skills. Others have similarly found that people will "offload" the more difficult cognitive work and rely on AI as a shortcut. If the idea of university is to help students learn how to think, AI undermines it. And that's before you get into the misinformation of it all. In an attempt to see how AI could serve in a focused education setting, researchers tried training different models on a patent law casebook to see how they performed when asked questions about the material. They all produced false information, hallucinated cases that did not exist, and made errors. The researchers reported that OpenAI's GPT model offered answers that were "unacceptable" and "harmful for learning" about a quarter of the time. That's not ideal. Considering that OpenAI and other companies want to get their chatbots ingrained not just in the classroom, but in every aspect of student life, there are other harms to consider, too. Reliance on AI chatbots can have a negative impact on social skills. And the simple fact that universities are investing in AI means they aren't investing in areas that would create more human interactions. A student going to see a tutor, for example, creates a social interaction that requires using emotional intelligence and establishing trust and connection, ultimately adding to a sense of community and belonging. A chatbot just spits out an answer, which may or may not be correct.
[2]
Welcome to campus, here's your ChatGPT
Some universities, including the University of Maryland and California State University, are already working to make AI tools part of students' everyday experiences. In early June, Duke University began offering unlimited ChatGPT access to students, faculty and staff. The school also introduced a university platform, called DukeGPT, with AI tools developed by Duke.OpenAI, the maker of ChatGPT, has a plan to overhaul college education -- by embedding its artificial intelligence tools in every facet of campus life. If the company's strategy succeeds, universities would give students AI assistants to help guide and tutor them from orientation day through graduation. Professors would provide customized AI study bots for each class. Career services would offer recruiter chatbots for students to practice job interviews. And undergrads could turn on a chatbot's voice mode to be quizzed aloud before a test. OpenAI dubs its sales pitch "AI-native universities." "Our vision is that, over time, AI would become part of the core infrastructure of higher education," Leah Belsky, OpenAI's vice president of education, said in an interview. In the same way that colleges give students school email accounts, she said, soon "every student who comes to campus would have access to their personalized AI account." To spread chatbots on campuses, OpenAI is selling premium AI services to universities for faculty and student use. It is also running marketing campaigns aimed at getting students who have never used chatbots to try ChatGPT. Some universities, including the University of Maryland and California State University, are already working to make AI tools part of students' everyday experiences. In early June, Duke University began offering unlimited ChatGPT access to students, faculty and staff. The school also introduced a university platform, called DukeGPT, with AI tools developed by Duke. OpenAI's campaign is part of an escalating AI arms race among tech giants to win over universities and students with their chatbots. The company is following in the footsteps of rivals like Google and Microsoft that have for years pushed to get their computers and software into schools, and court students as future customers. The competition is so heated that Sam Altman, OpenAI's CEO, and Elon Musk, who founded the rival xAI, posted dueling announcements on social media this spring offering free premium AI services for college students during exam period. Then Google upped the ante, announcing free student access to its premium chatbot service "through finals 2026." OpenAI ignited the recent AI education trend. In late 2022, the company's rollout of ChatGPT, which can produce human-sounding essays and term papers, helped set off a wave of chatbot-fueled cheating. Generative AI tools like ChatGPT, which are trained on large databases of texts, also make stuff up, which can mislead students. Less than three years later, millions of college students regularly use AI chatbots as research, writing, computer programming and idea-generating aides. Now OpenAI is capitalizing on ChatGPT's popularity to promote the company's AI services to universities as the new infrastructure for college education. OpenAI's service for universities, ChatGPT Edu, offers more features, including certain privacy protections, than the company's free chatbot. ChatGPT Edu also enables faculty and staff to create custom chatbots for university use. (OpenAI offers consumers premium versions of its chatbot for a monthly fee.) OpenAI's push to AI-ify college education amounts to a national experiment on millions of students. The use of these chatbots in schools is so new that their potential long-term educational benefits, and possible side effects, are not yet established. A few early studies have found that outsourcing tasks like research and writing to chatbots can diminish skills like critical thinking. And some critics argue that colleges going all-in on chatbots are glossing over issues like societal risks, AI labor exploitation and environmental costs. OpenAI's campus marketing effort comes as unemployment has increased among recent college graduates -- particularly in fields like software engineering, where AI is now automating some tasks previously done by humans. In hopes of boosting students' career prospects, some universities are racing to provide AI tools and training. California State University announced this year that it was making ChatGPT available to more than 460,000 students across its 23 campuses to help prepare them for "California's future AI-driven economy." Cal State said the effort would help make the school "the nation's first and largest AI-empowered university system." Some universities say they are embracing the new AI tools in part because they want their schools to help guide, and develop guardrails for, the technologies. "You're worried about the ecological concerns. You're worried about misinformation and bias," Edmund Clark, the chief information officer of California State University, said at a recent education conference in San Diego. "Well, join in. Help us shape the future." Last spring, OpenAI introduced ChatGPT Edu, its first product for universities, which offers access to the company's latest AI. Paying clients like universities also get more privacy: OpenAI says it does not use the information that students, faculty and administrators enter into ChatGPT Edu to train its AI. (The New York Times has sued OpenAI and its partner, Microsoft, over copyright infringement. Both companies have denied wrongdoing.) Last fall, OpenAI hired Belsky to oversee its education efforts. An ed tech startup veteran, she previously worked at Coursera, which offers college and professional training courses. She is pursuing a two-pronged strategy: marketing OpenAI's premium services to universities for a fee while advertising free ChatGPT directly to students. OpenAI also convened a panel of college students recently to help get their peers to start using the tech. Among those students are power users like Delphine Tai-Beauchamp, a computer science major at the University of California, Irvine. She has used the chatbot to explain complicated course concepts, as well as help explain coding errors and make charts diagraming the connections between ideas. "I wouldn't recommend students use AI to avoid the hard parts of learning," Tai-Beauchamp said. She did recommend students try AI as a study aid. "Ask it to explain something five different ways." Belsky said these kinds of suggestions helped the company create its first billboard campaign aimed at college students. "Can you quiz me on the muscles of the leg?" asked one ChatGPT billboard, posted this spring in Chicago. "Give me a guide for mastering this Calc 101 syllabus," another said. Belsky said OpenAI had also begun funding research into the educational effects of its chatbots. "The challenge is, how do you actually identify what are the use cases for AI in the university that are most impactful?" Belsky said during a December AI event at Cornell Tech in New York City. "And then how do you replicate those best practices across the ecosystem?" Some faculty members have already built custom chatbots for their students by uploading course materials like their lecture notes, slides, videos and quizzes into ChatGPT. Jared DeForest, the chair of environmental and plant biology at Ohio University, created his own tutoring bot, called SoilSage, which can answer students' questions based on his published research papers and science knowledge. Limiting the chatbot to trusted information sources has improved its accuracy, he said. "The curated chatbot allows me to control the information in there to get the product that I want at the college level," DeForest said. But even when trained on specific course materials, AI can make mistakes. In a new study -- "Can AI Hold Office Hours?" -- law school professors uploaded a patent law casebook into AI models from OpenAI, Google and Anthropic. Then they asked dozens of patent law questions based on the casebook and found that all three AI chatbots made "significant" legal errors that could be "harmful for learning." "This is a good way to lead students astray," said Jonathan S. Masur, a professor at the University of Chicago Law School and a co-author of the study. "So I think that everyone needs to take a little bit of a deep breath and slow down." OpenAI said the 250,000-word casebook used for the study was more than twice the length of text that its GPT-4o model can process at once. Anthropic said the study had limited usefulness because it did not compare the AI with human performance. Google said its model accuracy had improved since the study was conducted. Belsky said a new "memory" feature, which retains and can refer to previous interactions with a user, would help ChatGPT tailor its responses to students over time and make the AI "more valuable as you grow and learn." Privacy experts warn that this kind of tracking feature raises concerns about long-term tech company surveillance. In the same way that many students today convert their school-issued Gmail accounts into personal accounts when they graduate, Belsky envisions graduating students bringing their AI chatbots into their workplaces and using them for life. "It would be their gateway to learning -- and career life thereafter," Belsky said.
Share
Copy Link
OpenAI is aggressively promoting the integration of AI tools, particularly ChatGPT, into various aspects of college life, from personalized tutoring to career assistance, despite ongoing concerns about AI's impact on education.
OpenAI, the company behind ChatGPT, is spearheading a significant initiative to integrate artificial intelligence into every aspect of college life. The company envisions a future where students receive personalized AI accounts upon entering campus, similar to how they currently receive school email addresses 12. This ambitious plan aims to transform the educational landscape by embedding AI tools in various facets of the university experience, from orientation to graduation.
Despite initial skepticism and outright bans on AI tools in educational settings, several prominent institutions are now adopting OpenAI's premium service, ChatGPT Edu. The University of Maryland, Duke University, and California State University are among the early adopters integrating the chatbot into different aspects of their educational programs 12. Duke University, for instance, has introduced DukeGPT, a university platform featuring AI tools developed in-house, while offering unlimited ChatGPT access to its students, faculty, and staff 2.
Source: Gizmodo
OpenAI's push into higher education is part of a larger trend, with tech giants like Google and Microsoft also vying for a share of the educational market. The competition has intensified to the point where company CEOs are making public offers of free premium AI services to college students during exam periods 2. This race to capture the student market underscores the growing importance of AI in education and the tech industry's recognition of universities as crucial battlegrounds for future customers.
Proponents of AI integration in education argue that these tools can serve as personal tutors, teacher's aides, and career assistants, potentially enhancing the learning experience and preparing students for an AI-driven economy 12. California State University, for example, is positioning itself as "the nation's first and largest AI-empowered university system" in an effort to boost students' career prospects 2.
However, the rapid adoption of AI in education has raised significant concerns:
Impact on Critical Thinking: Early studies suggest that outsourcing tasks like research and writing to AI chatbots may diminish critical thinking skills 2.
Misinformation and Accuracy: AI models, including OpenAI's GPT, have been shown to produce false information and hallucinate non-existent cases, particularly in specialized fields like patent law 1.
Social Skills and Community: There are worries that reliance on AI chatbots could negatively impact students' social skills and reduce human interactions that are crucial for building a sense of community and belonging on campus 1.
Ethical and Environmental Concerns: Critics argue that the widespread adoption of AI in education glosses over issues such as societal risks, labor exploitation in AI development, and the environmental costs of running large AI models 2.
Source: Economic Times
As OpenAI and other tech companies continue their push to integrate AI into higher education, the long-term effects of this technological shift remain uncertain. While some universities are embracing AI tools to help shape their development and establish guardrails, others remain cautious. The coming years will likely see ongoing debates about the appropriate role of AI in education, balancing the potential benefits with the need to preserve essential human skills and interactions in the learning process.
Summarized by
Navi
[2]
A Ukrainian drone attack has reportedly damaged around 10% of Russia's strategic bomber fleet, including TU-95 and TU-22 bombers and A-50 surveillance planes. The attack, which targeted multiple Russian air bases, is said to have significant psychological impact on Russia's military operations.
4 Sources
Technology
19 hrs ago
4 Sources
Technology
19 hrs ago
Google's latest Pixel 9 series introduces advanced AI capabilities and camera improvements, offering a compelling alternative to high-end smartphones with competitive pricing and features.
2 Sources
Technology
11 hrs ago
2 Sources
Technology
11 hrs ago
Nvidia has achieved a historic 92% market share in the desktop GPU market, while AMD's share dropped to 8% and Intel's to nearly 0%. This shift comes amid Nvidia's focus on AI and data center markets, raising questions about the future of consumer GPU competition.
4 Sources
Technology
1 day ago
4 Sources
Technology
1 day ago
Anthropic has introduced a new set of AI models called "Claude Gov" specifically designed for U.S. national security customers, featuring enhanced capabilities for handling classified information and intelligence analysis.
5 Sources
Technology
2 days ago
5 Sources
Technology
2 days ago
Google begins testing 'Search Live' in AI Mode, offering real-time voice conversations powered by Project Astra. The feature is currently rolling out to select users in the US on both Android and iOS platforms.
3 Sources
Technology
1 day ago
3 Sources
Technology
1 day ago