Professors develop custom AI teaching assistants to reshape how universities approach generative AI

2 Sources

Share

Professors at Barnard College and George Washington University are creating custom AI teaching assistants to help students learn academic writing and research skills. Benjamin Breyer spent two years developing Althea, a chatbot trained on his lectures and student work, securing $30,000 in grants. These educators aim to teach critical and effective AI use while mitigating biases found in commercial AI tools like ChatGPT.

Professors Turn to Custom Solutions Amid AI Debate

The conversation around generative AI in universities has moved beyond simple bans and blanket acceptance. At Barnard College in Manhattan, Benjamin Breyer represents a growing number of educators who are charting a middle path—one that acknowledges both the promise and pitfalls of AI in universities

1

. While the first-year writing program at Barnard College generally prohibits tools like ChatGPT, Claude, and Gemini due to concerns about factual accuracy and perpetuating biases, Breyer has secured an exception to test whether AI teaching assistants can supplement student effort rather than replace it

2

.

This shift matters because it signals a pragmatic approach to a technology that students will inevitably encounter throughout their academic and professional lives. Rather than ignoring generative AI or allowing uncontrolled use, professors develop custom assistants that provide a controlled learning environment where AI literacy can be taught responsibly.

Building Althea: A Custom Chatbot for Academic Writing

Over two years, Breyer and Marko Krkeljas, a former Barnard software developer, invested thousands of hours developing Althea, named after the Grateful Dead song

1

. Breyer secured approximately $30,000 in grants and technical support to build this custom chatbot using a customizable OpenAI platform, now accessible at academicwritingtools.com

2

.

Source: NYT

Source: NYT

The motivation was clear: off-the-shelf commercial AI tools proved inadequate for teaching the nuanced academic skills students need for scholarly conversation. To train Althea to mimic his own teaching style, Breyer fed it transcripts of his lectures, course materials, and samples of exemplary student work

1

. He configured the bot's persona as a "tutor at an elite liberal arts college" that prompts students to improve their answers while giving a "hard refusal" if asked to write content directly

2

.

This approach to integrating AI as a supplementary learning tool demonstrates how educators can mitigate the biases of commercial AI by creating purpose-built alternatives. Breyer even instructed Althea to be less flattering, countering generative AI's typical eagerness to please

1

.

Testing Results and Faculty Reassurance

After three semesters of testing Althea in one writing section while maintaining an identical section that barred AI, Breyer has drawn conclusions that reassure anxious colleagues

2

. "This is no threat to us at present," he tells fellow professors. "AI may help with the expression of an idea and articulating that expression. But the idea itself, the thing that's hardest to teach, is still going to remain our domain"

1

.

This perspective matters for educators worried about being replaced by technology. Breyer's 22 years of experience at Barnard College and Columbia University lend credibility to his assessment that human creativity and critical thinking remain irreplaceable, even as AI tools become more sophisticated.

A Growing Movement Across Institutions

Breyer isn't alone in this effort. Alexa Alice Joubin, an English professor and co-director of the Digital Humanities Institute at George Washington University, has created her own AI teaching assistant bot codeveloped with computer science student Akhilesh Rangani

2

. Her bot helps students refine research questions and summarize readings, providing another example of how educators are taking control of AI initiatives rather than passively accepting commercial solutions

1

.

While many instructors experiment with AI, it remains unusual for English professors to develop a custom chatbot specifically designed to address the problematic implications of for-profit models

2

. This hands-on approach allows for critical and effective AI use tailored to specific pedagogical goals.

Institutional Support and Evolving Standards

At most universities, including Barnard College, individual professors retain authority over whether and how to allow AI in their classrooms

1

. College administrators play a dual role, supporting faculty decision-making while enabling access to AI tools through computer systems. This decentralized approach creates space for experimentation but also generates inconsistency in student experiences.

Legacy academic organizations are adapting their guidance. In an October 2024 working paper, a task force of the Modern Language Association, which promotes humanities studies, leaned toward engagement, stating that first-year writing courses "have a special responsibility" to teach students how to use generative AI "critically and effectively in academic situations and across their literate lives"

1

2

.

What Students and Educators Should Watch

The contrast between Breyer's approach and that of program director Wendy Schor-Haim—who runs screen-free classes and has never tried ChatGPT—illustrates the ongoing tension in higher education

1

. Schor-Haim argues that "students tend to use it in our classrooms to do the work that we are here to teach them how to do," adding that AI "is very, very bad at that work"

2

.

As more professors develop custom assistants and share results, the short-term implication is likely increased experimentation across disciplines. Long-term, these AI initiatives could establish new standards for how universities integrate technology while preserving the essential role of human instruction in developing critical thinking and original ideas. Students should expect more structured opportunities to learn AI literacy within academic contexts, while educators will need to decide whether to build their own tools or adapt existing ones to serve pedagogical goals rather than simply administrative efficiency.

Today's Top Stories

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2026 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo