Teaching in the Context of Generative AI
Generative AI Tools – What are they and what can they do?
Since OpenAI rolled out ChatGPT in late 2022, higher education (among many other industries) has had to adapt to the unprecedented capabilities of large language models (LLMs) and other generative AI tools. These new tools generate surprisingly effective, novel responses to human prompts in the form of writing, code, images, and even video. AI tools can generate audio summaries of texts for readers, create convincing video simulations of real people, and simulate at least B-level writing on many topics. The wide availability of these tools combined with their impressive, and consistently improving outputs, has created many questions regarding academic integrity, the viability of certain kinds of assessments, and the future of work itself.
How Do LLMs function?
The most widely known of these tools, ChatGPT, is a large language model (LLM). Using significant processing power, large data sets, and human training, LLMs generate probable responses in fluent language to user prompts. LLM development is a complex process that combines deep learning, natural language processing, hardware acceleration, and programming ethical rules (guardrails). LLMs are essentially sophisticated predictive text engines, predicting the most statistically likely next word, in response to a user’s prompt.
Ethical Use and Data Privacy
There are several concerns to keep in mind before deciding to use an AI tool or requiring that your students use one, including privacy, equity of access, and false or biased outputs. In general, whatever you enter into commercial AI tools is not private and will likely become part of the tool’s training data, so do not enter proprietary or protected data. Additionally, many tools may not be designed with accessibility for all users in mind. Finally, these tools may produce false and biased outputs because their training data is large and unvetted; based on this data, they produce outputs that are meant to be probable, but not necessarily true.
If you’re looking to use AI in the work of teaching and learning, it’s best to use a tool that preserves student privacy, by not sending data back to the company that made the tool. UChicago offers a secure environment for exploring AI, including access to PhoenixAI, a private generative AI tool, as well as enterprise-level platforms like Microsoft Copilot, Google Gemini, and, Google NotebookLM. You can compare the features of these tools on the University’s GenAI site and see information about a few such tools below.
PhoenixAI
PhoenixAI is UChicago’s Generative AI chat service, using LLMs from OpenAI within a custom web application for the University community. Data submitted to PhoenixAI is kept within the UChicago environment and not shared with third-party vendors or used to train AI models. This tool also allows you to build custom chatbots (Assistants) based on your own resources and share them with colleagues and classmates, if desired.
Microsoft Copilot
Microsoft Copilot is a generative AI-powered chat tool that is integrated with Microsoft’s Bing search engine to offer internet-connected AI answers and image generation. It is based on OpenAI’s GPT-4 and DALL-E 2 models. The University’s Microsoft 365 licensing agreement provides data and privacy protections that the public version does not offer. To get started, visit copilot.cloud.microsoft and log in with your UChicago account.
Google Gemini
Google Gemini is a powerful conversational AI that offers enterprise-grade data protection for users logged in with their CNet accounts at gemini.google.com. Tools built into Gemini include:
- Guided Learning: Explore a subject in-depth with interactive guidance and knowledge checks.
- Deep Research: Conduct in-depth, multi-step research on complex topics and synthesize findings in a comprehensive, cited report
- Image Creation: Create or edit images and photos using plain text descriptions
- Canvas: Create documents, code, and apps
NotebookLM
NotebookLM is a versatile AI tool from Google that is grounded in your specific documents or files. Responses in NotebookLM include citations to the sources that you provide. You can also generate new content based on the files you upload or connect, such as audio overviews, FAQs, brief docs, and more. The same privacy protections with Google Gemini apply to NotebookLM. The model won’t be trained on or reviewed to improve AI models. Visit notebooklm.google to get started.
Ed Discussion Bots++
Ed Discussion’s Bots++ feature allows users to add an AI-powered chatbot to an Ed Discussion course site. This bot can be customized to your specific course materials and approach, helping streamline communication and assistance to your students. The bot currently uses the OpenAI GPT-4o model, runs in a secure environment, won’t be used to train future models, and data is not shared with third-party vendors. A major benefit of this tool is that you can choose to hide the bot’s responses from students so you and your teaching staff can verify its accuracy before making them public. Review the KB article, Ed Discussion: Add an AI Chatbot with Bots++, for step–by-step instructions.
To Acquire Additional Tools
See the Generative AI at UChicago site for guidance on protecting University data, software procurement requirements, and more. You can also find a list of tools that have already been reviewed by the University for safety on the Generative AI Tools page.
Generative AI Weaknesses
Bias
The responses from these tools can reflect the biases in the data sets on which they’re trained. Information and data on the internet runs the gamut from extremely reliable to extremely biased, based on the source. One must always think critically about potential bias that can be reflected in the responses from these tools.
ATS has produced several blog posts that you may find useful in exploring how AI bias can show up in both obvious and subtle ways in the learning process:
- Experiment with (and Scrutinize) AI Together
- This post breaks down how tools like Google AI summaries, which are now provided automatically when a user searches, can subtly guide students’ thinking and oversimplify complex issues. Due to the way design is used to manipulate user behavior, this can mislead even savvy users.
- Demystifying Experiences to Mitigate AI
- Drawing on research on AI bias, this post unpacks the differences between six AI tools’ responses to the same seemingly simple question. The results demonstrate how you can draw attention to subtle distortions in the mirror to reality that AI presents.
Veracity and “Hallucinations”
These tools are made to generate probable responses, but not necessarily accurate responses. The fluency of the generated text is impressive, but it often outpaces its veracity, which can be misleading, as the tools can appear very confident in their responses. Educate students on investigating claims, facts, and even sources the tools cite, which are sometimes fabricated.
For an example of what hallucinations might look like in a conversation with AI, you can check out our blog post series on hallucinations. Furthermore, a profile of a UChicago Urdu instructor’s exploration of AI shows how these tools can lead users astray in terms of cultural context and low-resource languages.
Inconsistent Math Abilities
When ChatGPT first gained wide notoriety, many writers and educators noted that the tool was still severely limited when it came to math-related questions. While users still sometimes report inconsistent results, these AI outputs for math have improved in recent years. To learn more about the math milestones AI tools have met, and where they still make errors, check out “What’s Next for AI and Math” from MIT Technology Review. Regardless of these tools’ reliability in math, it’s important to be critical of the results you receive and think through what foundational skills students need to build, regardless of whether they can eventually rely on technology to complete those tasks.
For a look at how you can help students become more engaged with math using AI, check out ATS’ blog post on the work of UChicago math instructor Selma Yildirim, who created custom chatbots to help her students brush up on foundational skills with AI instead of simply getting answers.
Course Design and Student Communication
Artificial Intelligence and Education at the University of Chicago
President Alivisatos and Provost Baicker convened the Artificial Intelligence and Education Working Group at the University of Chicago on February 24, 2025, charging them with supporting faculty in navigating and developing the role of AI in teaching and pedagogy in undergraduate and graduate education and its effects on the University’s educational mission. The report from the Artificial Intelligence and Education Working Group reflects its findings and recommendations, as delivered to the President and Provost in July 2025.
Clarifying and Articulating Learning Goals
An important first step in designing your course and its assessments (whether using or not using AI tools) is stating the learning goal you have in mind and how each specific activity serves that learning. As UChicago instructors noted in a Spring 2023 panel on AI, the more students know the value of the activity, the more likely they are to engage with it as designed, versus finding shortcuts.
Designing Your Assignments
Authentic Assessments Without AI Tools
While it is tempting to seek tools that claim to detect AI-generated outputs, most experts in teaching and learning have come to consider that a losing game. A more productive endeavor may be to communicate openly with students and assess authentically to prioritize learning. Consider assessments that are more frequent and have lower stakes, or consider using scaffolding to break up larger assignments into iterative parts, like larger papers submitted in multiple stages or drafts. For more ideas, see our blog post on “AI-resistant assignments.”
Creative Activities with AI Tools
If you’re interested in using AI tools to promote learning in your classes, there are a number of ways you can do that. A few examples include:
- Generating examples and explanations of abstract concepts, with specific audiences and reading levels in mind
- Leveraging it as a brainstorming tool
- Drafting quiz questions and code challenges
- Creating text for students to fact check, practicing disciplinary research skills
- Allow students to revise and reorganize their drafts using an AI tool
Ethan Mollick’s Substack, One Useful Thing, from which some of these examples were borrowed, is a good further resource for instructors who are excited about AI as a teaching tool. Additionally, Cornell University’s CU Committee Report: Generative Artificial Intelligence for Education and Pedagogy has discipline-specific use cases in the appendices.
Policy for Use and Citation
Having an AI policy is an important part of an effective approach to AI in your classes. Whether you’re allowing the use of AI tools for certain assignments or banning them altogether, you and your students will benefit from a clear statement of what role AI should play in the class. The following resources, including two from the Chicago Center for Teaching and Learning (CCTL), may benefit you in crafting that language:
- CCTL’s Guidance for Syllabus Statements on the Use of AI Tools: a breakdown of the major considerations for instructors creating their syllabi
- CCTL’s Collection of UChicago AI Syllabus Statements: a repository of syllabus policies from UChicago instructors (requires UChicago login)
- UChicago’s Academic Honesty and Plagiarism Policy
- Guidance from the UChicago Provost’s Office on Academic Honesty
- ATS instructional designer Thomas Keith’s blog series on academic integrity
Additional Resources
UChicago Generative AI Website
Visit genai.uchicago.edu for up-to-date content, guidelines, and opportunities related to generative AI at UChicago.
Teaching in the Generative AI Landscape
Academic Technology Solutions, the Chicago Center for Teaching and Learning, and the Library have joined forces to create a Canvas course titled Teaching in the Generative AI Landscape to help faculty and instructors as they think through the teaching and learning implications of generative artificial intelligence. The course is asynchronous and allows self-enrollment.
Getting Started with AI: Guidance for Students
Academic Technology Solutions and the Library have collaborated to provide students with guidance on learning and AI, with information on tools that are available from the university, how to use those tools, and what it means to learn and work in a world where AI tools are available. This Canvas site, Getting Started with AI, is directed toward students, you may find it helpful to review some new messaging on learning and even test out some of the prompts within, which give users the opportunity to assess for accuracy and short-circuited learning.
ATS’s Blog Series on AI-Discerning Students
Whether you want AI tools to be part of your course or not, students’ knowledge and attitudes about these tools have a huge impact on your learning environment. ATS’ ongoing blog post series on AI-discerning learners offers research and strategies you can use to help your students develop their sense of judgment about whether, when, why, and how they use AI.
Workshops
Academic Technology Solutions offers three workshops about generative AI tools, which are designed to cover a range of familiarity with and pedagogical orientations toward this technology and its role in teaching and learning. Interested instructors can review the options below and find the next available session on the ATS workshop schedule.
Introduction to UChicago-Supported AI Tools
Learn about AI tools available to the University of Chicago community—such as PhoenixAI and NotebookLM. Discover the features of these tools, including how to ground them on sources that you provide and how to create custom assistants that can be shared with others. Review effective prompting techniques while considering issues of transparency and ethical use.
Meet the Generative AI Moment with Authentic Learning
This one-hour workshop will both provide a high-level explanation of how generative AI tools work, along with insights from colleagues across disciplines at UChicago about how they’ve been approaching this change in the educational landscape. Attendees will receive context to make an informed decision about how to approach these tools and address the topic with their students. By providing examples of how they might design assignments and communicate their expectations in this new context, we hope to provide attendees with everything they need to feel confident that learning remains authentic even in a time that computer-generated text may approach the quality of human intellectual work.
Generative AI And Academic Integrity: Some Considerations
This workshop is geared toward instructors who are looking for ways to limit dishonest use of these tools, whether in academic writing, computer code assignments, or other fields. Attendees will consider various digital tools and pedagogical techniques that have been proposed to combat dishonest behavior with AI and evaluate the strengths and weaknesses of each. Lastly, attendees will have an opportunity to discuss with your fellow instructors how best to approach this difficult problem.
Using AI Tools To Promote Meaningful Learning
This workshop is intended for instructors who have decided to allow students to use these tools in a supervised manner rather than banning them. It will offer useful ideas, room to experiment, and a space to talk through your approach with colleagues. Attendees will consider the impact AI tools are having or stand to have on their teaching; articulate their vision for AI’s role in their class; and explore ways to integrate AI into meaningful learning activities.
Generative AI Discussion Facilitator’s Guide
For instructors who are already familiar with the challenges and opportunities of generative AI in education and would like to lead a conversation among colleagues about this issue in order to consider the approach they’ll take to the use of generative AI tools in their teaching and how they’ll communicate about it with their students, this guide includes suggestions about how to plan this conversation, some suggested discussion questions, a selection of readings to consider sharing with colleagues, and a template for a discussion handout. Readers are welcome to download and modify this discussion guide for their purposes.