How to Think About Generative AI in Your Teaching

Generative AI Tools – What are they and what can they do?

Since OpenAI rolled out ChatGPT in late 2022, higher education (among many other industries) has had to adapt to the unprecedented capabilities of large language models (LLMs) and other generative AI tools. These new tools generate surprisingly effective, novel responses to human prompts in the form of writing, code, images, and even video. ChatGPT has been used to generate outputs satisfying the requirements of the Bar Exam and the MBA exam for the Wharton School of Business. The wide availability of these tools combined with their impressive, and consistently improving outputs, has created many questions regarding academic integrity, the viability of certain kinds of assessments, and the future of work itself.

How Do LLMs function?

The most widely known of these tools, ChatGPT, is a large language model (LLM). Using significant processing power, large data sets, and human training, LLMs generate probable responses in fluent language to user prompts. LLM development is a complex process that combines deep learning, natural language processing, hardware acceleration, and programming ethical rules (guardrails). 

Ethical Use and Data Privacy

There are a few concerns you should keep in mind before deciding to use an AI tool or requiring that your students use one, including user privacy, equity of access, and outputs that contain falsehoods or biases. It is not recommended to input any proprietary or protected data into any of these models via individual accounts.  Remember that in general, whatever you enter into this tool is likely to become part of the tool’s training data, and some tools may not be designed with accessibility for all users in mind. Finally, the combination of unvetted training data and the generation of probable responses means that truth is not a priority in these outputs.

Clipart of a contract with a check mark on it.

While there are many opportunities and concerns to consider from the instructional aspects of the University’s work, there are also important questions to answer before procuring generative AI tools, many of which stem from existing University policies to ensure technology is being used responsibly.

See UChicago IT Services’ Generative AI Guidance page for guidance on protecting university data, software procurement requirements, and more.

Generative AI Weaknesses

Bias

The responses from these tools can reflect the biases in the data sets on which they’re trained. Information and data on the internet runs the gamut from extremely reliable to extremely biased, based on the source. One must always think critically about potential bias that can be reflected in the responses from these tools.

Veracity and “Hallucinations”

These tools are made to generate probable responses, but not necessarily accurate responses. The fluency of the generated text is impressive, but it often outpaces its veracity, which can be misleading, as the tools can appear very confident in their responses. Educate students on investigating claims, facts, and even sources the tools cite, which are sometimes fabricated.

Clipart of a thought bubble with an incoherent scribble inside, representing confusion

Inconsistent Math Abilities

When ChatGPT first gained wide notoriety, many writers and educators noted that the tool was still severely limited when it came to math-related questions. While it’s been noted some tools have improved in this regard, particularly GPT-4, many users have reported inconsistent results, especially with basic mathematics. Similarly, testing as of Summer 2023 has shown inconsistencies even with a math-specific addition to GPT-4 when it comes to college-level math, as well. It’s important to be thoughtful about the way your prompts are phrased when you’re seeking assistance with math and, as always, to be critical of the results you receive. Ars Technica’s article on math and AI is a good place to start thinking about this.

Course Design and Student Communication

Clarifying and Articulating Learning Goals

An important first step in designing your course and its assessments (whether using or not using AI tools) is stating the learning goal you have in mind and how each specific activity serves that learning. As UChicago instructors noted in a Spring 2023 panel on AI, the more students know the value of the activity, the more likely they are to engage with it as designed, versus finding shortcuts.

Designing Your Assignments

Authentic Assessments Without AI Tools

While it is tempting to seek tools that claim to detect AI-generated outputs, most experts in teaching and learning have come to consider that a losing game. A more productive endeavor may be to communicate openly with students and assess authentically to prioritize learning. Consider assessments that are more frequent and have lower stakes, or consider using scaffolding to break up larger assignments into iterative parts, like larger papers submitted in multiple stages or drafts. For more ideas, see our blog post on “AI-resistant assignments.”

Creative Activities with AI Tools

If you’re interested in using AI tools to promote learning in your classes, there are a number of ways you can do that. A few examples include:

  • Generating examples and explanations of abstract concepts, with specific audiences and reading levels in mind
  • Leveraging it as a brainstorming tool
  • Drafting quiz questions and code challenges
  • Creating text for students to fact check, practicing disciplinary research skills
  • Allow students to revise and reorganize their drafts using an AI tool

Ethan Mollick’s Substack, One Useful Thing, from which some of these examples were borrowed, is a good further resource for instructors who are excited about AI as a teaching tool. Additionally, Cornell University’s CU Committee Report: Generative Artificial Intelligence for Education and Pedagogy has discipline-specific use cases in the appendices.

Policy for Use and Citation

While policy is only one part of an effective approach to AI in your classes, it’s nonetheless a very important one. Whether you’re allowing the use of AI tools for certain assignments or banning them altogether, you and your students will benefit from a clear statement of what role AI should play in the class. The following resources, including two from the Chicago Center for Teaching and Learning (CCTL),  may benefit you in crafting that language:

Additional Resources

Workshops

Academic Technology Solutions offers three workshops about generative AI tools, which are designed to cover a range of familiarity with and pedagogical orientations toward this technology and its role in teaching and learning. Interested instructors can review the options below and find the next available session on the ATS workshop schedule.

Meet The Generative AI Moment With Authentic Learning

This workshop offers a high-level explanation of how these tools work, along with insights from colleagues across disciplines at UChicago about how they’ve been approaching this change in the educational landscape. Attendees will receive context to make an informed decision about how to approach these tools and address the topic with their students.

Generative AI And Academic Integrity: Some Considerations

This workshop is geared toward instructors who are looking for ways to limit dishonest use of these tools, whether in academic writing, computer code assignments, or other fields. Attendees will consider various digital tools and pedagogical techniques that have been proposed to combat dishonest behavior with AI and evaluate the strengths and weaknesses of each. Lastly, attendees will have an opportunity to discuss with your fellow instructors how best to approach this difficult problem.

Using AI Tools To Promote Meaningful Learning

This workshop is intended for instructors who have decided to allow students to use these tools in a supervised manner rather than banning them. It will offer useful ideas, room to experiment, and a space to talk through your approach with colleagues. Attendees will consider the impact AI tools are having or stand to have on their teaching; articulate their vision for AI’s role in their class; and explore ways to integrate AI into meaningful learning activities.

Generative AI Discussion Facilitator’s Guide

For instructors who are already familiar with the challenges and opportunities of generative AI in education and would like to lead a conversation among colleagues about this issue in order to consider the approach they’ll take to the use of generative AI tools in their teaching and how they’ll communicate about it with their students. This guide includes suggestions about how to plan this conversation, some suggested discussion questions, a selection of readings to consider sharing with colleagues, and a template for a discussion handout. Readers are welcome to download and modify this discussion guide for their purposes.