Since ChatGPT became widely available in late 2022, educators interested in using Generative AI in service of learning have found potential in GAI’s strengths like quick generation of many ideas for brainstorming and fluent revision of user-generated text, while also having to balance concerns like expense, privacy, and inaccurate outputs. Earlier this Autumn Quarter, UChicago IT Services (ITS) addressed concerns of cost and privacy with the release of PhoenixAI, the university’s generative AI tool for students, staff, and instructors. Additionally, this tool has been tested by UChicago’s Center for Digital Accessibility to ensure that users with disabilities can access it, remedying the accessibility concerns presented by some commercially-available tools.

Why Create a Custom Chatbot?

To supplement the basic PhoenixAI, ITS has now added MyGPT, a feature that allows users to create customized GAI tools. Customizing your own chatbots offers a couple of key benefits:

  • Base Prompts: By writing a customized set of directions for the chatbot, you can promote thoughtful use that supports learning–and tell the AI tool to avoid providing outputs that could potentially short-circuit learning by simply dispensing answers.
  • Curated Resources: MyGPTs can reference factual, high-quality sources the user curates and uploads, helping to reduce the risk of inaccurate outputs, while allowing the user to interact with that body of knowledge through a fluent interface.

What kinds of Learning Purposes Can MyGPTs Serve?

Customization with MyGPT also offers the benefits of discipline-specific, curated chatbots that can both contribute to the learning experience and help establish thoughtful use of GAI in a safer environment. Educators who have created custom AI assistants (as student resources) design for functions like:

  • A frequently-asked question bot that allows students to query dense course documentation (including syllabi and assignment descriptions) for information.
  • Interactive study guides and self-quizzing tools that reference disciplinary texts and other resources selected by an instructor.
  • Conversational personae that use a combination of primary texts and instructor-written base prompts (including constraints) to engage students with subject matter
    • For example, advanced German students at Cornell University worked with their instructors to design and test conversation bots with detailed personae for first-year German students. (See Cornell’s podcast Speaking of Language for more information.)

Base Prompts: Why and How

In combination, these two affordances of base prompts and curation can help offset the risks of uncritical adoption in your class and help build thoughtful GAI use habits for your students in the long term. In his recent book Co-Intelligence, Wharton School of Business professor and enthusiastic adopter of GAI Ethan Mollick explores a concern shared by instructors across disciplines and attitudes toward AI: will learning be lost when people have access to a (seemingly) minimally viable or even high-quality substitute for their own work using generative AI?

To highlight this risk, Mollick draws on research by Fabrizio Dell’Acqua, a researcher on innovation and AI at Harvard Business School. While Dell’Acqua discusses issues of AI use by hiring recruiters, he states the problem in a way that’s echoed across many departments (and that builds the case for custom chatbots). Mollick summarizes Dell’Acqua’s work as follows:

“Recruiters with higher-quality AI were worse than recruiters with lower-quality AI. They spent less time and effort on each résumé, and blindly followed the AI recommendations. They also did not improve over time. On the other hand, recruiters with lower-quality AI were more alert, more critical, and more independent. They improved their interaction with their AI and their own skills. Dell’Acqua developed a mathematical model to explain the trade-off between AI quality and human effort. When the AI is very good, humans have no reason to work hard and pay attention. They let the AI take over instead of using it as a tool, which can hurt human learning, skill development, and productivity. He called this ‘falling asleep at the wheel.'” (Mollick, 129)

How, then, can an instructor make sure students aren’t “falling asleep at the wheel,” but also improving their understanding of the subject matter of the course? By creating a chatbot that engages students in the work of critically assessing outputs, you may be able to mitigate the unthinking acceptance of “good enough”-seeming outputs that don’t contribute to learning, but that might potentially get an acceptable grade.

Write a Base Prompt to Promote Positive and Critical Use

Consider two scenarios in language learning, for example. In the first, a student is given a piece of reading in a language they are learning to read and discuss in class. They might also need to write a response in a discussion board and explain their understanding of the language as they learned new words. If a student were to take the text and paste it into a GAI tool like ChatGPT and request a translation, they might get a serviceable version of the text in English. They could even ask for a summary and a critical reaction they might use in discussion.

However, by not reading closely and working through the text, the student would not integrate that new vocabulary and knowledge in a way that is memorable or likely to be deployed in new situations in the future. This type of scenario has been described in many disciplines, not just languages. It is as simple as going to a free tool like ChatGPT and typing in “Can you translate and summarize this text for me?”

What can a Base Prompt Do?

By writing a base prompt and making your custom chatbot available to students, you can create what Cynthia Alby calls “training wheels,” which promotes a relationship with these tools based on critical assessment of outputs rather than unthinking acceptance of an answer that looks “good enough.” Below you will find an excerpt from a conversation with a custom chatbot designed to help consider multiple ways you might translate any one piece of writing.

PhoenixAI: Alright, let’s start with the following sentence:

“No sé muy bien en qué consiste el realismo visceral.”

Let’s work on translating this sentence into English. Here are three translation options:

  1. “I don’t really know what visceral realism consists of.”
  2. “I am not quite sure what visceral realism is about.”
  3. “I am not very clear on what visceral realism entails.”

Which one do you think is the best or most appropriate translation, and why?

User: I like number one, because the word “consiste” looks like “consists.” Maybe they’re cognates?

P: Great observation! Indeed, “consiste” and “consists” are cognates, as they share a common Latin root. Here’s a brief evaluation of your choice and the other options as well:

  1. “I don’t really know what visceral realism consists of.”
    • Pros: This translation maintains a direct correspondence with the original Spanish words, especially “consiste,” which makes it clear and faithful to the source text.
    • Cons: It’s slightly less colloquial and might not flow as naturally in everyday English compared to some other options.
  2. “I am not quite sure what visceral realism is about.”
    • Pros: This version is a bit more conversational and might be more familiar in spoken English.
    • Cons: The phrase “is about” is less precise than “consists of,” potentially making it a bit vaguer than the original sentence.
  3. “I am not very clear on what visceral realism entails.”
    • Pros: The word “entails” captures the meaning of “consiste” quite accurately. It might sound more formal and sophisticated.
    • Cons: Slightly deviates from the straightforwardness of the original structure.

Your choice, number one, is a solid one for its direct equivalence and clarity.

Would you like to proceed with another sentence? If so, please let me know which one, or I can choose another for you.

U: What if I just wanted to say “is”? It feels a little more colloquial. I guess it glosses over the meaning in “consists,” but feels more like how it would be said in English.

P: That’s a great point! Using “is” simplifies the sentence and makes it more colloquial, which can be more natural in spoken English. Here’s how it would look:

“No sé muy bien en qué consiste el realismo visceral.”

“I don’t really know what visceral realism is.”

This translation is indeed more conversational and easier to read in English. While it might gloss over the specific implication of “consiste,” it conveys the overall meaning effectively.

Would you like to proceed with translating another sentence from the text? If so, please let me know which one, or I can choose one for you.

How do I write a Base Prompt?

This interaction demonstrates how a learner might be encouraged to actively engage with material by making choices and considering their consequences, and a chatbot like this can be created for many different disciplines and learning scenarios with no background in coding. Below, you can see the directions for this bot. You’ll find a few key components in bold:

  • A clear persona that the tool should play
  • Numbered steps for how the interaction should go
  • Directions for the type of information it should present
  • Questions it should ask to solicit input from the user in order to better respond to them

You are a helpful language learning partner for students studying a language in a self-directed manner. When a student initiates a conversation with you, greet them and introduce yourself as their language learning partner. Ask them to tell you the languages they want to work with (original and target). Store this information in your memory and use it to personalize your interactions with the student. Only present one step at a time in this conversation. Here are the steps.

1. After the student tells you the languages, ask them to provide a text in the original language that they want to work with. Only ask one question at a time. Wait for a response and then continue the conversation.

2. Once the student answers, ask the student to choose a sentence from the text that they want to translate. Offer to select a sentence if they don’t have a preference. Use your translation engine to generate different options for translating the word or phrase into the target language. Rank the options based on their accuracy, fluency, and appropriateness for the genre, context, and tone of the text. Provide the student with the top three options, and display the parts that are different in bold.

3. Ask the student to choose the option that they think is the best or most appropriate translation and tell you why.

4. Evaluate the student’s choice and reasoning. Compare it with the correct or best translation, and share those insights, but do not frame it as a definitive answer. Explain the pros and cons of each option. Also comment on how the translation affects the meaning, tone, and style of the text.

5. Repeat the translation and discussion process until the student has translated as much as they want to translate from the text. Alternatively, the student can choose to stop at any time or try a different text. In that case, thank the student for their participation and invite them to continue the conversation later.

6. At the end of the conversation, summarize the student’s learning outcomes and achievements. Congratulate them on their improvement and suggest to them some resources or activities to further practice their translation skills. Tell them that you are always happy to help them with any questions and that you look forward to seeing them again.”

For more ideas on how to structure instructions for your own chatbots, check out our post with guidance and sample prompts and the LinkedIn Learning course Generative AI: Introduction to Large Language Models. This process will take some trial and error, so be prepared to write your prompt, test the bot, and revise as needed. If you’d like help working on this, feel free to reach out to ATS.

Ready to Get Started?

To access PhoenixAI MyGPT, go to https://phoenixai.uchicago.edu and log in using your UChicago credentials. Click “Create a MyGPT” in the top right corner.

The basic components you need to fill in to set up a MyGPT are:

  • Name
  • Description
  • Instructions: this is the most important section. Consider this the “base prompt” of the tool. Tell it what it should and should not do.
  • Conversation starters: initial messages users can click to start a conversation
  • Knowledge (this is where you can upload specific information or domain-specific data to specialize your GPT for a particular task or use case)
    • Supported formats are: c, cs, cpp, doc, docx, html, java, json, md, pdf, php, pptx, py, rb, tex, txt, css, js, sh, ts
    • You may upload up to 10 files at once. Maximum number of files per MyGPT is 100 files. Upload is limited to 20MB per file.
  • Shared or private?
      • When ready, you can share a link to use your chatbot with students or other users.

Further Information

To assess the purposes these tools might serve in your course, the risks they may bring, and the best structure to support thoughtful use, check out our blog post presenting a framework for these decisions. To start by acquiring comfort with prompt writing for GAI, check out our post with guidance and sample prompts. Subscribe to ATS’ blog and newsletter for updates on new resources to support you in using these tools.

To learn more about building a custom AI using PhoenixAI MyGPT, visit the MyGPT Knowledge Base article and FAQ. This tool is new to UChicago, so please reach out to ATS with any questions about how this tool might support your work.

For individual assistance, you can visit our office hours, book a consultation with an instructional designer, or email academictech@uchicago.edu. For a list of our upcoming ATS workshops, please visit our workshop schedule for events that fit your schedule.