Not every course needs to integrate AI, but they all need to acknowledge the fact that AI is a part of our writing landscape. In my last post, I talked about the importance of designing AI-aware courses and offered some questions to examine learning goals and AI’s effects.
In this post, I drill down to some examples of AI-aware versions of common writing assignments: annotated bibliographies, reading responses, and reflections. By using pathways that acknowledge AI, we can achieve learning goals that support students’ writing, reading, and thinking development. And we can do it without resorting to unreliable AI detectors.
Stress testing assignments with AI
Prior to using assignments with students, think about stress-testing them with AI. What are the vulnerabilities in your assignments that AI now exposes? Some previously great assignments are now easy for AI to complete, which then presents students with a temptation to shortcut learning. The goal is to set up an assignment that, by default, leads students to make good choices about AI use.
To stress-test your assignment, first break down the tasks required for completing the assignment. Then ask: What are the goals for each task in the assignment? Which steps are most vulnerable to AI? Test this by prompting AI to do one or more of the steps. Which parts of an assignment do you want students to do on their own, without AI? Which parts can or should they use AI for? You can see Tim Laquintano stress-test and redesign an assignment he’s used at Lafayette College in a resource he and I put together for the University of Pittsburgh Writing Institute.
Let’s look at a few common writing assignments and how we might approach their redesign by either resisting or embracing AI.
In the classroom
Once you break down each assignment into its component tasks and goals, you can consider how you might approach an AI-aware redesign by resisting or embracing AI.
Annotated bibliographies
Writing classes have often used annotated bibliographies as a scaffolding step to help students prepare for research. However, AI is very good at this task. I tried the following prompt with both ChatGPT and Claude, and received a decent draft of an annotated bibliography. Each tool provided different sources, so if I were a student wanting to shortcut this assignment using AI, I could combine the results, rewrite them lightly, and turn in my assignment. Try it yourself in an AI platform of your choice:
Please give me an annotated bibliography on the pros and cons of traffic calming measures on college campuses. I would like 4, high-quality sources, cited in APA format, and about 5 sentences explaining the argument of each source and how the source contributes to a potential research paper on the topic.
How can we revise an annotated bibliography assignment to help students make good choices about using AI?
First: what is the goal of having students do an annotated bibliography? I might say: to practice finding and evaluating sources; to get familiar with campus library resources; to prepare for a project that draws on sources to make an argument; to discover academic approaches to personal interests; etc. Now: How can we meet those same goals while acknowledging that AI does this task well?
Here's a potential resisting AI approach to meeting the goals of a traditional annotated bibliography: Have students write an “I-Search” narrative about their research process, describing why they chose a topic, how they began their search, who they talked to, what threads they followed and why, what questions remain for them, etc. The I-Search approach, introduced by Ken Macrorie long before AI, emphasizes personal discovery, student agency, and writing process over research products and argument. Those are moves students will benefit from now, too. Stacey Waite at University of Nebraska has students prepare for research by making a list of what they don’t know about a topic, what they want to know, and what they can’t know about a topic. You could ask students to begin that list in class, minimizing distractions and temptations of AI. Elaine Minamide, an instructor at Palomar College, describes her approach and offers example student I-Search papers. She notes that the I-Search activity “encourages discovery as opposed to conclusion” and helps students find their own voice and style by narrating their experiences. ReadWriteThink offers a helpful strategy guide on I-Search that’s geared towards secondary education, but easily adapted to higher ed.
An embracing AI approach: Have students go to a site like Elicit or SciSpace and ask their research question. These sites allow users to filter for “top quality” journals, provide abstracts, etc. Both sites connect to the Semantic Scholar database, which provides links to real published research papers, so the chances of hallucinations are minimal. By showing students how to evaluate sources by number of citations, quality of journal, and relevance of the abstract to their research question, you can help them learn good research skills at the same time that they pick up some skills in working with AI.
Reading responses
We often have students do low-stakes writing to prepare for larger, more complex assignments. One example is reading response posts where students respond to a question about an assigned text via an LMS discussion board, like Canvas. I’ve done this for both grad and undergrad classes. But, as one student recently told me, these discussion posts are now “AI central.” Ouch.
What’s the goal? Often the point of reading responses is to hold students accountable for doing their assigned reading, have students interact with each other, and motivate prep for a discussion in class—or occasionally serve as a substitute for in-person discussion.
A resisting AI approach: You could eliminate this kind of assignment altogether and shift the work to in-class discussions. Or, you could have students print out a reading and hand-write their questions and responses. Platforms like Perusall are good for doing this digitally. Jason Crider, at Texas A&M, offers a “trick” assignment that would work in this context; he calls it the “paranoid memorandum.” First, have students break into groups to work on a brief writing assignment. Each group is given an instruction to either use AI or not use AI in completing the assignment. Groups then share their work with the entire class and guess whether each group used AI or not. The trick is that all of the groups are assigned to not use AI. The discussion, Crider explains, can focus on what it means to be readers in a context where anything is potentially AI. This approach will only work once, of course!
An embracing AI approach: You can have students interact with a text using AI before assigning it. First, give students a heads-up that they’ll be working with a particular text in class, but tell them not to read it yet. In class, give them 10 minutes to ask an AI of their choice anything they want to know about the text: a summary, which texts or debates it engages with, how it connects to concepts in class, etc. Claude, ChatGPT, or NotebookLM work well for this task. Then, run a discussion about what they learned about the text, and what further questions they have. Finally, you can assign students to read the text themselves prior to the next class. Primed with these questions and discussion, they’ll be more prepared to understand a complex text. My Pitt colleague Matt Burton came up with this idea, which worked great in our co-taught Writing Machines class.
Reflections
Some advice on discouraging students from using AI focuses on having students make personal connections in their writing. But some teachers report that students are even using AI for their reflections—offloading the discussion of their thoughts and writing development to AI. Unequivocally, I’d say: this is not an appropriate use of AI. How can we prevent this from happening in our classes?
A resisting AI approach: In class, have students annotate their own work or do pen-and-paper brainstorming: how did they feel when writing this assignment? What challenges did they encounter? What did they learn? Or, have students interview each other about their approach to the assignment. They can develop questions for each other or you can develop these questions as a group, guiding students through the metacognitive work of how to understand their own thinking. They can then interview each other in class, which incidentally means they (1) get to know each other, fostering class community; (2) learn from each other about different writing processes and ideas; and (3) have some accountability to each other, not just to you. (Imagine a student improvising their way through an interview with a peer when they’ve used AI to complete their assignment—awkward!)
An embracing AI approach: You could have students do a human vs. machine contest, where they try to make the most human reflection using AI, then test it out on each other. Which part was done by AI? How can they tell? What did they learn from this exercise and what didn’t they learn?
For all these approaches, especially those resisting AI, there’s the challenge of having limited time: you can’t have them do all the work in class! Consider the balance of where AI might be helpful, and how students can learn to interact with it in ways that support their learning.
P.S. If you’d like to see an advanced example of stress testing a common genre of writing, check out Arizona State University professor Andrew Maynard’s recent post, where he used OpenAI’s new Deep Research tool to write a PhD Dissertation! (Spoiler: Andrew put in a lot of work, and the result was just OK and potentially passable.)
Thanks for reading AI & How We Teach Writing! Subscribe and come back to hear more ideas for AI-aware approaches to teaching writing.
I love this - so glad to find someone getting into the details of how AI impacts actual assignments and ways to flexibly approach each. The AI resist / embrace framework is also extremely useful when thinking about how to redesign more traditional assessments. The annotated bibliographies example was especially relevant for me as I embark on our traditional research paper with my HS class. Thanks for sharing. These are great resources!
Annette, this is another fantastic set of tips. I have had students use SCITE, which answers a research question and puts sources into a narrative. I have had them deconstruct the results from SCITE by looking up the original sources and see if 1. are they, in fact, accurately depicted in SCITE and 2. Are they adequately being put in dialogue with other sources? SCITE can't create a lit review but does a great job amassing sources that are "near" each other. Your post is thoughtful and gives great terms to start thinking about.