In December, I surveyed composition students at Pitt to get their take on AI in our courses. I keep thinking about this student’s response:
"I think students should not [learn about AI in their courses], however, in order for that to happen the structure of classes must change to not allow for AI to be used. Using AI is addictive so honestly if I can use it I will."
It’s that last sentence that gets me. This isn’t the only time I’ve seen a student talk about “addiction” to AI. I read this as a cry for help: help me to avoid self-sabotaging uses of AI, because without a course structure that discourages me, I’m likely to make bad choices.
So, how can we help students make smart choices about AI in their writing?
AI is changing writing, and the worst thing we can do now is pretend that’s not true. Writers encounter AI in increasingly sophisticated autocompletion, app integrations, and their word processing programs. At this point, corporate writing is 10–24% AI-assisted. Students see ads from Grammarly telling them they can “win” at writing and influencers peddling AI “homework help.” ChatGPT even got a Superbowl ad this year.
With AI at our fingertips and in our faces, the choice of opting out of it while doing any form of digital composition is getting increasingly difficult. The default is AI, and settings to turn it off, if they even exist, are well hidden. AI’s incursion into our writing environments and the efficiency rhetoric that accompanies it is almost completely out of our control, and our students are struggling to make good decisions about their use of AI. The student who writes that “using AI is addictive” is, I think, asking for a respite from the relentlessness of AI in their life.
What can a writing teacher or program do?
AI-aware curriculum
We may not be able to control the degree of AI’s encroachment in writing environments our students use, but we can control how we structure our own course’s engagement with AI.
A writing curriculum should address AI skills, and first-year composition (FYC) should promote some critical AI literacy, but not every writing class needs to integrate AI. However, all classes—even those outside of writing programs—should acknowledge that AI is now out there. That means revising writing assignments that no longer lead to desired learning outcomes. If AI completes an assignment too easily, some students may be tempted to use it to shortcut learning goals—even if they know it’s a bad choice.
How we integrate AI into our writing courses should follow from our curricular and course goals. What do we think students should learn in our courses? What should our undergraduate majors know and be able to do when they graduate? We must tune our courses to meet those goals, acknowledging AI as part of our shared writing landscape, and providing structure in which students can make informed choices. I think of this approach to curriculum and course design as AI aware.
How we integrate AI into our writing courses should follow from our curricular and course goals.
For some courses, being AI aware means actively refusing AI. The Refusing GenAI in Writing Studies guide by Jennifer Sano-Franchini, Megan McIntyre, and Maggie Fernandes, provides a framework for “individuals and/or groups [to] consciously and intentionally choose to refuse GenAI use.” They outline principles about linguistic homogenization, relationships between language and power, ideological bias in technologies, labor issues, and environmental impacts of AI as reasons to refuse AI, arguing that “refusal can be a principled and pragmatic response to the incursion of GenAI technologies in college writing courses.” What I like about this guide is that it acknowledges AI as shaping our writing environments, and then it provides resources to actively work against AI’s negative influence. Further resources for this approach are available in Refusing, Rejecting, and Rethinking Generative AI in Writing Studies and Higher Education, curated by Maggie Fernandes, Megan McIntyre, Kat Gray, and Cara Marta Messina.
Other courses might adopt an AI minimalism approach. Dan Cryer, who teaches at Johnson County Community College outside Kansas City, offers what he calls “critical minimal adoption” of AI. His approach asks, “What’s the least I can responsibly do with AI in my teaching?” He urges teachers to integrate AI “creatively and responsibly” and provides examples of teaching against AI as well as teaching with AI. We need to acknowledge AI and have conversations with students, but AI training is not our job. He argues—persuasively, I think—that businesses should handle any AI training, and writing teachers should instead focus on what we already do well: teach rhetorical judgment, critical reading, and writing.
Finally, some courses might embrace AI integration. These courses would actively encourage students to explore AI tools and applications. What’s great about this approach is that students can do original research on AI, share it with the class, and even faculty can learn from the work. My Writing Machines class at Pitt, co-developed with Matt Burton in our School of Computing and Information, asks students to explore co-creation with AI—and I’ve learned as much as them each semester I’ve taught it. This course counts for both our Public and Professional Writing major and our Digital Narrative and Interactive Design major at Pitt, which both emphasize hands-on experience with digital composing.
Also embracing AI, Tim Laquintano at Lafayette College offers an advanced professional writing class, Writing and Artificial Intelligence, which asked students to experiment with prompts and settings in LLMs, build a bot representing their approach to work in the class, and compete with AI for changing the reading level on policy documents. The work of that class is less focused on product and more focused on students’ critical assessment of their own and AI’s capabilities.
To document embracing AI in general Composition classes, Pamela Baker at University of Central Florida asks for a detailed GenAI Disclosure Statement, which links to their conversations so she can track their prompts and processes. Not all of her assignments allow for AI, but for any use of AI, students are required to submit this statement. She told me that this information from students is invaluable: “Having those inputs and outputs has allowed me to see how students are using it, which LLMs they're choosing, and trace their thinking around its use.” She’s seen students begin to use AI more critically because of this open approach—and even admit that using AI is less exciting once it’s not forbidden!
Whatever the approach, being AI aware means being deliberate about the integration of AI, and aligning that integration with our stated learning goals, so that we can help students make good choices.
In the classroom
You can work on your deliberate approach to AI integration by doing an AI-aware audit. First, look at the goals of your program, course, or assignment. Work with colleagues if you can—this kind of work is better done collaboratively! Break down your goals and outcomes into the smaller tasks used to achieve each, and ask: how does the presence of AI in our writing environments change the strategies and pathways we’ve used to get to this goal? Should we revise our goals? revise our strategies? How can we achieve these goals using pathways that acknowledge AI?
Here are some questions for conducting an AI-aware audit of your program, course, or assignment:
What are the stated student learning outcomes?
Why are these the goals, and should these goals be reaffirmed or revised?
Where do students go after graduation, or this course, or this assignment?
What do you and your faculty colleagues find important and valuable about this program, course, or assignment? What do students appreciate? A survey of students in a particular course or major can help to get these answers, if your program doesn’t already know them.
Then ask some questions related to AI:
How does AI change the pathways to those student learning outcomes?
How can assignments be revised in light of AI’s capabilities?
What program-wide policies will encourage students to make responsible choices about AI use?
What AI skills do students need to know?
What additional assignments may be needed to get students to reach these goals?
Are there additional classes that should be added to the curriculum to meet these needs?
MIT Sloan Teaching & Learning Technologies offers an excellent resource, 4 Steps to Designing an AI-Resilient Learning Experience, which echoes some of my advice and includes a toolkit of additional questions.
Share your examples of refusing, minimizing, or embracing AI in the comments!
Thanks for reading AI & How We Teach Writing! Subscribe and come back to hear more ideas for AI-aware approaches to teaching writing.
I really like the frame of awareness over literacy.
Really great article, and very timely--there is a real opportunity here for faculty to leverage the current GenAI "disruption" of writing courses to renew our focus on writing as a human-centered process in parallel with our efforts to engage the tools in our classrooms.