"

19 Case Study: GenAI peer review

The scenario

In their writing class, students Noah, Hanan, George, and Shayla were tasked with a peer review of their final papers, a five-page personal narrative on the theme “Tell me about a time you struggled.” Their professor, Dr. Samdi, emphasized the importance of this process, noting it was worth 15% of their final grade and designed to help them refine their work before submission. They were instructed to provide feedback on tone, content, and writing style. The group members worked separately over the weekend, leaving comments directly in each other’s Google Docs.

Shayla, however, ran into a problem. On Saturday night, while trying to complete the reviews, she felt overwhelmed. The papers were all very personal, and she worried her comments might be too critical or, conversely, not helpful enough. Unsure of how to approach the feedback criteria and feeling like she’d only be repeating what her peers had already said, she decided to seek an alternative solution. Shayla downloaded her group members’ papers, uploaded them to Microsoft CoPilot, and prompted the GenAI to generate feedback. She specifically asked for suggestions on “improving the flow and clarity of the narrative” and “strengthening the emotional impact.” She then copied and pasted the GenAI’s suggestions, which included academic-sounding phrases like “The author could benefit from a more nuanced exploration of the subtext” and “Consider adjusting the narrative arc to enhance reader engagement,” into each of her group’s documents.

When the group met in class on Monday, they pulled Shayla aside. “Your comments were… interesting,” Noah said, a puzzled look on his face. Hanan added, “Yeah, they sounded really formal and generic, almost like they were from a textbook.” When Shayla explained that CoPilot had helped her because she didn’t understand the assignment, her group members were visibly upset. George sighed, “We spent hours on your paper, giving you specific suggestions. We expected the same from you.”

Considerations with Shayla’s actions

Google Gemini generated four students around a table in a classroom with whiteboards on the wall
Google Gemini generated image of the students in this scenario. It is the second attempt, since the first attempt included 5 students, and Google Gemini admitted that ” The image generation tool sometimes takes creative liberties and can misinterpret the number of subjects requested in a prompt.”

Ethical Concerns: Using an AI tool to complete a peer review assignment could be seen as a form of scholastic dishonesty. It bypassed the learning objective of the task, which is to develop critical reading and analytical skills and provide structured feedback.

Privacy and Risks: The papers were all personal, meaning they could have contained private information about the group members. None of the group members consented to having their work uploaded to a GenAI. Shayla’s classmates had a reasonable expectation that their personal stories would be read only by their trusted peers and professor. Shayla should have asked before uploading content that wasn’t hers.

Fairness: By using a GenAI, Shayla didn’t put in the required effort to read and understand their papers. This suggests she didn’t trust her own ability to provide valuable feedback. This creates a sense of unfairness, as she received a benefit (detailed feedback for her paper) without contributing equally to the group.

 

 

 

 

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

GenAI+U: A Student Learning Experience Copyright © 2025 by University of Minnesota Libraries is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book