18 Case Study: Using AI in a class group project
The scenario
Amina, Dominic, and Joe were collaborating on an assignment for their public policy class on protected speech on college campuses.
As they started working, Joe uploaded information about past campus speech incidents into a GenAI tool he had access to because of his major. He had found this information in a large online collection of data and hoped the GenAI would help the group efficiently summarize the issue.
But Amina looked at him with concern and asked “Did you get permission to use all that information?” Joe paused, realizing he hadn’t thought about whether the information was protected by copyright, if it was private, or if he needed to ask permission to use it.
Meanwhile, Dominic used the GenAI to summarize a well-known legal scholar’s article on student speech. It was a great summary, but it was almost word for word the same as the original article’s abstract. He wasn’t sure what he should do.
As they reviewed the GenAI’s outputs, Amina noticed that the summarized examples of controversial speech generated by the GenAI involved students of color. Dominic pointed out that the GenAI missed key points when summarizing articles they uploaded.
Things got tense when Joe’s parts seemed to be better written, thanks to his access to premium tools – something Anna and Dominic didn’t have. They began to question the fairness of their collaboration and the outcome of their project.
The ethical concerns in this scenario include:
-
- Joe’s use of a dataset of campus speech incidents he had accessed from a data repository. This has ethical concerns around the Terms of Use of the data. Joe should contact the data repository owners to make sure that this was okay to do.
- Dominic’s use of GenAI to summarize a legal argument that comes back with essentially plagiarized text is a red flag for academic integrity if it that summary is used in the class assignment without attribution.
- Anna’s notice of a troubling pattern that most examples of controversial speech involved students of color shows that there is likely some data bias in the training data.
- Dominic noticed that GenAI missed some information in article summaries.
- Joe’s use of a premium tool disadvantaged his group-mates who didn’t have access to the more professional tool.