Artificial Intelligence poses ethical dilemmas in classrooms

Artificial Intelligence continues rapidly developing Photo Credit: mikemacmarketing, Flick

By Emma Loenicker

Artificial intelligence, a technology less than a century old, is most recently making headlines for a language processing application that generates human-like text, called ChatGPT. Although flawed, particularly in its inability to produce coherent responses to some user prompts, (sometimes deviating from its argument in order to continue generating text), certain professors, such as Wharton graduate professor Christian Terwiesch have expressed satisfaction with its “superbly explained” prose.

Since its release in November by the American research laboratory OpenAI, ChatGPT has been provoking anxiety about the growing capabilities of AI. Artificial intelligence is only getting smarter and is now fueling efforts to re-work assessment methods, and permanently alter the culture of learning. An innovative and valuable learning tool, or a threat to the legitimacy of standard writing practices that encourage critical thinking, cultivation, academic honesty, and a dedication to the earning process? The verdict is split. 

Access to ChatGPT has stirred up a debate among faculty at educational institutions regarding the ideal course of action in response to such applications. Professors are no longer only responsible for detecting plagiarism in students’ work, but also for detecting AI-generated writing. “A culture will have to emerge around this,” said Patrick O’Neil, professor of Politics and Government at the University of Puget Sound.

Online sites, some titled as unambiguously as ‘Evil House of Cheat,’ have long been causing trouble at educational institutions. AI is not the catalyst for the culture of cheating and plagiarism that permeates educational institutions today — it does however lower the barrier of entry: it’s free.

AI-assisted cheating is only an indication of a flawed system that promotes higher education as a stepping stone to reaching certain career goals. According to visiting assistant professor of English at the University of Puget Sound, Jordan Carroll, “Education is no longer promoted as a goal worth pursuing for its own sake.”

Since education has been institutionalized as a gateway to success, juicing education for all its worth has been devalued. Increasingly, students pursue higher education to achieve the ‘end goal’ defined as an impressive salary rather than a well-rounded skill set and abundance of knowledge. “There are a lot of people who pursue higher education in a really instrumental way, who are very hostile to the idea that they have to seek out education in areas that they have no interest in. So if somebody can basically carry that load for them, for them it’s like ‘that’s great, I can do more coding,’” said Professor O’Neil. It’s imperative to recognize that AI will continue to exacerbate a culture of cutting corners.

Many institutions have taken steps to block ChatGPT on campus-wide internet networks. To follow suit, some educators have taken preventative measures by revising their syllabi to transition away from written assessment, employing browsers that monitor activity, and creating assignments that surpass AI’s capabilities. In another attempt to problem-solve, a 22-year-old Princeton University senior created an app to detect ChatGPT-generated writing.

Alternatively, there are educators perceiving a transition away from writing as entirely negative, and as a disservice to the broadening of students’ academic skill sets. In higher education, even more so at smaller institutions, the process of learning to write academically is integral to learning and forming professional relationships.

To that end, AI should signify the need to revise the standards by which work is evaluated rather than a revision of curriculum and assessment methods. “The success of A.I. software may be less a function of the power of A.I. than about how poorly a faculty member is evaluating a student’s work. If you’re asking it to do things that are purely descriptive, it’s pretty good. But, that raises a broader question of whether professors should be asking and evaluating students on their ability to regurgitate basic material,” said Professor O’Neil.

Sam Kigar, assistant professor of Religion, Spirituality, and Society recognizes the value of implementing AI into coursework for educational purposes. For a writing assignment that encourages students to think critically about the political consequences that lie in the balance of language use, Professor Kigar is attempting to demonstrate the limits of AI. Uniquely human qualities allow people to be thinkers and writers, develop a voice, and alter language in a way that ChatGPT cannot.

“It isn’t animated, and it lacks the emotion that can be seen in a person’s writing. It doesn’t feel real, and at a small university such as this one, it’s been made clear to me that professors want to engage with my thoughts and ideas through my writing,” said Sonja Black, a student of Professor Kigar

To be a true participant in a language implies an ethical position in the world, something that ChatGPT does not have. “To be a thinker and to have an ethical and creative capacity are elements of intelligence that it doesn’t really have,” said Kigar. There is an art to the kind of writing humans can produce. “Let ChatGPT become a proficient writer of technical texts, but let’s focus on us and how we can become free to cultivate ourselves.”

While ChatGPT has the ability to produce objectively adequate prose, its capabilities are limited. “There’s a way of viewing AI as a de-legitimization of what we do here, and that’s a scary thought for everyone, but I’d rather focus on how it sharpens the stakes of what we do as humans,” said Kigar.

Looking forward, the solution is twofold. If people can allow AI to exist in its own lane, they may focus on the human responsibility to cultivate and think with an ethical position in the world. “AI is about results, and not about process, about what it is, and not what it’s becoming. It is ethically flawed about things, and even if they iron that out, it’s not a process of becoming. The stakes really are about making the most of a chance to value the cultivation process and exchange of ideas over time. AI is teaching us what to care about,” Kigar said.