On Aug. 28, Yale recently announced that they would invest $150 million towards supporting Artificial Intelligence integration, over the next five years. Arizona State University is collaborating with OpenAI to engage its faculty and students with artificial intelligence.
They are just a couple of universities that have committed to integrating AI. These conversations on whether or not to include AI, and how, are inevitable, as it has the potential to be both a tool and a threat. What professors and students have to say about it, could shape the future of how AI is used in, out and beyond the classroom.
San Diego State University professors across disciplines view AI as inevitable, but are hesitant to fully embrace it in the classroom, particularly in writing-intensive departments such as Rhetoric and Writing and English.
“In writing specifically, I think it’s going to have to be a thing we talk about,” said Chelsea Kerford, a rhetoric and writing professor and assistant director of the SDSU Writing Center. “It’s going to be necessary to teach students how to use it.”
However, Kerford has not integrated AI into her classes. She does not allow AI to be used by students, nor has she utilized it as a grading tool.
“I’m not doing that because one – I don’t know how, and two – I don’t trust it,” she said.
Kerford’s main concern is about how students use it, rather than the tool itself.
“A lot of students use it instead of writing things themselves,” she said.
Jessica Pressman is a professor for the English and Comparative Literature department and co-founder of the Digital Humanities Initiative at SDSU. She works in and has been published many times in the field of electronic and digital literature, making her familiar with AI.
“As a scholar of new media, I think that it is one of my foundational goals in all my classes to teach my students to think critically about the tools they use,” Pressman said.
She is not opposed to allowing AI in her classes, but does not currently use it as a tool. However, she does teach about AI through secondary sources such as books published about, or in tandem with, AI.
Pressman stressed the importance of looking at and talking about AI, but on a wider scale, outside of just the world of academia.
“I’m hoping that it brings a more public view to the politics and real issues of new media which are less about cheating and more about, you know, hegemonic power centers and corporations that lack transparency,” she said.
Pressman explained that AI has been around for a long time but has only recently come into the spotlight as a major topic of discussion. She added that its recent popularity has raised ethical concerns that are often overshadowed by conversations regarding its applications.
Pressman referred to this ethical issue as “power and archive,” highlighting the inherent bias in the people and databases behind AI, which are often centered on Western knowledge.
“That’s what’s scary about it, is that it’s neutralizing certain types of knowledge and databases of knowledge,” she said.
Pressman said that it is especially important to think critically about the technology being used.
Professors and instructors are not the only ones who partake in these conversations. Students also play a key role, as they are more likely to embrace the technology.
“I don’t have moral qualms about using it in the first place,” said Anh-Thuat Nguyen, a first-year graduate student in the Rhetorical Writing Studies program.
Nguyen said he has used AI in some of his classes, but doesn’t actively engage with it.
“I see it as it’s really good at writing a high school paper; it’s very bad at writing a graduate paper,” Nguyen said. “It misses all the nuance of — everything.”
Roxi White, a student studying journalism and public relations, is concerned with AI removing the need for students to actively generate their own ideas and writing, and how this may negatively affect students in the future.
“I’d rather present my own ideas than come up with whatever the internet tells me to,” White said. “I think a lot of people are not patient enough to even look up things. It will lead to more brain rot than we are already experiencing.”
While conversations about AI often focus on cheating and plagiarizing, the institutions behind AI and its social and emotional consequences are often overlooked.
“For me, reading my student’s writing is so important to like everything I believe in as a teacher,” Kerford said.
If Kerford or other professors were to implement AI as a grading tool, it could affect the level of interaction with a student’s writing.
“I want that connection with them.I feel like students need that – they want to be seen,” Kerford said. “It [AI] can’t experience human emotion and you need that.”
Kerford acknowledged that there are a lot of professors who don’t want to talk about using AI because they only see it as a threat.
“I think that’s the wrong mentality,” she said about professors who think AI cannot be used for good. “There are ethical ways to use it.”