For decades, AI (artificial intelligence) in its current form has been a concept that scientists could only dream of. From Alan Turing’s development of the most primitive form of an AI-wielding computer less than a century ago, we’ve seen the development of programs such as OpenAI and Google Gemini that can generate just about anything in the blink of an eye. These innovations have been comparable to flying cars and moving at the speed of light, up until recently.
However, one must worry about the implications of AI on our ability to think critically, and worry even more so about the CSU’s confidence- or lack thereof- in its students’ ability to do so.
CSU Chancellor Mildred García announced a new initiative on Feb. 4 to further the implementation of AI programs throughout all 23 campuses within the CSU system. This initiative includes an 18-month contract with OpenAI that costs $17 million. “We are proud to announce this innovative, highly collaborative public-private initiative that will position the CSU as a global leader among higher education systems in the impactful, responsible and equitable adoption of artificial intelligence,” García said in a news release.
On Jan. 10, Gov. Gavin Newsom announced a proposed $375 million cut in the CSU’s state budget. Less than two weeks later, a notice on Jan. 22 was sent to all Sonoma State University faculty about a decision to cut 23 academic programs and several departments, in addition to laying off over 100 tenured, tenure-track and untenured faculty members.
As CSU universities scramble to figure out how to reallocate their funds by downsizing most programs while also raising tuition costs (which, by the way, are expected to raise by about 34% over the next five years), spending $17 million on an 18-month contract with OpenAI to provide all students and faculty with ChatGPT seems frivolous, to say the least.
Looking at this shift towards an “AI powered public university system” from a fiscal perspective is a concerning prospect on its own. From a logical perspective, this idea that universities should embrace AI and encourage its use is almost oxymoronic. There is an argument to be made about AI and its role as a tool in quantitative fields such as medicine, mathematics and science, as it has the ability to compartmentalize large amounts of data and aid in the acceleration of research.
Yet, this isn’t the only way it is being used, and far too many students have casually admitted to using AI in subjects that require critical and original thought, such as writing and language arts. A 2024 survey found that 86% of students admitted to using AI in their studies. Another study pulled data from the similarity-checker program Turnitin, a plagiarism detection software that is used to scan students’ assignments. This study showed that over half of the papers analyzed as of March 2024 were found to have been written by AI in at least some capacity.
“I think for a lot of people, AI is going to be seen as a thing that will cut time, that will just do the work for them, that cuts out what people might think is busy work or managerial work, or don’t see writing as thinking,” SDSU Writing Center Director Dustin Edwards said.
“I like to think of writing as a kind of tool for thinking. I think AI has the potential to cut that out, and I worry about what the impact of that’s going to be for generations to come.”
Edwards is also concerned about the impacts that AI will have on how the university values Instructional Student Assistants (ISAs), whose job it is to help students understand their writing assignments beyond a grade.
It also doesn’t help that SDSU has a rather vague policy regarding AI use, so the line between what is considered “ethical” and “unethical” AI use is left extremely blurry for students. This is contradictory to the fact that the Center for Student Rights and Responsibilities sends students who have been caught for AI-related academic dishonesty to the Writing Center for guidance. If there is no explicit university-wide policy, what guidance can tutors give them?
“With a big kind of blanket adoption of ChatGPT.edu at the CSU level saying we’re an AI powered university, it’s going to quickly send mixed signals,” Edwards said. “If some professors say, ‘we’re not using [AI] for this,’ then some students might be like, ‘but you’re an AI powered university, right?’”
In a time of such immense uncertainty regarding the fate of the educational system, the implementation of AI in universities seems like a non-issue. However, as the number of students utilizing on-campus learning facilities such as the Writing Center and other tutoring centers dwindles, opting for AI generated feedback instead of help from knowledgeable peers, it’s becoming more apparent that college is becoming less about learning to think deeply and more about getting a certain grade.
This, in and of itself, is a threat to what is valued most about education. Students pay thousands of dollars a year to learn, to dedicate themselves to their studies and the pursuit of higher knowledge. If not for learning to think critically through producing original work, then what’s the point?