Connecticut Professors Fear Dependence, Cognitive Decline Over AI Use

Connecticut Professors Fear Dependence, Cognitive Decline Over AI Use


(TNS) — Lloydia Anderson has listened to the common complaint among many students her age: a lack of time to get anything done.

Between some juggling jobs and others prioritizing certain classes to get more free time, Anderson said she has heard other students at Central Connecticut State University talk about using ChatGPT to get assignments done.

“Some people use it for time management and some don’t really think an assignment is important,” she said, describing the apathy of some students to the AI chatbot’s increasing role in education.


Anderson said she deliberately does not use ChatGPT for fear that it would diminish her capacity to think. She is equally concerned about her job in the future being replaced by AI. Anderson, a junior at CCSU, is studying sociology and philosophy and hopes to advance to a higher level of education.

AI is changing the landscape of higher education as professors change curriculum and testing methods to try to ensure students are relying on their own thinking rather than AI. Several professors at CCSU who spoke with the Courant shared how they fear AI is causing a decline in cognitive ability and a dependence on technology to complete assignments. But the concern is more than that students are finding a shortcut around doing the work, it’s that students will no longer be able to think critically or perform the simplest of tasks without technological assistance.

In the same measure, AI education experts say AI is not going anywhere and that it is the responsibility of educators to ensure that students are taught to use it ethically and responsibly — to create a way to leverage its use to promote learning instead of replacing the brain’s cognitive thinking skills.

IDENTICAL ASSIGNMENTS, CITATION ERRORS

Teaching an online class over the summer that required students to connect course material to pop culture themes, associate philosophy professor Audra King knew something was awry when she discovered three of the students’ essays were identical in nature, using the same terminology and ideas.

King determined that the students, who did not know each other, used ChatGPT — a troubling trend that she is seeing more often these days.

“A lot of students and I want to say faculty too put a prompt in ChatGPT and have it spit back the answer,” said King. “It is making things harder. They already have a decreased attention span. They have lower critical thinking skills.”

And the authenticity of ChatGPT in some instances also remains a question.

Brian Matzke, digital humanities librarian at the Elihu Burritt Library, said once every couple weeks a student will come to the research desk looking for certain articles they have listed but none of them can be found because they don’t exist.

Ricardo Friaz, assistant professor of philosophy, said ChatGPT is “reproducing patterns of language that includes real citations and inaccurate citations.”

“It reproduces a lot of the biases that go into it,” Friaz said. “It takes from the corpus of knowledge and is reproducing what has been said.”

Vahid Behzadan is associate professor of computer science and data science at the University of New Haven and cofounder of the Connecticut AI Alliance, a consortium of 21 universities, industrial action groups and communities across the state. ChatGPT, Behzadan said, “doesn’t just string words together: It can follow a prompt with several coherent paragraphs, shifting smoothly across subjects and styles.”

“That means it demonstrates not only a strong command of language, but also a practical grasp of commonsense and specialized knowledge,” he said.

COGNITIVE DECLINE

Several studies have emerged showing the correlation between the use of ChatGPT and cognitive decline.

An MIT study released this past June included 54 participants who were assigned into three groups: an LLM (language-generating AI) group that uses ChatGPT, another that uses the Google search engine and a third nothing at all. The participants were required to write an essay.

The study used electroencephalography (EEG) to record the participants’ brain activity, according to information on the study.

Results from the study measuring brain activity over four months found that “LLM users consistently underperformed at neural, linguistic, and behavioral levels,” according to the study.

“These results raise concerns about the long-term educational implications of LLM reliance and underscore the need for deeper inquiry into AI’s role in learning,” the study stated.

Another study from researchers at the University of Pennsylvania in 2024 found that “Turkish high school students who had access to ChatGPT while doing practice math problems did worse on a math test compared with students who didn’t have access to ChatGPT,” according to the Hechinger Report.

Behzadan said the studies are limited and research is still in early stages as use of chatbots evolves. In August, ChatGPT had 700 million weekly users, a number that has quadrupled in the past year.

He emphasized the importance of cognitive fitness, which offers long-term benefits including general health, well-being and quality of life.

Tomas Portillo, a junior at CCSU studying mathematics and philosophy, said he does not use AI in his studies. But many students, he said, rely so heavily on it that they would rather ask ChatGPT for help than a professor.

King has taught philosophy for 17 years and has seen how students approach assignments.

“I do see less students thinking abstractly,” she said, explaining that years ago students would come to class having read as assigned article and ask questions.

“I am seeing less engagement and feedback,” she said. “It is the humanity, the originality and the individuality that is completely lost and vanished when we rely on ChatGPT.”

Matzke said AI really in itself is not what he would call his primary concern.

“If the concept of AI technology were to emerge in a world that already valued critical thinking and humanities, it would be a great opportunity but it is amplifying a lot of existing problems.”

OVERWORKED AND EXHAUSTED

King said students today are overworked and exhausted by juggling numerous priorities.

“They don’t have time and don’t have energy,” she said. “We are in a perfect storm of capitalism. There is also an attention span issue. TikTok and social media have made it easy to rely on outside sources.”

Matzke said that students are very outcome oriented and that they are good at regurgitating facts and writing in a style that is bullet pointed.

“But trying to identify the relationship between the concept and how X leads to Y is much more difficult,” he said.

Portillo said while he does not use AI in his studies, he can never seem to escape it because it’s pervasive in the algorithms on social media and in many facets of everyday life.

“I feel like people are more self-centered nowadays with all this access to technology,” he said. “It seems like social media is the medium between us that adds a layer of obfuscation.”

He also recalled how AI has changed the way classes are structured. Several professors said they have changed curriculum, requiring more in-class essays and quizzes.

“There are a lot more in-class assignments,” said Anderson.

MISUNDERSTANDING AI

Friaz is teaching a philosophy, research and writing course to teach students to research and write in a world where AI claims to be better at doing those things than humans, said that students are often not aware that they are “offloading their thinking” by putting their essays into ChatGPT before handing them in.

“They don’t think they are using AI,” he said. ”Writing is your thinking and they are not aware they are offloading their thinking. It makes it hard to talk about it.”

Friaz said the goal of his class is for “students to become strong researchers and writers by researching AI, experimenting with it and thinking critically about how it affects society.”

In today’s society, Friaz said, students feel pressure to spend time on what is most relevant to their jobs with the premise that “writing is not essential and they don’t have to do it and have to suffer through it.”

AI IS HERE TO STAY

When ChatGPT first became a mainstay at universities in 2022, many universities attempted to ban the use of AI, Behzadan said. But that backfired.

“Whether you are going to ban it or not everyone is going to use it,” Behzadan said, “it is inevitable.”

Behzadan said universities began trying to develop guidelines for ethical use of AI, like citing AI when it’s used.

With AI not going away anytime soon, he said students need to learn how to use AI to be able to adapt to changing skills and job requirements.

“AI is here to stay,” he said. “It has become more advanced and we need to adjust and evolve our curriculum to embrace AI while also enabling our students to make beneficial and ethical use of the AI technology.”

Friaz said the key is “teaching it as a tool to aid you rather than something that thinks for you.”

King said she worries about the future and the continuing emergence of AI.

“The more students are relying on technology to think for them, the less they are going to connect with and grow those emotional connections in the real world,” she said.

©2025 Hartford Courant. Distributed by Tribune Content Agency, LLC.





Source link