HUNT VALLEY, Md. (TNND) — Chat, what should be the first sentence of my news article about the use of ChatGPT by students in school?
“The rise of ChatGPT in classrooms is forcing educators to rethink what academic honesty really means.”
Take it from the artificial intelligence-generated chatbot itself – it can do the work for you.
It may not be perfect, accurate, authentic or creative, but when your brain is facing a blockage and you can’t get over the hump of starting a task or answering a question, OpenAI’s digital companion ChatGPT can help get you there instantly.
The prompt above didn’t generate just that one idea of a first sentence; that was just the “controversial/thought-provoking” output. ChatGPT also provided sentences that it considers “neutral/informative,” “trend-focused,” or “statistic-based (if you have data).”
And then it asked if it should sound more formal, more casual, or tailored to a particular audience.
The technology achieves more than search engines or Q&A bots can even fathom. It’s more like having a conversation with an expert in all things, who doesn’t care or judge what your motive is for using it.
And – everyone has unfettered access to it for free, including our nation’s students, on any device of their choosing.
One might say it’s like another brain that does the thinking for you from start to finish: brainstorming, organizing thoughts and outlining step-by-step how to complete the task at hand, to achieve the exact result you want, which you can also tailor at the end.
In more technical, scientific terms, people relying on an AI chatbot to think for them is “cognitive offloading.” Users do less deep, reflective thinking and instead punch in prompts that get them to where they’re going instantaneously, no brain power exhausted.
Like – Chat, explain World War II in one sentence.
“World War II was a global conflict from 1939 to 1945 involving most of the world’s nations, sparked by Nazi Germany’s expansion under Adolf Hitler, and resulting in massive destruction, the Holocaust, and a reshaping of international power.”
Or, even simpler – Chat, explain World War II to me like I’m 8 years old in one sentence.
“World War II was a big fight between many countries around the world, where some tried to take over others, but in the end, the good countries worked together and stopped the bad ones.”
As one can assume, it’s pretty easy to take a worksheet or assignment prompt from a teacher, copy and paste it into ChatGPT, and end up with completed work in one to three seconds. Yes, it would be plagiarism.
“I think for most students and for most teachers, we would all agree that asking it to write something and then copying and pasting that into a Word file and putting my name on it and turning that into you, that’s cheating,” says Eddie Watson, the Vice President for Digital Innovation at the American Association of Colleges and Universities. “But if it provides an outline, and then I write every word using that outline, is that cheating? Or, if I ask it for feedback before I turn it in, is that cheating?”
And that’s just in reference to maybe an English or writing course. What about math, science or social studies? Watson calls generative AI “the Grand Challenge” for higher education and K-12 – and says it can only be addressed by understanding what it can do.
AI can help students cognitively offload – which experts warn threatens their independent reasoning, skepticism, motivation, critical thinking and decision-making. (AI tools also filter content based on prior search/input history, so it can reinforce existing biases).
But, AI can also help an overwhelmed student studying for three different exams prioritize what they need to study and when. It can answer a question for a student nervous to raise his/her hand in class to ask the teacher.
It can listen to a lecture and take notes, so a student can be more engaged and not be frantically scribbling chicken scratch down. It can save lots of time a student might be taking toiling over one problem or equation that’s stumping them by providing them the answer – and also how it got the answer, showing its work and reasoning.
AI can meet each student where they are individually – something one teacher with 200 students may not be able to do.
“If you’re in class and your teacher is explaining a concept and you don’t understand it, you could go to AI and say, ‘My professor talked about this in class today. Could you teach it to me but using basketball metaphors? Or using song lyrics that I’m familiar with?’” Watson says.
To demonstrate (and because it’s fun), here’s an example:
Chat, can you explain photosynthesis to me to the tune of “Twinkle, Twinkle, Little Star”?
“Photosynthesis, what a feat,
Plants make food from sun and heat.
Sunlight, water, CO₂,
Make the leaves turn green and new.
Glucose made to help them grow,
Oxygen they let it go.”
If it can come up with that in a split second, one can imagine the possibilities are endless.
But, is that really how students are using AI nowadays?
The data is ever-changing, but Harvard Business Review has been tracking the most common uses of AI in 2024 and 2025.
In 2024, the most common use was generating ideas. It was also mostly used for therapy/companionship, specific searches, editing text and exploring topics of interest.
This year, the leading use of AI is for therapy/companionship (aka, not education-related). The other leading uses are “organizing my life,” “finding purpose,” “generating code” and “enhanced learning” – which HBR describes as using ChatGPT to explain parts of courses that teachers might gloss over, which then gets added to notes.
Watson says most students don’t really want to cheat.
“I think the majority of students do want to learn what they’re being taught,” he says. “When they recognize that this is what cheating is and this is what cheating isn’t, most students will norm their behavior to fall within the fair practice of that particular assignment.”
Watson adds generative AI, including ChatGPT, is “definitely going to be pervasive” – pervasive like Google searches and even calculators. But, unlike calculators, Chat can’t really be put under a desk. Unlike Google, Chat is considering your input and the output you want.
He says it’s a matter of teaching students when and how to use AI properly – and why it shouldn’t be used every time a student is feeling too lazy to think about something for a second.
Because Watson says there’s no real way to keep students from using AI.
The only way, he says, is to have them do the work right in front of their teacher. Or, if it’s an online course, implementing a proctoring software that can track students’ eye movements, which does exist.
It might be feasible for a big exam or course final. If students know they have to retain the information and prove it in a classroom with no technology, they will probably do their best to learn.
And, Watson says teachers need to tell their students that applies in the real world, too.
“I’ve been telling students that you’re not likely to make $45,000 right out of college if all you can do is ask good questions and copy and paste whatever you produce,” he says. “If you don’t see your intellectual fingerprints all over whatever you’re eventually putting forward, then you’re probably not using AI well; you’re just having it do work for you.
“Research shows that if you’re transparent about your teaching choices, students will typically follow and say, ‘OK, I see why I have to do it this way because I’m trying to build these foundational skills.’”
He says he sees faculty members explicitly telling students: “You can use AI for your brainstorming phase,” or “Before you turn it in, ask for feedback on your grammar and spelling.”
Instructors’ approach to AI would differ across all fields, and would be entirely based on what they’re trying to teach. If it’s teaching spelling, no spell check allowed, but you can ask Chat for topics to write about. If it’s a scientific research paper, no copy and paste allowed, but you can let Chat review your grammar.
But first, students and their educators need to know what AI can and can’t do.
It hallucinates facts; it reinforces biases; it can simply be plain wrong about something because it pulled the output from misinformation or propaganda. ChatGPT admits this right there on its home screen.
It’s a tall task to develop AI literacy courses when the technology is constantly evolving, merging and changing, not to mention attempts to regulate it.
But Watson says for younger kids, a very broad, general class about what AI is and what it can do can be beneficial (and quite enjoyable) for them. Then, down the road, a class like “AI use for higher education” that high school seniors enroll in. Then, further down the road, “AI use in the medical field” for med students and “AI use in law” for law students.
In fact, let’s ask Chat what AI literacy classes it thinks would be beneficial.
For elementary students, it suggests: “What is AI?,” “AI and Me” and “How Computers Learn.”
For middle schoolers: “How AI Works: Algorithms & Data,” “Bias in AI,” and “AI in Everyday Life.”
For high school and beyond: “AI & Society: Ethics and Impact,” “Intro to Machine Learning,” “Prompt Engineering & Generative AI” and “Build an AI App.”
Watson says AI doesn’t necessarily mean our nation’s kids are less intelligent or can’t think critically. AI marks a massive change in the way they learn, but they’re still learning.
He compares it to when the calculator came into existence. People said we’d lose our relationship to numbers and math skills because we’re not adding by hand columns.
But really, calculators opened the door for statistics courses, so instead of students spending all their time on calculating the correct mathematical answer, they could focus on the implications of the results, and how to read and interpret them.
Watson maintains – there’s some things AI will never be able to do. Therefore, students still need to learn how to function and contribute to society.
“It actually elevates what humanity can do, because it frees us up from more manual, tedious tasks,” Watson says. “The human element, the judgments, the interpersonal skills, all of the things we see as durable skills like being able to speak well and work well on teams, things that we would never see AI, at least currently, put in the middle of – those skills are likely to be more prized.
“We’ll be able to engage in more of that kind of work whenever we’re not having to spend as much time adding columns of numbers.”