Advancements in artificial intelligence technology have transformed the media we consume. These highly intelligent computer programs can create realistic-looking images from a few words, hold entire conversations, and even write cited essays. While programs like ChatGPT can give us simple answers to our questions, they can also hinder our children’s learning when they outsource their brainwork to an all-knowing robot. If you worry that your child is relying on technology a little too much, our Screen Safety Toolkit offers a resource guide so you can tighten up screen time supervision and management.
What is ChatGPT?
ChatGPT is a language processing tool that is powered by artificial intelligence (AI) technology, allowing you to have human-like conversations.[1] ChatGPT can answer one’s questions and complete various tasks from essays, to editing code, to writing emails. This software is open to the public for free, although there is also a paid subscription version with additional features.[1] Notable celebrities in technology like Elon Musk have commented on the strength of ChatGPT, stating “ChatGPT is scary good. We are not far from dangerously strong AI.”[1]
ChatGPT gets its data from textbooks, websites, and various articles which it simultaneously uses to model its language to seem more human-like.[2] This AI is well-trained on biased and unbiased data and can reproduce data with reliability, something that many other similar AI systems lack.[2]
When asked to write a sentence for this article, ChatGPT responded with, “ChatGPT is an artificial intelligence language model developed by OpenAI, based on the GPT-3 architecture. It is designed to generate natural language responses to a wide variety of prompts and questions. ChatGPT uses advanced machine learning algorithms to understand the nuances of language and generate context-sensitive responses that are often indistinguishable from those written by a human. It has a wide range of potential applications, from customer service and education to creative writing and more.”[3] How’s that for a definition?!
How are kids using ChatGPT to cheat?
While ChatGPT’s design of being able to generate natural responses to various questions and prompts can make it a helpful tool for educational and informational purposes, it also opens it up to exploitation for other purposes. A quick Google search will turn up dozens of articles on how to get ChatGPT to write your essay for you. A student at Cardiff University in Wales shared his experience with turning in two papers, one written by himself and the other written by ChatGPT.[4] The essay from ChatGPT earned him the highest grade he had ever received on an essay in his entire time in undergraduate school.[4]
College professor and TikTok user Lilmaverick3 received an essay from one of her students that had been flagged by TurnItIn.com as being 100% written by AI technology, proving that students have already started taking advantage of the AI’s skills and ability to create human-like speech.[5] The technology is still relatively new but stories just like this will likely continue.
Cheating robs children of the satisfaction of completing their own assignments and the learning experience that comes with research. It also offers a dishonest view of academic ability, which can quickly get out of hand when teachers ratchet up expectations in response.
What Parents and Educators Can Do to Prevent Cheating
Research various forms of AI detectors and run your child’s papers through flagging software, this way you can see if the paper has any elements that have been plagiarized.
Some popular flagging software includes Writer’s AI Content Detector and Content at Scale’s AI Detector. For educators, we recommend having students turn in assignments through TurnItIn.com, this checks for plagiarism as well as how much of the assignment is AI-generated content.
Sit with your child while they work on assignments to offer support as they need it, and be there before they decide to turn to AIs.
Manage smart devices during homework time using resources from our GKIS Screen Safety Toolkit.
Utilize ChatGPT in ways that allow it to be an educational tool, like writing ideas, creating to-do lists, and finding resources.
ChatGPT is a helpful tool for educational purposes when used properly. As a prompt-based language bot, it can be used to spruce up resumes or cover letters based on inputted job description data, help create outlines for papers based on inputted prompt data, and even provide recipes for weekly meals.
Promoting ChatGPT as an educational tool rather than a homework robot can prevent your child from creating an unhealthy dependence on AI software to do their work for them.
With the Covid-19 pandemic creating an unexpected need for online school, tech has been forever integrated into our children’s everyday curriculum. Teachers recognize that kids benefit from tech tools, and schools now offer individual devices for kids as young as elementary age. We’ve become reliant on screen devices at home too. Smart assistants, like devices that support Alexa, Google Assistant, and Siri, get us fast answers to questions in seconds. But is it helpful to outsource the brainwork we should be doing ourselves? What if you’re a kid who hasn’t yet mastered independent reasoning? Does relying on a screen device impair learning rather than help us develop it? Are teachers aware of how much kids are outsourcing learning to their devices? If you worry that your child is using tech too much, our Screen Safety Toolkit offers a resource guide so you can tighten up screen time supervision and management.
What are smart assistants?
Smart assistants are tech devices, like phones, portable screens, watches, and speakers that use software to perform verbally requested tasks.[1] Among the most popular are Alexa, Google Assistant, and Siri.
As of 2021, about 47% of smartphone users in the United States are Siri-equipped iPhone users.[2] In 2020, close to 70% of smart speaker users in the United States used Alexa-equipped Amazon Echos.[3] The prevalence of these two smart assistants has created an encyclopedia of knowledge accessible just at the sound of our voices. In fact, Amazon promotes using Alexa for that very purpose.
Last year, Amazon ran an ad that showed a father asking Alexa what year Pompeii was destroyed. Once he receives his answer from Alexa, he shares it with his daughter who is clearly doing homework at the dining table. She then turns to ask him for the name of the volcano. He responds by turning back to the Alexa device. Although this ad intends to demonstrate a fun interaction between father and daughter, it also demonstrates how smart devices are being used to cheat on assignments and get easy answers.
Are kids using smart devices to cheat?
The modern generation of students is smart, and they know how to use the technological resources they are given. But they also know how to use them to cheat and get away with it.
A college student I interviewed recalls a run-in with a fellow classmate who had Siri turned on during a test in high school. “We were in the middle of a test in my AP European history class, and suddenly the iPhone of the girl sitting next to me begins speaking. You can hear Siri say, ‘The War of 1812 was…’ before she abruptly turned it off. The teacher immediately turned to her and said, ‘Make sure all phones are turned off please.’ The student turned bright red, so I think she definitely learned her lesson.”
This instance of cheating occurred in 2017. Since then, opportunities for cheating with smart devices are more common than ever. A recent high school graduate recalls taking tests during Covid-19. “The teachers would make us all have our cameras on, but they wouldn’t require us to be unmuted since it would be a distraction. Since they couldn’t hear us, anytime I would get stuck I would just ask the Alexa sitting in my room. Sometimes my friends and I would even Facetime each other and use my Alexa together whenever we felt confused. I honestly would never study for tests because I didn’t see a point when I could just get the information from Google so easily.”
For a student who is struggling, smart devices provide the perfect assistance for quick and easy answers to questions while simultaneously being practically untraceable. Other forms of cheating leave behind indicators or evidence, but smart assistants don’t.
It is a given that cheating is bad, but you may not know of the many downsides to cheating that go beyond academia. A recent study by a Harvard-Duke research team found that cheaters tended to engage in “self-deception,” meaning they would view their high performance as a sign of high intelligence, which may not actually be true. They see a high score, and even though they cheated, they believed that they are smart enough to have earned that score.[4] Another study found that when our kids cheat, they deprive themselves of the happiness that comes from independent accomplishment.[5] Smart devices are helpful tools, but when their assistance turns into dependence, kids begin to create a world where there actually at a disadvantage.
Take away or manage smart devices during class or homework time using resources from our Screen Safety Toolkit.
Advocate for your kids and communicate your preferences about tech integration with teachers and school administrators.
Offer valuable information and support for your learning community by suggesting a screen safety webinar from Dr. Bennett, our Screen Safety Expert, at your school and church.
Set up tech-free learning challenges for the whole family like a family game or trivia night.
Encourage creativity, curiosity, confidence, and a love of learning by offering a variety of fun educational materials and outings.
Optimize health tech integration for the whole family with the parents-only and family coaching videos from our Screen Safety Essentials Course.
Thanks to CSUCI intern, Katherine Carroll for researching kids cheating using smart assistant devices and co-authoring this article.
I’m the mom psychologist who will help you GetKidsInternetSafe.