Who (or what) makes the content your kids watch on YouTube? In some cases, it’s hard-working creators who strive to make quality videos for entertainment or education. In other cases, it’s a computer program designed to efficiently produce videos for a lot of views and big profit. With this in mind, it is up to parents to ensure that their kids have a safe and fun experience while online. For helpful and empowering tools to establish a safe screen home environment, check out our Screen Safety Essentials Course. Today’s GKIS article tells you what you need to know to make YouTube viewing safer for your kids.
Bots!
Bots are computer programs designed by people or other bots to carry out specific online tasks. Not all bots are bad. However, they can run without any oversight from an actual human being.
One application for bots is creating YouTube videos for kids. More specifically, in this capacity bots combine video segments and post them over and over to test how many views they get. Once the tests are completed, the bot has created and run videos that ultimately make money for the programmer. Now that’s artificial intelligence!
Bot-Made Videos
Bot-made videos can look like a normal kid’s video, but they are typically a bit stranger. They often contain just enough story to string the randomly chosen segments together, but not enough story for everything happening to make logical sense. There are just enough familiar elements to hold a child’s attention but nothing educational or valuable to a child.
These videos distract kids long enough to get them to view ads and may even cause harm. After all, many times a human’s eyes have not viewed the video, and bots can’t discriminate a harmful video from a harmless one. At a glance, parents can’t discriminate either. Plus, most parents simply don’t take the time to preview thousands of videos their kids browse each day – especially from beginning to end.
Using Branded Characters to Bail Kids
One element that gets kids searching and watching are recognizable characters. Although branded characters are used without permission and are placed in a disjointed storyline for the video, kids will select them and stay entrapped expecting entertainment. For example, in her book Screen Time in the Mean Time, Dr. Bennett describes an alarming video portraying popular kid’s cartoon character, Peppa the Pig, screaming while being tortured in a dentist’s chair. The beginning of the video looks like a regular Peppa the Pig story. But near the middle of it, the story takes a confusing, terrible turn. Inappropriate video content make be shocking and even funny to older kids but vulnerable young children don’t have the insight or sophisticated skill set to look away. This can feel like a violent ambush and result in confusion, shame, and trauma.
Auto-play
Kids don’t always view these videos because they searched out the characters. Sometimes it is offered to them automatically in their feed. Auto-play is a YouTube feature where a new video is automatically
started after the one currently playing ends. Auto-play will select a video that is similar to the one you just watched based on tags that content creators mark their videos with when they post them. If auto-play is left on too long, it can lead a viewer down a rabbit hole of similar but stranger and stranger videos until they fall into bot-generated content.
The Algorithm
Unfortunately, bot-made videos and more can slip onto YouTube relatively easily. The huge volume of content uploaded to YouTube every day means that having a human being review every video uploaded to the site would be impossible. Instead, YouTube has another way to filter the content uploaded to its site, a bot of their own.
YouTube’s algorithm is, in essence, a much more advanced form of a bot that can scan through every video as it’s uploaded and automatically flag anything that violates YouTube’s terms of service, or at least that’s what it’s supposed to do. Unfortunately, YouTube’s algorithm can’t detect every inconsistency. It’s looking for the very specific things it was programmed to look for. Videos that don’t contain these specific violations slip by the filters. Many content creators have learned what exactly the algorithm is looking for, and some of them use it to slip inappropriate content past the sensors.
YouTube’s algorithm is also responsible for other features on the site including auto-play. The algorithm is what decides what’s worth showing next after a video, and what isn’t. However, the algorithm is only capable of discerning what videos are similar to others based on the tags assigned to a video. If a bot learns to place all the relevant tags for child content on an automatically generated video, then the algorithm will suggest it as if it were normal child content.
What can you do about bot content?
There are a few things that you as a parent can do to protect your children from bot-generated content:
Check in on your kids when they’re watching YouTube
So you can be sure the algorithm hasn’t drifted too far away from where it started.
Get Help
Monitoring everything your child watches can be a daunting task GKIS is here to help. Our Social Media Readiness Course is designed to teach your tweens or teens how to spot red flags on social media sites and when they’re gaming.
Turn off auto-play
The auto-play feature can be disabled by clicking the auto-play button at the bottom of YouTube videos. The button appears as a small black and white play button and is replaced by a black and white pause button while disabled. By turning off this feature, YouTube will no longer pick the next video your child watches next and instead will wait for you to manually choose the next video.
Limit your child’s time on YouTube
The bot-generated content of YouTube is at the bottom of the algorithm’s list of choices. Children often end up being presented with bot-generated content after spending too much time watching videos on YouTube. Our Connected Family Course has screen management strategies and safe-screen home setup ideas to help you manage your child’s screen time.
If you do catch your kids being exposed to an inappropriate video, report it.
Videos reported to YouTube as inappropriate are reviewed by real people who can catch the video for what it is. An offending video will be deleted permanently and can get the channel it comes from deleted entirely.
Thanks to CSUCI intern, Jason T. Stewart for researching bot-generated content and co-authoring this article.
I’m the mom psychologist who will help you GetKidsInternetSafe.
Onward to More Awesome Parenting,
Tracy S. Bennett, Ph.D.
Mom, Clinical Psychologist, CSUCI Adjunct Faculty
GetKidsInternetSafe.com
Works Cited
Robertson, Adi. “What makes YouTube’s surreal kids’ videos so creepy” The Verge, https://www.theverge.com/culture/2017/11/21/16685874/kids-youtube-video-elsagate-creepiness-psychology
Maheshwari, Sapna. “On YouTube Kids, Startling Videos Slip Past Filters” NY Times, https://www.nytimes.com/2017/11/04/business/media/youtube-kids-paw-patrol.html
Oremus, Will. “Even YouTube’s service for kids is being abused. Can anything control the massive platforms that now shape our lives?” Slate, https://slate.com/technology/2017/11/those-disturbing-youtube-videos-for-kids-are-a-symptom-of-techs-scale-problem.html
Photo Credits
Photo By: Kaufdex (https://pixabay.com/photos/youtube-media-screen-mac-apple-2449144/)
Photo By: Gerd Altmann (https://pixabay.com/illustrations/binary-one-cyborg-cybernetics-1536624/)
Photo By: Gerd Altmann (https://pixabay.com/photos/hacker-attack-mask-internet-2883632/)
Photo By: Markus Trier (https://pixabay.com/photos/homeschooling-school-technology-5121262/)