fbpx

Need peaceful screen time negotiations?

Get your FREE GKIS Connected Family Screen Agreement

youtube kids

Do You Know What YouTube Is Showing Your Kids?

Who (or what) makes the content your kids watch on YouTube? In some cases, it’s hard-working creators who strive to make quality videos for entertainment or education. In other cases, it’s a computer program designed to efficiently produce videos for a lot of views and big profit. With this in mind, it is up to parents to ensure that their kids have a safe and fun experience while online. For helpful and empowering tools to establish a safe screen home environment, check out our Screen Safety Essentials Course. Today’s GKIS article tells you what you need to know to make YouTube viewing safer for your kids.

Bots!

Bots are computer programs designed by people or other bots to carry out specific online tasks. Not all bots are bad. However, they can run without any oversight from an actual human being.

One application for bots is creating YouTube videos for kids. More specifically, in this capacity bots combine video segments and post them over and over to test how many views they get. Once the tests are completed, the bot has created and run videos that ultimately make money for the programmer. Now that’s artificial intelligence!

Bot-Made Videos

Bot-made videos can look like a normal kid’s video, but they are typically a bit stranger. They often contain just enough story to string the randomly chosen segments together, but not enough story for everything happening to make logical sense. There are just enough familiar elements to hold a child’s attention but nothing educational or valuable to a child.

These videos distract kids long enough to get them to view ads and may even cause harm. After all, many times a human’s eyes have not viewed the video, and bots can’t discriminate a harmful video from a harmless one. At a glance, parents can’t discriminate either. Plus, most parents simply don’t take the time to preview thousands of videos their kids browse each day – especially from beginning to end.

Using Branded Characters to Bail Kids

One element that gets kids searching and watching are recognizable characters. Although branded characters are used without permission and are placed in a disjointed storyline for the video, kids will select them and stay entrapped expecting entertainment. For example, in her book Screen Time in the Mean Time, Dr. Bennett describes an alarming video portraying popular kid’s cartoon character, Peppa the Pig, screaming while being tortured in a dentist’s chair. The beginning of the video looks like a regular Peppa the Pig story. But near the middle of it, the story takes a confusing, terrible turn. Inappropriate video content make be shocking and even funny to older kids but vulnerable young children don’t have the insight or sophisticated skill set to look away. This can feel like a violent ambush and result in confusion, shame, and trauma.

Auto-play

Kids don’t always view these videos because they searched out the characters. Sometimes it is offered to them automatically in their feed. Auto-play is a YouTube feature where a new video is automatically

started after the one currently playing ends. Auto-play will select a video that is similar to the one you just watched based on tags that content creators mark their videos with when they post them. If auto-play is left on too long, it can lead a viewer down a rabbit hole of similar but stranger and stranger videos until they fall into bot-generated content.

The Algorithm

Unfortunately, bot-made videos and more can slip onto YouTube relatively easily. The huge volume of content uploaded to YouTube every day means that having a human being review every video uploaded to the site would be impossible. Instead, YouTube has another way to filter the content uploaded to its site, a bot of their own.

YouTube’s algorithm is, in essence, a much more advanced form of a bot that can scan through every video as it’s uploaded and automatically flag anything that violates YouTube’s terms of service, or at least that’s what it’s supposed to do. Unfortunately, YouTube’s algorithm can’t detect every inconsistency. It’s looking for the very specific things it was programmed to look for. Videos that don’t contain these specific violations slip by the filters. Many content creators have learned what exactly the algorithm is looking for, and some of them use it to slip inappropriate content past the sensors.

YouTube’s algorithm is also responsible for other features on the site including auto-play. The algorithm is what decides what’s worth showing next after a video, and what isn’t. However, the algorithm is only capable of discerning what videos are similar to others based on the tags assigned to a video. If a bot learns to place all the relevant tags for child content on an automatically generated video, then the algorithm will suggest it as if it were normal child content.

What can you do about bot content?

There are a few things that you as a parent can do to protect your children from bot-generated content:

Check in on your kids when they’re watching YouTube

So you can be sure the algorithm hasn’t drifted too far away from where it started.

Get Help

Monitoring everything your child watches can be a daunting task GKIS is here to help. Our Social Media Readiness Course is designed to teach your tweens or teens how to spot red flags on social media sites and when they’re gaming.

Turn off auto-play

The auto-play feature can be disabled by clicking the auto-play button at the bottom of YouTube videos. The button appears as a small black and white play button and is replaced by a black and white pause button while disabled. By turning off this feature, YouTube will no longer pick the next video your child watches next and instead will wait for you to manually choose the next video.

Limit your child’s time on YouTube

The bot-generated content of YouTube is at the bottom of the algorithm’s list of choices. Children often end up being presented with bot-generated content after spending too much time watching videos on YouTube. Our Connected Family Course has screen management strategies and safe-screen home setup ideas to help you manage your child’s screen time.

If you do catch your kids being exposed to an inappropriate video, report it.

Videos reported to YouTube as inappropriate are reviewed by real people who can catch the video for what it is. An offending video will be deleted permanently and can get the channel it comes from deleted entirely.

Thanks to CSUCI intern, Jason T. Stewart for researching bot-generated content and co-authoring this article.

I’m the mom psychologist who will help you GetKidsInternetSafe.

Onward to More Awesome Parenting,

Tracy S. Bennett, Ph.D.
Mom, Clinical Psychologist, CSUCI Adjunct Faculty
GetKidsInternetSafe.com

Works Cited

Robertson, Adi. “What makes YouTube’s surreal kids’ videos so creepy” The Verge, https://www.theverge.com/culture/2017/11/21/16685874/kids-youtube-video-elsagate-creepiness-psychology

Maheshwari, Sapna. “On YouTube Kids, Startling Videos Slip Past Filters” NY Times, https://www.nytimes.com/2017/11/04/business/media/youtube-kids-paw-patrol.html

Oremus, Will. “Even YouTube’s service for kids is being abused. Can anything control the massive platforms that now shape our lives?” Slate, https://slate.com/technology/2017/11/those-disturbing-youtube-videos-for-kids-are-a-symptom-of-techs-scale-problem.html

Photo Credits

Photo By: Kaufdex (https://pixabay.com/photos/youtube-media-screen-mac-apple-2449144/)

Photo By: Gerd Altmann (https://pixabay.com/illustrations/binary-one-cyborg-cybernetics-1536624/)

Photo By: Gerd Altmann (https://pixabay.com/photos/hacker-attack-mask-internet-2883632/)

Photo By: Markus Trier (https://pixabay.com/photos/homeschooling-school-technology-5121262/)

Is YouTube Still Targeting Your Kids?

In 2019, YouTube was fined 170 million dollars for illegally advertising to kids. In this article, we’ll cover how YouTube broke the law designed to offer protection for children online, what they did to fix it, and the gap that still puts kids at risk.

To help protect your kids from inappropriate content on the internet, check out our Screen Safety Essential Course. This program offers access to weekly parent and family-oriented coaching videos that will help you to create safer screen home environments and foster open communication all while connecting and having fun as a family. Dr. Bennett’s coaching helps parents make more informed decisions about internet safety and educates families so they can use good judgment when encountering risks online.

What is COPPA?

The Children’s Online Privacy Protection Act (COPPA) requires websites to get parent’s permission before collecting identifying data (like a kid’s name or address) or the cookies from the computer the child is using for children 13 and under. Cookies is a term for a type of data packet sent from a website to a computer and the computer returns the packet to the website. These data packets are a way for websites to track a user and record their actions on the site. Any company caught violating COPPA may be fined up to a maximum of $42,530 per violation.

COPPA applies to any website that is aimed at children or has an audience that can include children such as:

  • PBS Kids
  • Sesame Street
  • Nickelodeon
  • Cartoon Network

How did YouTube break the law?

In 2015 YouTube created a secondary website and app called YouTube Kids dedicated to content for children ages 12 and under. YouTube makes the bulk of their revenue by selling ads and gathering customer data. Customer data is valuable to marketers because it helps them better target advertisements. YouTube Kids gathered child customer data using cookies without parent permission. This was a violation of COPPA. As a result, YouTube received a fine of 170 million dollars.

YouTube marketed itself to advertisers on its popularity with children and made millions of dollars on the subsequent revenue. This led to a surge in kid-oriented content creators who made quick and easy-to-produce videos to capitalize on the profitability of these new advertisers. For example, toy unboxing videos became popular because it was an easy to produce video that generated a lot of views. These content creators are also violators of COPPA because they capitalized on YouTube’s violation for profit.

What has YouTube changed?

The good news is that YouTube no longer collects your children’s personal identifiers and will not allow advertisements that attempt to collect them either. YouTube along with the FTC have also cracked down on content creators who intentionally abused the ad revenue system by mass producing content while YouTube was still collecting kid’s data. Those channels were reported by YouTube, reviewed by the FTC, and channels found guilty were then fined for their own COPPA violation.

YouTube also has guidelines to limit what can be advertised to children. For example, YouTube does not allow advertising of any kind of food or beverage to children. YouTube has also added content filters that are meant to catch content that is oriented at kids and ensure that any advertisement that can collect your data can’t show up on those videos.

But kids are still viewing inappropriate content

The bad news is that the YouTube advertisement system isn’t perfect. YouTube may not be able to target advertisements at your child specifically anymore, but they can still target advertisements at children using videos marked as for children on their main site, or using their secondary site YouTube Kids. YouTube has extra guidelines for kid-oriented advertisements. However,  YouTube does not regulate video content in the same way they regulate advertisements. For example, YouTube won’t allow a thirty second ad about Kool-Aid on their platform if it’s aimed at kids, but Kool-Aid can make a channel and post videos that are essentially an advertisement dressed up like an entertaining video for children. If you’d like to learn more about how advertising affects your children, GKIS already has an article detailing just that linked here.

What does this mean for your child on YouTube?

YouTube has put better practices into place after the COPPA fine. That doesn’t mean that their business model is any different. YouTube is still a website that makes the majority of its money off of advertisements. The website may not be collecting your child’s data but their attention is still a commodity being sold. Content on YouTube can be fun and even educational for children, but you have to be careful of what content your kids are watching.

What can you do to protect your kids on YouTube?

Check what your kids are watching

If you check in on what your child is watching every few videos then you can be sure that they haven’t slipped into watching advertisements dressed up as videos.

Familiarize yourself with your child’s favorite creators

Check a couple of their videos and make sure their content is something you want your child to watch. It will also allow you to be sure this content creator isn’t advertising anything to your children in their videos.

GKIS how to spot marketing supplement

Here at GKIS our how to spot marketing supplement will help teach your kids about the strategies marketers use, and will help them identify when a video is really an advertisement in disguise.

GKIS social media readiness course

Bennett’s social media readiness course helps to teach your kids how to be safe online and recognize the risks on social media sites and found in gaming.

Thanks to CSUCI intern, Jason T. Stewart for researching YouTube’s COPPA fine and co-authoring this article.

I’m the mom psychologist who will help you GetKidsInternetSafe.

Onward to More Awesome Parenting,

Tracy S. Bennett, Ph.D.
Mom, Clinical Psychologist, CSUCI Adjunct Faculty
GetKidsInternetSafe.com

Works Cited

“Google and YouTube Will Pay Record $170 Million for Alleged Violations of Children’s Privacy Law” FTC, https://www.ftc.gov/news-events/press-releases/2019/09/google-youtube-will-pay-record-170-million-alleged-violations

“What are cookies” Norton, https://us.norton.com/internetsecurity-privacy-what-are-cookies.html

Stuart Cobb, “It’s Coppa-cated: Protecting Children’s Privacy in the Age of YouTube” Houston Law Review, https://houstonlawreview.org/article/22277-it-s-coppa-cated-protecting-children-s-privacy-in-the-age-of-youtube

“Advertising on YouTube Kids” Google, https://support.google.com/youtube/answer/6168681?hl=en

Photo Credits

Photo by Tymon Oziemblewski from Pixabay

(https://pixabay.com/photos/youtube-laptop-notebook-online-1158693/)

Photo by Pradip Kumar Rout from Pixabay (https://pixabay.com/photos/cyber-law-legal-internet-gavel-3328371/)

Photo by allinonemovie from Pixabay

(https://pixabay.com/illustrations/minecraft-video-game-blocks-block-1106253/)

Photo by Chuck Underwood from Pixabay

(https://pixabay.com/photos/child-girl-young-caucasian-1073638/)

 

Do You Know Bloody Mary, Talking Angela, and Slender Man? Because You Should!

Cartoon of Bloody Mary Urban Legend Generation after generation kids are terrified by urban legends and fear challenges like Bloody Mary and the man that stalks lovers with his hook arm. These stories used to be shared person to person at slumber parties. Now with screen media, kids are triggered by news stories and visit forums like Creepypasta and Redit. Monsters not only exist for them in the nonvirtual world, but they also stalk them online. What can parents do to minimize online content exposure that may result in disabling fears?

This May, my daughter Morgan was a block away from the Isla Vista murders in Santa Barbara, California and five minutes from being a potential target. She lost one friend that night and another was shot in the leg. We were so grateful the next day to have our daughter in our arms, though aching for the parents who lost their children that night. Morgan’s grief was heartbreaking; her anguish continued for days when she became startled by gunshot noises on the television and ducked upon seeing a motorcycle rider simply watching passerbys on the highway overpass. Despite our every effort to protect her, Morgan’s sense of safety in the world is forever shaken.

And now we learn of the Slender Man stabbing, where two 12 year-old friends plotted and stabbed a third friend over 19 times after a slumber party. The perpetrators are being charged with attempted murder and their victim is fighting for her life. As a clinical psychologist, I am painfully aware that her physical wounds will heal long before her psychological wounds. For those who haven’t read the headlines, Slender Man is an online character designed as a paranormal monster, “a creature that causes general unease and terror,” as described by his creator in a podcast interview.

I cannot ethically comment on the psychological state of the perpetrators in these crimes. However, there is no question that parents should take note and question what effect gaming and Internet activities played in the motivation of the Isla Vista killer and the Wisconsin children, all who plotted their murders over months. The girls reportedly thought they could become “proxies” to the fictional character Slender Man with their violent act.

We are all very familiar with this monster image shared around campfires and at slumber parties of every generation. When we were young, I vividly remember chanting into the mirror with my girlfriends in a dark room, “Bloody Mary, Bloody Mary, Bloody Mary,” convinced that she would appear to us. Each one insisted she knew the child who actually saw Bloody Mary in the mirror. And despite our terror, we too hoped the legendary ghost would choose us for her appearance, kind of… To this day Bloody Mary continues to haunt children, frightening some to fear being alone, or in the dark, for days or months after the legend is shared. Some kids are even traumatized to the point of coming in for psychotherapy to help relieve the resulting sleep deprivation and fears that have reached clinically impairing proportions.

Talking Angela is another recent urban legend that swept playgrounds several months ago. This is an Internet app that involves a white kitten who, when spoken at, will repeat the user’s words in a kitty voice. The rumor suggested a pedophile co-opted the app and was able to see his victim and speak to her when the app was used. During Friday morning coffee with my girlfriends, an 8 year-old daughter of a friend accompanied her mother, because she was too scared and hysterical to go to school after hearing the story the night before. The young girl told me with wide eyes that she personally knew the child to whom the pedophile spoke saying, “How old are you? I know you’re not 30 years old, because I can see you.” Her mother shared that her kids frantically insisted that all screens in the house be covered with a sheet before they went to sleep. Another mother in the group shared a story about how her daughter ended up in their bed terrified night after night when she learned of Bloody Mary.

Some of you may say, “Come on. Being scared by the monsters of urban legend is a child’s rite of passage. Of the millions of children who enjoy these scary campfire stories, few go on to develop phobias and even fewer blur the line between fact and fiction to the point of action.” To those I say you have a point, but is childhood a time that anyone should be scared to where they don’t feel safe and perhaps aren’t safe among their peers?

At the very least, we need to engage in cooperative parenting dialogue and decide for ourselves where risk truly lies. As a mother to a 10 year old who asked me for the Talking Angela app hours after I heard the rumor and then asked for a Slender Man app days prior to the Slender Man tragedy, I’m not particularly comfortable that he is fully Internet safe. What can we do to avoid these exposures and, if they happen, how should we handle them?

4 GKIS Tips to avoid scary content

  • Use Google Safe Search and YouTube Kids to filter Internet content.
  • Insist that screen use only happens in community areas, no bedrooms, no bathrooms, no closed doors.
  • Use device parental controls to limit content by age ratings and pay attention to video game ratings.
  • Do not allow social media apps before middle school.

4 GKIS Tips to overcome the damage from scary content

  • Validate your children’s feelings with compassion, but keep it light with a sense of humor.
  • Give extra hugs and reassure your children that they are safe and secure. Share that you don’t belief in monsters and why.
  • Tell stories about what scared you when your were little and how you overcame it.
  • Teach about why Bloody Mary seems to happen, even though it isn’t happening (neurological event of seeing light in a stimulus-deprived environment).

Remember that little ones have a hard time separating fact from fiction without our help. Be careful not to tease, compassion is where it’s at. A special thank you to GKISser Abby for emailing me and asking for directive advice. Because of you I added more to this article about what parents can do. For more information about how sex and violence can effect behavior, check out my GKIS article Sex and Violence in Video Games Change the Brain: What GKIS Parent Need to Know.

I’m the mom psychologist who will help you GetKidsInternetSafe.

Onward to More Awesome Parenting,

Tracy S. Bennett, Ph.D.
Mom, Clinical Psychologist, CSUCI Adjunct Faculty
GetKidsInternetSafe.com

To learn more specific information about the Talking Angela app, including the concern that it is too easy to toggle between the child/adult option, read more at http://www.snopes.com/computer/internet/angela.asp.  Snopes is an excellent resource anytime you are concerned about the rumor versus fact.