Sexual content on a social media platform with millions of young children should never be allowed, but lines become blurred when TikTok users find subversive ways to share fetish content with others. Behind seemingly innocent videos lay adults seeking to arouse one another, uncaring that it may be occurring right in front of your child. Protect your child from inappropriate online content and prevent digital injury with our GKISScreen Safety Toolkit.
What is a “Fetish” or “Kink”?
A fetish refers to sexual arousal resulting from objects or a specific body part that is not typically seen as sexual.[1] Fetish objects or body parts could include feet, hair, food, or even balloons. For those with a fetish, sex may be less pleasurable or even impossible without the presence of the fetish object.
In contrast, kink refers to unconventional or bizarre sexual activity with self or others.[2] While fetishes can become sexual disorders, kinks typically do not progress to that level. More specifically, Fetishistic Disorder occurs when a person’s fetish escalates to the point of being persistent and distressing.[3] To meet the conditions for this disorder, an individual must experience sexual urges that meet the definition of a fetish, these fetishes must cause distress or impairment in functioning, and the object they fetishize must not be used in cross-dressing or sexual stimulation.[4]
Fetishistic Disorders are typically seen in males and emerge during puberty.[4] Fetishistic Disorder can make it hard for individuals to develop intimate relationships and can cause sexual dysfunction.[4] Studies have also found that fetishism is often correlated with other mental health issues, substance abuse, criminal justice involvement, and an increased risk of sexually transmitted infections.[4]
How are TikTok Users Posting This Content?
Most social media websites have content filters and bots that flag any content that can be considered inappropriate or of a sexual nature. However, most of the fetish content on TikTok does not contain nudity, instead appealing to fetishes that utilize implied sexual behavior.
Popular TikTok user Lena Rae (@lenarae.lh), who has over 230k followers, has created a collection of videos titled, “Is This Fetish Content?” that identifies this type of content and calls out the users to post it.
In a video with almost 18 million views, Lena Rae reacts to a seemingly innocent video from user “putinnu” (a not-so-thinly veiled attempt at sexual innuendo “put it in you”). The video shows a woman in a wedding dress shoving a glass vase into a multi-tiered cake and then proceeding to pour multi-colored, runny frosting inside the vase. Lena Rae points out the vase is a phallic shape. She also comments on the consistency of the frosting and how the person is spilling it as they pour. The woman in the video explains that the frosting is going to run all over the cake, a reference that Lena Rae says is to appeal to those with a “sploshing” fetish. Lena Rae points out that the actions in the video are purposely repetitive and aim to appeal to a fetishistic audience.[5]
While these videos may not be expressly sexual like other content on TikTok, the hidden fetish content videos are flooded with comments from adults who are taking pleasure in the content being suggested to them. This creates a dangerous combination of adults with sexual fetishes consuming content that is “safe” enough to also show up on the For You Page of young kids.
How Viewing This Content Can Affect Kids
Viewing sexual content at any age can be harmful to one’s mental health, but when viewed during a time of development, it can have lasting effects into adulthood.
Experts have found that young children who view pornographic content frequently become isolated, withdrawn, anxious, or depressed.[6] Consumption of online sexual content at a young age can lead to premature sexual experimentation as well as other high-risk behaviors, dating violence, cannabis abuse, or the development of harmful fetishes.[7]
What Parents Can Do
Open communication about sexual content can save your child from digital injury and stunted development. Some experts even recommend talking to your child as young as 9 years old about the difference between “good” and “bad” pictures.[8] Experts believe that in doing so, children will be better able to identify groomers or online predators and be less susceptible to them.
There are also various protective factors that one can turn to to prevent early exposure to sex. Creating an environment where a child feels connected to their parents and family can help them feel more comfortable communicating about the content they consume. To help facilitate difficult conversations about online content, try out our free GKISConnected Family Screen Agreement. Fostering healthy conversations and helping your child create positive self-perception can help kids to seek validation from family and peers rather than online strangers.
Through their quickly consumable content, TikTok, Instagram, and Twitter have made it increasingly easy to connect with people who have similar interests to you. While this helps people to find those like them, it may become dangerous when those interests are morbid. All over social media, you can find fan accounts and fan edits for the world’s most deadly serial killers. By connecting true-crime fans all over the world, social media has caused an emergence of teens who have a cult-like obsession with killers like Dahmer, Bundy, and Manson. Today’s article covers social media’s obsession with serial killers and how you can keep your child safe from digital injury with our GKISScreen Safety Toolkit.
Fandoms and The Rise of the “Stan”
A fandom is a group built around the shared interest or enjoyment of something in popular culture.[1] Since before the creation of the internet, people have gathered to meet and obsess over their common interests. The internet has simply made it easier to do so.
Fandoms provide a space for people to be themselves without judgment, leading to higher levels of self-esteem.[2] There is a term for those who are particularly obsessed, called “Stans.” A Stan is someone who is a mix between a stalker and a fan, someone who shows extreme fandom behavior to the point of excessiveness.[3]
Typically online, you can expect to see fandoms for pop culture groups like movies, TV shows, and musicians. However, with the rise in true crime popularity, a new subgroup has formed of “Stans” with a particular interest in serial killers and their victims. These Stans continually post videos of serial killer interviews edited to music, create fan accounts, and even write serial killer self-insert fanfiction where they are the victims.[4]
Social Media’s Role in Obsession
Social media thrives on content that can get lots of views and produce lots of likes, meaning that the more scandalous and salacious content is, the more likely it is to do well. The notifications from social media likes and comments trigger the reward center of our brain, releasing dopamine and making us feel good all over.[5] When users post content that is related to their fandoms, they get a rush of dopamine and that connection between fandom content and happiness causes them to post more and interact with the content more.
Social media has also created a world where content is readily available for consumption, meaning that people can see posts specifically tailored to their interests 24/7. This allows people to go from fans to superfans, spending their waking moments scouring the internet for posts related to their fandom. One Quora user shared their experience as an obsessive fan, “I’ve been addicted to a fandom for 7 years, and I haven’t been able to stop thinking about it for that long. As per my personal experience, I got sucked into several fandoms due to over-engaging in social media. I over-identified with the idols and associated my own ego with that of their public image.”[6] This idolization of celebrities creates dangerous parasocial relationships that are only made further dangerous when one’s idol is a serial killer. To learn more about parasocial relationships, check out the GKIS article, “The Dangers of Online Parasocial Celebrity Relationships”.
When Does Harmless Become Harmful
It’s easy to brush off fandom behavior as nothing more than a phase one will grow out of. But when the obsession turns into something more it can become dangerous. Cody Ackland was a 24-year-old who grew up obsessed with Ted Bundy, an interest that no one paid much attention to until he attacked and murdered 18-year-old Bobbi Anne McLeod. Just hours before attacking McLeod, Ackland had searched for “Ted Bundy dead victim’s bodies” and “Ukrainian serial killer bodies” on the internet.[7]
Teens have become more and more desensitized to serial killers and true crime content, going so far as to make fan accounts as part of a big internet joke. When 23-year-old Peter Manfredonia was on the run from the police following a double murder he committed, teens on TikTok and Instagram began making fan accounts and posting meme comments to the killer’s personal Instagram page.[8] While the people running these accounts chalk up their actions to being a big joke, there is a large community of people who genuinely run fan accounts for notable serial killers.
Reddit user IkariMonster shared screenshots of several accounts from Twitter to a sub-Reddit, stating, “These teenagers worship and treat serial killers and school shooters like e-boys.”[9] In the screenshots, you can see several fan accounts treating serial killers Ted Bundy, Jeffrey Dahmer, and the Columbine Shooters Eric Harris and Dylan Klebold as though they were celebrities. In one post a teenage girl shares a selfie next to her bedroom wall, which is covered in photos of Dahmer with the caption, “I just thought I’d share cause I think my wall looks pretty [face with hearts emoji].” These accounts and posts are just one example of content and cult-like obsessive behavior that exists across multiple social media platforms.
The victims of these killers were people with friends and family and the pain they endured it absolutely horrendous. There is no reason that serial killers and mass murderers should be praised or celebrated for their actions. The creation of fan accounts and fandom content perpetrates further violence against the families of the victims and serves as a constant reminder of the pain they suffered. GKIS does not endorse this behavior. We are mortified by it and think it is destructive to kids and teens to be so callous and to celebrate violence in this way.
What Parents Can Do
Installing management tools for social media can help you in monitoring their internet behavior. If you would like help with this process, check out our GKISScreen Safety Toolkit made to help empower and provide parents with smart tech tools to filter, monitor, and manage online behavior.
Co-view the content your child interacts with; you can scroll together to choose what content they view and enjoy.
Make it known from the beginning the type of content that is acceptable for your child to view. We can help facilitate this healthy conversation with ourConnected Families Screen Agreement to help you work with your child to create a collaborative, living document.
Fast, entertaining content seems to be the only way to hold the attention span of today’s children and teens. This becomes an issue when virality becomes more important than the content being put out or the audience who will receive it. TikTok users have realized that they can quickly create a viral video by taking Reddit stories and resharing them over clips from popular video games like Minecraft, Roblox, and Subway Surfers. Today’s article covers why this content is dangerous and how you can keep your child safe from digital injury with ourScreen Safety Toolkit.
The Popularity of Minecraft, Roblox, and Subway Surfers
Minecraft
Between 2016 and 2021, Minecraft users have risen from 40 million to 93 million with a peak of 131 million users in 2020.[1] Surveys have found that up to 54% of boys and 46% of girls aged 3 to 12 play Minecraft.[2] The popularity of Minecraft is undeniable. Take a look down any aisle of children’s clothing and you will find item after item filled with popular references from the game. Covid-19 only increased the popularity of the online game accessible on most gaming consoles, with users increasing by 14 million from 2020 to 2021.[2]
Roblox
Roblox is another popular children’s game that has bankrolled off the pandemic and increased in users over the past few years. Between 2021 and 2020, the platform added more than 146 million users to its servers. In 2016, Roblox had 30+ million users. Today they have more than 202 million monthly active users.[3]
While the average user of Minecraft is 24 years old, only 14% of Roblox users are over 25 years old. 67% of Roblox users are under 16 years old, and 54.86% of users are under 13 years old.[3] The popularity of Roblox is reflected on TikTok through a large number of Roblox fan accounts owned by young users. In fact, there have been many audio trends on the platform that are accredited to edits by Roblox users.
Subway Surfers
While less popular than Minecraft and Roblox, Subway Surfers is still the most downloaded and most-played mobile game in the app store. Between 2021 and 2022, Subway Surfers saw more than 2 times the number of downloads with 5.43 million daily users.[4] These numbers are impressive, as, unlike Minecraft and Roblox, Subway Surfers is only available for download on mobile smartphones. It is not available on any other gaming platform. While there is no available data for the average age of Subway Surfer players, the game is recommended for ages 9 and up.
All three of these games are different in their gameplay but similar in their popularity and appeal to younger audiences. Videos made with clips from these games are sure to catch the attention of a younger audience who is trying to view content related to their interests.
Harmful Reddit Content and How They Use Kids’ Games
Reddit claims on its website to be “a network of communities where people can dive into their interests, hobbies, and passions.”[5] Users post on so-called “subreddits” dedicated to topics from the general to much more specific niches.
Today there are over 2.8 million subreddits and Reddit has 52 million users daily.[6] Of their monthly users, 79% are between the ages of 18 to 34 and almost 64% of those users are men.[7] When children encounter posts from Reddit, they are viewing material often made for and by adult men, leading to some very inappropriate content.
A simple search for “Reddit stories” on TikTok will show you video after video of Reddit threads overlaid with a video of someone playing a kid’s game and read by a voiceover. The threads contain content from various subreddits like “/AmITheAsshole” but most contain inappropriate “NSFW” (Not Safe for Work) content. A scroll down the search shows video after video of content with titles like, “My mom keeps having sex with my boyfriend…” or “What do you tell your partner when you’re horny?” or “What did you not know about sex until you lost your virginity?” The last question is from a Subway Surfers video with 1.6 million likes and 13.5 million views from a user with similar content and over 55.4 thousand followers.
The creators of these videos use the popularity of these kids’ games to bring more viewers to their content and help their virality. In doing so, they are promoting sex, adult situations, and overall unsafe content. Each of these videos is followed up by hundreds or thousands of comments from young kids and teens adding their own answers to the question proposed. A quick look at commenters’ profiles revealed that half were between the ages of 13 to 17, and each was commenting on their own sexual experiences to millions of viewers.
Set up content preferences and block out video keywords or put the account on restricted mode in the settings and privacy section of TikTok.
View an account’s watch and comment history and see what content your child is consuming and interacting with.
For younger children, consider waiting before allowing them to have their own TikTok account.
When they do adopt a TikTok account, scroll together to choose what content they view and influence the algorithm that will offer content automatically.
As your child gets older, work with them to establish what they are doing and looking at on social media. Preparing them for the possibility of adult content can help them in discovering it on their own and possibly suffering a digital injury.
If you fear your child may be watching inappropriate Reddit content without your knowledge and permission, check out our GKISScreen Safety Toolkit. This toolkit helps to empower parents and provides them with smart tech tools to filter, monitor, and manage online behavior.
To protect your child, prevent digital injury, and prepare them for social media use check out our GKIS Social Media Readiness Course.
Thanks to CSUCI intern, Katherine Carroll for researching how TikTok users use Reddit content and kids’ games for views.
I’m the mom psychologist who will help you GetKidsInternetSafe.
How would you feel if you found out that your child is going to extreme and dangerous lengths to change their appearance? What if your child is putting themselves in potential harm to fit beauty standards set by beauty filters? Beauty filters can be a fun way to transform selfies, but they have failed to embrace the beauty of all skin tones, especially dark ones. This has led to the rise of colorism and extreme self-esteem issues. To help you recognize the dangers of social media on self-esteem, I interviewed Dr. Chavarria, CSUCI Assistant Professor of Sociology, to offer insight on how colorism affects minority communities and how to prevent it. If you are concerned for your child’s mental and physical well-being when they interact on social media, check out our Social Media Readiness Training for tweens and teens. Our guide prepares your children for safer screen use and prevents psychological illness with our expert emotional wellness tools. Today’s GKIS article shares the story of a young girl negatively affected by beauty filters and tips you can take to help protect your kids from colorism.
What are beauty filters?
Beauty filters are social media features that beautify and erase people’s imperfections and flaws by creating a modified version of themselves. Specific modifications can be anything, but the most popular filters alter the size of facial features, change eye color, and add effects like make-up or long eyelashes.[1]
The Negative Effects of Filters
Low Self-Esteem
Although filters can be fun, they can also be damaging to one’s self-esteem. Research demonstrates that the use of filters can lead to low self-esteem because filter users are more likely to hyper-focus on the features they dislike when using them. This can then lead to frequently comparing one’s real looks with filtered looks, changing our beauty “ideal” and recognizing (even obsessing on) our failure to live up to that ideal. Not being able to accomplish the same look with these filters can make someone feel less than or that they will always be below beauty standards. For others, it may motivate them to find a way to change their appearance to better match the beauty standards set by social media regardless of the risks these changes pose.[2]
The Rise of Colorism
It has been noted by many social media users that beautifying filters usually have a lightening or bleaching effect on the skin. In fact, according to skin color expert Ronald Hall, this effect is not an accident. He explains that it is a way to maintain and conform to historically Eurocentric beauty standards.
Beauty filters are promoting a rise in colorism. Colorism refers to prejudices or discrimination an individual may experience for having a darker skin tone. This phenomenon usually occurs among one’s own ethnic or racial group.[3,4]
A Young Teen Takes Drastic Measures to Change Appearance
Lise, a young teenager, shared her struggles with colorism. Her experience included being bullied for her darker skin tone. The bullying not only came from white girls at school but, to her surprise, also from those who looked similar to her in her same ethnic or racial group.
Seeing pictures of light-skinned women receive lots of likes and positive comments online also confirmed to Lise that she did not meet society’s standards of beauty, bringing her self-esteem down. To try to lighten her skin, Lise began to scrub her mom’s bleaching cream into her skin with a copper wire brush. Even without abrasion injuries, bleaching products can pose health risks.[4]
If you are concerned that your child is suffering from a digital injury like mood and anxiety disorders triggered by compare-and-despair, check out our GKIS Online Safety Red Flags For Parents. With this guide, you’ll learn the behavioral red flags to look out for that may signal your child is suffering from digital injury.
Colorism Affects Minority Communities on a Larger Scale
Colorism is an issue that not only affects self-esteem, but it has also been a problem for minority communities on a larger scale. Dr. Chavarria, CSUCI Assistant Professor of Sociology, explained in our interview that the emergence of colorism, particularly in the Latino society, has been a consequence of conquest and colonization of indigenous communities.
Colonizers constructed these ideas about indigenous communities so they would be perceived as inferior, uncivilized, having no knowledge, and being closer to evil. Whites or being light-skinned, in contrast, have historically been constructed to be perceived as better, good, and even closer to God.
This construction caused the devaluation of indigenous identity features such as brown skin, indigenous language, and ethnic practices leading to the destruction of indigenous communities. Many who managed to survive and succeed in the majority culture often did so by blending in and learning to assimilate. Ethnic roots were lost over generations, and minority communities lost a sense of pride in what they look like. Dr. Chavarria reported that research has demonstrated how individuals that align with beauty standards often get more career opportunities and higher pay.
How to Help Stop Colorism
Start with Family
Colorism needs to be stopped. A first step is addressing how colorism starts within the family. Dr. Chavarria stated that, although colorism often starts with the family, grandparents and parents are often not even aware they are engaging in it. They too have been socialized to believe these ideas about their indigenous roots and characteristics. Therefore, educating family members about what colorism is and how it can cause generational trauma can be the first important step to change.
As a Chicana who has also experienced colorism within my community and family, I recognize that change can be hard. Sometimes I didn’t know how to tell my grandmother that the “advice” she gave me was conforming to Eurocentric standards and colorism, and that it did more damage than help. For example, when family members told me that I should find a light-skinned man with colored eyes so my future children can inherit those features, they seemed to be telling me that, as a brown girl, I did not possess “beautiful” features.
Follow Body-Positive Campaigns
Dr. Chavarria also highly recommends that social media users check out campaigns directed to make positive changes. Cultural Survival on Facebook is a campaign that she tracks. It is an international organization that engages with indigenous communities across the globe. They address important issues like colorism by protecting indigenous women and challenging Eurocentric notions of beauty.
Practice Self-Awareness
If you find yourself contributing to colorism with comments and negative self-appraisals, challenge yourself for positive change.
Speak Out
As you become more self-aware, speak out to friends and post positive pro-beauty messages that demonstrate that beauty comes in many shades and colors. We must consistently challenge historical ideas to break biases and end discrimination. It starts with us, let’s get started!
Thanks to Dr. Chavarria for offering expert insight on colorism and how to prevent it. Thanks also to CSUCI intern Ashley Salazar for researching and co-authoring this article. Colorism is on a high rise due to beauty filters on social media. Check out our GKIS courses to learn to have easier dialogues with your children and protect them from digital injury.
I’m the mom psychologist who will help you GetKidsInternetSafe.
In 2019, YouTube was fined 170 million dollars for illegally advertising to kids. In this article, we’ll cover how YouTube broke the law designed to offer protection for children online, what they did to fix it, and the gap that still puts kids at risk.
To help protect your kids from inappropriate content on the internet, check out our Screen Safety Essential Course. This program offers access to weekly parent and family-oriented coaching videos that will help you to create safer screen home environments and foster open communication all while connecting and having fun as a family. Dr. Bennett’s coaching helps parents make more informed decisions about internet safety and educates families so they can use good judgment when encountering risks online.
What is COPPA?
The Children’s Online Privacy Protection Act (COPPA) requires websites to get parent’s permission before collecting identifying data (like a kid’s name or address) or the cookies from the computer the child is using for children 13 and under. Cookies is a term for a type of data packet sent from a website to a computer and the computer returns the packet to the website. These data packets are a way for websites to track a user and record their actions on the site. Any company caught violating COPPA may be fined up to a maximum of $42,530 per violation.
COPPA applies to any website that is aimed at children or has an audience that can include children such as:
PBS Kids
Sesame Street
Nickelodeon
Cartoon Network
How did YouTube break the law?
In 2015 YouTube created a secondary website and app called YouTube Kids dedicated to content for children ages 12 and under. YouTube makes the bulk of their revenue by selling ads and gathering customer data. Customer data is valuable to marketers because it helps them better target advertisements. YouTube Kids gathered child customer data using cookies without parent permission. This was a violation of COPPA. As a result, YouTube received a fine of 170 million dollars.
YouTube marketed itself to advertisers on its popularity with children and made millions of dollars on the subsequent revenue. This led to a surge in kid-oriented content creators who made quick and easy-to-produce videos to capitalize on the profitability of these new advertisers. For example, toy unboxing videos became popular because it was an easy to produce video that generated a lot of views. These content creators are also violators of COPPA because they capitalized on YouTube’s violation for profit.
What has YouTube changed?
The good news is that YouTube no longer collects your children’s personal identifiers and will not allow advertisements that attempt to collect them either. YouTube along with the FTC have also cracked down on content creators who intentionally abused the ad revenue system by mass producing content while YouTube was still collecting kid’s data. Those channels were reported by YouTube, reviewed by the FTC, and channels found guilty were then fined for their own COPPA violation.
YouTube also has guidelines to limit what can be advertised to children. For example, YouTube does not allow advertising of any kind of food or beverage to children. YouTube has also added content filters that are meant to catch content that is oriented at kids and ensure that any advertisement that can collect your data can’t show up on those videos.
But kids are still viewing inappropriate content
The bad news is that the YouTube advertisement system isn’t perfect. YouTube may not be able to target advertisements at your child specifically anymore, but they can still target advertisements at children using videos marked as for children on their main site, or using their secondary site YouTube Kids. YouTube has extra guidelines for kid-oriented advertisements. However, YouTube does not regulate video content in the same way they regulate advertisements. For example, YouTube won’t allow a thirty second ad about Kool-Aid on their platform if it’s aimed at kids, but Kool-Aid can make a channel and post videos that are essentially an advertisement dressed up like an entertaining video for children. If you’d like to learn more about how advertising affects your children, GKIS already has an article detailing just that linked here.
What does this mean for your child on YouTube?
YouTube has put better practices into place after the COPPA fine. That doesn’t mean that their business model is any different. YouTube is still a website that makes the majority of its money off of advertisements. The website may not be collecting your child’s data but their attention is still a commodity being sold. Content on YouTube can be fun and even educational for children, but you have to be careful of what content your kids are watching.
What can you do to protect your kids on YouTube?
Check what your kids are watching
If you check in on what your child is watching every few videos then you can be sure that they haven’t slipped into watching advertisements dressed up as videos.
Familiarize yourself with your child’s favorite creators
Check a couple of their videos and make sure their content is something you want your child to watch. It will also allow you to be sure this content creator isn’t advertising anything to your children in their videos.
GKIS how to spot marketing supplement
Here at GKIS our how to spot marketing supplement will help teach your kids about the strategies marketers use, and will help them identify when a video is really an advertisement in disguise.
GKIS social media readiness course
Bennett’s social media readiness course helps to teach your kids how to be safe online and recognize the risks on social media sites and found in gaming.
Thanks to CSUCI intern, Jason T. Stewart for researching YouTube’s COPPA fine and co-authoring this article.
I’m the mom psychologist who will help you GetKidsInternetSafe.
Persuasion plays a big role in our interaction with the world and the people around us. We persuade our friends to watch our favorite movie with us and are persuaded to elect a new president. Some manipulations are transparent. Others are well-thought-out psychological hidden tricks to make easy money. With the rise of technology, methods of manipulation and persuasion are commonly used. From advertisements to get you to buy products to being kept in the dark about what the company promises you, these methods of manipulation are known as dark patterns.
What are dark patterns?
Dark patterns are persuasive techniques used by companies to trick people into buying and signing up for things.
The term dark pattern was coined by Harry Brignull, a cognitive scientist.[1] He describes dark patterns as, “A user interface that has been carefully crafted to trick users into doing things. they are not mistakes. They are carefully crafted with a solid understanding of human psychology, and they do not have the user’s interest in mind.”[2]
Why do companies use dark patterns?
The Internet is a business platform. Websites are designed to capture your attention. To stay competitive, companies must have offers that set them apart, like the end-cap items at the grocery store.[3]
Intentional product placement forces customers to view more expensive merchandise on their way to grab their everyday purchases. Online websites have similar methods of forcing users to see attractive products.
Dark patterns come in many different styles, all with the same intention of keeping the misleading strategy somewhat hidden. Sometimes, dark patterns can be illegal. Brignull says, “Many designers, and possibly even most, hate using dark patterns in their work, but they are forced to implement them by managers. These managers only care about one or two individual metrics, not the experience of the site or brand as a whole. So, a manager who is tasked with increasing the number of people who sign up for a company’s newsletter might order a website designer to use a dark pattern to capture email addresses, because it’s an easy short-term solution that doesn’t require any effort.”[4]
Types of Dark Patterns from Dr. Brignull’s website, darkpatters.org.
Bait & Switch
The bait and switch technique refers to the act of advertising a ‘too good to be true’ price that is not stocked. By grabbing the customer’s attention, the chances of purchasing an alternatively higher-priced item go up.
Disguised Ad
Disguised ads are advertisements designed to appear like the content the user was searching for, so they’ll mistakenly click on them. They are typically presented to people during informational searches.
Forced Continuity
The forced continuity dark pattern is used when a company offers a free trial period. The company holds the customer responsible for unsubscribing from the free trial period, otherwise charging them for their subscription.
Obstruction
Obstruction refers to the strategy of making a particular task more difficult than it needs to be. The intent is to frustrate or confuse the customer so they give up before completing the task. A common obstruction is hiding the unsubscribe link or instructions, so the customer gives up and keeps on paying their monthly subscription cost.
Friend Spam
Friend Spam refers to the devious act of asking for access to your social media friends list and then spamming them with ads. The user agrees because they’re rushing through the signup process or under the impression that your friend list will be used for a desirable outcome, like finding more friends. LinkedIn was sued for $13 million in 2015 for using this dark pattern.
Hidden Costs
Hidden costs typically appear in the last step of the checkout process when the company asks for additional and unexpected charges, like delivery or shipping charges. Because the customer has already invested enthusiasm and time in the purchase, they are less likely to bail on the transition at the end of the process.
Price Comparison Prevention
This dark pattern is in play when the retailer makes it hard for the user to compare the prices of an item with another item to prevent them from making an informed buying decision.
Privacy Zuckering
Privacy Zuckering occurs when the customer is tricked into publicly sharing more information about themselves than they intended. This dark pattern was named after Facebook CEO Mark Zuckerburg because, in the company’s early years of development, Facebook made it difficult for users to control their privacy settings and easy to overshare by mistake.
Today, ‘privacy zuckering’ works more deviously, using data brokers to collect personal information that they sell to other companies.[8] These techniques were described in Facebook’s lengthy ‘terms and conditions.’ But most users won’t read them because of the overly burdensome legalese.[9]
Roach Motel
This dark pattern technique makes it easy for you to get into a certain situation but hard for you to get out of it.
An example is if a customer wants to delete their profile or content on social media but is punished with unwanted consequences if they do so (like requiring you to lose all of your photos or contacts as a penalty for deletion).
Sneak into the Basket
Sneak into the basket happens when the customer attempts to purchase something, but somewhere during their purchasing journey, an additional item appears in their basket.
Trick Question
A trick question compels you to answer thinking one thing but, if read carefully, it asks for another thing entirely.
Fear of Missing Out
This dark pattern technique is made to look like the item you’re thinking of purchasing is in high demand, pressuring you to make the decision quickly.[10]
Examples include a reminder of the number of people looking also at the specific item. The marketing technique of scarcity will alert you, “Only 3 left!” Booking hotel rooms, airplane flights, and merchandise on Amazon use this persuasive tactic to trigger your urgency and thus increase sales.
Nagging/Forced Action
A popup appears that requires action before you can move forward with your online task.
Sneaking
Sneaking refers to hiding, disguising, or delaying relevant information to force uninformed decisions.
Intentional Misdirection
Intentional misdirection is a persuasive technique that offers a promise for a free or inexpensive item, only to eventually inform you that the free item doesn’t apply to you after all, and you must purchase a different solution.
An example of this type of dark pattern is that was used by the company Turbo Tax. In this instance, Turbo Tax offered people the option to file their taxes online for free. But once in the sales funnel, customers discovered that the free option only applied to people needing to file a simple W-2 form (which is a small minority of taxpayers). Once customers purchased the paid option that applied to them, they would discover the extra $60 to $200 charges for any forms that deal with loans and mortgages.[5] That moves what looks like a free service to an unexpected paid service.
Our GKIS favorite podcast Reply All covered dark patterns in episode #144. They reported that Turbo Taxes’ second option, known as Turbo Tax-Free File/Freedom, was not advertised and the link could not be found anywhere on their website. Further, Turbo Tax-Free File was only free for people whose adjusted gross annual income was $66,000 or less.[6] When the producers searched online for TurboTax Freefile/Freedom, Google offered two options, an orange button asking if you qualify or a blue button saying “start for free.” Most people would choose the option of the blue button that offers free tax filing. But choosing the blue button takes you back to Turbo Tax-Free, where only simple forms are free. Finding TurboTax Free File was only accessible through a link from IRS.gov.[7]
How to Avoid Being Victimized by Dark Patterns
In Dr. Bennett’s book Screen Time in the Mean Time she writes, “Due to screen technology, this generation of children are more marketed to than any other children in history. Why? Because there’s BIG money in child and teen products. If you can’t see the product, you are the product.” GetKidsInternetSafe courses help parents and kids be educated consumers.
Thanks to Andrew Weissmann for his research and help with writing this article.
I’m the mom psychologist who will help you GetKidsInternetSafe.