Snapchat receives approximately 10,000 sextortion reports monthly, with internal documents revealing this is "likely a small fraction" of actual cases.
Introduction
With 74% of teenagers using Snapchat, it's one of the most influential platforms shaping young people's social experiences and mental health. But what many parents don't realize is that behind Snapchat's playful filters and disappearing messages lies a business model built on maximizing teen engagement—regardless of the consequences.
This article examines the documented risks of Snapchat based on internal company communications revealed through 600 lawsuits analyzed by researcher Jonathan Haidt, and evaluates whether Snapchat's parental controls can effectively mitigate these dangers.
How Snapchat Works
Core Features
Disappearing Messages
Snapchat's defining feature creates an illusion that content vanishes after viewing. This false sense of security encourages risky sharing while making parental oversight nearly impossible.
Streaks
Users must exchange snaps daily to maintain their "streak." Breaking a streak causes visible loss of status, deliberately creating anxiety and daily usage pressure.
Stories
Content visible to friends for 24 hours, exploiting FOMO (fear of missing out).
Spotlight
Algorithm-curated viral content similar to TikTok, designed to maximize engagement.
Quick Add
Feature suggesting "friends of friends" that facilitates connections with strangers.
Snap Map
Can share user's precise location in real-time with connections.
Business Model
Snapchat's business model, like most social platforms, relies on advertising revenue. The more time users spend on the app, the more ads they see, and the more money Snapchat makes. This creates a fundamental conflict of interest: features that would genuinely protect children would inevitably reduce engagement metrics and hurt revenue.
In a 2017 internal email revealed through lawsuits, a Snapchat employee wrote about the streaks feature: "Wow, we should add more addicting features like this."
This wasn't a slip of the tongue—it was a deliberate acknowledgment of their engagement-maximizing strategy.
The Documented Risks
The Sextortion Crisis
Sextortion—when someone manipulates a child into sending compromising images, then blackmails them—has reached crisis levels on Snapchat. According to internal documents, Snapchat receives approximately 10,000 sextortion reports monthly.
A frustrated Snapchat employee wrote in an internal communication: "God I'm so pissed that we're over-run by this sextortion shit right now. We've twiddled our thumbs and wrung our hands all f***ing year."
More alarming still, another internal email admitted these reports "likely represent a small fraction of this abuse" since most victims never report their experiences due to shame and fear.
Law enforcement data shows that where the platform is identified, half of all child sexual abuse image offenses occurred on Snapchat. Most victims are boys aged 14-17, with over 20 documented suicides linked to these cases.
Addiction By Design
Snapchat's features aren't accidentally engaging—they're engineered for addiction:
- Streaks create artificial commitment and daily usage pressure
- Stories exploit FOMO (fear of missing out)
- Trophies and scoring gamify social interaction
- Time-limited content creates urgency to check regularly
This design pattern is no accident. When internal documents discussing streaks came to light through lawsuits, employees explicitly acknowledged the addictive nature of these features, with one writing: "Wow, we should add more addicting features like this."
The Mental Health Connection
Since 2007, youth suicide rates have more than tripled—timing that correlates directly with social media adoption. For people aged 10-24, suicide is now the second leading cause of death in the United States.
.png)
Youth suicide rates have more than tripled since 2007, correlating with social media adoption
"Comparison is the thief of joy" – this principle explains why social media is so harmful. Snapchat's beauty filters and social scoring (via streaks and friend counts) create constant comparison loops. Teens measure themselves against manipulated, filtered versions of peers, creating a reality in which they can never measure up.
Studies show this continual social comparison is directly linked to depression, anxiety, and suicidal ideation—especially during key developmental stages when self-image is being formed.
Corporate Knowledge
The most disturbing aspect of this story is Snapchat's awareness of these problems. When faced with suggestions to better identify predator grooming, internal documents show Snapchat complained this would create "disproportionate admin costs."
They acknowledged adults were targeting children for "deeply pernicious and dangerous" conduct but didn't want to "strike fear" among young users—which would potentially reduce engagement.
In 600 lawsuits analyzed by researcher Jonathan Haidt, a pattern emerged: "Company insiders were aware of multiple widespread and serious harms, and in many cases did not act promptly."
Snapchat's Family Center: A Critical Analysis
In response to mounting concerns, Snapchat created "Family Center," a set of parental controls. But how effective are these tools at addressing the core risks?
What Parents Can See vs. What They Can't
Snapchat Family Center Visibility
Parents Can See
- person Who your teen chatted with in the last 7 days
- group Their friends list and new friend additions
- forum Group chat memberships
- flag Option to report accounts
Parents Cannot See
- chat Any message content
- image Photos or videos shared
- schedule When conversations happen
- delete Deleted conversations older than 7 days
- send Any content exchanged in real-time
The reality: Parental controls only show surface data. All critical content remains invisible to parents.
Content Restriction Reality
Family Centre allows parents to toggle "Restrict Sensitive Content," but this control:
- Only applies to Stories and Spotlight
- Does not restrict direct messages
- Does not filter search results
- Does not block content shared by friends
- Does not apply to subscribed content
This means potentially harmful content reaches your teen through multiple pathways even with "restrictions" on.
My AI Control Reality
Parents can disable Snapchat's AI chatbot from responding to teens, but teens can still send messages to My AI—they simply won't get responses. If your concern is inappropriate conversations with AI, this control doesn't prevent the attempt—it only prevents the reply.
Location Tracking Limitations
Perhaps most concerning is that location sharing requires teen consent and can be revoked at any time. This is not a reliable monitoring solution as it depends entirely on your teen's ongoing cooperation.
The Fundamental Conflict
The reality is that Snapchat's business model fundamentally conflicts with robust parental controls. True protection would mean less time spent, which hurts their revenue.
The parental controls offer visibility, not safety. They show you activity perimeters, not actual content.
Effective Protection Strategies
Given the significant gaps in Snapchat's parental controls, these additional measures are crucial:
1. Disable Camera Access at the Device Level
The single most effective protection against sextortion is disabling camera access:
2. Implement Network-Level Filtering
These solutions can block Snapchat entirely:
- Circle Home Plus: Filters at the router level with time limits
- Gryphon: Advanced parental controls at the network level
- NextDNS: DNS-level filtering that blocks Snapchat connections
3. Device-Level Controls
Both iOS and Android provide robust parental controls:
iOS Screen Time:
- Set app limits specifically for Snapchat
- Block app installation/deletion
- Restrict mature content
Android Family Link:
- Set time limits and bedtime schedules
- Approve/deny app downloads
- Lock device remotely
4. Create a Safety Agreement
Download our Technology Safety Agreement Template
Key elements should include:
- No communications with unknown people
- No sharing personal information or photos
- Regular reviews of friends list
- Clear consequences for violations
Addressing the Peer Pressure Reality
"But all their friends have it." This is perhaps the most powerful challenge parents face.
Let me ask you this: If all your child's friends were drinking alcohol or smoking cigarettes, would you provide those substances to keep your child from feeling left out?
Social media platforms like Snapchat are engineered to be addictive in the same way as cigarettes—but with unrestricted access to children. The designers deliberately exploit brain chemistry for profit.
The difference is we've recognized the dangers of cigarettes and alcohol for minors, creating age restrictions and social norms against their use. We need to approach addictive technology with the same protective mindset.
Practical Approaches:
- Connect with other parents to establish shared boundaries
- Highlight unique benefits of not being on the platform (privacy, mental wellbeing, etc.)
- Find alternative ways to maintain social connections
- Remind teens that true friends will connect outside of any specific platform
Warning Signs to Watch For
If your teen uses Snapchat, be alert for these concerning behaviors:
- Secretive behavior around phone
- Hiding screen when you approach
- Emotional changes after using app
- Unusual gift card purchases
- Late night device usage
- Withdrawal from family or other activities
- Anxiety when unable to access the app
Resources and Downloadable Guides
Family Internet Safety Agreement
A customizable agreement template to establish clear boundaries for Snapchat and other social media use.
DownloadOnline Predator Warning Signs
Learn to identify potential warning signs of sextortion, online predators, and other digital dangers.
DownloadSafe Chat Conversation Starters
Helpful conversation prompts to discuss digital safety with your kids in a non-threatening way.
DownloadConclusion
Ask yourself: If you knew a product had 10,000 monthly reports of harm to children, with employees internally admitting the real number is much higher, would you let your child use it?
The choice is yours. If you do allow Snapchat, combine Family Centre with the additional protections we've discussed.
But remember: the most effective protection remains the simplest—either delay access until your child is older, or if allowed, implement robust external controls that go far beyond what Snapchat's own tools offer.
You wouldn't give your child keys to a car without proper training and rules. The same principle applies to powerful social tools designed to maximize engagement regardless of consequences.
Take Action Now
Sources:
- Haidt, J., & Rausch, Z. (2024). "Snapchat is harming children at an industrial scale." After Babel Substack
- New Mexico Attorney General's lawsuit against Snap, Inc. (2023)
- NSPCC analysis of Ofcom data (2023)
- Centers for Disease Control and Prevention. (2023). Web-based Injury Statistics Query and Reporting System (WISQARS)