Should Your Kid Have Snapchat? A Cybersecurity Expert's Guide for Parents

What a 30-year cybersecurity expert found in Snap's SEC filings, internal documents, and 63 wrongful death lawsuits

Your tween comes home and says everyone has Snapchat. This isn't new. It's what happens when kids hit 10-12. The app is embedded in 75% of teen social life, and at some point, every parent faces this decision. Most of what's written about Snapchat covers the surface. This covers the business model, the internal documents, and what actually works.

Two dangers that matter

Two things threaten your kid on the internet more than anything else.

The first is addictive algorithms. Any system designed to maximize how much of your attention it captures. Instagram, TikTok, YouTube, Snapchat. The business model requires studying your behavior so they can sell better-targeted ads. The more time you spend, the more they learn, the more they earn. Since 2007, when these systems landed in kids' backpacks, bedrooms, and pockets, youth suicide rates for ages 10-24 have tripled according to the WHO mortality database. Suicide is now the second leading cause of death in that age group. The human frontal lobe doesn't finish developing until around age 25, which is exactly where the suicide spike drops off. This is not coincidence.

Chart showing increase in youth suicide rates since 2007

Youth suicide rates have more than tripled since 2007, correlating with social media adoption

Snapchat's beauty filters and social scoring (streaks, friend counts, Snap Score) create comparison loops that don't stop. Teens measure themselves against manipulated, filtered versions of their peers and come up short every time. Research links this kind of constant comparison to depression, anxiety, and suicidal ideation, especially in the years when kids are still figuring out who they are. Sextortion mostly affects boys. The comparison and body image problem mostly affects girls. Snapchat hits both.

The second is anonymous communication. Online chat is where predators hunt for children. Not in vans at the park. They use VPNs and go on Roblox, Instagram, Snapchat, and Discord. Every major criminal network in the world now has dedicated resources for this because it's profitable. The Yahoo Boys from Nigeria, who used to send prince emails, now run sextortion operations targeting teenage athletes on Snapchat and Instagram. Where the platform is identified in child sexual abuse cases, Snapchat accounts for 50%. The National Center for Missing and Exploited Children received 550,000 sextortion reports in 2024. 100,000 were AI-generated. In 2025, they're on track for a million.

AI is the accelerant underneath both. It makes algorithms smarter at predicting what holds your attention. It gives predators tools to generate fake images and automate grooming at scale. And Snapchat has built its own AI chatbot, called My AI, directly into the app and pointed it at 60 million teenagers. The people who build these AI systems have publicly said they don't understand how they work. In traditional software, a bug means you lose a document. In AI, a bug is in your brain. Kids should never be early adopters of technology that enters their brain.

warning
The core problem

Snapchat has all three problems in one place: addictive algorithms, anonymous communication, and AI. No other platform combines all three at this scale.

Follow the money

Snapchat is not a chat company. It's an advertising company. 100% of its revenue comes from selling ads.

Their pitch deck to advertisers says they "reach 90% of 13 to 24-year-olds" with "$5 trillion in spending power." That's the product they're selling. Access to your kid.

When forced to disclose financial details to Congress, Snap revealed they make $437 million a year from minor users alone. Their trust and safety budget for all users, not just kids, is $135 million, and they cut it 18% last year. They have 163 full-time safety employees responsible for 60 million teenagers.

None of this age data appears in their SEC filings. Investors don't see it. Advertisers see the pitch deck. Congress sees it under subpoena. Parents see none of it.

The disappearing myth

The foundation Snapchat was built on is a lie. Messages don't disappear.

Snapchat stores unopened Snaps on their servers for up to 31 days. Metadata survives at least 30 days. Their own privacy policy says: "We cannot promise that deletion will take place by a specific time." That's the company's own words contradicting their core marketing claim.

In the first half of 2025, Snap received 25,449 government data requests affecting 41,007 user accounts and complied 81.6% of the time. If messages truly disappeared, law enforcement wouldn't bother requesting them at this scale. Forensic tools used by police can recover deleted Snapchat content directly from devices. Criminal convictions for sexual assault, murder, and drug trafficking have been secured using Snapchat evidence that supposedly vanished.

For years, Snapchat allowed third-party apps that let users save "disappearing" photos permanently. One of those apps got hacked, and all the saved photos were sold online.

A colleague of mine shared a story from her school district. A high school senior texted her friend on Snapchat that she was going to "shoot up to the school" to pick someone up. Snapchat's automated keyword filter flagged it. The FBI showed up at her house at 3am and showed the family the message. The school expelled her despite the FBI classifying it as a non-threat. Messages didn't disappear. Snapchat flagged them automatically, shared them with law enforcement, and a good kid got expelled over a misread text.

That same system that monitors keywords and flags teenagers to the FBI? When police report drug dealers operating on the platform for a year straight, those accounts stay active.

Features, one by one

Every feature Snapchat offers ties back to one of those two dangers: capturing your attention or connecting you with people who shouldn't have access to your kid.

bolt

Streaks

Users have to exchange Snaps every single day to maintain a streak counter. Break it and you lose your number. Your friend loses theirs too. That's shame on both sides, by design. Leaked internal memos use the word "addiction" to describe this feature. One employee wrote: "Wow, we should add more addicting features like this." If you lose a long streak, Snapchat will sell it back to you for $5. It's a compulsion mechanic with a payment system attached.

High Risk
location_on

Snap Maps

Real-time precise location, shared with every connection. A lot of teenagers keep it enabled constantly. It shows where they live, where they go to school, and where they are right now. On the same platform where sextortion and trafficking are documented at industrial scale, kids are broadcasting their home address to everyone on their friends list, including the strangers who got there through Quick Add.

High Risk
person_add

Quick Add

This feature suggests "friends of friends," extending the contact network one hop at a time. The attorney general of New Mexico found through discovery that "regardless of whether a minor explicitly allowed Snapchat to access their contacts, they could still receive Quick Add suggestions from strangers," and that the algorithm "immediately begins recommending more adults to connect with." Two or three hops from a school friend, you're connected to people nobody in the family has ever met.

High Risk
trending_up

Spotlight

Snapchat's algorithmic short-form video feed, designed to compete with TikTok. The algorithm curates content to maximize engagement. This is the addictive algorithm pillar in its purest form: an infinite feed of content selected by a system that learns what you can't stop watching.

Medium Risk
history

Stories

Content posted to your profile for 24 hours, then it disappears from the feed (but not from Snap's servers). The 24-hour window creates urgency to check back. If you don't look today, you miss it. FOMO as a product feature.

Medium Risk

My AI

Snapchat's AI chatbot is different from the other features. It's not a communication tool or a social mechanic. It's an AI system that analyzes everything your kid does on the platform.

My AI is powered by OpenAI (the makers of ChatGPT) and Google Gemini. When your kid talks to it, that conversation is processed through Snapchat's servers and through OpenAI or Google's infrastructure. Three companies with access to your child's thoughts.

Snapchat rolled My AI out to every user, including minors, with no opt-in. It was pinned above human friends in the chat feed. There was no way to remove it unless you paid for Snapchat+. App Store ratings crashed from 3.05 to 1.67 with 75% one-star reviews. About 20% of Snapchat's user base are children.

My AI collects full conversation transcripts, interaction patterns, and draws from all Snapchat activity, photos, messages, and location data, to build advertising profiles. Conversations are stored and used for model training. So much for disappearing.

When Aza Raskin of the Center for Humane Technology tested My AI by posing as a 13-year-old girl, the chatbot advised her to have sex with someone 18 years older. It suggested "setting the mood with candles and music." It gave her instructions on how to conceal bruises from Child Protective Services using green color correction. Snap's response: only 0.01% of responses are "non-conforming." When you have hundreds of millions of users, 0.01% is tens of thousands of conversations.

The Utah attorney general found My AI providing minors advice on hiding drugs and "setting the mood" for sex. The FTC referred a complaint to the Department of Justice in January 2025, voting 3-0, specifically targeting My AI's "risks and harms to young users."

Parents can disable My AI responses through Family Center, but the limitation is telling: your teen can still send messages to it. It just won't reply. The attempt isn't prevented. Only the response is.

Fentanyl

Most parents don't associate Snapchat with drug dealing. They should.

Alexander Neville was 14 when he bought what he thought was a prescription pill through Snapchat. It was laced with fentanyl. He died.

The dealer who sold to him had been reported to Snapchat repeatedly. His account stayed active for approximately a year after Alexander's death. During that time, it was linked to two more overdose deaths.

Alexander was not an edge case. 63 families have filed wrongful death lawsuits against Snap Inc for fentanyl-related deaths. The victims include Sammy Chapman (16), Cooper Davis (16), Devin Norring (15), and Elijah Ott (15). In every case, parents describe the same thing: an engaged family, protective measures in place, and a kid who purchased counterfeit pills through Snapchat within seconds.

The National Crime Prevention Council estimates that 80% of teen and young adult fentanyl deaths can be traced to social media contact. They call Snapchat a "digital open-air drug market." The DEA's Operation Last Mile found that 76% of 1,436 youth drug-buying investigations involved Snapchat. A UK peer-reviewed study found that 83% of teens who saw drug ads encountered them on Snapchat, the highest of any platform.

Snapchat's own internal documents from 2019 show the company understood that dealers preferred their platform specifically because of disappearing messages. They internally admitted it "takes under a minute to use Snapchat to be in a position to purchase illegal and harmful substances." The Utah attorney general found that 96% of abuse reports go unreviewed by Snapchat's Trust and Safety team.

The Cooper Davis and Devin Norring Act, named after two of the teenagers who died, is federal legislation aimed at holding platforms accountable for drug trafficking on their services.

Sextortion

Sextortion is when someone manipulates a child into sending compromising images and then blackmails them. On Snapchat, this is an industrial-scale operation.

According to Snapchat's own internal documents, the platform receives approximately 10,000 sextortion reports every month. Their internal communication describing that number also says it "likely represents a small fraction of this abuse" because most victims never report due to shame and fear. A frustrated employee wrote internally: "God I'm so pissed that we're over-run by this sextortion shit right now. We've twiddled our thumbs and wrung our hands all f***ing year."

Thorn and the National Center for Missing and Exploited Children published data in June 2024 showing Snapchat is the number one platform for online enticement of minors and the number one secondary destination for sextortion. The pattern works like this: a predator meets a kid on Roblox or Instagram, builds trust, then moves the conversation to Snapchat because the messages "disappear." Except they don't, and now the predator has leverage.

Where the platform is identified, 50% of all child sexual abuse image offenses occurred on Snapchat. 90% of victims are boys aged 14-17. The FBI documents approximately one suicide per month linked to sextortion cases.

The criminal networks running these operations are sophisticated. The Yahoo Boys, a Nigerian cybercrime network that used to run email scams, now target teenage athletes and popular kids on Snapchat and Instagram. They study the target's friends list, figure out what school they attend, establish a fake relationship, extract images, then threaten to send the images to everyone the victim knows unless they pay. From initial contact to suicide can be a matter of hours.

Snapchat underreports these cases to NCMEC, and the gap is staggering. Thorn's data shows Snap's ratio of Electronic Service Provider reports to public victim mentions is 4.1 to 1. Compare that to Facebook at 204.3 to 1 and Instagram at 15.9 to 1. Victims mention Snapchat in public reports almost as often as Instagram, but Snap files only a fraction of the reports. They're not looking for the problem because finding it creates liability.

Watch our detailed breakdown of Snapchat's sextortion crisis and what parents can do

What Snap knew

This isn't a case of a company that didn't know. Internal documents obtained through lawsuits and attorney general investigations reveal a company that knew exactly what was happening and calculated the cost of fixing it.

When employees suggested better tools to identify predator grooming, leadership complained it would create "disproportionate admin costs." They acknowledged adults were targeting children for "deeply pernicious and dangerous" conduct but didn't want to "strike fear" among young users, which would reduce engagement.

The company's director of security engineering, told that CSAM scanning on Android devices was broken, responded: "That's fine, it's been broken for ten years." When questioned internally about age verification, the admission was plain: "I don't think we can say that we actually verify."

What else the documents revealed

  • A 2017 internal email about the streaks feature: "Wow, we should add more addicting features like this."
  • A user with 75 separate reports mentioning "nudes, minors, and extortion" remained active on the platform.
  • Over 90% of account-level reports were ignored by Trust and Safety.
  • Snapchat's CSAM detection database was rolled back, deleting matches.
  • Internal reports from 2019 confirmed the company knew drug dealers preferred their platform because of disappearing messages.

As of March 2026, Snap faces more than 10,000 individual lawsuits, enforcement actions from 41 state attorneys general, and approximately 800 school district claims. In January 2026, Snap settled the first bellwether trial (K.G.M. v. Meta et al.) approximately one week before jury selection was scheduled to begin, avoiding CEO testimony before a jury. Settlement terms are confidential. Federal courts have rejected Snap's Section 230 defense for product design liability, meaning the company can be sued not just as a communication channel but as a maker of a defectively designed product. Legal analysts compare the scope of this litigation to the tobacco and opioid industry cases.

Parental controls

Snapchat has parental controls. They're called Family Center. Almost nobody uses them.

CEO Evan Spiegel testified to the US Senate that of the 60 million teenagers using Snapchat daily, 400,000 are linked to a parent through Family Center. That's a 0.67% adoption rate. The Social Media Safety Organization calls it "a talking point, not a tool."

What Family Center shows parents: who your teen chatted with in the last 7 days, their friends list and new additions, and group chat memberships. What it doesn't show: any message content, any photos or videos, when conversations happen, or deleted conversations older than 7 days. The restrictions parents can toggle only apply to Stories and Spotlight. They do not restrict direct messages, do not filter search results, do not block content shared by friends, and do not apply to subscribed content. The actual danger zones are untouched.

Teens can remove parent access unilaterally without notification. Location sharing requires teen consent and can be revoked at any time. These are not oversights. They're design choices.

Snapchat has not implemented any of the structural safety changes that Meta and TikTok made under regulatory pressure: no default private accounts for minors, no DM restrictions from unknown adults, no content recommendation restrictions for minors, no parental time limits.

Why is adoption so low? Part of it is that parents are overwhelmed. Part of it is that the controls are hard to find and confusing to configure. But the deeper answer is structural. A federal law called COPPA, the Children's Online Privacy Protection Act, passed in 1998, says that if a company has "actual knowledge" that a child is using their platform, they become legally liable for that child's safety. The liability can be enormous. Meta faces a potential $50 billion exposure under this law. So companies are incentivized to avoid knowing children are present. They don't build effective parental controls because effective parental controls require acknowledging that children use the platform. Acknowledging that creates liability. So they build the minimum, bury it in settings, and point to it when Congress asks questions.

Watch our detailed walkthrough of Snapchat's parental controls and their limitations

What you can do

If your kid is going to use Snapchat, change these settings before they open the app. But for each one, have the conversation about why.

Disable the camera on the phone.

iPhone: Settings > Screen Time > Content & Privacy Restrictions > Allowed Apps > toggle off Camera.
Android: Settings > Digital Wellbeing > Parental Controls > select child's account > block Camera.

This is the single most effective protection against sextortion. Talk to your kid about why. Not to scare them, but because understanding the threat is a skill they'll carry into adulthood.

Enable Ghost Mode.

This hides their location on Snap Map. Ask them: would you walk around school with your home address on your shirt? That's what Snap Map does, except it updates in real time.

Set contacts to Friends Only. Turn off Quick Add.

The attorney general of New Mexico found that Quick Add was connecting minors with adult strangers regardless of their privacy settings. Ask your kid to count how many people on their friends list they've actually met in person.

Disable My AI.

The chatbot analyzes everything your kid does on Snapchat to build advertising profiles. When tested, it advised a 13-year-old on how to conceal abuse from child protective services. Your kid can still send it messages, but it won't respond.

Review the friends list together, regularly.

Not as surveillance. As a shared practice. You're building the habit of questioning who has access to your life online.

Snapchat's controls stop at Snapchat. When a conversation moves to Discord or Instagram DMs, Family Center can't follow it. And Family Center can't see message content in the first place.

The Bark Phone is a Samsung Galaxy built on a custom version of Android with monitoring and controls built into the operating system. Your kid can't uninstall it, disable it, or delete their messages before you see them. On Snapchat specifically, Bark monitors direct messages, My AI conversations, and searches. If something looks wrong, you get an alert. It also monitors across platforms, so if a conversation starts on Snapchat and moves to Discord or Instagram, Bark is still watching. You can set app time limits your kid can't bypass.

If your kid already has a phone, the Bark app monitors many of the same things on Android devices. You can see exactly what Bark monitors on each device type.

Bark rates Snapchat 1 out of 5 stars and recommends age 16+. I've written a detailed review of the Bark Phone on my site.

Parental controls restrict. Monitoring watches. If your kid is on a platform where strangers can contact them, they need both.

These settings reduce risk. What eliminates it is what you model. If you use your phone at dinner, in the car, during conversations, your kid learns that a screen belongs between people. If you show them your own screen time stats and say "I'm not happy with this number, and here's what I'm going to do about it," they learn that struggling with technology is normal and recoverable. The parents with the best outcomes aren't the ones with the strictest settings. They're the ones honest about their own struggles with technology who build accountability together instead of enforcing it from above.

Warning signs to watch for

If your teen uses Snapchat, be alert for these behaviors:

  • Secretive behavior around their phone, or hiding the screen when you approach
  • Emotional changes after using the app
  • Unusual gift card purchases (common in sextortion payoffs)
  • Late night device usage
  • Withdrawal from family or other activities
  • Anxiety when unable to access the app
  • New contacts or friends you don't recognize

The age question

Every parent asks: what's a safe age for Snapchat?

There isn't one. Not because all ages are equally dangerous, but because the question itself misses the point. Snapchat combines addictive algorithms, anonymous communication, and AI in a single app. Those don't become safe at 14 or 16 or 18.

The better question is: how do I raise a kid who can navigate this when they inevitably encounter it as an adult?

That's a skills question, not a settings question. And the answer starts with you, not with them.

I interviewed a father whose son went to a Waldorf school. Minimal internet exposure. The family was intentional about all of it. When the son reached high school, the dad relented. All his friends had Snapchat. He didn't want his kid to be the outcast. Within a year, his son bought MDMA through the app, took too much, and died.

No parent has this figured out. I don't either. What I do know: the families with the best outcomes are the ones where the parents model the behavior they want to see, build skills through honesty and shared accountability, and treat technology boundaries as something the whole family works on together.

Expert voices

Mike McLeod is a speech-language pathologist who has personally helped over 500 families disconnect their kids from the internet. Snapchat comes up in nearly every case. Parents say the same thing: "If I take away Snapchat, my kid will have no friends." McLeod's experience is the opposite. Every time, the real friendships got stronger. The Snapchat friendships were false signals.

"If your child's relationships are dependent on Snapchat, then they have no relationships at all. Friendships are not reliant on an app or a method of communication."

Mike McLeod (Watch the full conversation)

Dr. Lisa Strohman, a clinical psychologist who has profiled for the FBI, studies what happens to developing brains when human connection is replaced by screens. The research points to oxytocin. Less face-to-face interaction means less oxytocin, which means less empathy and less compassion. This isn't just behavioral. It's neurochemical and neurostructural.

"The more we disconnect human-to-human contact, the further apart we get, the less oxytocin in our system, the less empathy, the less compassion we have for one another."

Dr. Lisa Strohman (Watch the full conversation)

Is every kid going to die if they go on Snapchat? Of course not. Are there lots and lots of stories where the starting point was Snapchat and, not that long in the future, there was serious damage? Yes. I think these systems are a net negative. I look at them like the meth of the digital world.

If you take one thing from this article: the two questions that matter for any app are whether it has an algorithm and whether it has a chat function. Snapchat has both, plus an AI chatbot pointed at 60 million teenagers. Whatever decision you make for your family, make it with the full picture.

Resources

Protect Your Child Today

Start with our step-by-step guide to setting up your child's iPhone safely