How Algorithms Work
Behind every social media platform and app your child uses is a sophisticated algorithm—a set of instructions designed to control what content they see, when they see it, and for how long. These algorithms are not passive tools; they are engineered specifically to capture and maintain attention, and they're particularly effective on developing brains.
Source: Whatagraph, 2023
To understand why algorithms pose such a significant risk to children, we first need to understand how they operate and the business model that powers them.
How Social Media Algorithms Work
The algorithm tracks every interaction: what content you view, like, share, comment on, how long you pause on each post, and even your mouse movements or screen taps.
Machine learning identifies patterns in your behavior to determine what keeps you engaged longest. It learns your emotional triggers and content preferences.
Source: Sprout Social, 2023
Based on this data, the algorithm selects content most likely to keep you scrolling, often prioritizing emotionally triggering or controversial content.
As you interact with this curated content, the algorithm refines its understanding, creating a feedback loop that becomes increasingly effective at holding your attention.
This process happens invisibly every time a child opens a social media app. The algorithm's sole purpose is to maximize "engagement time"—the amount of time a user spends on the platform. More time viewing content means more opportunities to show advertisements, which generates more revenue for the platform.
Source: MIT Technology Review, 2021
The Attention Economy
To understand why algorithms are engineered this way, we need to recognize the fundamental business model of social media: the attention economy. In this economic system, user attention is the primary commodity being bought and sold.
"If you're not paying for the product, you are the product. Or more specifically, your child's developing brain and attention is the product."
Ben Gillenwater, Family IT GuySocial media platforms generate revenue primarily through advertising. The more time users spend on the platform, the more ads they can be shown, and the more detailed behavioral data can be collected to further optimize both content delivery and ad targeting.
Source: Center for Humane Technology, 2023
This creates a fundamental conflict of interest: what's best for the platform (maximizing time spent) is often directly at odds with what's best for the user, especially when that user is a child whose brain is still developing.
- $232 billion global social media advertising revenue in 2023
- $17.64 average revenue per user for Meta (Facebook/Instagram) in North America
- 70+ data points collected per user to optimize content delivery
- 90 minutes average daily time teens spend on TikTok alone
Source: Common Sense Media, 2023
The incentives driving these platforms are clear: the more attention captured, the more profit generated. This is why algorithms are specifically designed to be addictive—creating a dopamine-driven feedback loop that keeps users, especially children, coming back repeatedly.
The Instagram Example: A Visual Walkthrough
To understand how these algorithms work in practice, let's examine Instagram—one of the most popular platforms among children and teens—and walk through exactly how its algorithm manipulates user behavior.
Inside Instagram's Algorithm
Instagram's recommendation system uses machine learning to predict what content will keep you engaged longest. Here's how it works step by step:
Initial Content Assessment
When someone creates a post, Instagram shows it to a small percentage of that person's followers to test engagement. Posts that receive rapid, positive engagement (likes, comments, saves, shares) are promoted more widely.
Interest Signals
Instagram tracks which posts you interact with, how long you look at certain types of content, and what you search for. These become "signals" of your interests.
For example, if a teen pauses on fitness content, the algorithm might begin showing more fitness posts, potentially escalating to extreme dieting or body image content that can trigger insecurities.
Relationship Signals
The algorithm prioritizes content from accounts you interact with frequently, creating social reciprocity pressure ("I should check and respond to maintain this relationship").
For children and teens, this creates significant social anxiety around missing posts from friends or not responding quickly enough.
Negative Emotion Optimization
Internal Facebook documents revealed by whistleblower Frances Haugen showed that content triggering negative emotions (outrage, insecurity, fear) often generates more engagement than positive content.
This means the algorithm can deliberately promote content that makes children feel inadequate, anxious, or upset—because it keeps them scrolling longer.
Rabbit Hole Effect
As children interact with algorithmically recommended content, their feed becomes increasingly narrow and extreme—a phenomenon called "the rabbit hole effect."
For example, a teen initially interested in healthy eating might gradually be shown more restrictive diet content, eventually seeing pro-anorexia material that would never have been actively sought out.
This system creates what former Google design ethicist Tristan Harris calls "a race to the bottom of the brain stem"—with algorithms competing to trigger the most primitive emotional responses that keep users engaged regardless of the psychological cost.
Source: Center for Humane Technology, 2023
Algorithmic vs. Chronological Feeds: The Critical Difference
To understand just how dramatically algorithms have changed social media, let's compare an algorithmic feed to a chronological one:
The chronological feed primarily shows content from people your child has actually chosen to follow, in the order it was posted. The algorithmic feed is curated to maximize emotional response and engagement time, often showing content from accounts your child doesn't even follow but that the algorithm predicts will trigger strong reactions.
How Algorithms Affect Developing Minds
Children and adolescents are particularly vulnerable to algorithmic manipulation for several key neurological reasons:
1. Incomplete Prefrontal Cortex Development
The prefrontal cortex—responsible for impulse control, decision-making, and understanding future consequences—isn't fully developed until approximately age 25. This makes children and teens:
Source: American Academy of Child & Adolescent Psychiatry, 2020
- Less able to recognize manipulative design patterns
- More susceptible to immediate rewards (likes, comments) despite long-term costs
- More likely to act on algorithm-induced emotions without critical evaluation
2. Heightened Sensitivity to Social Feedback
During adolescence, the brain undergoes a period of intense social recalibration, making teens:
- Hypersensitive to social approval and rejection
- More vulnerable to social comparison
- Prone to rumination over perceived social failures
Algorithms exploit this vulnerability by highlighting social content that triggers insecurity and comparison, creating a cycle where teens seek validation through continued platform use.
Source: Current Opinion in Psychology, 2021
3. Dopamine System Vulnerability
The brain's reward system is particularly sensitive during adolescence, making teens:
- More responsive to unpredictable rewards (like those provided by social media)
- More susceptible to behavioral addiction patterns
- More likely to experience withdrawal symptoms when unable to access platforms
Source: US Surgeon General, 2023
Many features of social media platforms were explicitly designed using the same principles as slot machines and other addictive technologies:
- Variable reward schedules (not knowing when you'll get likes or comments)
- Infinite scroll (removing natural stopping points)
- Autoplay features (reducing the decision to continue)
- Notification systems (creating urgency and fear of missing out)
These features are especially effective on developing brains with limited self-regulation capabilities.
Source: MUD\WTR, 2023
4. Identity Formation Vulnerability
Adolescence is a critical period for identity development. Algorithms can interfere with this process by:
Source: American Academy of Child & Adolescent Psychiatry, 2020
- Creating echo chambers that limit exposure to diverse perspectives
- Amplifying extreme viewpoints that can shape developing beliefs
- Promoting appearance-focused content that ties self-worth to external validation
Source: Current Opinion in Psychology, 2021
"Social media wasn't designed to help your child develop into a healthy adult. It was designed to capture and monetize their attention, regardless of the cost to their mental health."
Ben Gillenwater, Family IT GuyProtecting Your Child from Manipulative Algorithms
While the ideal solution would be to delay social media use entirely until a child's brain is more fully developed (around age 16), this isn't always realistic for every family. Here are evidence-based strategies to reduce algorithmic harm:
Source: Pew Research Center, 2023
lightbulb Protection Strategies
- Disable algorithmic feeds when possible - On platforms that allow it, switch to chronological viewing
- Use time limits - Set and enforce daily limits on social media use through parental controls
- Teach algorithm literacy - Explain to your child how algorithms manipulate their attention and emotions
- Disable notifications - Turn off all non-essential notifications to reduce the "slot machine" effect
- Regular digital detox periods - Implement tech-free times and zones in your home
- Monitor the explore/discover pages - These algorithmically curated sections often contain the most problematic content
- Encourage diverse content consumption - Help your child follow a wide range of accounts to prevent algorithmic narrowing
Platform-Specific Settings
Here are specific settings changes you can make on popular platforms to reduce algorithmic influence:
- Switch to "Following" feed instead of "For You" by tapping the Instagram logo and selecting "Following"
- Disable "Suggested Posts" in settings: Profile → Menu → Settings → Privacy → Suggested Posts → toggle off
- Clear search history regularly: Profile → Menu → Settings → Security → Clear Search History
TikTok
- Use "Following" feed instead of "For You" feed
- Enable "Restricted Mode": Profile → Menu → Settings → Digital Wellbeing → Restricted Mode
- Set time limits: Profile → Menu → Settings → Digital Wellbeing → Screen Time Management
YouTube
- Disable autoplay: toggle the switch at the top of the Up Next feed
- Clear watch and search history: Account → Settings → History & Privacy → Clear Watch History
- Consider YouTube Kids for younger children (though still algorithmic)
The Conversation to Have
Perhaps the most important protection is education. Here's a simplified way to explain algorithms to your child:
"Social media apps are designed by very smart people who want you to stay on the app as long as possible, because that's how they make money. They use computer programs called algorithms that track everything you do—what you like, what makes you angry, what makes you sad, how long you look at different posts—and then shows you more of whatever keeps you scrolling longest.
It's a bit like if someone followed you around a store, watching what catches your eye, and then rearranged the store every time you blink to put more of those things in front of you. That might sound helpful, but it can actually trick your brain into feeling bad about yourself or spending time on things that aren't really good for you.
When you notice yourself feeling upset, inadequate, or like you can't stop scrolling, that's often the algorithm working exactly as designed—not because there's something wrong with you, but because the app is designed to keep you engaged no matter what."
Conclusion: Algorithms as Tools, Not Masters
Understanding how algorithms work doesn't mean rejecting technology entirely. Rather, it's about helping children develop a healthier relationship with digital tools—teaching them to recognize when they're being manipulated and providing them with the skills to use technology intentionally rather than being used by it.
By implementing the protection strategies outlined above and maintaining open conversations about algorithmic manipulation, parents can significantly reduce the harmful impacts of social media on their children's developing minds.
The goal isn't to create fear of technology, but to foster awareness and agency—helping children understand that they, not the algorithm, should be in control of their digital experiences.
- Center for Humane Technology, "Social Media and Youth Mental Health," https://www.humanetech.com/youth, accessed March 2024
- Stanford Internet Observatory research on filter bubbles (2022), https://cyber.fsi.stanford.edu/io/publications
- "The Social Dilemma" documentary (Netflix, 2020)
- Frances Haugen Congressional Testimony (October 5, 2021), Senate Commerce Committee
- Journal of Adolescent Health (2023), "Social Media Use and Adolescent Mental Health," https://www.jahonline.org/article/S1054-139X(23)00711-1/fulltext
- Common Sense Media, "Social Media and Teens" research report (2023), https://www.commonsensemedia.org/research
- American Academy of Child & Adolescent Psychiatry, "Teen Brain Development" (2020), https://www.aacap.org/AACAP/Families_and_Youth/Facts_for_Families/FFF-Guide/The-Teen-Brain-Behavior-Problem-Solving-and-Decision-Making-095.aspx
- US Surgeon General, "Social Media and Youth Mental Health" (2023), https://www.hhs.gov/surgeongeneral/reports-and-publications/youth-mental-health/social-media/index.html
- Pew Research Center, "Teens and Social Media" (2023), https://www.pewresearch.org/internet/2023/12/11/teens-social-media-and-technology-2023/