
When American streaming sensation iShowSpeed arrived in Kenya on January 11, 2026, the response was seismic. The 20-year-old content creator gained 360,000 subscribers in a single day and pushed his global following past 48 million. President William Ruto welcomed him with an official video message. But beneath the spectacle lies an uncomfortable truth: most parents have no idea their children are watching iShowSpeed, let alone the dozens of other streamers occupying significant spaces in young people’s digital lives.
The Hidden Problem
iShowSpeed is relatively benign compared to other mega-streamers. Unlike some creators, he isn’t known for promoting gambling or deliberately sexualizing women. This distinction matters—and reveals the real problem. If parents don’t know about the “safer” streamers their kids follow, they certainly don’t know about the dangerous ones.
Only 39% of parents use parental controls, and just 16% restrict cell phone usage. Meanwhile, Twitch has 11% of users aged 13 to 17, and YouTube reaches nearly 9 out of 10 teens. Yet research reveals a consistent pattern: many parents remain largely unaware of what their children consume online.
This knowledge gap is structural. Children develop digital literacy faster than their guardians. The sheer volume of creators makes comprehensive parental knowledge nearly impossible without active engagement.
The Toxic Creators Parents Don’t Know About
While media celebrated iShowSpeed’s tour, other streamers with comparable or larger young audiences broadcast deeply problematic content that has drawn criticism from researchers and child safety advocates.
Kai Cenat is among the most prominent Twitch streamers globally, with research indicating his audience skews significantly younger, including viewers under 13. According to reporting on streaming culture, his content has been noted for objectifying language and commentary that critics argue reduces women to conquests rather than people. The platform dynamics of streaming to predominantly young audiences while engaging in such behavior raises significant concerns about the messages being normalized.
Adin Ross has faced multiple platform violations and bans from Twitch, and has hosted guests including Andrew Tate, who faces serious criminal allegations. Tate’s TikTok content had approximately 11.6 billion views before deletion. Research and reporting on misogynistic influencers show concerning patterns of young people, particularly boys and young men, adopting problematic attitudes toward women that have led to documented harassment in school settings.
These aren’t edge cases. They are among the most-watched streamers in the world.
How Algorithms Push Harm
Children don’t deliberately seek toxic content—platforms deliver it to them. Amnesty International’s investigation of TikTok’s algorithm found that after just 5 to 6 hours on the platform, almost 1 in 2 videos shown were mental health-related and potentially harmful, roughly 10 times the volume served to accounts with no interest in mental health. When researchers simulated 13-year-old user accounts, the algorithm pushed increasingly dark content.
The algorithm doesn’t distinguish between entertainment and toxic influence. It simply learns: this user watched this, now show them more of it.
Gaming Culture and Normalized Misogyny
The problem extends beyond individual streamers. Research found that 85% of respondents witnessed problematic or toxic behavior in gaming spaces, with misogyny functioning as an incubator for hostile sexist attitudes and dehumanizing slurs. Young people exposed to constant gendered slurs and misogynistic attacks form emotional and social connections around this normalized hatred.
The psychological impact is measurable: Gen Z exhibits a larger gender divide on women’s rights than older men, with declining support compared to previous generations.
Why This Ignorance Is Dangerous
This isn’t merely embarrassing—it’s dangerous. 15.6% of young people have experienced online child sexual abuse; 11.0% image-based sexual abuse; 5.4% online grooming. In African contexts, statistics are more alarming: 19% of South African children aged 9-17 received unwanted sexual requests online; 21% of Ugandan 15-17-year-olds experienced the same.
Predators exploit parental ignorance. Additionally, exposure to toxic or misogynistic content is linked to increased aggressive behavior, sleep disruption, and problematic media use, particularly in children 8 and younger.
What Parents Must Actually Do
Parental controls aren’t the solution—they can’t prevent children from adopting toxic ideologies or bonding with harmful communities. Research shows that parents who combine clear boundaries with open dialogue and active involvement see better outcomes.
Parents must have ongoing, non-judgmental conversations about what their children consume. Understand the platforms themselves—how algorithms work, what makes creators popular. Teach critical thinking about content authenticity and creator motivations. Consider co-viewing opportunities. Negotiate boundaries rather than imposing them, ensuring children understand the reasoning.
The Larger Reckoning
The discovery that children quietly follow toxic creators while parents remain oblivious reveals a deeper problem: we’ve built a media landscape optimized for engagement at the expense of safety, leaving parents to navigate it alone.
Platforms engineer algorithms to maximize engagement regardless of children’s wellbeing. They resist transparency and regulation. But this systemic reality doesn’t absolve parents of their fundamental duty to know what influences shape their children’s thinking and behavior.
The real question isn’t whether iShowSpeed is good or bad. It’s whether parents will finally acknowledge the vast, hidden ecosystem of influencers central to how young people develop values, understand relationships, and see themselves and others.
That acknowledgment must precede action. And that action must begin with genuine, non-judgmental conversation with the young people in our lives about what they’re actually watching and what these creators are teaching them about how to be human.



