However, reports of suicide instructions being spliced into children's videos are true — they just don’t involve Momo.
As a result, some parents have demanded better automatic restriction filters and a faster response time on videos flagged for inappropriate content on YouTube Kids — a supposedly kid-friendly version of the video-streaming app intended for audiences 8 and younger.
The dangerous content has been spliced into otherwise seemingly-innocent YouTube Kids videos featuring children’s show characters such as Nick Jr.’s Peppa Pig.
Florida mom and pediatrician Free Hess recorded and documented one on pedimom.com.
“It makes me angry and sad and frustrated,” Hess told CNN. “I’m a pediatrician, and I’m seeing more and more kids coming in with self-harm and suicide attempts. I don’t doubt that social media and things such as this is contributing.”
In some cases, in the middle of the episode, Peppa Pig is shown being tortured at the dentists office, harming herself with a knife or even killing her father in the cartoon show. These clips have been made by users, not the creators of Peppa Pig.
In other cases, the video will pause — seemingly hacked — to allow a man in sunglasses to walk on the screen, hold his arm up and tell children how to slit their wrists.
“Remember kids, (this way) for attention, (that way) for results. End it,” he says before walking back off the screen and allowing the video to resume.
The whole splice lasts just nine seconds, meaning parents may not even be aware of what their children are being exposed to.
Some of these videos, which were then flagged by parents and other viewers as disturbing, have been removed from YouTube.
Hess wrote about the issue on her blog and said in a quick search on the “kid-friendly” version of YouTube, she found videos “glorifying not only suicide but sexual exploitation and abuse, human trafficking, gun violence and domestic violence.” She also shared an example of one video, inspired by the Minecraft video game, which depicted a school shooting.
In response to Hess' allegations, YouTube said in a statement that it works to make the videos on YouTube Kids family-friendly and takes feedback seriously, CNN reported.
"We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video," the statement to CNN said. "Flagged videos are manually reviewed 24/7 and any videos that don't belong in the app are removed. We've also been investing in new controls for parents including the ability to hand pick videos and channels in the app. We are making constant improvements to our systems and recognize there's more work to do."
YouTube Kids is filtered by an automatic algorithm designed to filter out content that may be inappropriate for young children. Once a video either on YouTube or YouTube Kids is flagged, it is reviewed for removal by the YouTube team. It may also instead be labeled with a warning advising that content may be inappropriate for some audiences.
YouTube has removed tens of millions of predatory comments, and terminated more than 400 channels who were found to be posting predatory messages, according to the Verge. The company also terminated several channels, including FilthyFrankClips, for inserting disturbing content into children’s programming.
A recent major scandal involved the discovery of a “soft paedophile ring” operating in YouTube comments, where users leave chilling comments on videos of children and exchange numbers to share further images, according to The Verge.
Norwalk Police Chief Mike Conney said his department has not received reports about inappropriate children’s videos.
However, Conney said he wants to “remind parents to take an active role in monitoring their children’s activities, online and otherwise.”