ORLANDO, Fla. – First there was the Momo Challenge -- videos about suicide, with the unsettling voice of a child saying, “Momo’s gonna kill you.”
Now, there are reports of violent scenes being slipped into popular cartoons that children are watching on YouTube Kids and other internet sites and apps.
[RELATED: Mom finds video on YouTube Kids giving instructions for suicide | Is the 'Momo Challenge' a hoax?]
Chris Hadnagy, CEO of Social-Engineer LLC, said his team searched for violent cartoons and found versions of "Peppa Pig" with extremely violent scenes added, with the same tag or identity listing as the real cartoon.
“They didn’t hack YouTube. They didn’t hack into dad’s account,” Hadnagy said. “What they did was, they basically lied about the content and put it into a thread of other kids’ videos.”
News 6 at Nine Executive Producer Tara Evans said her 2-year-old daughter was watching an episode of what she thought was "Doc McStuffins" when she realized that, instead, her daughter was watching pure violence.
“My reaction was, 'Get rid of it. Get rid of it,'" Evans said. “I’ve approved the real versions of the cartoons for her to watch, so you don’t think she’s going to see that.”
Hadnagy said that expectation of secure sources can’t always be guaranteed.
“This is literally shock and awe," he said. “For all we know, someone hates YouTube and they threw a firebomb in the middle of YouTube.”
Susie Raskin, a licensed mental health specialist for the Teen Xpress program at Arnold Palmer Hospital for Children, said she was surprised to hear cartoons were being altered to deliver violent scenes to children.
“It’s disturbing. It's scary, but now we can do something about it," Raskin said. “I would address it head-on, but address it at a level that is appropriate for the age of the child.”
[RELATED: YouTube disables comments on videos of kids following safety concerns]
A YouTube spokesperson told News 6 the company has been screening millions of videos and removing anything that doesn’t meet its standards.
“We work hard to ensure YouTube is not used to encourage dangerous behavior, and we have strict policies that prohibit videos which promote self-harm. We rely on both user flagging and smart detection technology to flag this content for our reviewers. Every quarter, we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views," the spokesperson said. "We are always working to improve our systems and to remove violative content more quickly, which is why we report our progress in a quarterly report and give users a dashboard showing the status of videos they’ve flagged to us."