What parents need to know about cross-platform sharing of inappropriate content

Anyone can post anything online. It’s simultaneously one of the internet’s most joyous qualities and one of its greatest perils. For every upload that’s heart-warming, helpful or just plain hilarious, you’ll find – unfortunately – an equal amount of content that is shocking, frightening or upsetting … particularly to young eyes and ears.

Some of this material is placed deliberately, out of unfathomable malice. More commonly, though, it was created for mature viewers and ended up on a young person’s device by accident or oversight: seeping into child-friendly areas from more adult-oriented corners of the web. Our online safety guide analyses the possible risks of cross-platform sharing.

In 2017, having been alerted to large quantities of videos which – at first glance – were geared to pre-schoolers but in fact featured disturbing images, YouTube pledged to purge itself of “content that attempts to pass as family friendly, but clearly is not.” Almost as quickly as platforms like YouTube can remove these videos, however, others are being uploaded in their place. Many of these clips evade automated filters by using child-friendly search tags. Others are deliberately designed to resemble normal episodes of, say, Peppa Pig, but include violent or sexual content. Our #WakeUpWednesday guide from NOS this week examines the potential dangers when inappropriate content begins to be shared across different platforms.