How would you respond if a stranger came up to you in a public space and asked: "Can I follow your child?"
A little over a year ago we launched a campaign to draw attention to the dangers lurking online – dangers our children are vulnerable to every day.
The campaign was provocative. The video depicted a man in a shopping centre asking parents if they could follow their children. We also backed it up with the tagline: “You do everything you can to keep your child safe in the real world – why wouldn’t you do the same online?”
This had the potential to unsettle parents and carers, perhaps get their backs up.
It was intended to.
The campaign needed to unsettle and provoke. It was designed to do just that. If you haven’t seen the video yet, you can watch it below.
Online safety: the good
It’s not that we wanted to scare parents and carers into thinking that all online activity is negative or dangerous. We didn’t want parents to feel overwhelmed with fear every time their child reached for the iPad.
Let’s not forget, the online world provides a wealth of positive opportunities for young people. From building support networks to finding information and entertainment, behind the screen isn’t always a scary place.
And it looks like things are getting better. A year on from this campaign and some inroads have being made in the arena of online safety. In October last year, Facebook revealed they had introduced software that uses machine learning to automatically flag and remove child sexual abuse material.
This was an important step forward. Facebook says the software removed 8.7 million pieces of content that violated its child nudity or sexual exploitation policies in one quarter alone.
The social media giant also revealed separate software that identifies perpetrators by analysing how often they contact children en masse, or how frequently they are blocked by other users.
Online safety: the bad
If you’re expecting a ‘but’ you’d be right to do so, because it's clear that Facebook could be doing so much more. For example, for reasons unknown, none of this new software (at time of writing) has been rolled out on Instagram.
It certainly hasn’t been rolled out on Whatsapp (also owned by Facebook), which remains an entirely closed book to both law enforcement and the standard moderation systems used on other platforms.
Also, very little (read: nothing) is being done about Facebook's child users who aren't technically old enough to be on these platforms in the first place. (The minimum age is supposed to be 13.)
Online safety: the ugly (truth)
So while we should, and do, welcome action where it’s taken, this mustn't weaken calls for statutory regulation for the tech giants that are such a huge part of our young people’s lives (whether we like it or not).
Making it compulsory for these companies to tackle online safety for children will always be more effective than simply hoping that they might be willing to at some point.
The simple fact is that the software Facebook has implemented costs money – money the tech companies haven’t factored in because they’re not using a safety-by-design approach.
Here’s some more ugly: The European Commission have written a new regulation that will make it illegal for tech companies to investigate child sexual abuse material unless they have the consent of all parties. That’s all parties – including the offender. We will fight this in every way we can but we might need public support to get there, so watch this space.
What can we do?
Those with specialised knowledge and expertise can make a difference.
Here at Barnardo’s we're part of the ‘Trusted Partner’ scheme. This means we’ve joined forces with Google and Facebook so that any Barnardo’s employee who sees inappropriate or potentially harmful content can email a dedicated Barnardo’s email address and the issue will be flagged directly with Google or Facebook.
It’s necessary to have this in place. We cannot trust the tech companies alone to keep our children safe online. Nobody seems to be steering the ship in the right direction, so somebody needs to get behind the wheel. Barnardo’s employees are uniquely placed to identify and flag content that could be harmful to children.
There’s plenty more to do. The tech giants should be required through domestic legislation to adopt the software Facebook have started to use – and we are pushing for this. We’re backing Government plans to introduce a new legal duty on internet companies. This would make them responsible for ensuring children stay safe online.
This makes perfect sense. We expect influential companies to take precautions to keep our children safe offline – why should it be any different online?
The Government’s Online Harms paper is due shortly and indications are that it will call for this duty of care approach, meaning regulation – and enforcement of the new rules. We will respond to this, with a particular focus on the safety of the most vulnerable children, such as sexual abuse and exploitation victims, as well as children in care – i.e. the children and young people our 1000+ services help keep safe every day.
As an organisation, this is our forté – we care about the safety and welfare of all children of course, but our particular focus is on protecting the most vulnerable in society. Because somebody has to.
If we want to shift the responsibility away from users (i.e. children), the tech companies need to take responsibility and be held accountable. They need to start factoring in safety because a reactive approach is just not working, and unfortunately that means the online world is currently not a safe place for our children.
If you’re a parent or carer and concerned about your child’s online world, check out our post on 5 ways to keep your child safe online.
If you’re concerned about online sexual abuse or the way someone has been communicating with a child, contact the Child Exploitation and Online Protection CEOP command.