Ian Stevenson recently collaborated with International Justice Mission in the debate around Online Safety, resulting in an insightful interview with Gertjan de Jong. You can read the translated interview below, and the full original interview here.
According to Ian Stevenson, CEO of British tech company Cyacomb, legislation is urgently needed to protect children from online abuse. “This is not just about children somewhere far away, it’s about children on your street.”
The figures of online abuse are alarming. Sexually explicit images of one in five children worldwide were shared without their consent. In the Philippines alone, nearly half a million children were abused for live streams, online photos, and videos in 2022. Despite these figures, large-scale outrage is not forthcoming, Stevenson notes. "Painfully enough, we have become accustomed to large numbers."
That is why Ian argues for telling the stories behind those figures. "Realize that behind these large numbers are individual children," he emphasises. "One problem is that many people think this only happens to children far away. However, this happens to children in your community. There are children who live on your street and share a nude photo of themselves because they think they are in a relationship with someone, but in the meantime, are blackmailed. I have met people who have lost their children to suicide after being blackmailed with nude images. Stopping this problem is protecting your neighbours. It's also protecting a child in the Philippines, yes, but it's protecting your children and children on your street."
Ian is the co-founder of a company that provides tools that can quickly find sexual abuse material, helping companies detect and remove illegal content. He also works with governments and police, collecting and tracking down images of abuse victims.
"Stopping this problem means protecting your neighbours."
The impact of online abuse is enormous, Stevenson emphasises. "I meet women in their thirties who were raped about twenty years ago and whose abuse images are still circulating on the internet. They are still approached by sick people on the internet who say: ‘I will put your abuse on your social media profiles unless you send me new nude photos.’”
"One of the cases our technology was used on by police in the US was an 11-year-old girl who was sold by her parents, smuggled across the border into the US and taken to all kinds of places to be abused by groups of men. Fortunately, she was saved, thanks to the actions of online platforms that made intelligence and law enforcement investigations possible. These are the kinds of stories behind those figures. Girls become victims, but boys just as much. In blackmail (often with AI-generated images) it is just as often boys who are vulnerable."
Children must be protected from these horrific traumas, everyone agrees on that. The question is: How do you do that? Views on this vary considerably. "The conversation about online safety is not easy," notes Stevenson. "This has to do with the lobby of so-called 'privacy absolutists', people who believe in an absolute right to privacy in the online world. In no other context, there is an absolute right to privacy. You have the right to privacy in your own home. But in almost every country in the world, the police are allowed to enter your house if they think a crime is going on. Still, some believe there should be an absolute right to privacy online, and any threat to privacy, no matter how negligible, should be addressed. You hear their voices the loudest."
And how do you explain that?
"People don't seem want a nuanced conversation. People want a soundbite. That is the direction in which politics has gone, even when it comes to a big and complex topic such as migration. Unfortunately, some issues are just too complicated for soundbites. It's very easy to produce a soundbite that says, 'This is a threat to your privacy'. And yes, there is plenty of truth in it because almost everything is a threat to your privacy. What you have to do then is to have a more nuanced conversation. Okay, what do we mean by a threat to your privacy? Is this a real threat to your privacy, or is this proportional? Think of the use of metal detectors at airports. That takes away some of your privacy. But metal detectors only detect metal. If such a detector goes off, you will not immediately be arrested or thrown in jail, and then there will be a check from an officer. So it is a proportional violation of privacy compared to the alternative, which is that planes are hijacked and crash from the sky."
Do you understand the concerns about privacy?
"Certainly. In the dictatorships of this world, you see how disastrous it is when people have lost their privacy. Previously, there have been discussions within the EU about demands that the EU create a general backdoor to encryption to detect child abuse and the like. That seems like a very dangerous path to take. That's exactly why I started and created technology that can block known abuse material without fundamentally compromising privacy and security."
What technical solutions are there to combat online child abuse that are not controversial?
"Tools where abuse is reported to the police cause a lot of resistance. I notice there is much less resistance when illegal material is blocked by tools without it being reported to law enforcement.
Suppose you legally require tech companies to block existing abuse material. I expect that such a law will have a hugely positive effect on the circulation of known child abuse material. People will notice that regular images are not blocked. They'll say, "My photos aren't blocked. I walk through the metal detector without being stopped every time.' These tools do not require a human moderator to review images. It's not complicated. Blocking is automatic. With extremely few false positives – so few that it will have no practical impact.
This would prevent a lot of damage occurs because people feel that things they see in mainstream apps must be safe or OK. Even with abuse of children, they think it's OK if they, for example, get it forwarded in a messaging app. They wonder: I just got this on my mobile, how bad can it be? This normalisation process has made the step to more extreme offences more quickly. Simply blocking the transmission of known abuse material would have a huge impact on that, without affecting free speech or human rights.
The next step may be to use tools that detect online abuse and where reporting takes place. Ultimately, you want to stop perpetrators. But that has to be done very carefully, and that takes time. We must ensure that this reporting contains safeguards so that if mistakes are made, privacy is violated as little as possible.”
Cyacomb focuses on blocking known child abuse material. What do you think of tools to recognise new abuse content, using artificial intelligence?
"Yes, and it is now possible to build phones or apps that are safe in this way. I think we definitely have to put this technology on phones for children. This will prevent them from accidentally or otherwise sharing intimate images of themselves or receiving obscene images that damage their development. This technique is also necessary in the fight against online abuse, they solve different pieces of the same puzzle."
The numbers are huge and you hear very intense stories about online child abuse. Do you ever feel powerless in the fight against this injustice?
"That feeling can indeed creep up on me. But whenever I feel this way, it brings to mind a comparison by Julie Inman Grant*, the Australian eSafety Commissioner. She compares the fight for safe internet with the fight for road safety. If you had had a car accident in America in the 1960s, you probably would have died. If you had hit something, you would have flown out through your windshield, and if the accident wasn't fatal, the landing was. Or the steering column collapsed and went through your chest. If you were hit from behind, the fuel tank broke, and you died from a fire in your car. And at the time, the car manufacturers said: 'The car is great. Look at mobility. Plan the holidays. The car gives us so much freedom. And those car accidents? They will always be.'
Still, scientists found that some cars had a much higher rate of fatal accidents than others. They were designed to be safer. So people said: ‘Wait a minute, why don't we make all cars as safe as the safest cars?’ For example, seat belts became mandatory, which caused a lot of resistance from car manufacturers: 'You will nip innovation in the bud,' they shouted. 'Cars will become too expensive. Consumers have the right to choose what they buy. We should not force them to add seat belts. We will no longer be able to make innovative designs. People will no longer be able to afford cars.'
All the same arguments we hear from tech companies these days about measures for online safety. Despite all protests, seat belts became mandatory. And guess what? The number of road deaths fell enormously.
Julie Inman Grant's thesis is that we are currently just before the 'seatbelt moment' for online safety. The internet and all social platforms are wonderful inventions. They have done a lot of good by bringing people into contact with each other. But they also do horrible things. But just as the first step was to fit seat belts, the first step is to block known child abuse wherever it occurs. It is now up to the politicians. We need courageous politicians who are willing to tackle this scourge."
* Julie Inman Grant is Australia's eSafety Commissioner, responsible for leading the world's first government agency dedicated to the online safety of citizens.
What do you think of IJM's work?
I think IJM is doing a great job right now in bridging the gap between people like me, who are technologists and kind of domain experts, and European governments and the European Parliament. IJM brings technologists and policymakers together with a deep understanding of the problem and a strong motivation to solve it. That's why I love working with IJM in any way I can. We are fighting against the same injustice and are different pieces of the solution. We can develop great technology against online child abuse, but without policy advocacy, these tools will remain on the shelf. That's where IJM comes in as a bridge and connector."
"We need courageous politicians who are willing to tackle this scourge of violence."
Please click here to start downloading your file.