In this guest blog by our colleague John Carr brings us some good news on the latest moves from Apple which will help protect children from sexual exploitation on the internet. You can find all of John’s blogs at Desiderata.

“There has been great rejoicing at ECPAT International’s global HQ in Bangkok. Last week the work we have been doing with our partners around strong encryption received an enormous boost when Apple made a hugely important announcement about their plans to keep children safe online. Not everyone likes it but we love it.

The cat is out of the bag

The cat is now very definitely out of the bag. Apple has confirmed a core contention advanced by ECPAT. There are scalable solutions available which do not break encryption, which respect user privacy while at the same time significantly bearing down on certain types of criminal behaviour, in this case terrible crimes which harm children.

If people believe Apple or malevolent Governments could misuse the technology, that is an extremely important point, Hovever, it is a different one. It speaks to how we regulate or supervise the internet. It is emphatically not an argument which allows companies to continue doing nothing to curb illegality where technology exists which allows them so to do. Equally it is not an argument for Apple to “uninvent” what it has already invented. 

What it is is an argument for Governments and legislatures to catch up. Quickly. 

In the world of tech, alibis for inaction are always thick on the ground. Apple should be applauded. They have not just moved the needle they have given it an enormous and wholly beneficial shove. The company has not moved fast and broken things. It has taken its time and fixed them.

So what is Apple planning to do?

Apple’s announcement contained three elements. Later this year, in the next version of their operating system, first in the USA then country-by-country they will:

  1. Limit users’ ability to locate child sexual abuse material (csam) and warn about online environments which are unsafe for children.
  2. Introduce new tools to help parents help their children stay safe in relation to online communications. In particular warning about sensitive content which may be about to be sent or has been received.
  3. Enable the detection of csam on individual devices before the image enters an encrypted environment. This will make it impossible for the user to upload csam or distribute it further in any other way. 

Number three is what has prompted the greatest outcry.

A game changer

The mere fact a company like Apple has acknowledged they have a responsibility to act in this area, and and have come up with a scalable solution, fundamentally changes the nature of the debate. Now we know something can be done the “it’s not possible”position has been vanquished. Any online business which refuses to change its ways likely will find itself on the wrong side of public opinion and, probably, the law as legislators around the world will now feel emboldened to act to compel firms to do what Apple has voluntarily chosen to do.

And all the angst?

Several commentators who otherwise appeared to express sympathy for Apple’s stated objective couldn’t quite resist trying to take the shine off their coup de théâtre by complaining about the way they did it. 

However, in 2019, Facebook’s unilateral announcement that it intended to do the exact opposite of what Apple are now proposing suggests the possibility of reaching an industry consensus was wholly illusory.

I am sure many “i’s” need to be dotted, many “t’s” need to be crossed, but sometimes I feel when it comes to protecting children everything has to be flawless out of the traps. It is OK for Big Tech to move fast and break things everywhere else and fix them later, or not. But that cannot be allowed to happen in this department. It is OK to innovate madly, but not here. We are judged by a different standard.

Don’t get me wrong. I am not in favour of imperfection. I do not applaud innovation that pays no heed to the downside. 

The simple truth, though, is this whole business has been about values and priorities. It is wholly binary. Either you think steps should be taken to minimise risks to children before content is encrypted or you don’t. There is no middle way because when it is encrypted it is invisible forever. The bad guys win. Apple has shown how they lose.

Good news from Apple

Encryption is not broken. No new data is being collected or exploited

In a further statement issued by Apple yesterday they make it abundantly clear they are not breaking any kind of encryption. They also make it clear their technology is limited in scope and they will not use it for any other purpose.

If you don’t believe that we are back to the point I made earlier. Let’s discuss that but whatever the outcome of the discussion might turn out to be Apple must be allowed and encouraged to carry on. I eagerly wait to hear other companies pledging to follow in their footsteps. Soon.”

Letter in Financial Times

On 12 August 2021 John Carr also published letter about this issue in the Financial Times. He applaud Apple’s decision concerning its plans to limit the possibility of child sex abuse material being distributed via their devices or network. John also suggested it will force Facebook to reconsider its extremely bad intentions. The full text of the letter is…

“In his article about Apple’s plans to introduce new child protection policies, Richard Waters suggests the way Apple went about it had “cut short debate” about the potential impact of their planned measures (Opinion, August 10). 

Specifically Waters refers to Apple’s plan to inspect content on users’ devices before it is uploaded and placed into a strongly encrypted environment such as iCloud. Apple is going to do this in order to ensure the company is not aiding and abetting the distribution of child sexual abuse material. 

Sadly the “debate” has been going for at least five years and for the greater part of that time it has been completely frozen. Things intensified when, in March 2019, Facebook announced it (was) going to do the exact opposite of what Apple is now proposing. That too was a unilateral decision, made all the worse because, unlike with Apple, it was against a well-documented background of Facebook already knowing that its currently unencrypted Messenger and Instagram Direct platforms were being massively exploited for criminal purposes. 

In 2020 there were 20,307,216 reports to the US authorities of child sexual abuse material which had been exchanged over either Messenger or Instagram, but Facebook has so far given no sign that it will row back. 

A choice must be made

The argument is, I’m afraid, a binary one. Once material is strongly encrypted it becomes invisible to law enforcement, the courts and the company itself. So either you are willing to live with that or you are not. Facebook is. Apple isn’t. 

However, I suspect Apple’s decision will force Facebook and others to reconsider. There are scalable solutions available which can respect user privacy while at the same time bearing down against at least certain types of criminal behaviour, in this case terrible crimes which harm children.

If people believe Apple or indeed malevolent governments could misuse the technology, that is an important but different point which speaks to how we regulate or supervise the internet. It is emphatically not an argument which allows companies to continue doing nothing to curb illegality where technology exists which allows them to do so. Apple should be applauded. It has not just moved the needle, it has given it an enormous and wholly beneficial shove.”