Once again we turn to our colleague John Carr OBE for an informed view on how the new Online Safety Act 2023 will operate in practice to protect children. In Part Two of his blog Desiderata he sets outs the details. Part One is available here. John is is one of the world’s leading authorities on children’s and young people’s use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK’s Children’s Charities’ Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), UNODC, the EU and UNICEF. John has advised many of the world’s largest technology companies on online child safety.
“The UK’s new online child protection framework. Part 2
I have decided there is no point writing at length about the provisions of the final text of the Online Safety Bill. Many hundreds of pages would be required. Lawyers and others are probably already halfway through their first draft. Here I will give only the headlines so you get the feel of what is in the new law.
The Government’s initial Impact Assessment for this legislation can be viewed here.
If you want a fuller picture of how it all finally turned out you should read the excellent account provided by the Carnegie Trust. Several law firms have also published their own synopses.
Ofcom is to be the principal Regulator of the new legal regime although the UK’s privacy body, the ICO and other agencies will also have important roles in the overall scheme of things. It will be interesting to see what kind of inter-agency working arrangements they establish and how well they work.
On the day the Bill completed its Parliamentary journey (19th September), Dame Melanie Dawes, Ofcom’s Chief Executive said the following
“Very soon after the Bill receives Royal Assent, we’ll consult on the first set of standards that we’ll expect tech firms to meet in tackling illegal online harms, including child sexual exploitation, fraud and terrorism.”
Such a clear statement of priorities is very welcome. Major elements of the new Bill will be phased in, probably over a period of up to 18 months. A lot of consultation will come first. Parliament will need to determine how it will scrutinise the operation and effectiveness of the new laws. As will we all.
Risk assessments are core
If you provide a social media service to the UK you must complete a risk assessment to determine whether or to what extent the service poses a risk to children and, where it does you must take steps to mitigate those risks. This further embeds the notion of safety by design and safety by default.
Transparency rules OK!
Ignorance as to who your actual users are will no longer be an excuse and the Regulator has powers to examine your risk assessment and take a view as to its adequacy. You will also be required to explain to users what steps you are taking to prevent and detect behaviour or postings which break the rules of the service, as stated in the Ts&Cs.
Your terms of service matters, particularly in respect of age
Ts&Cs can no longer be used simply as marketing tools. Nobody can any longer put stuff in then fail to make any serious attempt to enforce it. If they do they could land themselves in big trouble. Particular attention needs to be paid to a service’s stated age limits.
Proportionality rules OK!
The bigger the company the bigger the expectations. It’s a statement of the obvious but everyone will not be required to deploy the same level of resources to the task of keeping children safe. Context is everything. Risk assessments and evidence of actual or likely harm will be crucial.
Remove certain types of content quickly
We have already noted the priority being attached to child sexual abuse material. Relevant platforms will need to develop a capacity to identify and remove it swiftly and prevent it being reuploaded.
Content which advocates or promotes self-harm likewise should be removed quickly and other forms of content judged to be harmful to children should not be accessible to children.
Greater protection for women and girls
Through this legislation, it will be easier to convict someone who shares intimate images without consent and new laws will further criminalise the non-consensual sharing of intimate deepfakes.
Criminal and other sanctions
In certain circumstances senior executives could face fines or imprisonment for failure to comply or for lying to the Regulator. Huge fines can be levied on non-compliant businesses. Up to £18 million or 10% of global turnover.
Pornography is a no-no for kids
Any site or service which provides access to pornography must make sure it is behind a robust age verification system
App stores are within scope
Ofcom will research the role of app stores in allowing children to access harmful content, potentially with a view to requiring firms to take action to reduce risks. It is mad that App Stores can categorise an app as suitable for 4/5 year olds when the App provider or the law specifies something else. And if an App appears with Apple’s or Android’s apparent seal of approval that should mean it has passed certain basic honesty, technical and legal checks.
End-to-End Encryption
As I understand matters, as with the draft EU Regulation, the UK Bill contains no power to ban or compel anyone not to use E2EE. Equally there is no power to compel a service provider to decrypt any particular message, much less all messages which pass across its network or use its App.
Having said that, in the UK at least, under Part 3 of the Regulation of Investigatory Powers Act, 2000 (as amended) it should be noted an individual can be required to disclose an encryption key or a password. Refusal to do so could lead to a prison sentence of up to 5 years in cases involving national security or child sexual abuse.
However, returning to the new legislation, where the risk assessment or actual evidence points to the fact a particular E2EE platform or system is in fact or is very likely to be being used on an appreciable scale to conduct criminal activity which harms children, then the provider of that E2EE service will be expected to demonstrate what steps it is taking to eliminate or minimise such large scale illegality.
A failure or refusal to do so would render the platform liable to fines or other sanctions but the choice of how they do it will be down to them.”