The age of online users really matters. What has happened in the world in 2025 to help prevent children having easy access to online pornography? In this guest blog by our dear colleague John Carr OBE we have the full story. Enjoy!

John’s thoughts on the age of online users…

I am not going to try to provide an overview of everything that happened in the online child protection world in 2025. Too many moving parts that are still moving but, unquestionably, last year was peppered with landmark moments.

Here are some of them….

A more age-aware internet is approaching fast

I have done a fuller note on the history of the emerging age-aware internet. I will put it up tomorrow or very soon. Meanwhile…

The porn problem and age verification

Early exposure to porn opens up multiple pathways to various forms of harm for young people. Despite constant assertions that they didn’t want children on their sites, for over thirty years online porn companies declined to do anything serious to keep children out.

Mandatory age verification was the obvious answer. Texas concurred.

In June 2025 in Paxton v Free Speech Coalition the US Supreme Court upheld the constitutionality of a law from the Lone Star State. Age verification for porn sites publishing in Texas is now compulsory. Many other US states are following suit.

The way some states are taking up the cudgels is problematic but in Paxton all of the US Supreme Court Justices acknowledged there is no insurmountable privacy or 1st Amendment objection to using age verification as a means of protecting children on the internet, at least in respect of pornography although it seems unlikely the use of age verification will be limited to porn.

Had the Supreme Court decision gone the other way it would not only have been bad for children in America it would have set back the politics of age verification everywhere, and probably not just in respect of pornography. Nothing would have come to a complete stop but everyone would have had to work a lot harder and longer.

So in June, when the decision in Paxton was handed down, we all let out a big “phew!”. However, looking at those who came together or worked to support Texas, the enduring and obvious point is, across the political spectrum, there is a broad coalition of interests . That coalition is willing to put aside partisan or ideological considerations and work co-operatively on at least parts of a positive online child protection agenda.

A lesson for us all

On what further issues, if any, such a coalition can be reconstructed will be of great interest to and very instructive for children’s advocates across the globe. There are some fights children’s groups alone cannot win. That’s a sad but indisputable fact.

Meanwhile in dear old Blighty

Staying with online pornography, an adverse decision in Paxton would not have helped in the UK, we had already legislated to similar effect, but it could have reopened aspects of the debate. We dodged that bullet.

In July all the relevant provisions of the UK’s Online Safety Act 2023 finally became operative. Age verification is now a legal requirement for porn sites publishing online in Britain.

The regulator, Ofcom, has begun formal enforcement action to secure compliance by 22 porn companies that appear not to have received the memo. About sixty others are being investigated.

Ofcom also announced each of the top 10 most-visited pornography sites and most of the top 100 were now deploying age-verification systems for UK users. This meant over three-quarters of daily traffic to the top 100 most popular porn sites was subject to age verification. That looks like a result.

Are many children using VPNs to try to circumvent the new laws? I doubt it, at least younger children won’t be, but UK laws are clear. Porn sites must deploy “highly effective” age verification solutions and a solution which fails to detect and block a VPN is unlikely to pass muster. Ofcom will be watching the VPN issue closely.

Elsewhere

In Australia, comparable legislation in respect of pornography sites and age verification has now cleared its final Parliamentary hurdles and is scheduled to come into force in early 2026.

In the European Union, within the wider framedwork of the GDPR and the DSA, individual Member States are energetically pressing ahead with measures to protect children from online pornography, notably in Germany, France, Italy, Spain and Poland and this has been accompanied by major technological advances in age verification technologies, particularly those which engage zero knowledge proofs.

Moving on form porn but going back Down Under….

The big Australian experiment

Since early December 2025 various major social media platforms operating in Australia have been obliged to prevent children below the age of 16 from opening or retaining accounts with them. It’s a huge step. The world is watching. If there are demonstrable positive outcomes other nations will give it a go and no doubt schools and parents generally will take note.

In some quarters the Australian authorities have been criticised for doing what they did, the implication being the critics thought the status quo ante was preferable.

Why 13 years of age?

So I have to ask, where’s the magic in the status quo ante of 13? That wasn’t a legally defined minimum age in Oz. It’s just the one they ended up with and for the same reason almost everyone else did.

13 is an age plucked from the 20th Century when social media did not exist in anything approaching the current forms.

Based on research carried out in America in 1997/98 the Children’s Online Privacy Protecion Act 1998 became Federal law binding on all US-based businesses in respect of their activities in the USA and elsewhere. It nominated 13 as the threshold age for joining an online commercial service that required someone to surrender personal information about themselves, usually for the purposes of serving ads. That was the top and bottom of it. But because US companies dominated the internet 13 consequently became a de facto global standard. Not much science there.

The research referred to did not and could not have considered the issue of bullying, misogyny, body image, self-harm, “doom scrolling”, misinformation, addiction, extremism, and so on.

These are just some of the problems that have not been sufficiently ameliorated by improved media literacy initiatives whether in the home or elsewhere. The Australian Parliament felt too many families were still floundering. Maybe trying something new would help?

It’s called innovation.

Age in the EU

A little reminder about the position in the EU wouldn’t come amiss here.

In 2016 the GDPR set the new default age for joining social media (and other) services at 16. However, individual Member States were allowed to choose any age between 13 and 16. Hmmm.

Please don’t ask what research was done, when, where and by whom to justify or explain why the EU plumped for 16 or allowed a range of ages but it is obvious there was no clearcut consensus.

This is how things currently stand….

13

Only 8 countries stuck with 13. (Belgium, Denmark, Estonia, Finland, Ireland, Latvia, Portugal, Sweden).

Together the combined populations of these countries constitute about 12% of the total population of the EU.

14

5 countries (Austria, Cyprus, Italy, Lithuania, Spain)

15

4 countries (Czech Republic, France, Greece, Slovenia)

16

10 countries (Bulgaria, Croatia, Germany, Hungary, Luxembourg, Malta, The Netherlands, Poland, Romania and Slovakia) either positively chose 16 or did nothing. In these cases the default of 16 kicked in.

Between them these ten countries contain around 40% of the whole EU population. That’s 180+ million people.

Australia’s population is less than 30 million.

Are people OK with 16 but only if there is no possibility of it being enforced? I’ll leave that with you.

My 2026 agenda

It’s complicated.

Watch this space.

See the original here.