Finland

Finland

In August 2020 the Finnish National Audiovisual Institute, KAVI, published a report on parental engagement with the system of recommended ages for children viewing different sorts of content. It found higher levels of parental engagement, and more following of the advice given, in the code for parents with children at younger ages. The code only applies to broadcast media and officially classified content, such as film, television and games. It does not apply to pornography on the Internet.

Key new research

While Finland is far from world leading in its legislative approach to age verification, it has other strengths. The civil society group, Protect Children, has recently carried out unprecedented research on users of child sexual abuse material, or CSAM, in the dark web. Results of this research are very significant. They provide the whole world with additional motivation to separate children from pornography consumption.

Dr. Salla Huikuri, researcher and Project manager at the Police University College Finland has been quoted. “Systematic research on child sexual abusers’ interactions in the dark web is of paramount importance while fighting CSAM use and online violence against children.”

Protect Children’s research into the dark web is revealing unprecedented data on CSAM users. Called the ‘Help us to help you’ survey, it was conducted as a part of the two-year ReDirection project. The work was funded by ENDViolence Against Children. It was answered by over 7,000 respondents.

The ‘Help us to help you’ survey, based on the cognitive behavior theory, asks users of CSAM about their behaviour, thoughts and emotions related to their use of CSAM. The data gathered has provided invaluable insight into the thoughts, habits and activities of CSAM users.

The survey’s Legal Specialist in Finland made the following comment. “We have seen that our Redirection survey itself has served as an intervention for many CSAM users. Responding has allowed many to re-evaluate their behavior, thoughts, and emotions related to the use of CSAM”.

Escalation to CSAM Viewing

The survey also found a lot of evidence to suggest that escalation of pornography use can lead individuals to viewing more extreme harmful content, including images of child sexual abuse.

The preliminary research has uncovered key findings including that a majority of CSAM-users were children themselves when they first encountered CSAM. Approximately 70% of users first saw CSAM when they were under 18 and approximately 40% when they were under 13. Additionally, users predominantly view CSAM depicting girls. Approximately 45% of respondents said they use CSAM depicting girls aged 4-13, whilst approximately 20% said they use CSAM depicting boys aged 4-13.

Help to quit CSAM viewing

The preliminary results have shown that approximately 50% of the respondents have at some point wanted to stop their use of CSAM, but have been unable to do so. A majority, approximately 60% of respondents, have never told anyone about their use of CSAM.

Tegan Insoll, the Research Assistant, said: “The results show that many individuals are motivated to change their behaviour, but have been unable to do so. The new data highlights the urgent need for the ReDirection Self-Help Program, to provide them with the help they need to stop their use of CSAM and ultimately protect children from sexual violence online.”

In June 2021, Protect Children was invited to join the expert roundtable discussion hosted by WePROTECT Global Alliance and the International Justice Mission’s Center to End Online Sexual Exploitation of Children. The discussion was called ‘Framing online child sexual abuse and exploitation as a form of human trafficking – opportunities, challenges, and implications’.

In light of the discussions on livestreaming, Protect Children took the opportunity to start gathering new data on the use of livestreamed CSAM material. Again, it will cover the whole world, not just Finland. Preliminary data from this new questionnaire has been gathered, already showing very valuable results in a short time.

For other recent news on efforts to counter rising use of CSAM, see John Carr’s excellent blog.


Deprecated: unserialize(): Passing null to parameter #1 ($data) of type string is deprecated in /home/rewardfo/public_html/wp-content/themes/uncode/partials/elements.php on line 2303