top of page
Search

Combating child sexual abuse online: the controversy of COM(2022)209

by Matilde Serena


CW // child sexual abuse


On 11 May 2022, the European Commission proposed a new regulation aimed at the preventing and combatting child sexual abuse online: proposal COM(2022)209. The numbers relating to child sexual abuse online are concerningly high. In 2021 alone, 85 million picture and video depicting sexual abuse were reported worldwide. Due to the absence of harmonized rules at EU level, detection and reporting e.g. by digital services suppliers takes place on a voluntary basis. Arguably, the lack of harmonization slows down the fight to child sexual abuse online and leaves companies operating in the field of digital services subject to different (and often contradictory) rules. With this proposal, the EU urges for a comprehensive response to the growing threat of child sexual abuse online by improving, prevention, investigation and rendering better assistance to the victims. According to the EU’s commissioner for home affairs Ylva Johansson, “Detection, reporting and removal of child sexual abuse online is urgently needed to prevent the sharing of images and videos of the sexual abuse of children, which retraumatises the victims often years after the sexual abuse has ended”.This post examines the current legal framework in place to combat child sexual abuse online, the COM(2022)209 proposal and its controversial character.


When examining existing EU legislation to combat CSAM, the “2020 EU strategy for a more effective fight against child sexual abuse” was instrumental to set out the need for the implementation and development of a efficient and effective legal framework. At the present stage, the prevailing legislation in the context of combatting online CSAM is the Interim Regulation.The latter essentially provides for temporary derogations from the e-Privacy Directive, which protects the confidentiality of traffic data and interpersonal communication. These derogations allow online services providers to voluntarily detect, report and remove CSAM. The practice of allowing temporary derogations started already in 2020 with the European Electronic Communications Code.Other relevant instruments are the Digital Services Act (increasing digital services providers’ responsibility for the content on their platforms), and the General Data Protection Regulation(setting out the requirements for lawful processing of personal data of children).Therefore, COM(2022)209 draws on these instruments and aims at the creation of a harmonized and cohesive legislative framework.


Now this post dives into the proposed piece of legislation. Firstly, COM(2022)209 sets out the basis for the creation of a new independent EU Centre on Child Sexual Abuse (“EU Centre”), namely a hub of experts. The envisaged task of the EU would be the provision of reliable information on identified child sexual abuse material (“CSAM”), reception and analysis reports from providers. Moreover, the EU Centre would distinguish between erroneous reports and relevant reports, forwarding only the relevant ones to the competent law enforcement authorities and providing support to the victims. The EU Centre would, therefore, help both online service providers by providing guidelines on how to comply with their new obligations and national law enforcement authorities by reviewing reports from providers.


Secondly, the new rules proposed in COM(2022)209 entail mandatory risk assessment and risk mitigation measures, targeted detection obligations based on a detection order, strong safeguards on detections, clear reporting obligations, effective removal of CSAM, reduction of exposure to grooming, solid oversight mechanisms and judicial redress.

Online service providers will have to carry out an assessment of the risk that their services are used for the dissemination of CSAM or for grooming (soliciting children to provide CSAM). Moreover, national authorities of EU Member States will be designated to review these risk assessment reports, and potentially ask for detection orders targeting a specific type of content on a specific service for a limited time. Most importantly, national authorities will be able to issue removal orders if the service provider in question fails to take down the identified CSAM, which requires internet access providers to disable access to images and videos that cannot be taken down.


Nevertheless, this proposal did not come without criticism. In particular, the European Commission has been accused of threatening to end encrypted communication (a method that renders data hard to read by an unauthorized third-party) by imposing extremely strict requirements on messaging apps. This derives from the fact that the proposed regulation requires messaging services and web hosts to search for and report CSAM. This would also apply to encrypted messaging services such as iMessage and WhatsApp, where encrypted communication is supposed to prevent this kind of surveillance practice.

Although systemically preventing avoidable child abuse and grooming wherever it takes place including private messaging platforms is a pressing issue, it is questionable whether this would require companies to set up infrastructures for a detailed analysis of user messages creating a massive surveillance system. Nevertheless, it must be noted that COM(2022)209 requires companies to employ detection technologies only for the purposes of identifying CSAM and in the least privacy-intrusive fashion. Additionally, the EU Centre is committed to scrutinize reports by companies and national authorities to minimize errors.


Big Tech companies such as Apple are not strangers to the idea of combatting CSAM within their digital services. In fact, Apple had already proposed a plan to begin scanning users’ images for CSAM by means of a mechanism called “perceptual hashing”. This process entails comparing users’ photos portraying children with known images of child abuse, which will be flagged and signaled to the law enforcement authorities in case of matching. Although this feature was supposed to be implemented in an update to iOS 15 and Ipad OS 15 by the end of 2021, Apple abandoned the plan due to the negative feedback received by consumers, advocacy groups, and researchers. Based on this premises, it seems reasonable to expect that approval of COM(2022)209 could lead to Apple and other Big Tech companies rehashing their plans to prevent and combat child sexual abuse online. Thus, a year after the publication of COM(2022)209, this proposal is still being scrutinized and discussed by the competent EU bodies, as required by the ordinary legislative procedure. It must be noted that the afore-mentioned derogations allowed under the e-Privacy Directive and the European Electronic Communication Code will cease to exist upon the passing of COM(2022)209.


To conclude, COM(2022)209 reflects the pressing need to prevent and combat child sexual abuse online within the EU and worldwide. Nevertheless, enacting EU legislation on this matter would require an exceptional exercise of balancing users’ privacy with the need for children’s protection. In the future, it would be interesting to examine the response of Big Tech companies to developments in this field.


Sources:


A Hern, ‘Apple delays plans to scan cloud uploads for child sexual abuse images’ (The Guardian, 3 September 2021) <https://www.theguardian.com/technology/2021/sep/03/apple-delays-plans-to-scan-cloud-uploads-for-child-sexual-abuse-images> accessed 24 April 2023.


A Hern, ‘Planned EU rules to protect children online are attack on privacy, warn critics’ (The Guardian, 12 May 2022) <https://www.theguardian.com/society/2022/may/12/planned-eu-rules-to-protect-children-online-are-attack-on-privacy-warn-critics> accessed 24 April 2023.


Commission Staff Working Document of 11 May 2022, ‘Impact Assessment Report accompanying the document Proposal for a Regulation of the European Parliament and the Council laying down rules to prevent and cobat child sexual abuse’, SWD (2022) 209 final


Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) OJ L 201.


Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast) OJ L 321.


European Commission, ‘Fighiting chil sexual abuse: Commission proposes new rules to protect children’ (Press Release, 11 May 2022) <https://ec.europa.eu/commission/presscorner/detail/en/ip_22_2976> accessed 24 April 2023.


EU Monitor, ‘COM(2022)209 - Regulation: Rules to prevent and combat child sexual abuse’ <https://www.eumonitor.eu/9353000/1/j9vvik7m1c3gyxp/vlsvpfrr9fzl> accessed 24 April 2023.


J Rossignol, ‘Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos’ (MacRumors, 7 December 2022) <https://www.macrumors.com/2022/12/07/apple-abandons-icloud-csam-detection/> accessed 24 April 2023.


M Negreiro, ‘Briefing: EU Legislation in Progress. Combating child sexual abuse online’ (European Parliamentary Research Service, December 2022) <https://www.europarl.europa.eu/RegData/etudes/BRIE/2022/738224/EPRS_BRI(2022)738224_EN.pdf> accessed 24 April 2023.


Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse of 11 May 2022, COM (209) final, 2022/0155 (COD).


Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) OJ L 119.


Regulation (EU) 2021/1232 of the European Parliament and of the Council of 14 July 2021 on a temporary derogation from certain provisions of Directive 2002/58/EC as regards the use of technologies by providers of number-independent interpersonal communications services for the processing of personal and other data for the purpose of combating online child sexual abuse OJ L 274.


Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277.


 
 
 

Recent Posts

See All

Comments


Law and Technology

©2022 by Law and Technology. Proudly created with Wix.com

bottom of page