EU Proposes Legislation to Force Scanning of User Data for Child Sexual Abuse Material

EU Proposes Legislation to Force Scanning of User Data for Child Sexual Abuse Material

Posted by · 1682 views
The European Commission has proposed new EU legislation that will force companies to scan private user data for child sexual abuse material. If adopted, the move will effectively kill the privacy of digital correspondence.

The proposed rules will oblige providers to detect, report and remove child sexual abuse material on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.

Those rules will include:
● Mandatory risk assessment and risk mitigation measures: Providers of hosting or interpersonal communication services will have to assess the risk that their services are misused to disseminate child sexual abuse material or for the solicitation of children, known as grooming. Providers will also have to propose risk mitigation measures.
● Targeted detection obligations, based on a detection order: Member States will need to designate national authorities in charge of reviewing the risk assessment. Where such authorities determine that a significant risk remains, they can ask a court or an independent national authority to issue a detection order for known or new child sexual abuse material or grooming. Detection orders are limited in time, targeting a specific type of content on a specific service.
● Strong safeguards on detection: Companies having received a detection order will only be able to detect content using indicators of child sexual abuse verified and provided by the EU Centre. Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.
● Clear reporting obligations: Providers that have detected online child sexual abuse will have to report it to the EU Centre.
Effective removal: National authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.
● Reducing exposure to grooming: The rules require app stores to ensure that children cannot download apps that may expose them to a high risk of solicitation of children.
● Solid oversight mechanisms and judicial redress: Detection orders will be issued by courts or independent national authorities. To minimise the risk of erroneous detection and reporting, the EU Centre will verify reports of potential online child sexual abuse made by providers before sharing them with law enforcement authorities and Europol. Both providers and users will have the right to challenge any measure affecting them in Court.

“The idea that all the hundreds of millions of people in the EU would have their intimate private communications, where they have a reasonable expectation that that is private, to instead be kind of indiscriminately and generally scanned 24/7 is unprecedented,” said Ella Jakubowska, policy adviser at European Digital Rights (EDRi), a network of 45 nongovernmental organizations.

Home Affairs Commissioner Ylva Johansson has claimed technical solutions exist to keep conversations privacy while finding illegal content, something cybersecurity experts have repeatedly discredited.

“The EU shouldn’t be proposing things that are technologically impossible,” said Jakubowska.

Last year, Apple announced it would starting scanning user's photos and messages for child pornography. The move was viewed as a shocking about-face for users who have relied on the company's leadership in privacy and security. Outrage was immediate from users, researchers, and organizations like the Electronic Frontier Foundation. Eventually, Apple abandoned its plan; however, unless there is strong pushback in the EU, it's likely we'll see the company reintroduce it.

Please download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for more updates.

Read More [via Politico]

EU Proposes Legislation to Force Scanning of User Data for Child Sexual Abuse Material
Add Comment
Would you like to be notified when someone replies or adds a new comment?
Yes (All Threads)
Yes (This Thread Only)
You must login or register to add a comment...