Google, Meta, Microsoft and Snap said they will maintain efforts to detect child sexual abuse material using existing tools following the expiry of the ePrivacy derogation in the EU, the companies said in an announcement.
The firms said voluntary measures, including hash-matching technology, have for years been used to detect, remove and report CSAM and to protect victims and survivors.
Related reading
- Meta and Arm to co-develop data centre chips built for AI workloads
- Nvidia says the open versus proprietary AI debate is the wrong argument
- From Vera Rubins, Blackwell, Nemotron, NemoClaw, to Isaac and Cosmos - here's a whistlestop guide to NVIDIA's…
The companies warned the derogation's expiry has created legal uncertainty that "clouds the legal certainty that has helped responsible platforms try to protect our communities, safeguard child victims, and preserve the integrity of our services," and added: "We are disappointed by this irresponsible failure to reach an agreement to maintain established efforts to protect children online," the companies said in an announcement.
Signatory companies reaffirmed they "will continue to take voluntary action on our relevant Interpersonal Communication Services," and called on EU institutions to conclude negotiations on a regulatory framework urgently; they also invited stakeholders to a webinar at 3PM CET to explain how hash-matching and CSAM detection tools work.
The recap
- Companies say they will continue CSAM detection after derogation expiry
- Almost 250 child rights organizations share the companies' concern
- Companies urge EU institutions to conclude regulatory negotiations urgently