Instagram’s Child Safety Risks Prompt More Scrutiny from European Authorities

Meta has received another formal request for information (RFI) from European Union regulators seeking more details of its response to child safety concerns on Instagram – including what it’s doing to tackle risks related to the sharing of self-generated child sexual abuse material (SG-CSAM) on the social network. The Digital Services Act (DSA) started applying for larger in-scope platforms (including Instagram) in late August, putting obligations on Big Tech to tackle illegal content and protect minors. The latest request to Meta comes hard on the heels of a report by the WSJ suggesting Instagram is struggling to clean up a CSAM problem it exposed this summer. In June, following the WSJ’s exposé, the EU warned Meta it faces a risk of “heavy sanctions” if it doesn’t act quickly to tackle the child protection issues. Now, another report by the WSJ claims Meta has failed to rectify the issues identified. Spotty performance by Meta on tackling the sharing of illegal CSAM/SG-CSAM could get expensive for the company in the EU. Meta was fined under half a billion dollars after Instagram violated the bloc’s data protection rules for minors. The company has been given a deadline of December 22 to provide the Commission with the latest requested child safety data.