A UK data privacy group has approached regulators seeking to stop Meta from scraping user data to train its AI models. Open Rights Group (ORG) said Meta changed its privacy policy on June 26, allowing the company to use content shared by its users as a source for developing AI systems.
Also read: Tech giants use YouTube subtitles for AI training without permission
ORG is a member-based organization that advocates for privacy and data protection. The group said that Meta would “rely on the legal basis called legitimate interests” to use personal information for its AI development. The privacy policy change could affect up to 50 million Facebook and Instagram users in the United Kingdom, it added.
Meta accused of ‘overriding user rights’
ORG wrote to the UK’s Information Commissioner’s Office requesting an investigation into Meta’s “undefined” nature of AI uses, the user’s inability to opt in and out, and data processing without consent. The formal complaint accuses Meta of “overriding user rights and legitimate expectations”.
According to the Open Rights Group, Meta emailed UK Facebook and Instagram users at the end of May, informing them that it intended to use individual data for building its generative artificial intelligence models. Mariano delli Santi, complainant and legal and policy officer at ORG, said :
“While Meta told users they had the right to object, it did not commit to honoring objections as a matter of course. Once a user’s data has been used by the company, it will likely be irreversible, so consent cannot be applied retrospectively.”
“It’s not acceptable that the company is making a half-hearted attempt to enable people to opt out rather than give their consent to such intrusive data processing,” he added.
Delli Santi suggested that Meta’s proposals violate the UK’s General Data Protection Regulation (GDPR) “on several levels.” He wants the regulator “to investigate thoroughly and stop them [Meta] once and for all.”
Meta says it complies with EU regulations
The Open Rights Group complaint is similar to the one made in the European Union by None of Your Business , a Europe-based data privacy pressure group. The action forced Meta to halt plans to train its AI programs using posts made by people in the EU on Facebook and Instagram. The decision also delayed the launch of Meta AI in Europe.
Also read: WhatsApp to allow users to create images using Meta AI
While the United Kingdom’s GDPR resembles that of the European Union, the EU’s decision does not apply to Meta UK. That’s because Britain separated itself from the economic and political bloc in 2020 via Brexit .
After the None of Your Business complaint, Stefano Fratta, global engagement director of Meta Privacy Policy, said the company is “highly confident that our approach complies with European laws and regulations. AI training is not unique to our services, and we’re more transparent than many of our industry counterparts.”