Meta has faced a backlash in the EU for using AI data without user consent.
On June 5, MetaPlatforms received 11 complaints about changes it made to how it uses personal data to train its artificial intelligence models without obtaining consent. These changes may violate EU privacy regulations.
Privacy advocacy group No Your Business (NYOB) has called on national privacy regulators to take immediate action to stop such use. The 11 petitions were filed in Austria, Belgium, France, Germany, Greece, Italy, Ireland, the Netherlands, Norway, Poland and Spain.
Content of the complaints
The complaints say changes to Meta's privacy policy, effective June 26, allow the company to use years of personal posts, private images and online tracking data for AI technology.
As a result of the recent changes, NOYB has asked data protection authorities in 11 countries to start an urgent review.
In a statement from NYOB, Meta's recently updated privacy policy cites the legal need to train and develop user data and other AI tools it shares with third parties.
The policy change will affect millions of European users, preventing them from removing their data once it's in the system.
Related: Meta AI boss blames Elon Musk on hip, conspiracy theories
NOYB has previously filed several complaints against Meta and other Big Tech companies regarding violations of the European Union's General Data Protection Regulation (GDPR).
The decision of the European Court of Justice was ignored by Meta.
The founder of NOYB, Max Schrems, said in his statement that the European Court of Justice He pointed out that in 2021 he made a decisive decision on this matter, which should serve as a reference point for the use of personal data that Meta plans. he said:
“The European Court of Justice (CJEU) has already made it clear that Meta does not have a ‘legitimate interest' in overriding users' data protection rights with regard to notices…” he said.
Schrems argued that giving users the responsibility to protect their privacy is not entirely reasonable. The law requires Meta to obtain express consent from users rather than providing a hidden and misleading opt-out option.
Related: EU launches investigation into Apple, Google and Meta for violating digital markets law
Meta emphasized that if they want to use users' data, they have to ask permission directly. Instead, Meta requests users to opt out of data usage, which is inappropriate.
In July 2023, Google was sued for the same after it updated its privacy policy. The lawsuit alleges that the company misused large amounts of data, including copyrighted data, in its AI training.
Magazine: Make Meth, Napalm in ChatGPT, AI bubble, 50M Deep Fake Calls: AI Eye