Italy fines OpenAI over ChatGPT privacy rules breach
Published by Jessica Weisman-Pitts
Posted on December 20, 2024
2 min readLast updated: January 27, 2026

Published by Jessica Weisman-Pitts
Posted on December 20, 2024
2 min readLast updated: January 27, 2026

By Elvira Pollina and Alvise Armellini
MILAN (Reuters) -Italy’s data protection agency said on Friday it fined ChatGPT maker OpenAI 15 million euros ($15.58 million) after closing an investigation into use of personal data by the generative artificial intelligence application.
The fine comes after the authority found OpenAI processed users’ personal data to “train ChatGPT without having an adequate legal basis and violated the principle of transparency and the related information obligations towards users”.
OpenAI said the decision was “disproportionate” and that the company will file an appeal against it.
The investigation, which started in 2023, also concluded that the U.S.-based company did not have an adequate age verification system in place to prevent children under the age of 13 from being exposed to inappropriate AI-generated content, the authority said.
The Italian watchdog also ordered OpenAI to launch a six-month campaign on Italian media to raise public awareness about how ChatGPT works, particularly as regards to data collection of users and non-users to train algorithms.
Italy’s authority, known as Garante, is one of the European Union’s most proactive regulators in assessing AI platform compliance with the bloc’s data privacy regime.
Last year it briefly banned the use of ChatGPT in Italy over alleged breaches of EU privacy rules.
The service was reactivated after Microsoft-backed OpenAI addressed issues concerning, among other things, the right of users to refuse consent for the use of personal data to train the algorithms.
“They’ve since recognised our industry-leading approach to protecting privacy in AI, yet this fine is nearly twenty times the revenue we made in Italy during the relevant period,” OpenAI said, adding the Garante’s approach “undermines Italy’s AI ambitions”.
The regulator said the size of its 15-million-euro fine was calculated taking into account OpenAI’s “cooperative stance”, suggesting the fine could have been even bigger.
Under the EU’s General Data Protection Regulation (GDPR) introduced in 2018, any company found to have broken rules faces fines of up to 20 million euros or 4% of its global turnover.
($1 = 0.9626 euros)
(Reporting by Alessia Pe and Elvira PollinaAdditional reporting by Supantha MukherjeeEditing by Alvise Armellini and Frances Kerry)
The General Data Protection Regulation (GDPR) is a comprehensive data protection law in the EU that governs how personal data is collected, processed, and stored, ensuring individuals' privacy rights are protected.
Personal data refers to any information that relates to an identified or identifiable individual, such as names, email addresses, and identification numbers, which must be handled according to privacy laws.
A data protection agency is a regulatory body responsible for enforcing data protection laws and ensuring compliance with regulations like GDPR, safeguarding individuals' privacy rights.
Artificial intelligence (AI) refers to the simulation of human intelligence in machines, enabling them to perform tasks such as learning, reasoning, and problem-solving.
Explore more articles in the Technology category











