Italy’s data protection authority (GPDP) has fined OpenAI €15 million ($15.6 million) for violating data privacy laws in relation to ChatGPT. This concludes a nearly two-year investigation into the company’s handling of personal data.
The fine followed GPDP’s decision in March 2023 to temporarily block ChatGPT in Italy, making it the first Western country to restrict the popular AI chatbot over privacy concerns.
Details of the Violation
The Italian Data Protection Authority revealed several key issues that led to the fine:
- OpenAI did not notify the GPDP about a data breach in March 2023.
- Users’ personal data was processed to train ChatGPT without a proper legal basis.
- The company violated transparency principles by failing to provide adequate information to users about how their data was used.
- There were no reliable mechanisms for age verification, potentially exposing children under 13 to inappropriate content.
OpenAI’s Cooperation Acknowledged
The authority noted that OpenAI had cooperated during the investigation, which was considered in the fine calculation.
However, OpenAI is also required to run a six-month awareness campaign across various media platforms, TV, print, and online to educate the public about ChatGPT and its use of personal data.