OpenAI Faces Legal Issues in Europe Over Data Privacy Concerns

OpenAI, the company that created ChatGPT, might have to pay big fines. Italy's main technology authority has accused them of not following Europe's data privacy laws. The issue seems to be about not controlling the content for young users.

In Italy, the Data Protection Authority, called the Garante, informed OpenAI they might have broken data protection rules. They did not give details on what OpenAI did wrong or what actions they might take.

The Italian regulators are worried about young users seeing bad content on the chatbot. OpenAI's website says users should be at least 13 years old and those under 18 need permission from a parent or guardian.

The Italian agency is also looking at how OpenAI collects data from users to train the chatbot. The Garante thinks OpenAI might have broken rules in the EU's General Data Protection Regulation.

Italy had banned ChatGPT before, the first time in Europe, but they allowed it again after OpenAI answered their privacy concerns.

The EU law says companies can be fined a lot of money if they break these rules.

It's not clear if OpenAI will face another ban because of this new issue. The Garante has not made any comments yet.

OpenAI disagrees with the Italian agency. They say they follow the GDPR and other privacy laws.

They are also trying to use less personal data in their systems. OpenAI wants to keep working with Garante to solve these problems.

This is not the only trouble for OpenAI. They are facing legal and regulatory issues in the US too. In the US and Europe, authorities are looking at OpenAI's relationship with Microsoft for potential competition issues.

This attention grew after Microsoft helped bring back OpenAI's CEO, Sam Altman, who was briefly fired.

OpenAI is also dealing with a lawsuit from the New York Times. The newspaper says OpenAI used its articles to train ChatGPT without permission or payment.

Image: DIW

Read next: Microsoft Sees Revenue Growth in Latest Financial Quarter
Previous Post Next Post