LinkedIn Accused of Giving Third Parties Access to Premium Clients’ Private Messages for AI Training – UPDATED (Dismissed)

Updated on 1st Feb. 2025.

A proposed class action accusing LinkedIn of violating Premium customers’ privacy by disclosing private messages for generative AI training has been dismissed in San Jose, California. Alessandro De La Torre filed a notice of dismissal without prejudice just nine days after initiating the suit, following LinkedIn’s assertion that the claim held no merit. The complaint alleged that LinkedIn broke its promise to restrict personal data use—allegedly revealed when updating its privacy policy in September—and that the new data-sharing setting would not affect earlier AI training. Industry representatives expressed relief when LinkedIn’s vice president and lawyer, Sarah Wight, confirmed publicly that private messages were never used for AI training, assuring users that their data remained secure.

--------------------------------------

Popular social networking platform LinkedIn was recently accused of providing third parties with exclusive access to Premium clients’ information. To be more specific, they shared private InMail messages for the sake of training AI models.

A legal case was recently filed at a federal court in California that alleged how those private messages were fed into neural networks depending on the disclosure made on LinkedIn in 2024. The company which comes under the ownership of Microsoft did share some policy changes that reflected more on this front recently.

This includes how member posts with personal information could be used by the app for training its AI models. It further went on to delineate how the data would be forwarded to third parties for solely this purpose.

The app exempted clients based in the EU, UK, EEA, Canada, Hong Kong, China, and Switzerland from this rule of training content through AI models. Clients based in America would be enabled through default settings as the country still has no privacy laws restricting this action, yet. Still, Americans were provided a new setting dubbed Data For Generative AI Improvement.

As per the platform, the setting takes full command of training for all Generative AI models that design content. After it’s switched on through the app, affiliates can use personal information and any content published on the app for this reason.

The app did acknowledge how it will make use of personal data and information for the sake of AI training and even give that information to third parties to use for training models. For now, the question still remains if the app includes private data from InMail messages or not. These were exclusive to those having Premium Accounts on LinkedIn as a part of the personal data getting shared.

As per the legal case, the platform ends up breaching all promises made as a part of the contract by disclosing private messages available to third parties for training AI models. The fact that this app is widely used for professional means, such data is very sensitive and could impact firms if and when made public. It might also touch sensitive subject areas like employment, compensation, or personal life issues.

The matter focuses more on Premium clients who continue to pay a Premium figure for subscriptions as they signed a different contract. So they have more exclusive privacy rights than those seen for non-paying members through the app.

In one part of the agreement called LSA, the app vows to never disclose Premium clients’ private data to third parties. They also boldly delineate how it would be a complete violation of the American Stored Communications Act, California Law, and a complete breach of contract.

For now, the complaint for this lawsuit failed to offer any more details on this front of how or if InMail material was shared. Instead, the filing just blindly assumes that a message from InMail might be a part of the AI training data used.

Image: DIW-Aigen

Read next: Google’s Circle to Search Update Brings Instant Image Searches and AI Overviews to Android
Previous Post Next Post