FTC Takes Aim at AI Scams with Proposed Rule Change

The FTC is working to change a rule that stops people from pretending to be businesses or government groups. Now, they want it to protect all consumers. This change could also stop AI platforms from offering services or products used to trick consumers by pretending to be someone else.

FTC chair Lina Khan shared that scammers are using AI to copy people very well and on a big scale. These fake copies are used in scams like tricking someone in online dating or pretending to be company workers to get money. This is why the FTC wants to make their rules stronger to fight these AI scams.

A lot of people in the US are worried about fake videos and sounds that look real but are not. Surveys show that many believe these fakes will make it easier to spread lies, especially during elections. Last week, there was news about a rule change to stop fake AI calls, which was timely because of a fake call pretending to be President Biden.

Right now, there's no specific US law against these fakes. Famous people who are copied might use other laws to fight back, but it's hard and takes a lot of time. Some states have laws against these fakes, especially those that harm someone's image without permission. As these fake-making tools get better, more states are likely to make new laws against them.

FTC Chair warns of AI-driven scams replicating individuals at scale, prompting push for stronger regulations.
Image: Digital Information World - AIgen

Read next: Apple to Release New AI Tools for Developers
Previous Post Next Post