EU Guidelines for Election Security on Online Platforms

The European Union is working on new rules to make elections safer. They want big online companies like Facebook, Google, TikTok, and Twitter to be more careful. These rules aim to reduce risks from new technologies that can create fake texts, images, or videos. The EU is worried these tools can trick voters or mess with elections. They mention that these technologies can make things that look real but are not. This includes fake events or false information about political figures.

The rules are part of the Digital Services Act, which focuses on keeping online spaces safe and trustworthy. The EU suggests that online platforms should clearly mark content made by artificial intelligence, especially if it looks like real people or events. They also say platforms should help users label AI-generated content.

For content that gets shared a lot, the EU advises using watermarks. This helps users know if what they're seeing was made by a computer. The guidelines also want platforms to update their systems to recognize these watermarks.

Other ideas include making sure AI-generated content uses reliable sources, especially for election information. Platforms should warn users about possible mistakes in AI content and direct them to trustworthy sources. They should also try to stop the creation of fake content that could influence people's actions.

The EU suggests platforms use "red teaming" to find security problems. This means they should constantly check how well their AI systems work and make them safer against misuse. For AI-generated text, platforms might show where the information came from. This helps users check the facts themselves.

Supporting research into AI risks is important, too. The EU wants platforms to give researchers tools to study AI content. When it comes to ads, the guidelines suggest clear labels for any AI-made content in advertisements.

These guidelines are still being discussed and will be finalized soon. They are not laws, but they match the Digital Services Act. This act requires platforms to reduce risks to democracy. The EU is making these rules with the upcoming European Parliament elections in mind. They want online platforms to be ready to protect democracy.

New EU guidelines aim to combat the spread of misleading AI-generated texts, images, and videos during elections.
Image: Digital Information World - AIgen

Read next: FCC Cracks Down on AI Voice Robocalls, Mandates Consent for Telemarketers Using AI Technology
Previous Post Next Post