Colorado

Colorado's attorney general issues warning about “deepfakes”

Published

on


DENVER (KKTV) – Colorado’s attorney general issued a warning Monday morning about “deepfakes.”

Attorney General Phil Weiser says the public needs to be on the lookout for election misinformation and disinformation in the form of realistic-looking images, videos, and audio created using artificial intelligence, known as “deepfakes.” You can see a public advisory that was issued at the bottom of this article.

This year, lawmakers passed and Gov. Polis signed into law HB24-1147. The new law requires anyone using AI to create communications to voters featuring images, videos, or audio of candidates for office to include a disclaimer explaining that the content is not real. Failure to provide such a disclaimer can result in fines and other penalties.

“Because images, videos, and audio created with artificial intelligence are becoming difficult to distinguish from the real thing, you should be cautious when forming opinions based on what you see and hear online, on TV, and receive in the mail,” said Weiser. “The sad reality is that even AI-powered tools designed to detect these deepfakes have difficulty catching them. I encourage voters to do your research, get your news and information from trusted sources, and be mindful that the sophistication of AI means you can’t always believe what you see and hear anymore.”

Advertisement

In the public advisory Weiser issued, he lays out what voters, candidates, and campaigns need to know about the new law:

  • Any visual or audio communication regarding candidates for office using deepfake images, audio, video, or multimedia are prohibited unless properly disclosed.
  • The required disclosures must be clear and conspicuous. A disclaimer notifying voters that the content “has been edited and depicts speech or conduct that falsely appears to be authentic or truthful” must be displayed or otherwise appear in the communication, and the law provides for exact font sizes and other requirements.
  • Exceptions to the law include protections for outlets that discuss deepfake material in news stories, so long as the broadcast makes clear the content includes a deepfake. Additionally, radio and television broadcast stations are exempt if they run political advertisements that contain deepfakes that lack proper disclaimers. The law also exempts satires and parodies.
  • Violations can result in legal action to prevent dissemination of the deepfake in question, and violators could be subject to financial liabilities or even criminal penalties.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Trending

Exit mobile version