Chapter 7Defending Democracy from Deepfake Deception Act of 2024
Section § 20510
This law is called the Defending Democracy from Deepfake Deception Act of 2024.
Section § 20511
This section highlights the challenges California faces with the use of generative artificial intelligence (AI) in elections. It acknowledges that AI-powered disinformation could significantly undermine voter trust by creating and spreading fake images, audio, and videos that mislead the public. For instance, people might be deceived by fabricated images of candidates or officials committing acts they did not do.
The text specifically mentions deepfakes—realistic fake videos or recordings—as a significant threat, especially heading into the 2024 presidential elections. These could distort election outcomes by spreading misleading content quickly and widely. To combat this, the law requires clear labeling of artificial content to inform consumers of its inauthenticity, thus reducing deception.
The overarching goal is to ensure elections remain free and fair, emphasizing the importance of addressing disinformation to protect the electoral process in California.
Section § 20512
This section defines key terms in the context of California election laws. An "advertisement" refers to public communications to support or oppose election candidates, specifically paid ones. A "broadcasting station" includes various types of media outlets like radio, TV, and streaming services. The term "candidate" encompasses those running for certain public offices, including President. "Deepfake" denotes manipulated media that falsely appears authentic. "Election communication" covers general public communications related to elections, excluding advertisements. "Election in California" refers to elections featuring specific candidates or statewide measures. An "elections official" is defined as certain government individuals, such as the Secretary of State. A "large online platform" includes digital applications with over a million users in California in a year. Lastly, "materially deceptive content" involves significantly altered media that misleads a reasonable person, excluding minor adjustments.
Section § 20513
This law requires large online platforms to identify and remove false, misleading content that could harm a candidate or undermine trust in elections. Specifically, if content portrays a candidate, elections, or elected official as doing or saying something untrue that could damage reputations or election confidence, it must be reported and removed within 72 hours. The platforms must use advanced techniques to identify such content, and identical or similar content must also be removed.
To ensure transparency, if a candidate uses manipulated media of themselves, they must disclose it clearly during election periods. The rules apply from 120 days before an election until election day, or slightly longer if the content involves election officials.
Section § 20514
This law requires large online platforms to identify and label misleading content that could impact elections. Platforms must use advanced technology to find and label fake or altered posts if reported, especially if they appear as ads or election communication. Once such content is identified, platforms have 72 hours to label it with a clear message, like 'This video has been manipulated and is not authentic.' This label must allow users to click for more information. The law applies during specific times leading up to and following an election, particularly if the content involves election-related items like ballots or voting machines.
Section § 20515
This law requires large online platforms to make it easy for California residents to report content that may need to be removed or labeled. Once a report is made, the platform must respond within 36 hours, explaining their actions on the reported content.
If a candidate, elected official, or elections official reports content and either doesn’t get a response within 36 hours or disagrees with the outcome, they can take legal action within 72 hours. They can request the court to force the platform to remove or label the content or ensure the platform follows the reporting process. The person seeking legal action must provide strong evidence of the platform's failure to comply.
Section § 20516
This law allows the Attorney General, district attorneys, or city attorneys to take legal action against large online platforms if they don't remove certain content, label it correctly, or follow specific reporting procedures. The officials must prove the platform's violation with strong and convincing evidence. Such cases will be given priority in the court process.
Section § 20517
This law targets misleading content, no matter what language it's in. If the content isn’t in English, required notices and labels must be shown in both the original language and in English as well.
Section § 20518
This section allows large online platforms to block, remove, or label content that is deemed materially deceptive at any time, even outside certain specified periods. Also, online platforms that aren't governed by this chapter can still take the same actions against deceptive content.
Section § 20519
This law section explains exceptions to certain content restrictions for online publications, broadcasting stations, and satire or parody content. Specifically, it states that regularly published online newspapers or magazines can publish deceptive content, as long as they clearly disclose that the content doesn't reflect real events. Broadcasting stations can also air deceptive content in news formats if they make clear it doesn't represent real events. There are also exceptions for paid advertisements that meet specific requirements and when satirical or parody content is involved.