Chapter 25AI Transparency Act
Section § 22757
This section names the law as the California AI Transparency Act, setting the stage for regulations or guidelines related to AI transparency in California.
Section § 22757.1
This section defines key terms related to artificial intelligence (AI) within a specific chapter. It describes 'AI' as a machine-based system that can influence environments through its outputs. A 'covered provider' is someone who creates an AI system with over 1,000,000 users monthly in the state. A 'GenAI system' refers to AI that can generate content like text, images, video, and audio. The terms 'latent' and 'manifest' are explained. 'Metadata' refers to data about other data, and 'personal information' is defined as in the Civil Code. 'Personal provenance data' contains specific user-associated information, while 'provenance data' verifies content authenticity and history. 'System provenance data' includes device or system information that isn't user-specific but shows content authenticity.
Section § 22757.2
This law requires certain providers to offer a free AI detection tool to users. The tool should identify if content like images, videos, or audio has been created or changed by the provider's AI system, but it should not show personal data. The tool must be available online and must allow uploads or content links. It should also have an interface so it can be used without visiting the provider's site. Providers must gather user feedback to improve the tool and cannot keep personal user information without consent. They must not retain content or personal data longer than necessary, except for user contact information if the user agrees for feedback purposes.
Section § 22757.3
This law requires companies that use generative AI (GenAI) systems to create or alter digital content (like images, videos, or audio) to include clear disclosures that the content is AI-generated. These disclosures must be evident and hard to remove. The content should include information like the company name, AI system details, and creation date, and it must be detectable by AI detection tools. If such a system is licensed to someone else, the company must ensure the licensed system can still mark content appropriately. If a licensee changes the system so it can't mark content, the company must cancel the license within 96 hours, and the licensee must stop using it.
Section § 22757.4
If a company breaks the rules in this section, they can be fined $5,000 for each day they’re in violation. These fines can be imposed through a lawsuit filed by local or state legal authorities. If someone sues the company and wins, they can get their legal fees and costs covered. Additionally, if third-party contractors violate specific parts of the rules, authorities can also sue to stop them and recover legal costs.
Section § 22757.5
This law says that certain rules in this chapter don't apply to products or services that only offer video games, TV shows, streaming, movies, or interactive content that isn't created by users.
Section § 22757.6
This section of the law will start being enforced on January 1, 2026.