The proposed laws still alive would take aim at “high-risk AI systems,” the use of deepfakes in elections, and the impact of artificial intelligence in facial recognition.
The Consumer Protections for Artificial Intelligence bill, Senate Bill 24-205, cleared the Colorado Legislature on May 8. The legislation imposes regulations on what are considered “high-risk AI systems” — generally, those AI systems with the potential to impact critical infrastructure.
Through the bill, deployers of this technology will now be responsible for implementing a risk management policy and program — and for completing an impact assessment of the high-risk systems, to avoid discrimination or bias from AI algorithms.
S.B. 24-205 is part of Colorado’s Consumer Protection Act, which also includes the Colorado Privacy Act. If Gov. Jared Polis signs it, it would take effect Feb. 1, 2026.
A piece of legislation on “candidate election deepfake disclosures,” House Bill 24-1147, has also passed the statehouse. It seeks to curb deepfakes or AI impersonations in political communications. The bill would make deceptive portrayals of presidential candidates potential crimes of perjury and forgery. It would also impose civil penalties for sending out any communication regarding a candidate for elective office, using deepfake technology. The bill headed to Polis’ desk May 15 for his signature.
House Bill 24-1468 would not only expand the duties of the already formed Colorado task force on facial recognition and biometric technologies — it would also expand the task force’s membership. Established in 2022, the task force focuses on analyzing how state and local governments are using facial recognition.
To deal with potential bias within AI systems, legislators expanded the scope of the group in this bill — changing the name of the task force to the Biometric Technology and Artificial Intelligence Policy Task Force. The bill would increase membership from 15 to 17 — with one of the two additions being an expert in generative artificial intelligence technology, charged with studying how AI use in facial recognition will impact “vulnerable communities.”
H.B. 1468 also adjusts reporting obligations for state agencies that use facial recognition. They’re required to maintain internal agency records on the usage, and keep them open for inspection and review by the state Office of Information Technology. The bill is still under consideration by the Legislature.
Senate Bill 24-158, which focused on transparency in social media governance, has been postponed indefinitely. It would have given social media companies until July 1, 2025, to make their usage policies public on their platforms, and required them to post policy updates within 14 days of implementation. Social media companies would have had to provide contact information for users who had questions or concerns on those policies, and the companies would have to document their processes for flagging content or groups they believed violated the policies.
The bill sparked a fervent discussion on data privacy and corporate accountability before lawmakers voted to postpone consideration indefinitely.