April 1, 2026

AI and Housing: Preparing for Colorado’s High-Risk Artificial Intelligence Regulations

Share this post
Man typing on a computer

Update: Colorado AI Act Enforcement Delayed. The Colorado Attorney General's office has stated it will not enforce the Act until it has finished interpretive rulemaking (timing uncertain).

Artificial Intelligence (AI) is rapidly transforming property management operations. However, states are attempting to regulate its use and provide oversight of this monumental shift. The first such bill is Colorado’s AI Act (Senate Bill 24-205) which will take effect June 30, 2026 after it was delayed last year. The AI Act establishes a comprehensive framework for regulating "High-Risk" Artificial Intelligence Systems, specifically targeting technologies used to make consequential decisions in housing. 

Defining "High-Risk" AI System

The law defines a “High-Risk Artificial Intelligence System” as any AI system that, when deployed, makes or is a substantial factor in making a consequential decision. “Consequential decision” means a decision that has a material legal or similarly significant effect on the provision or denial of certain categories of services to a consumer, such as education enrollment, employment, or financial lending service. Importantly for the rental industry, it also includes the provision or denial, or the cost or terms, of housing. Owners who use AI-driven applicant screening, such as a predictive, continuous modeling system  that makes decisions or recommendations, and certain AI algorithmic rent-pricing models.

Property managers using AI should evaluate whether it falls under the definition of a “High-Risk Artificial Intelligence System”. If it does, then the following requirements may apply.

The Duty of Reasonable Care

Both developers (the software companies) and deployers (the property managers using the “High Risk AI System”) have a legal "duty of reasonable care" to protect consumers from algorithmic discrimination. This is defined as any condition in which the use of an AI system results in unlawful differential treatment based on protected classes like race, religion, sex, or disability.

Requirements for Property Managers (Deployers)

As a "deployer" of an AI system, property managers subject to the AI Act can implement several proactive measures to create a rebuttable presumption that a deployer used reasonable care. A few measures are summarized below. Please review the entire Act for a complete list.

  1. Risk Management Policy: Deployers must implement a risk management policy and program to identify and mitigate risks of algorithmic discrimination. This program must be reviewed and updated systematically throughout the AI system's lifecycle.
  2. Annual Impact Assessments: Deployers (or a third party) must complete an impact assessment for the High Risk AI System annually and within 90 days of any "substantial modification" of the system. This assessment must analyze the purpose of the AI system, its data inputs, and whether it poses a risk of discrimination.
  3. Consumer Notifications: Before a decision is made, deployers must notify the applicant that a high-risk AI system is being used. If the decision is adverse (e.g., a denied application), you must provide the "principal reasons" for that decision and an opportunity for the applicant to appeal and correct incorrect data.

Developer Obligations

To help deployers comply with the Act, software developers of High Risk AI Systems are required to provide you with detailed documentation. This includes summaries of the data used to train the system, known limitations, and the measures taken by the developer to mitigate discrimination risks.

Small Business Exemption

There is a limited "safe harbor" for small businesses. If a deployer employs fewer than 50 full-time employees, it may be exempt from the formal risk management policy and impact assessment requirements, provided it does not use its own data to train the AI and it makes the developer’s impact assessments available to consumers.

Enforcement and Risk

There is no private right of action (consumers cannot sue  directly under this law), and the Attorney General has exclusive authority to enforce the Act. Violations are classified as unfair trade practices. However, the law provides an affirmative defense if you discover and cure a violation through internal reviews or "red teaming".

Entrata is Here to Help

For additional news and updates related to rental regulation changes, please visit the Compliance Center (Entrata login required). This resource is available to all clients and will be continuously updated to reflect new legal developments in the multifamily industry in 2026.

Interested in seeing what Entrata can do for you?

See how Entrata can transform your operations.