Chapter
In this article
Construction workers are seen on-site with large apartment buildings in the background.
Construction workers at the groundbreaking ceremony for a new residential building with a set percentage of affordable housing units on February 9, 2024, in Brooklyn, New York. (Getty/Andrew Lichtenstein/Corbis)

See other chapters in CAP’s Report: Taking Further Agency Action on AI

Authors’ note: For this report, the authors use the definition of artificial intelligence (AI) from the 2020 National Defense Authorization Act, which established the National Artificial Intelligence Initiative.1 This definition was also used by the 2023 “Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence.”2 Similarly, this report makes repeated reference to “Appendix I: Purposes for Which AI is Presumed to be Safety-Impacting and Rights-Impacting” of the 2024 OMB M-24-10 memo, “Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence.”3

Read the fact sheet

The fact sheet lists all of the recommendations detailed in this chapter of the report.

The U.S. Department of Housing and Urban Development (HUD) and other housing regulators should consider addressing potential artificial intelligence (AI) risks to housing fairness and discrimination using existing statutory authorities in the Fair Housing Act and the Dodd-Frank Wall Street Reform and Consumer Protection Act. While Governing for Impact (GFI) and the Center for American Progress have extensively researched these existing authorities in consultation with numerous subject matter experts, the goal is to provoke a generative discussion about the following proposals, rather than outline a definitive executive action agenda. Each potential recommendation will require further vetting before agencies act. Even if additional AI legislation is needed, this menu of potential recommendations to address AI demonstrates that there are more options for agencies to explore beyond their current work and that agencies should immediately utilize existing authorities to address AI.

While the use of AI in housing decisions is not the primary perpetrator of discrimination, it has augmented existing historical inequalities and further obscured housing providers’ decision-making metrics. As Lisa Rice, president and CEO of the National Fair Housing Alliance, stated in a Senate AI insight forum:

These systems are still performing their originally-intended function: perpetuating disparate outcomes and generating tainted, bias-laden data that serve as the building blocks for automated systems like tenant screening selection, credit scoring, insurance underwriting, insurance rating, risk-based pricing, and digital marketing technologies. The ability of automated systems to scale can lead to, reinforce, or perpetuate discriminatory outcomes if they are not controlled.4

While the administration should address these underlying inequities, the 2023 AI executive order has specifically tasked HUD with at least addressing the harms caused by AI in housing,5 and many of the recommendations below build on this directive.

AI risks and opportunities

Access to housing is imperative to overall well-being, economic and social advancement, and safety. As the 2023 “Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence” highlights, AI has the potential to exacerbate unlawful discrimination in housing, including by automated or algorithmic tools used to make decisions about access to housing and in other real estate transactions.6

A note on terminology

In this section, the authors define “AI” expansively to refer not just to technologies incorporating recent advances in machine learning but also to algorithmic and automated decision-making technologies that have enabled discrimination in the housing context for many years now, especially with the prolific use of tenant screening tools.7

Specifically, the 2023 executive order on AI requires the Department of Housing and Urban Development and the Consumer Financial Protection Bureau (CFPB) to issue guidance targeting tenant screening systems and detailing how the Fair Housing Act (FHA), Consumer Financial Protection Act, and Equal Credit Opportunity Act (ECOA) apply to the discriminatory effects of AI in housing advertising, credit, or other real estate transactions8—ECOA proposals are included in chapter 5 of this report. Furthermore, the Office of Management and Budget (OMB) M-24-10 memorandum on “Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence” identifies minimum risk management practices that must be applied when using AI for rights-impacting purposes, including: “Screening tenants; monitoring tenants in the context of public housing; providing valuations for homes; underwriting mortgages; or determining access to or terms of home insurance.”9 The AI section on HUD’s own website identifies “potential risks associated with AI systems, such as fairness, bias, privacy concerns, and security vulnerabilities.”10

From the authors’ perspective, several kinds of AI applications could further entrench harms to consumers and tenants, including:

  • Surveillance in public housing: Facial recognition and other biometric data are being used in public housing to evict residents for minor infractions as part of a broader surveillance network, another example of automated and machine learning technologies enabling and exacerbating long-standing harmful activities.11 Types of facial recognition are also identified in the OMB M-24-10 AI memo as presumed rights-impacting uses of AI.12 Reports indicate that HUD grant money has been used to install surveillance cameras, some of which are equipped with AI technology.13 In the wake of these reports, HUD has said that it will not fund future grants that use facial recognition.14
  • Tenant screening: AI and adjacent tools—including automated processes that compound existing discriminatory assessments—can be used in public and private housing screening contexts, further perpetuating discrimination. Automated tenant screening programs often conceal factors that result in a negative recommendation, sometimes including years-old eviction notices.15 “Screening tenants” is cited in the OMB M-24-10 AI memo as an AI use by federal agencies that is presumed to be rights-impacting.16
  • Allocation of subsidized housing: Relatedly, AI tools may be used by federal and state agencies to determine the allocation of subsidized housing or other federal programs. A 2019 report that analyzed common screening and prioritization programs used by federal housing agencies, including the Vulnerability Index – Service Prioritization Decision Assistance Tool (VI-SPDAT), found that communities of color received lower prioritization scores than their white counterparts and that individual white applicants were more likely to be prioritized for permanent supportive housing than people of color.17
  • Appraisal: As HUD has recognized in recent proposed rulemaking, home appraisal programs have negatively affected marginalized communities: “While AVMs [automated valuation models] have the potential, if properly used, to reduce human bias and improve consistency in decision-making, they are not immune from the risk of discrimination. For example, the models may rely upon biased data that could replicate past discrimination or even data that could include protected characteristics, such as race, or very close proxies for them. Moreover, if an algorithm were to generate discriminatory results, the harm could be widespread because of an AVM’s scale.”18

    Undervalued homes and entire neighborhoods can help fuel generational wealth gaps.19 A report from the Terner Center for Housing Innovation at the University of California, Berkeley, found that home valuations lower than the contract price are more common for households of color and significantly diminish a homeowner’s overall wealth.20

  • Online advertising: Online advertising is a critical component of many industries today, including housing. In 2019, Facebook (now Meta) settled with various civil rights groups and private parties over allegations of discriminatory ad targeting practices.21 Later in 2019, HUD filed suit against Facebook alleging violations of the Fair Housing Act, which resulted in a settlement in 2022.22 As HUD’s suit against Facebook alleged, online platforms may perpetuate racial discrimination in access to housing opportunities by targeting ads that exclude certain protected classes or other characteristics.23 AI and algorithms that target advertisements could cause similar harms and violations of the Fair Housing Act. As HUD noted in recent guidance, this can be both deliberate or unintentional but is illegal either way.24

  • Privacy: Online advertisements for housing and housing applications may be predicated on private or confidential information about potential consumers.25 Beyond data collection, housing providers and screening companies can misuse sensitive and personal information, especially in the context of eligibility determinations.26 This risk is amplified when AI-driven systems make automated decisions without transparency, leading to possible exclusion from housing opportunities based on obscure or inaccurate data.27 Aggregation and analysis of sensitive and personal data by AI can also result in profiling and discrimination, further exacerbating existing inequalities in the housing market.28

Current state

The Biden administration has already taken steps to address housing discrimination writ large, which can be exacerbated by the unregulated use of AI and adjacent tools. For example, the Action Plan to Advance Property Appraisal and Valuation Equity (PAVE) specifies several actions to bring AVMs into compliance with existing anti-discrimination laws.29 As a part of the PAVE plan, agencies are currently engaging in rulemaking under Section 1473(q) of the Dodd-Frank Act to address potential bias by including nondiscrimination quality control standards in the proposed rule.30 According to the PAVE plan, CFPB, the Department of Justice (DOJ), the Department of Veterans Affairs (VA), and HUD will issue guidance on the Fair Housing Act and ECOA’s application to the appraisal industry.31 HUD has already issued a letter informing all FHA participants that appraisals must comply with the Fair Housing Act.32

HUD also recently issued two guidance documents—one on tenant screening, “Guidance on Application of the Fair Housing Act to the Screening of Applicants for Rental Housing,” and one on advertising, “Guidance on Application of the Fair Housing Act to the Advertising of Housing, Credit, and Other Real Estate-Related Transactions through Digital Platforms”—that explain how the FHA protects certain rights when housing providers use AI technologies.33

The new HUD screening guidance outlines liability for housing providers and screening companies under the FHA, explaining how intentional and unintentional discrimination facilitated by AI technology may violate the FHA.34 Furthermore, the guidance highlights important considerations for both housing providers and tenant screening companies when using AI technologies, including choosing relevant screening criteria; ensuring the accuracy of records; providing transparency to applicants; allowing applicants to challenge negative information; and designing and testing models for FHA compliance.35 The guidance points specifically to credit history, eviction history, and criminal records as underlying information that is susceptible to recreating bias.36

The HUD guidance on advertising through digital platforms describes the responsibilities and liability for advertisers and ad platforms under the FHA.37 Specifically, the guidance illustrates several ways advertisers and ad platforms may violate the FHA, including by segmenting and selecting audiences based in part on protected characteristics or proxies, including via custom or mirror audience tools; limiting protected class groups’ access to housing-related ads; reverse redlining; and showing different content or pricing to different groups based on protected characteristics.38 Lastly, the guidance recommends that advertisers use platforms that manage the risk of discriminatory delivery, follow ad platform instructions, carefully consider the source of audience datasets, and monitor the outcomes of advertising campaigns.39 The guidance recommends that ad platforms should run housing-related ads in a separate process with a specialized interface designed to avoid discrimination in audience selection and delivery, avoid providing targeting options that directly or indirectly relate to protected characteristics, conduct regular testing, identify and adopt less discriminatory alternatives for AI models and algorithmic systems, ensure that algorithms are similarly predictive across protected groups, ensure that ad delivery systems are not resulting in differential pricing, and document information about ad targeting functions and internal auditing.40

HUD has also appointed a chief artificial intelligence officer (CAIO) in accordance with the taskings from the 2023 executive order on AI and the OMB M-24-10 AI memo.41

In 2013, the Obama administration implemented the discriminatory effect rule, formalizing HUD’s long-held interpretation of the FHA prohibiting discriminatory effects, regardless of intent to discriminate.42 In 2020, however, the Trump administration issued a revised rule, which purported to create defenses to disparate impact claims for entities relying on algorithms and other automated technologies.43 Of note, the 2020 rule allowed defendants to show that “predictive analysis accurately assessed risk” as a defense to a challenged policy.44 The 2020 rule never took effect and was rescinded by the Biden administration’s HUD in 2021, which finalized a rule mainly returning to the 2013 paradigm—and eliminating the Trump rule’s defenses related to algorithmic technologies.45

Relevant statutory authorities

This section explains how some statutes currently enforced by housing regulators could apply to AI. As explained in the introduction to this report, this list is by no means exhaustive, and each potential proposal would benefit from additional research and vetting.

Fair Housing Act

The Fair Housing Act makes it unlawful for “any person or other entity whose business includes engaging in residential real estate-related transactions to discriminate against any person in making available such a transaction, or in the terms or conditions of such a transaction” based on any protected class under the statute.46 The statute specifically prohibits discrimination in advertising, appraisal, public housing, and tenant screening.47 The act gives HUD the authority to conduct formal adjudications of complaints and to promulgate rules to interpret and carry out the act.48 Under this authority, HUD recently promulgated a rule reinstating HUD’s discriminatory effect standard, which clarifies that discriminatory effect—facially neutral practices that cause unjustified discrimination—is sufficient to violate the act’s prohibition on discrimination.49

Regarding advertising, Section 804(c) of the Fair Housing Act, 42 U.S.C. 3604(c), as amended, states:

It shall be unlawful to make, print, or publish, or cause to be made, printed, or published, any notice, statement, or advertisement, with respect to the sale or rental of a dwelling, that indicates any preference, limitation, or discrimination because of race, color, religion, sex, handicap, familial status, or national origin, or an intention to make any such preference, limitation, or discrimination.50

HUD has implemented this provision through rulemaking,51 and subsequent rulemaking expanded the definition of prohibited discrimination to include discriminatory effect.52 As highlighted above, the 2023 discriminatory effect rule determined that the 2020 rule’s third-party and outcome prediction defenses—both making it easier for AI-based discrimination—were unnecessary.53 HUD has since provided general guidelines for advertising and marketing and investigation procedures.54

The FHA covers tenant screening that results in discrimination.55 For example, a 2016 guidance document outlined that housing providers violate the FHA when the provider’s “policy or practice has an unjustified discriminatory effect, even when the provider had no intent to discriminate.”56 The guidance explains circumstances under which utilizing criminal records, which are inherently biased due to the criminal justice system’s disproportionate targeting of African American and Hispanic communities,57 may subject housing providers to liability under the FHA.58 Advocacy groups and the DOJ, in a statement of interest,59 have also argued that the same logic should similarly apply to other screening factors, such as credit, and rental and eviction records.60

The FHA also covers residential appraisal. The term “residential real estate-related transaction” is defined in the statute to include the “appraising of residential real property.”61 Courts have also relied on other provisions of the Fair Housing Act, including 42 U.S.C.A. § 3605, which prohibits real estate discrimination because of “race, color, religion, sex, handicap, familial status, or national origin” to prohibit discrimination in the appraisal industry.62 This prohibition includes real estate businesses that provide housing-related services that “otherwise make unavailable or deny.”63 Courts have observed that “an appraisal sufficient to support a loan request is a necessary condition precedent to a lending institution making a home loan.”64 Moreover, HUD has updated its general appraiser requirements to include nondiscrimination principles65 and has begun rulemaking on automated valuation models—which is discussed below.66

Under 42 U.S.C.A. § 3608, HUD must administer public housing programs in a “manner [that] affirmatively [furthers] the purpose of [the FHA],”67 including its nondiscrimination provisions. HUD has promulgated several rules under this authority, known as the Affirmatively Furthering Fair Housing (AFFH) rules.68 The AFFH rules apply to all federally funded housing programs, which must not only abide by nondiscrimination principles but also “take meaningful actions to overcome patterns of segregation and foster inclusive communities.”69 In 2015, the Obama administration promulgated an AFFH rule under FHA’s mandate for affirmatively advancing fair housing.70 In 2020, the Trump administration effectively eliminated the AAFH rule, leaving only a general statement of what constitutes a fair housing approach, with few policy requirements for local governments.71 In 2023, HUD proposed a new AFFH rule, largely restoring the 2015 rule and developing several key provisions, including requiring localities to develop equity plans, track their process in fair housing goals, and increase accountability through direct public complaints.72

Recommendation

Based on the aforementioned authorities, HUD could take the following action:

  • Update the “Fair Housing Advertising” guidelines—a separate document from the newly released advertising guidance—elucidating Section 804(c)’s prohibition against discrimination in the advertisement of housing opportunities in the context of online advertising that relies on algorithmic tools or data, as required by the 2023 executive order on AI and consistent with the recent HUD guidance on advertising through digital platforms.73 Such guidance would be consistent with the DOJ’s settlement with Facebook, which targeted similar practices,74 and can specifically highlight practices that lead to housing advertisements being steered away from protected communities.75 Furthermore, the guidance should specify that companies providing advertising services using AI technologies are liable. The guidelines should mirror the responsibilities and liabilities outlined in HUD’s recent guidance.76

Dodd-Frank Act

Section 1473(q) of the Dodd-Frank Act amended Title XI of the Financial Institutions Reform, Recovery, and Enforcement Act of 1989 (Title XI) to add a new section, 1125, requiring automated valuation models to adhere to certain quality standards.77 Under this authority, the Federal Housing Finance Agency (FHFA) and other agencies have proposed a rule to improve the quality control standards for AVMs.78 The proposed rule applies to AVMs in making credit decisions or covered securitization determinations regarding a mortgage but does not mandate specific policies institutions must follow or cover nonbank entities.79 Key provisions in the rule require AVMs to “ensure a high level of confidence in the estimates produced; protect against the manipulation of data; seek to avoid conflicts of interest; and require random sample testing and reviews.”80

Recommendations

Based on this authority, the FHFA should take the following actions:

  • Continue the rulemaking process on the proposed AVM rule but also include its application to all mortgage lenders—specifically nonbanks, given that more than half of annual residential real estate loans were made by nonbanks in 2022.81 Furthermore, the rule should include specific minimum standards for each proposed goal, potentially incorporating the National Institute of Standards and Technology (NIST) AI guidelines82 or relevant minimum standards developed in response to the minimum risk management practices anticipated by the OMB M-24-10 AI memo.83
  • Specify, through the proposed AVM rule or additional rulemaking, that companies using AVMs must disclose their use to customers and allow customers to request nonautomated appraisals or seek valuation from alternative AVMs. The FHFA can do so using its broad authority in Section 1125 to “account for any other such factor that the agencies … determine to be appropriate.”84 This would align with the statute’s purpose to “ensure a high level of confidence in [AVMs],” “protect against the manipulation of data,” and “avoid conflict of interest.”85

Fair Credit Reporting Act

While HUD does not administer the Fair Credit Reporting Act (FCRA), it can help the Consumer Financial Protection Bureau and the Federal Trade Commission communicate statutory and regulatory obligations to affected entities in the housing space. Especially if the FCRA’s primary regulators update regulations and guidance to account for novel AI development, as the authors recommend in Chapter 5,86 HUD can collaborate on guidance explaining entities’ obligations in the context of AI. For example, under some of the recommendations the authors propose in Chapter 5, credit reporting agencies, such as certain tenant screening firms,87 would need to disclose their use of AI technologies; periodically assess whether their machine learning or other automated technologies result in discriminatory outcomes or take into account information prohibited by statute; and provide for human review of reinvestigation requests, which, in practice, would require individual traceability and legibility.88 Furthermore, users of credit reports, including landlords and property managers, may eventually need to disclose information about the use of AI or related technologies in adverse decision notices.89

Conclusion

The Department of Housing and Urban Development and other housing regulators play a critical role in ensuring fairness in housing and contemplating how to address the potential challenges AI creates. HUD’s AI work directed by the AI executive order is a critical start, and further utilizing its existing authorities, as outlined in this chapter, is essential. GFI and CAP hope this chapter will inspire regulators, advocates, and policymakers interested in how the federal government could update regulatory regimes to account for this new AI moment as it affects housing.

Read the fact sheet

The fact sheet lists all of the recommendations detailed in this chapter of the report.

Endnotes

  1. Office of the Law Revision Counsel, “15 USC 9401(3): Definitions,” available at https://uscode.house.gov/view.xhtml?req=(title:15%20section:9401%20edition:prelim) (last accessed May 2024).
  2. Executive Office of the President, “Executive Order 14110: Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence,” Federal Register 88 (210) (2023): 75191–75226, available at https://www.federalregister.gov/documents/2023/11/01/2023-24283/safe-secure-and-trustworthy-development-and-use-of-artificial-intelligence.
  3. Shalanda D. Young, “M-24-10 Memorandum for the Heads of Executive Departments and Agencies: Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence” (Washington: Office of Management and Budget, 2024), available at https://www.whitehouse.gov/wp-content/uploads/2024/03/M-24-10-Advancing-Governance-Innovation-and-Risk-Management-for-Agency-Use-of-Artificial-Intelligence.pdf.
  4. Lisa Rice, “Written Statement of Ms. Lisa Rice: Presented at Senate AI Forum: High Impact AI,” National Fair Housing Alliance, October 27, 2023, available at https://nationalfairhousing.org/wp-content/uploads/2023/11/Lisa-Rice-Written-Remarks-at-Senate-AI-Forum_10.27.2023-1.pdf.
  5. Executive Office of the President, “Executive Order 14110: Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence.”
  6. Ibid., 75213.
  7. Kaveh Waddell, “How Tenant Screening Reports Make It Hard for People to Bounce Back From Tough Times,” Consumer Reports, March 11, 2021, available at https://www.consumerreports.org/electronics/algorithmic-bias/tenant-screening-reports-make-it-hard-to-bounce-back-from-tough-times-a2331058426/.
  8. Executive Office of the President, “Executive Order 14110: Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence.”
  9. Young, “M-24-10 Memorandum for the Heads of Executive Departments and Agencies: Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence,” p. 32
  10. U.S. Department of Housing and Urban Development, “Artificial Intelligence,” available at https://www.hud.gov/program_offices/cfo/ai (last accessed May 2024).
  11. Douglas MacMillan, “Eyes on the poor: Cameras, facial recognition watch over public housing,” The Washington Post, May 16, 2023, available at https://www.washingtonpost.com/business/2023/05/16/surveillance-cameras-public-housing/.
  12. Young, “M-24-10 Memorandum for the Heads of Executive Departments and Agencies: Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence,” p. 32.
  13. Ibid.
  14. Ibid.
  15. See Waddell, “How Tenant Screening Reports Make It Hard for People to Bounce Back From Tough Times.”
  16. Young, “M-24-10 Memorandum for the Heads of Executive Departments and Agencies: Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence,” p. 32.
  17. Catriona Wilkey and others, “Coordinated Entry Systems: Racial Equity Analysis of Assessment Data” (Needham, MA: C4 Innovations, 2019), p. 4, available at https://c4innovates.com/wp-content/uploads/2019/10/CES_Racial_Equity-Analysis_Oct112019.pdf.
  18. Interagency Task Force on Property Appraisal and Valuation Equity, “Action Plan to Advance Property Appraisal and Valuation Equity: Closing the Racial Wealth Gap by Addressing Mis-valuations for Families and Communities of Color” (Washington: U.S. Department of Housing and Urban Development, 2022), available at https://pave.hud.gov/sites/pave.hud.gov/files/documents/PAVEActionPlan.pdf.
  19. Rashawn Ray and others, “Homeownership, racial segregation, and policy solutions to racial wealth equity” (Washington: Brookings Institution, 2021), available at https://www.brookings.edu/articles/homeownership-racial-segregation-and-policies-for-racial-wealth-equity/.
  20. Terner Center for Housing Innovation at UC Berkeley, “Reducing Bias in Home Appraisals: The Roles for Policy and Technology” (Oakland, CA: 2022), available at https://ternercenter.berkeley.edu/research-and-policy/reducing-bias-in-home-appraisals-the-roles-for-policy-and-technology/.
  21. Sheryl Sandberg, “Doing More to Protect Against Discrimination in Housing, Employment and Credit Advertising,” Facebook/Meta, March 19, 2019, available at https://about.fb.com/news/2019/03/protecting-against-discrimination-in-ads/; American Civil Liberties Union, “Summary of Settlements Between Civil Rights Advocates and Facebook: Housing, Employment, and Credit Advertising Reforms,” March 18, 2019, available at https://www.aclu.org/wp-content/uploads/document/3.18.2019_Joint_Statement_FINAL.pdf.
  22. Ariana Tobin, “HUD Sues Facebook Over Housing Discrimination and Says the Company’s Algorithms Have Made the Problem Worse,” ProPublica, March 28, 2019, available at https://www.propublica.org/article/hud-sues-facebook-housing-discrimination-advertising-algorithms; Jeanine Worden, Kathleen M. Pennington, and Ayelet R. Weiss, “Charge of Discrimination: S. Department of Housing and Urban Development v. Facebook,” U.S. Department of Housing and Urban Development, March 28, 2019, available at https://www.hud.gov/sites/dfiles/Main/documents/HUD_v_Facebook.pdf; Naomi Nix and Elizabeth Dwoskin, “Justice Department and Meta settle landmark housing discrimination case,” The Washington Post, June 21, 2022, available at https://www.washingtonpost.com/technology/2022/06/21/facebook-doj-discriminatory-housing-ads/; Office of Public Affairs, “Justice Department Secures Groundbreaking Settlement Agreement with Meta Platforms, Formerly Known as Facebook, to Resolve Allegations of Discriminatory Advertising,” Press release, U.S. Department of Justice, June 21, 2022, available at https://www.justice.gov/opa/pr/justice-department-secures-groundbreaking-settlement-agreement-meta-platforms-formerly-known.
  23. Harlan Yu, Aaron Rieke, and Natasha Duarte, “Urging the Biden Administration to Address Technology’s Role in Housing Discrimination,” Upturn, July 13, 2021, available at https://www.upturn.org/work/proposals-for-the-biden-administration-to-address-technology-housing/; Worden, Pennington, and Weiss, “Charge of Discrimination: U.S. Department of Housing and Urban Development v. Facebook”; Muhammad Ali and others, “Discrimination through optimization: How Facebook’s ad delivery can lead to skewed outcomes,” Proceedings of the ACM on Human-Computer Interaction 3 (2019): 1–30, available at https://arxiv.org/abs/1904.02095.
  24. Office of Fair Housing and Equal Opportunity, “Guidance on Application of the Fair Housing Act to the Advertising of Housing, Credit, and Other Real Estate-Related Transactions through Digital Platforms” (Washington: U.S. Department of Housing and Urban Development, 2024), available at https://www.hud.gov/sites/dfiles/FHEO/documents/FHEO_Guidance_on_Advertising_through_Digital_Platforms.pdf.
  25. Federal Trade Commission, “How Websites and Apps Collect and Use Your Information,” available at https://consumer.ftc.gov/articles/how-websites-and-apps-collect-and-use-your-information (last accessed February 2024); National Fair Housing Alliance, “Facebook Settlement,” March 14, 2019, available at https://nationalfairhousing.org/facebook-settlement/.
  26. The Economic Times, “AI and Privacy: The privacy concerns surrounding AI, its potential impact on personal data,” April 25, 2023, available at https://economictimes.indiatimes.com/news/how-to/ai-and-privacy-the-privacy-concerns-surrounding-ai-its-potential-impact-on-personal-data/articleshow/99738234.cms?from=mdr.
  27. Consumer Financial Protection Bureau, “CFPB Reports Highlight Problems with Tenant Background Checks,” November 15, 2022, available at https://www.consumerfinance.gov/about-us/newsroom/cfpb-reports-highlight-problems-with-tenant-background-checks/.
  28. Valerie Schneider, “Locked Out by Big Data: How Big Data, Algorithms and Machine Learning May Undermine Housing Justice,” Columbia Human Rights Law Review 52 (1) (2020): 256–260, available at https://hrlr.law.columbia.edu/files/2020/11/251_Schneider.pdf.
  29. Interagency Task Force on Property Appraisal and Valuation Equity, “Action Plan to Advance Property Appraisal and Valuation Equity: Closing the Racial Wealth Gap by Addressing Mis-valuations for Families and Communities of Color.”
  30. Federal Housing Finance Agency, “Proposed Rule: Quality Control Standards for Automated Valuation Models,” Regulations.gov, June 21, 2023, available at https://www.regulations.gov/document/FHFA-2023-0015-0001. “Section 1473(q) of the Dodd-Frank Act amended Title XI of the Financial Institutions Reform, Recovery, and Enforcement Act of 1989 (title XI) to add a new section 1125 relating to the use of automated valuation models (AVMs) in valuing real estate collateral securing mortgage loans (section 1125) … Section 1125 requires that AVMs, as defined in the statute, adhere to quality control standards designed to ‘(1) ensure a high level of confidence in the estimates produced by AVMs; (2) protect against the manipulation of data; (3) seek to avoid conflicts of interest; (4) require random sample testing and reviews; and (5) account for any other such factor that the agencies determine to be appropriate.’ … Section 1125 provides the agencies with the authority to ‘account for any other such factor” that the agencies “determine to be appropriate.’ Based on this authority, the agencies propose to include a fifth factor that would require mortgage originators and secondary market issuers to adopt policies, practices, procedures, and control systems to ensure that AVMs used in connection with making credit decisions or covered securitization determinations adhere to quality control standards designed to comply with applicable nondiscrimination laws.” See also, Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010, Public Law 203, 111th Cong., 2nd sess. (July 21, 2010), available at https://www.govinfo.gov/content/pkg/PLAW-111publ203/pdf/PLAW-111publ203.pdf.
  31. Interagency Task Force on Property Appraisal and Valuation Equity, “Action Plan to Advance Property Appraisal and Valuation Equity: Closing the Racial Wealth Gap by Addressing Mis-valuations for Families and Communities of Color.”
  32. Lopa P. Kolluri, “Appraisal Fair Housing Compliance and Updated General Appraiser Requirements,” U.S. Department of Housing and Urban Development, November 17, 2021, available at https://www.hud.gov/sites/dfiles/OCHCO/documents/2021-27hsgml.pdf.
  33. Office of Fair Housing and Equal Opportunity, “Guidance on Application of the Fair Housing Act to the Screening of Applicants for Rental Housing” (Washington: U.S. Department of Housing and Urban Development, 2024), available at https://www.hud.gov/sites/dfiles/FHEO/documents/FHEO_Guidance_on_Screening_of_Applicants_for_Rental_Housing.pdf; Office of Fair Housing and Equal Opportunity, “Guidance on Application of the Fair Housing Act to the Advertising of Housing, Credit, and Other Real Estate-Related Transactions through Digital Platforms.”
  34. Office of Fair Housing and Equal Opportunity, “Guidance on Application of the Fair Housing Act to the Screening of Applicants for Rental Housing,” pp. 4–6.
  35. Ibid., pp. 11–15.
  36. Ibid., pp. 15–22.
  37. Office of Fair Housing and Equal Opportunity, “Guidance on Application of the Fair Housing Act to the Advertising of Housing, Credit, and Other Real Estate-Related Transactions through Digital Platforms.”
  38. Ibid., pp. 4–10.
  39. Ibid., p. 11.
  40. Ibid., pp. 11–12.
  41. Madison Alder, “Housing and Urban Development names Vinay Singh as chief AI officer,” FedScoop, November 22, 2023, available at https://fedscoop.com/hud-names-vinay-singh-chief-ai-officer; Executive Office of the President, “Executive Order 14110: Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence”; Young, “M-24-10 Memorandum for the Heads of Executive Departments and Agencies: Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence.”
  42. Office of the Assistant Secretary for Fair Housing and Equal Opportunity, “Implementation of the Fair Housing Act’s Discriminatory Effects Standard,” Federal Register 78 (32) (2013): 11460–11482, available at https://www.federalregister.gov/documents/2013/02/15/2013-03375/implementation-of-the-fair-housing-acts-discriminatory-effects-standard.
  43. Office of the Assistant Secretary for Fair Housing and Equal Opportunity, “HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard,” Federal Register 85 (186) (2020): 60288–60333, available at https://www.federalregister.gov/documents/2020/09/24/2020-19887/huds-implementation-of-the-fair-housing-acts-disparate-impact-standard.
  44. Ibid. This section was an alternative to the 2019 proposed rule’s explicit algorithmic defenses. See U.S. Department of Housing and Urban Development, “Reconsideration of HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard,” Federal Register 83 (118) (2018): 28560–28561, available at https://www.federalregister.gov/documents/2018/06/20/2018-13340/reconsideration-of-huds-implementation-of-the-fair-housing-acts-disparate-impact-standard.
  45. U.S. Department of Housing and Urban Development, “Reinstatement of HUD’s Discriminatory Effects Standard” (Washington: 2021), available at https://www.hud.gov/sites/dfiles/FHEO/documents/6251-F-02_Discriminatory_Effects_Final_Rule_3-17-23.pdf.
  46. Fair Housing Act of 1968, 90th Cong., 2nd sess. (April 11, 1968), 42 U.S.C. 3605, available at https://uscode.house.gov/view.xhtml?path=/prelim@title42/chapter45&edition=prelim.
  47. Ibid., 42 U.S.C. 3604–5.
  48. Ibid., 42 U.S.C. 3608(a).
  49. Office of the Assistant Secretary for Fair Housing and Equal Opportunity, “Reinstatement of HUD’s Discriminatory Effects Standard,” Federal Register 88 (62) (2023): 19450–19500, available at https://www.federalregister.gov/documents/2023/03/31/2023-05836/reinstatement-of-huds-discriminatory-effects-standard.
  50. Fair Housing Act, 42 U.S.C. 3604(c), as amended.
  51. Office of the Assistant Secretary for Fair Housing and Equal Opportunity “Implementation of the Fair Housing Amendments Act of 1988,” Federal Register 54 (13) (1989): 3232–3280, available at https://www.govinfo.gov/content/pkg/FR-1989-01-23/pdf/FR-1989-01-23.pdf.
  52. Office of the Assistant Secretary for Fair Housing and Equal Opportunity, “Reinstatement of HUD’s Discriminatory Effects Standard.”
  53. Office of the Assistant Secretary for Fair Housing and Equal Opportunity, “HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard.” “Defendants may also argue that the policy or practice is reasonably necessary to comply with a third-party requirement which limits the defendant’s discretion. HUD believes that this is an appropriate defense at the pleading stage where the defendant can show, as a matter of law, that the plaintiff’s case should not proceed beyond the pleading stage when considered in light of a binding authority which limits the defendant’s discretion in a manner which shows that the defendant’s discretion could not have plausibly been the direct cause of the disparity. … HUD has included language allowing a defendant to demonstrate that the policy or practice being challenged is intended to predict the occurrence of an outcome, the prediction represents a valid interest, and the outcome predicted by the policy or practice does not or would not have a disparate impact on protected classes compared to similarly situated individuals not part of the protected class.”
  54. See, for example, U.S. Department of Housing and Urban Development, “Advertising and Marketing,” available at https://www.hud.gov/program_offices/fair_housing_equal_opp/advertising_and_marketing (last accessed February 2024); Roberta Achtenberg, “Guidance Regarding Advertisements Under §804(c) of the Fair Housing Act,” Office of the Assistant Secretary for Fair Housing and Equal Opportunity, January 9, 1995, available at https://www.hud.gov/sites/documents/DOC_7784.PDF.
  55. U.S. Department of Housing and Urban Development, “Housing Discrimination Under the Fair Housing Act,” available at https://www.hud.gov/program_offices/fair_housing_equal_opp/fair_housing_act_overview (last accessed February 2024).
  56. Helen R. Kanovsky, “Office of General Counsel Guidance on Application of Fair Housing Act Standards to the Use of Criminal Records by Providers of Housing and Real Estate-Related Transactions,” U.S. Department of Housing and Urban Development, April 4, 2016, available at https://www.hud.gov/sites/documents/HUD_OGCGUIDAPPFHAS TANDCR.PDF.
  57. Nichole Nelson, “Tenant Screening Systems Are Unfair To Black and Brown People. Here’s What Our Government Should Do About It,” National Community Reinvestment Coalition, June 27, 2023, available at https://ncrc.org/tenant-screening-systems-are-unfair-to-black-and-brown-people-heres-what-our-government-should-do-about-it.
  58. Kanovsky, “Office of General Counsel Guidance on Application of Fair Housing Act Standards to the Use of Criminal Records by Providers of Housing and Real Estate-Related Transactions.”
  59. Louis vs. SafeRent Solutions, statement of interest, U.S. District Court for the District of Massachusetts, Case No. 22cv10800-AK (January 9, 2023), available at https://www.google.com/url?q=https://www.justice.gov/opa/press-release/file/1561526/download&sa=D&source=docs&ust=1710443738925144&usg=AOvVaw0Fa_wdNoTXy-iiHlmcWeLi.
  60. Mariah de Leon and Natasha Duarte, “Comments to the FHFA on tenant protections for Enterprise-backed multifamily properties,” Upturn, July 31, 2023, available at https://www.upturn.org/work/comments-to-the-fhfa-on-tenant-protections-for-enterprise-backed-multifamily/; U.S. Department of Housing and Urban Development, “Office of Fair Housing and Equal Opportunity (FHEO) Guidance on Compliance with Title VI of the Civil Rights Act in Marketing and Application Processing at Subsidized Multifamily Properties” (Washington: 2022), available at https://archives.hud.gov/news/2022/HUD-Title-VI-Guidance-Multifamily-Marketing-and-Application-Processing.pdf.
  61. Fair Housing Act, including 42 U.S.C.A. § 3605.
  62. Ibid.
  63. United States v. American Institute of Real Estate Appraisers, U.S. District Court for the Northern District of Illinois, 442 F. Supp. 1072 (November 23, 1977), available at https://law.justia.com/cases/federal/district-courts/FSupp/442/1072/2285128/; United States v. American Institute of Real Estate Appraisers, appeal dismissed, 7th U.S. Circuit Court of Appeals, 590 F.2d 242 (December 21, 1978), available at https://casetext.com/case/us-v-am-inst-of-real-estate-appraisers; Fair Housing Act, 42 U.S.C. 3604(a), as amended.
  64. Steptoe v. Savings of America, U.S. District Court for the Northern District of Ohio, 800 F. Supp. 1542, 1546 (August 24, 1992), available at https://law.justia.com/cases/federal/district-courts/FSupp/800/1542/1393612/.
  65. Kolluri, “Appraisal Fair Housing Compliance and Updated General Appraiser Requirements” (noting that “no part of the appraisal analysis or reporting may be based on [protected classes]”).
  66. Federal Housing Finance Agency, “Proposed Rule: Quality Control Standards for Automated Valuation Models.”
  67. Fair Housing Act, 42 U.S.C.A. § 3608.
  68. U.S. Department of Housing and Urban Development, “Affirmatively Furthering Fair Housing (AFFH),” available at https://www.hud.gov/AFFH (last accessed February 2024).
  69. Ibid.
  70. U.S. Department of Housing and Urban Development, “Affirmatively Furthering Fair Housing,” Federal Register 80 (136) (2015): 42272–42371, available at https://www.govinfo.gov/content/pkg/FR-2015-07-16/pdf/2015-17032.pdf.
  71. Office of Fair Housing, “Preserving Community and Neighborhood Choice” (Washington: U.S. Department of Housing and Urban Development, 2020), available at https://www.hud.gov/sites/dfiles/ENF/documents/6228-F-01%20Preserving%20Housing%20and%20Neighborhood%20Choice.pdf.
  72. U.S. Department of Housing and Urban Development, “Affirmatively Furthering Fair Housing,” Federal Register 88 (27) (2023): 8516–8590, available at https://www.federalregister.gov/documents/2023/02/09/2023-00625/affirmatively-furthering-fair-housing.
  73. Executive Office of the President, “Executive Order 14110: Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence”; Office of Fair Housing and Equal Opportunity, “Guidance on Application of the Fair Housing Act to the Advertising of Housing, Credit, and Other Real Estate-Related Transactions through Digital Platforms”; American Civil Liberties Union and others, “Re: Addressing Technology’s Role in Housing Discrimination,” July 13, 2021, available at https://www.upturn.org/static/files/letter-to-ostp-on-housing-technologies-20210713.pdf; Office of Fair Housing and Equal Opportunity, “Part 109–Fair Housing Advertising,” available at https://www.hud.gov/sites/dfiles/FHEO/documents/BBE%20Part%20109%20Fair%20Housing%20Advertising.pdf (last accessed March 2024).
  74. Office of Public Affairs, “Justice Department Secures Groundbreaking Settlement Agreement with Meta Platforms, Formerly Known as Facebook, to Resolve Allegations of Discriminatory Advertising.”
  75. See, for example, Yu, Rieke, and Duarte, “Urging the Biden Administration to Address Technology’s Role in Housing Discrimination.”
  76. Office of Fair Housing and Equal Opportunity, “Guidance on Application of the Fair Housing Act to the Advertising of Housing, Credit, and Other Real Estate-Related Transactions through Digital Platforms.”
  77. U.S. Government Publishing Office, “12 U.S.C.A. § 3354,” available at https://www.govinfo.gov/content/pkg/USCODE-2022-title12/pdf/USCODE-2022-title12-chap34A-sec3354.pdf (last accessed May 2024).
  78. Federal Housing Finance Agency, “Proposed Rule: Quality Control Standards for Automated Valuation Models.”
  79. Office of the Comptroller of the Currency and others, “Quality Control Standards for Automated Valuation Models,” Federal Register 88 (118) (2023): 40638–40675, available at https://www.federalregister.gov/documents/2023/06/21/2023-12187/quality-control-standards-for-automated-valuation-models.
  80. Ibid.
  81. Dennis Kelleher, “Re: Quality Control Standards for Automated Valuation Models – OCC Docket ID OCC – 2023-0002; Board Docket No. R-1807 and RIN No. 7100 AG60; FDIC RIN 3064-AE68; NCUA Docket Number NCUA-2023-0019 and RIN 3133-AE23; CFPB Docket No. CFPB-2023-0025; FHFA RIN 2590-AA62; 88 Fed. Reg. 40638 (Jun. 21, 2023),” Better Markets, August 21, 2023, available at https://www.regulations.gov/comment/OCC-2023-0002-0011. See Rica Dela Cruz and Gaby Villaluz, “Nonbank lenders shed mortgage market share as originations plummet in 2022,” S&P Global, July 13, 2023, available at https://www.spglobal.com/marketintelligence/en/news-insights/latest-news-headlines/nonbank-lenders-shed-mortgage-market-share-as-originations-plummet-in-2022-76481554.
  82. National Institute of Standards and Technology, “NIST AI RMF Playbook,” available at https://airc.nist.gov/AI_RMF_Knowledge_Base/Playbook (last accessed February 2024).
  83. Young, “Memorandum for the Heads of Executive Departments and Agencies: Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence.”
  84. Legal Information Institute, “12 U.S.C. § 3354 – Automated valuation models used to estimate collateral value for mortgage lending purposes,” available at https://www.law.cornell.edu/uscode/text/12/3354 (last accessed May 2024). See, for example, Alexei Alexandrov, Laurie Goodman, and Michael Neal, “Reengineering the Appraisal Process: Better Leveraging Both Automated Valuation Models and Manual Appraisals” (Washington: Urban Institute, 2023), p. 18, available at https://www.urban.org/sites/default/files/2023-01/Reengineering%20the%20Appraisal%20Process.pdf.
  85. Legal Information Institute, “12 U.S.C. § 3354(a)(1)–(3).”
  86. For a more fulsome analysis of the FCRA’s history and authority, see “Fair Credit Reporting Act” section of Todd Phillips and Adam Conner, “Financial Regulatory Agencies,” in Will Dobbs-Allsopp and others, Taking Further Agency Action on AI (Washington: Center for American Progress, 2024), available at https://www.americanprogress.org/article/financial-regulatory-agencies-chapter/.
  87. Consumer Financial Protection Bureau, “CFPB Reports Highlight Problems with Tenant Background Checks.” The CFPB has already recognized that tenant screening reports often rely on untrustworthy data, creating barriers to housing.
  88. “Fair Credit Reporting Act” section of Phillips and Conner, “Financial Regulatory Agencies”; Legal Information Institute, “15 U.S.C. § 1681c – Requirements relating to information contained in consumer reports,” available at https://www.law.cornell.edu/uscode/text/15/1681c (last accessed May 2024).
  89. “Fair Credit Reporting Act” section of Phillips and Conner, “Financial Regulatory Agencies.”

The positions of American Progress, and our policy experts, are independent, and the findings and conclusions presented are those of American Progress alone. A full list of supporters is available here. American Progress would like to acknowledge the many generous supporters who make our work possible.

Previous

Next

Chapters

Author

Anna Rodriguez

Policy Counsel

Governing for Impact

Team

Technology Policy

Our team envisions a better internet for all Americans, advancing ideas that protect consumers, defend their rights, and promote equitable growth.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.