Executive Order on Artificial Intelligence Includes Actions Impacting Consumer Financial Service Providers

On October 29, the Biden Administration issued a broad Executive Order (Order) on artificial intelligence (AI).  Titled “Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence,” the Order establishes guidelines for AI safety and security, aims to shield Americans’ data privacy, and emphasizes equity and civil rights.  As stated by the White House in its Fact Sheet about the Order, the Order “stands up for consumers and workers” while fostering innovation and competition.

Ballard Spahr has issued a legal alert that provides an overview of the Order.  In this blog post, we highlight the provisions of the Order that are most noteworthy for providers of consumer financial services that use AI.

Guidelines and Best Practices

For consumer financial services providers that use proprietary AI, the Order includes provisions directed at AI developers and designers.  It directs the Secretary of Commerce, acting through the Director of National Institute of Standards and Technology, in coordination with certain other agencies, to “establish guidelines and best practices, with the aim of promoting consensus industry standards for developing and deploying safe, secure, and trustworthy AI systems.”

AI and Civil Rights

 Last October, the White House identified a framework of five principles, also known as the “Blueprint for an AI Bill of Rights,” to guide the design, use, and deployment of automated systems and AI.  One of those principles is that automated systems should be used and designed in an equitable way to prevent algorithmic discrimination.  For instance, measures should be taken to prevent unfavorable outcomes based on protected characteristics.  In April 2023, the CFPB, FTC, Justice Department, and Equal Employment Opportunity Commission issued a joint statement about enforcement efforts “to protect the public from bias in automated systems and artificial intelligence.”  (CFPB Director Chopra has repeatedly raised concerns that the use of AI can result in unlawful discriminatory practices.)

Building on those developments, the Order encourages the CFPB Director and the Director of the Federal Housing Finance Agency, in order “to address discrimination and biases against protected groups in housing markets and consumer financial markets, to consider using their authorities, as they deem appropriate, to require their respective regulated entities, where possible,” to do the following:

  • Use appropriate methodologies including AI tools to ensure compliance with federal law; 

  • Evaluate their underwriting models for bias or disparities affecting protected groups; and

  • Evaluate automated collateral valuation and appraisal processes in ways that minimize bias.

The Order also requires the Secretary of Housing and Urban Development and “encourage[s]” the CFPB Director, in order “to combat unlawful discrimination enabled by automated algorithmic tools used to make decisions about access to housing and in other real estate-related transactions,” to issue additional guidance within 180 days of the date of the Order that addresses:

  • The use of tenant screening systems in ways that may violate the Fair Housing Act, the Fair Credit Reporting Act, or other relevant federal laws, including how the use of data, such as criminal records, eviction records, and credit information can lead to discriminatory outcomes in violation of federal law; and

  • How the Fair Housing Act, the Consumer Financial Protection Act, or the Equal Credit Opportunity Act apply to the advertising of housing credit, and other real estate-related transactions through digital platforms, including those that use algorithms to facilitate advertising delivery, as well as best practices to avoid violations of federal law.

Protecting Consumers  

The Order encourages independent regulatory agencies, as they deem appropriate, to consider using the full range of their authorities to protect consumers from fraud, discrimination, and threats to privacy, and to address other risks that may arise from AI, including risks to financial stability.  The agencies are also encouraged to consider rulemaking, as well as emphasizing or clarifying where existing regulations apply to AI.  The agencies are also encouraged to clarify the responsibility of regulated entities to conduct due diligence and monitor any third-party AI services they use, and to emphasize or clarify requirements and expectations related to the transparency of AI models and regulated entities’ ability to explain their use of AI models.

The impact of AI on the consumer financial services industry has been the focus of two episodes of our Consumer Finance Monitor Podcast.  The episodes are available here and here.

Executive Order on Artificial Intelligence Includes Actions Impacting Consumer Financial Service Providers
http://www.insidearm.com/news/00049491-executive-order-artificial-intelligence-i/
http://www.insidearm.com/news/rss/
News

All the latest in collections news updates, analysis, and guidance

Speak Your Mind

*