CFPB issues guidance on credit denials that use artificial intelligence

chopra-030521-topten.jpeg
Consumer Financial Protection Bureau Director Rohit Chopra said that creditors "must be able to specifically explain their reasons for denial," decrying some lenders' overreliance on artificial intelligence to make credit underwriting decisions.
Bloomberg News

The Consumer Financial Protection Bureau warned lenders of the requirement to provide specific and accurate reasons when denying credit to a consumer, reiterating the agency's skepticism of artificial intelligence and advanced algorithms in underwriting decisions. 

On Tuesday, the CFPB issued guidance on the use of artificial intelligence in underwriting and the explanations given to consumers who are denied credit. The bureau said that creditors are relying inappropriately on a checklist of reasons provided by the CFPB in sample forms. The bureau said that creditors instead must provide specific reasons and details to explain why a consumer is denied credit or why a credit limit was changed. 

"Creditors must be able to specifically explain their reasons for denial. There is no special exemption for artificial intelligence," CFPB Director Rohit Chopra said in a press release. "Technology marketed as artificial intelligence is expanding the data used for lending decisions, and also growing the list of potential reasons for why credit is denied."

The agency also warned lenders against using data harvested from consumer surveillance or data that is not typically found in a consumer's credit file or credit application. The bureau said that consumers can be harmed by the use of surveillance data given that "some of these data may not intuitively relate to the likelihood that a consumer will repay a loan." 

Under the Equal Credit Opportunity Act, a landmark 1974 anti-discrimination statute, a creditor is required to provide an applicant with a reason for denying, revoking or changing the terms of an existing extension of credit. The explanation is known as an adverse action notice. 

Credit applicants and borrowers receive adverse action notices when credit is denied, an existing account gets terminated or an account's terms are changed. The notices discourage discrimination, and help applicants and borrowers understand the reasons behind a creditors' decisions, the CFPB said. The CFPB said that a lender is not in compliance with ECOA if the reasons given to the consumer are "overly broad, vague, or otherwise fail to inform the applicant of the specific and principal reasons for an adverse action." 

The guidance serves as a warning to lenders that are using sample CFPB forms and a CFPB checklist of reasons for denying credit. The bureau said that creditors that select inaccurate reasons on a checklist are not in compliance with the law. 

The CFPB's guidance states that the specific reasons disclosed as to why a consumer is denied credit, or if there is a change in circumstances, must "relate to and accurately describe the factors actually considered or scored by a creditor." Such "specificity" is necessary to ensure a consumer understands the explanation and the lender does not obfuscate the principal reasons for the change, the bureau said.

As an example, the CFPB said that if a creditor decides to lower the limit on a consumer's credit line based on behavioral spending data, the creditor would need to provide more details about the specific negative behaviors that led to the reduction beyond checking a general reason such as the consumer's "purchasing history."

Last year the CFPB issued an advisory opinion that further clarified that lenders are required to provide the adverse action notices to borrowers with existing credit, to explain if an unfavorable decision was made against a borrower. At the time, the CFPB did not provide an analysis of how lenders that use complex algorithms can find ways to meet the adverse action requirements in ECOA. The current guidance attempts to bridge that gap.

The CFPB has taken a variety of regulatory actions related to AI in recent years, including telling landlords to notify prospective tenants when they are denied housing, and issuing a proposed rule with other federal agencies on automated valuation models. The bureau is working to ensure that black-box AI models do not lead to what it calls digital redlining in the mortgage market.

For reprint and licensing requests for this article, click here.
Politics and policy Regulation and compliance Artificial intelligence
MORE FROM AMERICAN BANKER