CFPB tells businesses ‘black box’ credit models used by banks, other lenders must not discriminate | Ballard Spahr LLP

0

[author: Richard Satran]

The U.S. Consumer Financial Protection Bureau told the companies it regulates that federal anti-discrimination law extends the liability of banks and other lenders in their use of algorithmic models used in credit decisions. The consumer credit agency’s decision to enforce biases in what it calls “black box” credit decisions was disclosed in a “financial consumer protection circular” that asserted authority to regulate fintech operations widely used by lenders.

The agency underlined its intention to begin implementing the decision immediately; he called on tech workers to file whistleblower complaints when they learn of programming features that promote disparate lending practices. While the CFPB itself does not have a program that offers tip rewards, it could rely on other law enforcement partners that offer whistleblower incentives.

The CFPB, which used the CFPB’s “circular” policy-making pathway to implement administrative priorities, instead of the much slower actions required in rule-making, aimed for oversight of financial firms in a series of recent initiatives. In the new circular published on May 26, it cited the provisions of the FCOA, which requires lenders to explain to consumers “adverse actions” in denial of credit decisions, as the legal authority supporting its regulatory action.

The CFPB could face legal challenges and obstacles to implementation

The CFPB circular lacked specifics about what actions constitute violations and how the agency will enforce the rule, said Michael Gordonpartner in the law firm Ballard Spar LLP and former senior CFPB official involved in the start-up of the agency.

“But it sends a strong signal that the CFPB will look closely at the use of algorithms in credit decision-making – and will have little tolerance when companies fail to meet their obligation to notify consumers of unfavorable decisions” , Gordon said.

The CFPB’s Algorithmic Bias Initiative follows orders from the Biden administration asking regulators to broadly push both discriminatory lending breaches and modernized oversight of the financial industry’s digital practices. CFPB Director Rohit Chopra has moved faster than other US agencies to map fintech oversight in a series of lending bias studies and initiatives. The agency has pushed its oversight beyond financial services companies to include big tech companies that have recently launched consumer credit offerings.

At the same time, the CFPB has stepped up enforcement of existing discriminatory lending laws in a series of recent actions against financial firms that marked a major shift from the Trump-era period with relatively little enforcement. applications, which often resolved with warning letters in place. of penalties.

The revamped CFPB innovation unit

The circular carries similar weight to the Securities and Exchange Commission’s risk alerts which outline priority areas for review based on the results of the reviews. It warns the industry that it considers discriminatory fintech lending will not be viewed as any different from brick-and-mortar Fair Lending Act violations.

“Companies are not exempt from their legal responsibilities when they let a black box model make lending decisions,” Chopra said. He cited the specific provisions covered by the Equal Credit Opportunity Act requirements that companies issue notices to explain adverse credit decisions they make.

The CFPB, in another similar policy interpretation, recently said it would monitor discriminatory practices in how companies handle credit cards and loans after an application is approved. In the previous ruling, the CFPB warned lending institutions that “anti-discrimination protections do not disappear once a customer obtains a loan.”

The agency also recently signaled, in what was considered its most significant recent policy decision, that it would review lenders and service providers for discriminatory practices by applying previously unused authority to combat acts and unfair, deceptive and abusive practices (UDAAP), encompassed in the Dodd-Frank Act of 2010who created the office.

The authority, which it has not used in the past, will use UDAAP provisions to enforce lending practices in actions that allow for large fines. The UDAAP gives the CFPB broad authority to prosecute fraud in models that show consumer abuse without requiring proof of intent to mislead.

The CFTC changes course with more force

Last month, the CFTC signaled its tougher regulation of fintech when it revamped the Office of Innovation, which under the previous administration had been a vehicle to encourage fintech development with letters “without action” and regulatory “sandboxes” allowing developers to launch new applications. The revived Competition and Innovation Office aims to expose the anti-competitive practices of big banks and technology companies that hinder small innovators.

The new competition unit will “analyze barriers to opening up markets, better understand how big players are crowding out smaller players, organize incubation events and generally facilitate switching financial services providers.” The agency has made it clear in its new black box initiative that creditors cannot justify non-compliance with ECOA based on the simple fact that the technology they use to assess credit applications is too complicated. , too opaque in its decision-making or too novel.

The agency last October when it ordered from major American technology companies Apple, Facebook, Google, PayPal and Square to provide information about their online payments and use of customer data, in a move that has also focused on Chinese tech giants outside of its regulatory oversight, including Alipay and WeChat Pay.

The new move toward regulating algorithmic processes could prompt companies to challenge the CFPB’s authority in emerging areas, legal experts said. The CFPB’s legal status was upheld in a recent contested Supreme Court split decision that nevertheless limited the agency’s director’s unchecked authority. The United States High Court has generally taken a more skeptical view of regulatory authority in recent decades.

CFPB director targets abuse in digital practices

Weeks after taking over as director of the CFPB last October, Chopra’s agency launched an initiative to review the practices of big tech companies as consumer finance entities, with programs such as Buy Now, Pay Later used by millions of consumers for online purchases. Since then, he has initiated reviews of real estate valuation data metrics and other uses of technology in consumer credit.

“This CFPB circular is part of a broader effort by the agency to prioritize fair lending compliance and improve scrutiny of technological innovations in consumer credit markets,” Gordon said. The CFPB said in its advice on black box practices that its oversight would “extend beyond adverse action notices and ECOA”, citing its recent investigation into automated valuation models in the home assessment process” as an example of the types of behavior it could include in its expanded oversight.

The CFPB said it plans to invoke the adverse action provision under the ECOA, which “entitles each applicant to a specific explanation if their application for credit has been denied, and that right does not is not diminished simply because a company uses a complex algorithm that it does not”. I do not understand.

The CFPB said it plans to take strong action to compensate for data abuses, as “the collection of data about Americans has become large and ubiquitous, giving companies the opportunity to know very detailed information about their customers before even to interact with them”.

Abuse, he said, uses “detailed datasets to power their algorithmic decision-making, which is sometimes marketed as ‘artificial intelligence.'” The information gathered from analyzing the data have a wide range of commercial uses by financial companies, including for targeted advertising and credit decision making.

[View source.]

Share.

About Author

Comments are closed.