Fed banking regulator warns A.I. could lead to illegal lending practices like excluding minorities

Real Estate

Michael Barr, vice chair for supervision of the board of governors of the Federal Reserve, testifies during a House Committee on Financial Services hearing on Oversight of Prudential Regulators, on Capitol Hill in Washington, DC, on May 16, 2023.
Mandel Ngan | AFP | Getty Images

The Federal Reserve’s top banking regulator expressed caution Tuesday about the impact that artificial intelligence can have on efforts to make sure underserved communities have fair access to housing.

Michael S. Barr, the Fed’s vice chair for supervision, said AI technology has the potential to get credit to “people who otherwise can’t access it.”

However, he noted that it also can be used for nefarious means, specifically to exclude certain communities from housing opportunities through a process traditionally called “redlining.”

“While these technologies have enormous potential, they also carry risks of violating fair lending laws and perpetuating the very disparities that they have the potential to address,” Barr said in prepared remarks for the National Fair Housing Alliance.

As an example, he said AI can be manipulated to perform “digital redlining,” which can result in majority-minority communities being denied access to credit and housing opportunities. “Reverse redlining,” by contrast, happens when “more expensive or otherwise inferior products” in lending are pushed to minority areas.

Barr said work being done by the Fed and other regulators on the Community Reinvestment Act will be focused on making sure underserved communities have equal access to credit.

Articles You May Like

Munis improve, USTs mixed while inflation data muddies market outlook
The Maga court: inside Donald Trump’s new White House
Top Wall Street analysts like these dividend-paying stocks
Muni returns in the black, outperforming USTs in November
FTX bankruptcy estate files $1.8B lawsuit against Binance, CZ