Algorithms amplify historic racism of US financial systems | Across America

Federal regulators must be vigilant against sophisticated digital discrimination by financial institutions, online banks and social media platforms like Facebook, said Rohit Chopra, the director of a government consumer watchdog agency.

“We are quite focused on the future of how technology is going to reshape lending and financial services,” said Chopra, director of the Consumer Financial Protection Bureau. “A lot of that relates to what we think is digital redlining.”

In the past, redlining was thought of as financial institutions not lending to a particular geographic area, he said.

“But now with these highly targeted algorithms it’s becoming more and more difficult to understand where there might be exclusion or discrimination,” Chopra said.

He was in Philadelphia for a conference at the Federal Reserve Bank of Philadelphia and gave The Philadelphia Tribune an exclusive interview about digital discrimination.

In March, a Wells Fargo customer, Aaron Braxton, who is Black, filed a lawsuit in US District Court in Northern California, alleging racial discrimination after he was denied a mortgage to refinance his home. According to the complaint, the bank’s algorithms have amplified the historic racism of the US financial system.

In June, the US Justice Department reached an agreement with Meta, which owns Facebook, to settle a discrimination lawsuit in federal court charging the social media platform with a violation of the Fair Housing Act for the use of an algorithm.

In the complaint, the government alleged that Facebook was allowing landlords to market their housing advertisements in discriminatory ways. For instance, Facebook allowed advertisers to target housing-related ads by race, gender, religion and other characteristics, in a way that enabled discrimination.

Kristen Clarke, assistant attorney general of the US Justice Department civil right division, said then that it was the first time that Meta had agreed to terminate one of its algorithm tools and modify those tools for delivering housing ads, in response to a discrimination lawsuit.

As part of the settlement, Facebook agreed to scrap one of its algorithm tools and build a new automated advertising system, to make sure that its housing ads are seen by a more equitable audience. The company also agreed to pay $ 115,054 penalty, the largest amount allowed by law.

A spokesperson for Facebook said that it would refrain from using ZIP codes alone to target customers, as part of the agreement.

An algorithm is a complex formula or an advanced computational method that can be used to make decisions.

“Think of it as, it’s using variables to come up with some sort of decision. That decision may be whether or not you get a loan or whether or not you receive an advertisement for it, “Chopra said. “A lot of these algorithms rely on a huge amount of data about you.”

For example, data that the company may rely on to make a decision may not have anything to do with your credit worthiness or past debts, but could be based on things like your browsing history your geolocation or whether or not you attend church, Chopra said . “It might also include something totally unrelated to the commercial transaction.”

According to the website, Google and Facebook boast the largest share of digital ad revenue in the US, with Google at 28.6% and Facebook at 23.8%. Together, they account for more than 50% of all digital ad revenue in the US

Meanwhile, the June settlement by Facebook with the Justice Department and the allegations is one of several allegations against the social media platform for its algorithms related to housing, employment and credit.

In July 2018, Washington state Attorney General Bob Ferguson announced that Facebook signed a legally binding agreement to terminate the ability of advertisers to exclude people from seeing certain ads, based on race, ethnicity, religion, LGBTQ individuals and other protected groups, as a result of its targeting practices or algorithms.

According to Ferguson, third parties were able to discriminate by not allowing certain individuals to see their ads for credit, employment, housing, insurance and lending.

As part of the agreement, Facebook was required to discontinue the practice of allowing advertisers to excluding people through algorithms from seeing ads for public accommodations, such as restaurants, housing, but also ads for credit, employment and insurance. In addition, the social media platform was required to pay $ 90,000 in costs and fees.

During a 20-month investigation, office of the Washington attorney general placed 20 fake ads on Facebook posing as bankers, employers, insurance firms and apartment rentals. They were able to exclude certain minorities with their ads.

“When you search listings, there has been empirical research that shows that people with equal qualifications, a minority candidate might not even be shown the listing, because they (advertisers) are using targeted analytics,” said Chopra, CFPB director.

To counteract these practices, Chopra said the government can utilize one of its longstanding anti-discrimination lending laws: the Equal Credit Opportunity Act.

Under the Equal Credit Opportunity Act, if you are denied or get an adverse decision, the lender is required to tell people the clear statement of reasons, as to why that adverse action occurred, he said.

“We have issued a legal interpretation that says just because you use a fancy algorithm, doesn’t mean you don’t have to do that,” Chopra said. “If you can’t us tell what that machine is saying you can’t use it. ”


Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker