question archive Redlining" occurred when banks denied mortgage loans in certain neighborhoods because those neighborhoods were determined to be 'high risk
Subject:Computer SciencePrice: Bought3
Redlining" occurred when banks denied mortgage loans in certain neighborhoods because those neighborhoods were determined to be 'high risk.' Independent analysis founds that these city neighborhoods were predominantly populated by minorities. Federal courts determined that denying loans based solely on neighborhood or location was a surrogate for denying loans based on race. The Fair Housing Act outlawed the practice of 'redlining' in 1968.
Today, algorithms use many factors to determine whether a loan application is reasonable or is too risky. Should these algorithms be allowed to use input factors that are highly correlated with race?
Should companies that use algorithms be required to show that, statistically speaking, the algorithms are not racially biased? If so, how would this work?
Should employers be allowed to look at a job applicant's credit report when making a hiring decision? If companies are allowed to consider credit score when hiring, what are the potential unintended consequences?