We should do some thinking about how this idea might apply to our community...
A recent ProPublica analysis of The Princeton Review’s prices for online SAT tutoring shows that customers in areas with a high density of Asian residents are often charged more. When presented with this finding, The Princeton Review called it an “incidental” result of its geographic pricing scheme. The case illustrates how even a seemingly neutral price model could potentially lead to inadvertent bias—bias that’s hard for consumers to detect and even harder to challenge or prove.
Over the past several decades, an important tool for assessing and addressing discrimination has been the “disparate impact” theory. Attorneys have used this idea to successfully challenge policies that have a discriminatory effect on certain groups of people, whether or not the entity that crafted the policy was motivated by an intent to discriminate. It’s been deployed in lawsuits involving employment decisions, housing, and credit. Going forward, the question is whether the theory can be applied to bias that results from new technologies that use algorithms.