MATH 385 Week 10 Reading Assignment

Readings

Prompts

    Risk assessment software is thought to take biases out of the decision-making process by giving an objective look into the likelihood of risk. The issue is that these algorithms have been found to contain racial biases. A study by ProPublica found that risk assessment software COMPAS used by the Florida judicial system was wrongly assessing people in a way correlating with race
  • What are the basic issues faced by relying on software to assess risk assessments of convicted criminals?
  • the program wrongly labeled African Americans as a future criminal at twice the rate of whites
  • Is this as bad as it sounds? What accuracy did the court systems have before they implemented COMPAS?
  • the percentage of white defendants labelled low risk who reoffended is 47.7% while for African American defendants the percentage is 28.0%
  • What type of statistic is being quoted here?
  • Similarly, it seems suspicious that a vast majority of white defendants receive the minimum score of one while black defendants receive it as often as they do scores of ten
  • What distributions are they describing for black and white defendants? What distributions would be more fair?
  • While the algorithm cannot create its own biases, these past convictions are affected by disproportionate arrest rates of African American men and other inequities that have been consistently present in the justice system. ... It should therefore be the responsibility of the engineers to evaluate racial biases and remove them from the system’s learning
  • So what could Northpointe, the company that owns COMPAS, do to de-bias their software? How could they do better?