MATH 385 Week 10 Reading Assignment
Please submit Markdown file to your GitHub repository for Week 10 Reading Assignment.
Please wite at least 400 words on your opinions, thoughts, concerns, questions, and/or comments about the article(s) below.
Readings
Prompts
- What are the basic issues faced by relying on software to assess risk assessments of convicted criminals?
- Is this as bad as it sounds? What accuracy did the court systems have before they implemented COMPAS?
- What type of statistic is being quoted here?
- What distributions are they describing for black and white defendants? What distributions would be more fair?
- So what could Northpointe, the company that owns COMPAS, do to de-bias their software? How could they do better?
Risk assessment software is thought to take biases out of the decision-making process by giving an objective look into the likelihood of risk. The issue is that these algorithms have been found to contain racial biases. A study by ProPublica found that risk assessment software COMPAS used by the Florida judicial system was wrongly assessing people in a way correlating with race
the program wrongly labeled African Americans as a future criminal at twice the rate of whites
the percentage of white defendants labelled low risk who reoffended is 47.7% while for African American defendants the percentage is 28.0%
Similarly, it seems suspicious that a vast majority of white defendants receive the minimum score of one while black defendants receive it as often as they do scores of ten
While the algorithm cannot create its own biases, these past convictions are affected by disproportionate arrest rates of African American men and other inequities that have been consistently present in the justice system. ... It should therefore be the responsibility of the engineers to evaluate racial biases and remove them from the system’s learning