“The study compared the crime-predicting powers of an algorithm called COMPAS, already used by multiple states, to those of Amazon’s Mechanical Turk, a sort of micro TaskRabbit where people are paid to complete small assignments. Using an online poll, the researcher asked “turks” to predict recidivism based on a few scant facts about offenders. Given the sex, age, crime charge, criminal degree, and prior convictions in juvenile, felony and misdemeanor courts of 50 offenders, each of the 400 survey takers had to assess their likelihood of reoffending. The Dartmouth researchers had information on whether the offenders in question actually did reoffend…”
“Overall, the turks predicted recidivism with 67 percent accuracy, compared to COMPAS’ 65 percent. Even without access to a defendant’s race, they also incorrectly predicted that black defendants would reoffend more often than they incorrectly predicted white defendants would reoffend, known as a false positive rate. That indicates that even when racial data isn’t available, certain data points—like number of convictions—can become proxies for race, a central issue with eradicating bias in these algorithms.”
Please click here to read the full article.