From predictive policing to the social media tracking of potential terrorists, governments worldwide rely increasingly on data-driven risk assessments. On March 21-22, the Robert L. Bernstein Institute for Human Rights hosted its second annual conference to address the human rights implications of this trend. Tyranny of the Algorithm? Predictive Analytics & Human Rights, featured over 30 speakers, including Samuel Rascoff, NYU professor and former director of intelligence analysis at the NYPD, and Latanya Sweeney, Harvard professor and former Chief Technologist at the Federal Trade Commission. The conference was attended by over 200 academics and practitioners. Consistent with the inter-disciplinary mission of the Institute, speakers and attendees came from a range of backgrounds with participants including legal academics, computer scientists, business leaders, anthropologists and law enforcement officials.
Some speakers suggested that data-driven risk assessments may be a promising way to mitigate racial, gender, or religious bias in policing and counter-terrorism operations. Jeff Brantingham, UCLA professor of anthropology and co-founder of the predictive policing technology, PredPol, argued that algorithms can more accurately and objectively predict certain crime “hot spots” than human analysts. But the technology comes with risks. Patrick Ball, co-founder of Human Rights Data Analysis Group, explained how machine learning will get the wrong answers when unrepresentative data is fed into predictive algorithms. Jason Q. Ng, a research fellow at The Citizen Lab and data analyst at Tumblr raised the potential human rights concerns about the use of seemingly benign predictive technology by the Chinese government.
Speakers also addressed ways to improve human rights accountability in the field of predictive analytics. One approach, discussed by Frank Romo, an analyst with Million Hoodies for Justice, is for communities to use the tools of predictive analytics to monitor government activity. Just as predictive technology can help police anticipate sites of future crime, so too can the technology be used to predict sites of future police violence. And Sorelle Friedler, a computer science professor at Haverford College and fellow at Data & Society, spoke about her work on building fair algorithms.