_CULTURE SOCIETY Big Data

_Code 101

Do computer algorithms perpetuate built-in biases?

_Kelly Joyce

Joyce is a professor, associate dean for humanities and social science research and director of the master's program in Science, Technology & Society for the College of Arts and Sciences.

Professor of sociology Kelly Joyce wanted to find out who makes computer algorithms and code, and how values translate into the work they do.

Joyce partnered with Kristene Unsworth, a visiting assistant professor at Stockton University. They studied teams of computer scientists and engineers who build big data sets — conducting interviews and sitting in on dozens of meetings. Based on that, they wrote data-driven scenarios that engage future computer scientists in ethical problem solving.

The result was a study on “The Ethics of Algorithms,” funded by the National Science Foundation. It is one of the first studies to create scenarios specific to algorithms and big data, employing a critical “upstream” approach that focused on education.
The decision to focus on big data was influenced by examples of algorithms’ increasing inequalities in a variety of fields — from health care to law enforcement and business.

As Joyce says, although data about humans may be perceived as “neutral, inclusive or representative,” it is often more complicated than it appears. To be meaningful, data needs context.

“We don’t know the decisions that were made about what to include or exclude in an algorithm — we just encounter the effects of them,” Joyce says. “We don’t want to keep reacting. We want to figure out what is causing the effects and try to prevent them.”