Independent Student Newspaper Since 1969

The Badger Herald

Independent Student Newspaper Since 1969

The Badger Herald

Independent Student Newspaper Since 1969

The Badger Herald

Advertisements
Advertisements

Perfectly imperfect machines: How artificial intelligence algorithms produce biased decisions

‘This is about social inequality being reproduced in this new merging platform,’ UW sociologist says
Perfectly+imperfect+machines%3A+How+artificial+intelligence+algorithms+produce+biased+decisions
Flickr user hackNY.org

Google, Amazon, schools, the government and many places across the world use some form of artificial intelligence (AI). While the algorithms and data science behind AI become more ubiquitous in society, some researchers are focusing their efforts on making these technologies more equitable for everyone.

University of Wisconsin assistant professor of sociology Eunsil Oh is working on a collaborative project with researchers in the computer science department to study how bias can enter algorithms and perpetuate social inequalities.

“I thought algorithm fairness is such a great topic because it kind of prepares you to try to solve these problems that might happen,” Oh said. “I wanted to use my expertise on gender and sociology which is about inequality [and] that does not just include gender. It also includes race. It also includes ethnic backgrounds, sexuality and ableism.”

Advertisements

STEM students with disabilities face extra barriers in earning degree

Oh’s main focus in this project is the gender wage gap. The hiring process in big companies such as Google and Amazon use algorithms for the first round of screening. When algorithms are fed years of biased data they ultimately makes biased decisions, Oh said. This can help perpetuate gender inequalities, in hiring, wage and productivity predictions. Because women have been paid less than men for years, the statistical discrimination can persist in data fed into AI systems.

Additionally, Oh and her team are conducting a meta-analysis of hiring practices. They create fake resumes where they control for many variables, such as gender, race and name. They submit these resumes to real job openings and observe which variables affect the likelihood of getting hired.

Concurrently, the project’s computer science team created an algorithm that analyzes these resumes and decides whether or not to hire the candidate. The algorithm then assigns the candidate a starting wage. By doing this, Oh said the team is able to see how biased algorithms can be and suggest ways to fine tune them to overcome these biases.

UW computer science professor Jerry Zhu studies methods to ensure AI does its intended job and doesn’t produce unexpected outcomes. He said natural language systems, such as chat bots, can be vulnerable to displaying bias. These systems are designed to “parrot” back things humans have said and done in the past. Since minorities can be underrepresented in the data set, these systems can reproduce social inequalities.

“This is potentially a way for the system to magnify those biases, especially if the system is used to make important decisions,” Zhu said.

AI has already disproportionately affected minorities on some occasions. Facebook had to issue an apology after a software prompted users to continue watching “videos about primates” after watching a video featuring black men, according to NPR. In 2015, Google had to remove the words gorilla, chimp, chimpanzee and monkey from its image recognition software after the software was labeling black men as “gorillas.”

UW research labs struggle to sustainably manage waste

Additionally, Oh studies how the criminal justice system uses AI. She said bias could lead to unfair bail determinations or sentencing. Though, each case is unique, making it hard to tell if the AI was right or wrong.

Police departments often use AI as well, further contributing to the existing systemic issues. 

“I would say the usage of big data has been increasing in policing, and it has been reproducing existing inequality rather than reducing it so far,” Oh said.

The AI used in policing is focused on zip codes, Oh said, which can be problematic since police will parole neighborhoods that historically host more crime. This leads to more arrests in those neighborhoods. More arrests lead to more police on patrol, and the system starts to spiral out of control.

AI is not all bad, and biased algorithms are not intentional. Zhu said any respectable AI software company has no intention of producing biased AI, yet bias can still leak in through their data. Zhu suggested these systems need some kind of guardrail, such as having a human component in the decision making process or building a system that’s more resistant to biases.

“You could constrain it so that it is aware of certain biases, and you can instruct it to never do something that’s ethically wrong,” Zhu said.

Sustainable energy seminar: Using ammonia to switch away from fossil fuels

Zhu compared AI to any food product, and with any food product comes a nutrition label and an ingredient list. Zhu said future AI systems could be required to include a “nutrition label“ in the future which would help users know what they are using. AI is currently missing that label.

The challenge comes in identifying how an AI system makes its decisions. Zhu called these systems a “black box” since there is no way to dissect and understand them right now. Still, he believes continuing this research is important to make AI that is fair for everyone.

“It is extremely important because AI is just like any other product, and you don’t want a product that is defective,” Zhu said.

These instances highlight why it will be important for development of AI systems to include input from sociologists like Oh.

Oh thinks of herself as a qualitative scholar, meaning her data and observations come from interacting with real people. Yet, once the pandemic hit, she couldn’t gather observations through these interactions. She was drawn to this study by her colleague in the computer science department, Kangook Lee, who proposed that they collaborate on a project based on social issues. This way, Oh could bring her sociology background to the emerging field of big data.

UHS offers free flu shots to students, staff

Oh feels lucky to have her sociological training when approaching this technical field. She hopes studies like these can lead to more collaboration between sociology and data science. She believes people should continue to bring a broader philosophical platform to these data driven projects to tackle these social issues.

“My role is really giving the bigger picture of what this is about,” Oh said. “This is about social inequality being reproduced in this new merging platform and that it’s inevitable for us.”

Advertisements
Leave a Comment
Donate to The Badger Herald

Your donation will support the student journalists of University of Wisconsin-Madison. Your contribution will allow us to purchase equipment and cover our annual website hosting costs.

More to Discover
Donate to The Badger Herald

Comments (0)

All The Badger Herald Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *