Independent Student Newspaper Since 1969

The Badger Herald

Independent Student Newspaper Since 1969

The Badger Herald

Independent Student Newspaper Since 1969

The Badger Herald

Advertisements
Advertisements

UW researchers untangle bias in technology, social media

UW+researchers+untangle+bias+in+technology%2C+social+media
Flickr user hackNY.org

As the world continues to depend on global interconnection, individuals heavily rely on technology and software applications to drive the economy, news, communication and entertainment. Though its importance is universal, Pew Research Center reports even technology and media aren’t immune to underlying bias in hiding — a parasite continuing to grow.

According to the Harvard Business Review, the roots of racism and biased microaggressions are subconscious, influenced by surroundings and the heavy weight of U.S. history. Yet, they’re almost impossible to recognize, especially in the digital age.

University of Wisconsin teaching faculty in the Information School Eric Ely said the historical foundation of the United States, where colonization pushed Indigenous people out, normalizes the whiteness in U.S. consumer culture and contributes to racial biases seen today.

Advertisements

Ely said bias in technology starts from the developers, who are mostly white men in computer science fields. This contributes to racial and gender disparities and the underrepresentation of many groups.

WID symposium highlights Latinx scientists

“The people who are building [technology], are doing so from their own experiences and frame of reference,” Ely said. “These biases are prevalent and pervasive, and you can’t recognize what you don’t know.”

In 2014, the U.S. Equal Employment Opportunity Commission reported the higher tech sector employed 63.5–68.5% white people, 5.8–14% Asian Americans and 7.4–14.4% African Americans.

Out of executives in high tech, 80% were men. African Americans composed only 2–5.3%, according to EEOC.

It’s also possible to have racial bias in training sets, Ely said. According to the Data Conversation Laboratory, training sets are used to train algorithms to understand how to apply concepts to learn and produce results.

For example, Ely said facial recognition software can create problems with training sets if the faces it’s programmed with look similar or have the same skin tone.

UW researcher and assistant professor at the Information School Jacob Thebault-Spieker said soap dispensers are an example of algorithms facilitating racist outcomes, where they don’t work on darker colors of skin. Another example is the former Twitter algorithm cropping out faces of darker skinned users.

Project shares knowledge of native rice with community

While Thebault-Spieker said software developers may not be inherently biased, he introduces a different perspective — developers are increasingly cognizant of certain biases, yet there are still certain biases that exist.

To combat integrated biases, it’s important to have awareness that biases exist, Ely said.

“In general, a lot of people who use social media are not necessarily aware of all of these issues,” Ely said. “Raising awareness can be done in schools and different social institutions. Social media companies can be proactive in discussing issues that exist with their products and dealing with these issues in a more transparent way.”

Thebault-Spieker said bias in technology can often be more complicated than targeting designers and software developers who write algorithms.

Though biases will always exist, there is a significant margin for improvements to be made, according to Thebault-Spieker.

Africa at Noon guest speaker discusses infectious disease crisis in Africa, access to healthcare

Thebault-Spieker said artificial intelligence software systems like ChatGPT and others may not be biased in their algorithms, but purely by the searches people have put into it.

The same applies for social media platforms — social media portraying exactly what people choose to share, Thebault-Spieker said.

“The other piece is that big data companies rely on social data, and much of data is shaped by the social world that it sits in,” Thebault-Spieker said. “People’s inputs and searches into data dictate how the system operates.”

To combat the issue, Thebault-Spieker said it’s important for research and literature to elevate these issues in academia, where many algorithms and systems are built and born. Many academic learning conferences have incorporated sections dedicated to concepts of diversity, equity and inclusion.

The American Psychological Association was one of the many organizations close to academia to condemn racism and admit past faults against African American people. In their statement, they apologized to people of color for being complacent to racially motivated psychological experiments, contributing to systemic inequalities.

Thebault-Spieker said bias and racism in technology can be a difficult problem and in some ways a social problem. But the social world people live in isn’t static either.

“Technology development can help shape and sort what the social world looks like in the same way that the social world can shape technology,” Thebault-Spieker said.

Advertisements
Leave a Comment
Donate to The Badger Herald

Your donation will support the student journalists of University of Wisconsin-Madison. Your contribution will allow us to purchase equipment and cover our annual website hosting costs.

More to Discover
Donate to The Badger Herald

Comments (0)

All The Badger Herald Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *