Independent Student Newspaper Since 1969

The Badger Herald

Independent Student Newspaper Since 1969

The Badger Herald

Independent Student Newspaper Since 1969

The Badger Herald

Advertisements
Advertisements

Mathematician Cathy O’Neil discusses big data algorithms as ‘weapons of math destruction’

Speakers argues algorithms can have racist implications
Mathematician+Cathy+O%E2%80%99Neil+discusses+big+data+algorithms+as+%E2%80%98weapons+of+math+destruction%E2%80%99
Mackenzie Christman

The University of Wisconsin Department of Mathematics and UW-iSchool partnered with the University Lectures Committee to host mathematician Cathy O’Neil Tuesday evening at the Fluno Center.  

O’Neil is a multi-faceted individual with a Ph.D. in mathematics from Harvard University. She has published multiple research papers focused on arithmetic algebraic geometry, as well as a book titled Weapons of Math Destruction. Currently, O’Neil works as a data scientist in New York.

O’Neil elaborated on a topic she discussed in her book Weapons of Math Destruction — her belief that big data increases inequality and acts as a threat to democracy.

Advertisements

“Facebook and Google have so much influence over the way we think about truth,” O’Neil said.

Fate of Confederate monuments discussed in joint meeting

Algorithms are used every day in our heads, and only some of them are formulated in code, O’Neil said. Individuals make subjective choices because they are in charge of their own decisions.

O’Neil argued algorithms become problematic when they are used in situations where lots of data is involved. Algorithms have been used in workplace personality tests to filter out risky job applicants, and in the value-added model created to judge the performance of teachers.

“We think it’s objective because its a mathematical algorithm, but we cease to evolve [our thought processes],” O’Neil said.

The biggest problem with using algorithms in this fashion is there are no rules in the world of big data, O’Neil said. Determinations made by algorithms can be overtly racist or sexist just by inputting questions that allow certain inferences to be made.

A practice known as predictive policing is an example of an algorithm with racist implications, O’Neil said. The algorithms involved in predictive policing look at locations of arrests, series of events surrounding the area and other crime data to determine who is more likely to commit crimes.

UW history professor says race isn’t going away any time soon

The problem with predictive policing, besides its racial implications, is the lack of applicable data, O’Neil said.

“There is no crime data,” O’Neil said. “We have arrest data. Most crime does not lead to arrest.”

Algorithms need to be looked into more, O’Neil said. We should be able to know our credit scores because they are being used to make determinations about us that remain a secret.

O’Neil suggested the creation of a data rights bill or antitrust laws as potential ways to solve the unrestricted application of algorithms to the daily lives of everyday citizens.

“We need accountability, we need enough transparency so that we can find the harm [in big data algorithms],” O’Neil said.

Advertisements
Leave a Comment
Donate to The Badger Herald

Your donation will support the student journalists of University of Wisconsin-Madison. Your contribution will allow us to purchase equipment and cover our annual website hosting costs.

More to Discover
Donate to The Badger Herald

Comments (0)

All The Badger Herald Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *