Independent Student Newspaper Since 1969

The Badger Herald

Independent Student Newspaper Since 1969

The Badger Herald

Independent Student Newspaper Since 1969

The Badger Herald

Advertisements
Advertisements

AI embodies stereotypes with real consequences, UW professor says

Leslie Bow researches implications of robots depicting female Asian features
AI+embodies+stereotypes+with+real+consequences%2C+UW+professor+says
Courtesy of pixabay.com

“The Polar Express” is both a classic children’s movie and a classic example of a phenomenon known as the “uncanny valley” because of its use of 3D computer animation to generate animated versions of real humans.

The “uncanny valley” is a term used to describe the point in which feelings towards artificial intelligence, including robots and other anthropomorphized forms of people, shift from positive ones to ones of revulsion and fear. This occurs most commonly when a lifelike appearance of artificial intelligence is sought after but not attained, according to Spectrum.

Leslie Bow, a professor of Asian-American studies at the University of Wisconsin, spends her time conducting research within the uncanny valley. Bow is discovering how artificial intelligence has begun to embody stereotypes of Asian women seen in popular culture.

Advertisements

“The saturation of Asianized gynoids in popular culture dovetails with actual robots who take the form of young, attractive Asian women,” Bow said.  

Bow has asked why society must imagine AI as having any sort of physical, human-like form when AI at its core is just machinery.

As AI programs proliferate, researchers investigate implications of machine-driven decisions

Bow discovered the embodiment of AI allows for specific affective or emotional relationships and can help mitigate the fear of advanced technology, as long as it avoids crossing into the uncanny valley.

“Imagining AI as young, female and Asian taps into the common stereotypes surrounding Asian women: As innocent, passive yet willing to please, as sexually desirable yet vulnerable,” Bow said. “[AI] are conceived as non-threatening service bots that mitigate the fear of advanced technology. The anxiety that AI could represent a superior, more powerful being is offset by this stereotype of Asian women.”

Bow believes stereotypical stories surrounding Asian women are also expressed when AI bears their resemblance. Asian women are depicted as continually in need of rescue, Bow said.

The narratives being told derive from the number of Asian women and children involved in human trafficking or sex slavery, Bow said.

“Fantasies of female embodied AI, clones or cyborgs enable two seemingly contradictory pleasures: Witnessing exploitation, imagined as sexual assault, and witnessing its transcendence, imagined as rescue,” Bow said.

The #MeToo movement has encouraged sexual assault survivors to show solidarity with one another. In light of the #MeToo movement, women have asked that Siri be programmed to push back against sexual harassment, Bow said.

Made to be like us: Using robots to enhance human interactions

Vladimir Lumelsky, UW professor emeritus of mechanical engineering, also focuses his research on human-robot interaction. He believes interactions between humans and robots will become increasingly complex.

“The interaction between a robot and a human that we shoot for is much like an interaction between two humans,” Lumelsky said. “That’s what we want to achieve.”

As the future of AI continues to evolve daily, the increasing amount of human-like interactions could raise further problems, especially with the stereotypes that AI have already begun to represent, Bow said.

It’s possible to rid ourselves of the stereotypes we’ve attached to artificial intelligence by working hard to acknowledge them in our daily lives, Bow said. To do so, we must be willing to catch ourselves and not fall back on stereotypes to “give coherence and fixity to [our] worlds,” Bow said.

UW student researches ways to make robots more human

Stereotyping is just another form of overgeneralization and an inaccurate form at that, Bow said. The characterization of a large group of people is difficult to maintain because it doesn’t apply to every individual involved.

“We notice the identity of people when their behaviors correspond to the stereotype of that group,” Bow said. “We attach meaning only to the pattern [and] other information is discarded.”

Looking forward, the future is still in our hands, both Bow and Lumelsky said. People have the opportunity to disregard the stereotypes presented on a day to day basis, while still being able to increase the complexity of our interactions with robots and other types of artificial intelligence.

“Whatever you’d expect from me if two of us were doing something together, expect that from a robot of the future,” Lumelsky said.

Advertisements
Leave a Comment
Donate to The Badger Herald

Your donation will support the student journalists of University of Wisconsin-Madison. Your contribution will allow us to purchase equipment and cover our annual website hosting costs.

More to Discover
Donate to The Badger Herald

Comments (0)

All The Badger Herald Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *