Governor Tony Evers signed an executive order Aug. 23 marking the beginning of the Governor’s Task Force on Workforce and Artificial Intelligence, according to a press release from the Office of the Governor.
The task force will help Wisconsin develop best practices for utilizing generative AI and will study it’s potential affects on various industries to help increase equity and economic opportunity, according to the release.
Leading the task force is State Representative Nate Gustafson (R-Fox Crossing), according to WisPolitics.
The task force is designed to analyze the current state of generative AI’s impact on Wisconsin’s labor market, and to make predictions of how it will influence and shape the workforce in the near and long term future, according to the press release.
The press release also said the task force will analyze the presence of AI in not only Wisconsin’s workforce, but also in a range of industries important to Wisconsin, including manufacturing, healthcare, education, transportation, agriculture and more.
UW–Milwaukee, UW–Oshkosh to close Washington County, Fond du Lac branches
To help visualize the task force’s endeavor of analyzing the role of AI in Wisconsin industries, University of Wisconsin Data Science Institute Director Kyle Cranmer shared some insight pertaining to generative AI’s multifaceted nature.
AI systems have the potential to be biased towards specific groups of people depending on the historical data it was trained with. Because of this, it is important to know the ramifications of AI’s decisions and why it’s making those decisions, Cranmer said.
“It’s easy to think that if a computer or algorithm is making a decision it can’t be biased. And that’s definitely not true,” Cranmer said. “There are a number of automated systems that are biased against some groups of people, and it is often more extreme in AI systems that are trained with historical data in that they can propagate certain existing biases and narratives. You have to take all these things in account at the bare minimum when deploying an automated system, especially one using AI techniques.”
Cranmer said one significant challenge that accompanies the regulation of AI systems is the methods they use when drawing on data to provide responses to queries.
Generative AI tools like ChatGPT act similarly to a Google search engine. When you enter a Google search, the request enters a body of knowledge, and the engine retrieves the answer from there, Cranmer said. AI queries will appear to do the same, but in reality they are doing their best to provide a response based on information they have encountered before, which is not always accurate, Cranmer said.
AI tools often cite facts and references, with responses that seem very authoritative and trustworthy. But, they are forced to provide these responses based on material they have already learned from previous inquiries, and they are not adept at telling the user that they are not sure whether the answer is actually valid, Cranmer said.
Republican state leaders withhold pay raises for Universities of Wisconsin employees
“With things like generative AI, it gets a lot murkier because it feels like some trusted interface,” Cranmer said. “But the current versions do not have that trust built in. The term ‘hallucination’ is often used with this. Referring to AI’s tendency to cite things and look like facts that are not actually true. And so that definitely has a lot of ramifications, especially if the people that are using it aren’t really aware that what’s being generated is not actually correct.”
With regards to AI in education, UW professor and founder of the Quantitative Ethnography field David Shaffer expanded on the topic of the ethics and regulation of AI.
Shaffer emphasized that large language AI models like ChatGPT are still incredibly new, which makes it all the more difficult to predict what kind of changes they are going to produce in both education and the workforce.
According to a 2023 Forbes survey, more than 75% of consumers are worried about AI producing misinformation.
“In this case, it’s especially problematic because these large language models are kind of in their infancy,” Shaffer said. “Things are uncertain. I think people are correct to be concerned. But I also think we need to be a little cautious about both our predictions and our worst fears in terms of the ethical use of these models”
Shaffer said the text based AI model’s property of scraping the internet for information to construct a seemingly logical response as one of their main principles the governor’s task force should be aware of when analyzing their impact on Wisconsin industries.
AI engines also have an effect on the environment and Wisconsin’s energy efficiency, Shaffer said.
Wisconsin Division of Safety and Protection Services improves drug monitoring program
“It also uses a tremendous amount of energy. Every query is costly in terms of electricity and water,” said Shaffer “The water is the cost of cooling the machines that are doing all this computing to produce whatever recipe it is you just asked of ChatGPT. Any time you use the internet or computer you’re using energy. However this is way more than you would ordinarily think. People need to be aware of that.”
Though Shaffer’s research of AI principally investigates its impact on education rather than on the workforce, he suggests AI regulation as one the most important elements that should be considered by the task force when carrying out their work and agenda.
As seen with social media, when left to develop technologies to be put out to the world, tech companies often do so in a manner that maximizes their profit, which doesn’t necessarily benefit the public good, Shaffer said.