Since 2021, Microsoft has released an annual Work Trend Index 2024, offering insights into U.S. workplace trends. The reports initially focused on the hybrid work environment shaped by the COVID-19 pandemic.
But now, Microsoft has shifted its focus to artificial intelligence.
The 2024 report, titled “AI At Work Is Here. Now Comes the Hard Part,” reveals 76% of professionals believe AI skills are essential to stay competitive in the job market, and 69% say strong AI skills will accelerate their path to promotion.
For today’s university students, mastering AI is no longer a lofty, optional pursuit — it’s a necessity for career success.
For University of Wisconsin computer science major Tanvi Wadhawan, envisioning a future where artificial intelligence is not only present but omnipresent has been a no-brainer. Growing up in the Silicon Valley area, Wadhawan has long understood the potential of AI, so much so that it caused her to switch career paths.
“It’s [AI] why I switched gears from straight software engineering to security… it 100% has made me rethink my entire career,” Wadhawan said. “If ChatGPT or cloud AI can do my homework, it can do my job.”
Wadhawan said the consideration of AI altering jobs, or even worse, replacing them, is typical of computer science majors.
“My friends and I have talked about this before. It’s really prominent if you’re a computer science major,” Wadhawan said. “If you’re in Comp Sci you know what you’re building, you know what’s coming for you.”
Michael Dinh, a fellow UW computer science senior, echoes Wadhawan’s concerns. Dinh said the company he interns at as a software developer has held multiple sessions on AI chatbots like ChatGPT and Microsoft Copilot and whether or not they pose a threat to programming jobs.
“I know a lot of computer science majors that I guarantee have nearly all thought about the prospect of A.I. taking jobs or making the possibility of finding a job harder,” Dinh said.
And it’s hard to blame them. Following OpenAI’s November 2022 release of generative AI chatbot ChatGPT, classrooms and office spaces were left scrambling to adapt to the new technologies’ powerful text generation.
What is Higher Education’s Role in AI?
ChatGPT garnered attention in 2023 when, as part of a two-week experiment, Harvard University undergraduate Maya Bodnick had the chatbot complete seven take-home writing assignments, according to Slow Boring. At one of the most prestigious universities in the world, ChatGPT earned a GPA of 3.57 across the essays.
At UW, administrative and faculty reactions to ChatGPT were mixed. Some professors embraced the technology, while others considered its use plagiarism, employing AI detection software like Turnitin, which scans submissions for hallmarks of artificially generated text.
Though pure text generation is still deemed academic misconduct in most syllabi, UW now discourages AI detection software and encourages professors to market AI as a useful tool to students — so long as students cite when and how they use AI per UW’s research guides.
UW’s recent promotion of AI may relate to its increasing relevance to the job market. Jirs Meuris is an assistant professor of management and human resources at UW. His research looks into the consequences of AI use in the workplace for workers. Here’s how he described AI’s importance as a skill.
“If you are graduating from this point on, you should have some understanding of how AI tools are being integrated into your field,” Meuris said. “College is a great opportunity to be exposed to the latest advancements.”
If college is meant to prepare students for their future, then it must prepare them for a world of AI, regardless of potential academic misconduct, University of Illinois Springfield professor Ray Schroeder said in a piece for the Online Professional Education Association.
“It is not the academic integrity issue that looms the largest in higher education use of Generative AI (GenAI) apps,” Schroeder wrote in the story. “Rather, it is our students’ needs to gain knowledge, experience and skills with the technologies before they submit applications to employers.”
The question remains, then: Do UW faculty aptly teach their students how to use AI? Wadhawan and Dinh say no.
“I don’t think the university does a good job at that [teaching students about AI]. I’ve only had a couple professors talk about it… there’s specific professors that do it but most of them don’t,” Tanvi said.
Dinh said some of his professors still view AI as a way for students to cheat.
“It’s dependent on the professor in terms of their AI awareness and their AI experience,” Dinh said. “Some professors have no idea how powerful it can be and how educational it can be, they just see it as a cheating bot because that’s all maybe they’ve encountered from their students.”
UW does a good job of hosting AI-related events, with five events scheduled in both September and October 2024, and UW has invested particularly in the Wisconsin School of Business for AI research and skill development, according to Meuris.
In May 2024, the WSB hosted a workshop between a handful of graduate business students and professionals from consulting firm McKinsey & Company where the latter taught the former how to use generative AI models to create marketing strategies.
While these are valuable opportunities, items on the UW events calendar are encouraged but optional, and workshops like the WSB’s with McKinsey are exclusive only to business school students.
So, how can UW help all of its students — not just those of a particular major or those interested in AI — add experience using AI to their skill list?
A Canvas module all students can access would be a good place to start, Dinh said.
“I think it would be very useful if the university did release some sort of campus-wide canvas instructional course that just teaches how powerful these chatbots can be, the dos and don’ts of it and how you can use it to your advantage,” Dinh said. “I think that would be a big step in the right direction.”
Others, like UW Industrial & Systems Engineering Professor Ranjana Mehta, are calling for a more systemic approach to establishing AI literacy among students. Mehta’s research at the NeuroErgonomics Lab at UW in part focuses on studying the interaction between humans and AI systems.
“It’s a systemic challenge that needs to be looked at through a systems lens,” Mehta said. “Let’s say we want a health policy that needs to go out in the nation, it needs to be done at various levels, at the federal level, then at the state level, then at the local level, family level, we need to look at how to align AI into our everyday learning and our curriculum in a similar fashion.”
Professors should begin adding elements of AI in everyday assignments, thus embedding AI into their department’s curriculum, Mehta said. And not just departments like computer science or business, Mehta added, using psychology as an example.
“For a curriculum in psychology, it is also very important to include how AI tools can change how people think and how people behave,” Mehta said. “It might be slower in some of the disciplines than others, but I do think that the change is coming, and I just hope that it catches up to the rate of adoption of AI tools.”
Students outside of tech majors
LinkedIn has seen a 142x increase in users adding AI-related skills, like ChatGPT and Microsoft Copilot, to their profile, according to the Work Trend Index report. Designers, marketers and writers — all positions outside of technology — topped the list of such users.
Similarly, LinkedIn learning modules created to strengthen AI skills have experienced a 160% increase in usage among non-technical workers, with architects, administrative assistants and project managers leading the way, according to the report. This broadening adoption of AI skills reflects a growing recognition of their importance across diverse career paths.
Jennifer Orozco, a senior studying journalism at UW, exemplifies this trend. Despite her major being outside of the STEM field, Orozco recognizes the necessity of understanding AI in a technology-driven workforce.
“We all live in the same world surrounded by technology,” Orozco said. “I believe that it’s important that we all stay in the loop. All majors should learn how to navigate AI. It’s critical now that it’s still fairly new.”
Orozco said her professors in the J-School encourage the use of AI under some conditions.
“In the J-school they acknowledge that AI is cutting edge technology and the future so we should utilize it, but with caution. Still sticking to being original and authentic to our own ideas and writing.”
Generative AI models like ChatGPT can be used to output basic who, what, when, where and why reporting when given enough information. You can even specify to write in Associated Press style.
Below is ChatGPT’s output when asked to report on former President Donald Trump’s most recent rally in Mosinee, Wisconsin.
MOSINEE, Wis. (AP) — Former President Donald Trump held a rally in Mosinee, Wisconsin, on Sept. 7, 2024, to energize his base and discuss his political agenda as the election approaches.
AI is also used by researchers like Hernando Rojas at UW’s School of Journalism and Mass Communication. Rojas is a public opinion researcher who uses questionnaires and survey data to understand a population’s stance on given topics.
“You can use AI for generating questionnaires and asking survey questions,” Rojas said. “You can use AI to assist you in your analysis of collected data. You can use AI to program an online survey and administer an online survey, so there’s multiple ways in which AI is being used.”
In Spring 2024, Rojas had his public opinion students practice using AI as part of their research in an assignment titled “AI powered question development.”
In the assignment, students prompted ChatGPT to create questionnaire items based on a given topic and requirements, such as the use of a Likert scale.
Below is a portion of ChatGPT’s output when asked to create a Likert scale questionnaire which measures political attitudes among rural vs. urban Americans.
Instructions: Please indicate how much you agree or disagree with each statement by selecting one of the following options:
1 = Strongly Disagree, 2 = Disagree, 3 = Neutral, 4 = Agree, 5 = Strongly Agree
- Economic Opportunities
- Rural areas offer fewer economic opportunities compared to urban areas.
Urban areas provide better job prospects than rural areas.
“I really think that a student that comes out of college without knowing how to use AI within their field is going to be at a serious disadvantage because you’re going to be less productive than someone who is using AI,” Rojas said.
A lack of understanding
In June, Forbes published an article titled “What Jobs Will AI Replace First,” framing the replacement of jobs by AI not as a possibility but an inevitability. Hands-on jobs like customer service and retail checkout as well as basic technical jobs like graphic design topped the list.
Similarly, CNN published an article also in June titled “AI is replacing human tasks faster than you think,” which argued companies are adopting AI systems to cut costs and automate tasks previously done by employees.
AI’s potential to replace jobs seems to be the new flashy headline.
Such a framework, though, — some jobs are certain to be replaced — is not grounded in reality and can even hinder humans from fully utilizing AI, according to Mehta.
“Why are people worried [about AI replacing jobs]? A lot of it is a lack of understanding and knowledge on what AI is, what the capacity of AI is,” Mehta said.
Mehta said the vast majority of AI researchers are motivated by the desire to unlock human potential.
In other words, researchers are creating AI systems that work alongside humans, not ones that replace them.
“Most folks who are AI scientists really want to harness human potential, we want to develop tools and automation that helps us do our jobs better,” Mehta said. “[There’s] a lack of understanding of what drives advancement in these technologies, what is the motivation behind it? And I think if we make that clear, if we have workforce development, it would be a much easier implementation and adoption of this technology in our society.”
Mehta’s own research employs AI systems in the form of robotics to real-life scenarios, like emergency health and natural disaster response.
Her field, titled “disaster robotics”, was first used in response to the 2001 terror attacks on the World Trade Center.
Disaster robotics serves as an extension of human ability, according to Mehta.
“The [9/11] disaster site was deemed unsafe,” Mehta said. “Humans couldn’t really get to crevices and pockets of air that small, but a system could. And in that case, you think about an extension of an individual being that robotic system, that the human is now teleoperating.”
Since the 9/11 attacks, disaster robotics have been used in 33 other instances. Mehta and her students conducted AI research in response to Hurricane Ian just last year.
“We have been embedded in four disasters so far,” Mehta said. “They [Mehta’s students] had a very good sort of insight into it, it’s not easy developing AI systems, unless you take into consideration the human who’s going to be using these tools.
In the food service industry, household name companies have been using AI to streamline business operations. In 2014, for example, Dominos introduced AI voice assistant “Dom,” which receives online orders, allowing human employees to focus on other tasks.
Dom can now be used as an online chatbot much like ChatGPT and Copilot.
Perhaps anxieties about AI replacing jobs seem justified when looking at technologies like Dom, which can take orders and respond to customer inquiries. Why would a company hire a human for tasks AI can handle?
Striking a balance
In February of 2023, a shooting occurred on the Michigan State University campus, which killed 19-year-old Arielle Anderson and 20-year-olds Brian Fraser and Alexandria Verner, who were all students.
In response to the tragedy universities across the country sent MSU email letters of condolences. Vanderbilt University’s letter, though, contained some troubling text at the bottom.
(Paraphrase from OpenAI’s ChatGPT AI language model, personal communication, February 15, 2023).
Turns out leaders from Vanderbilt’s Peabody Office of Equity, Diversity and Inclusion had used AI to write their letter and forgot to erase ChatGPT’s signature at the bottom.
Vanderbilt apologized, but the incident left an impression on many following the story, including UW Marketing Professor Moses Altsech. Altsech works as a consultant, advising businesses on how to prepare for and use AI. He cites what happened between Vanderbilt and MSU as an example of overreliance on AI.
“If there was ever a case where, skip ChatGPT for crying out loud, just write something heartfelt, it’s two sentences,” Altsech said. “I think ChatGPT and AI have valuable uses as long as we compensate in other ways so we don’t lose the skill set of being able to write not just correctly but in a compelling, captivating and elegant way.
According to Altsech, being a compelling and creative writer is just as important as any other skill in the business world. He fears overreliance on text generation through AI may undermine students’ ability to write.
“It makes people who really don’t read and write very much in the first place read and write even less, because a computer can do it for them,” Altsech said.
Altsech has students in his marketing classes complete two creative writing assignments a semester. Such assignments aren’t commonplace for business classes, but students should learn to think like storytellers, Altsech said.
Even Altsech, though, can’t deny AI’s power as a tool and relevance in the workplace. A balance needs to be struck between AI’s power to create and students’ creativity.
“However, I know that I have a lot of colleagues who use it as a tool, and I suspect that one day I’m going to move in that direction too,” Altsech said.
In the evolving landscape of higher education and the job market, the integration of AI presents both challenges and opportunities. AI skills are now essential for career advancement — this urgency is felt across disciplines, from computer science to journalism.
While universities like UW are beginning to recognize the necessity of AI literacy, some students think the university should be doing more to establish a systemic, widespread education system for AI literacy.
But some faculty like Altsech have exhibited adverse opinions about AI’s integration, arguing that while AI has undeniable power as a tool, overreliance on it could undermine critical skills like creative writing.