A May 2024 survey by Inside Higher Ed and Generation Lab asked students whether they knew when, how, or whether to use generative artificial intelligence to support their classes. . While student responses revealed the importance of faculty communication about generative AI policies in the classroom, they also highlighted some learners’ disdain for using technology in any way. Ta.
Of the more than 5,025 survey respondents, approximately 2% (n=93) provided open-ended responses to questions regarding AI policy and classroom use. More than half (55) of the respondents flatly refused to engage with AI. Some said they did not know how to use AI or were unfamiliar with AI tools, which affected their ability to use them appropriately in their classrooms.
But as generative AI penetrates deeper into the workplace and higher education, more professors and industry experts believe it will become a necessity in every student’s classroom and life beyond academia.
methodology
Inside Higher Ed’s annual Student Voice survey was conducted in May in partnership with Generation Lab, and a total of 5,025 students responded.
The sample included over 3,500 fourth-year students and over 1,400 second-year students. More than one-third of respondents are post-traditional students (attending a two-year institution or age 25 or older), 16 percent are exclusively online learners, and 40 percent are first-generation students. did.
The complete dataset with interactive visualizations is available here. In addition to asking questions about academics, the survey also asked about health and wellness, the college experience, and preparation for life after college.
“The big picture is that this situation is not slowing down or going away, so we need to work quickly to secure the workforce of the future,” said President and CEO of the Association of American Colleges and Universities. CEO Sean Vanderzeel says: Employer (NACE). “That’s what employers want. They want a well-prepared workforce and they want to know that higher education is in place to meet the needs of industry.”
students say
The Student Voice survey mirrors other national surveys of student perceptions of generative artificial intelligence. While some learners are ready to embrace technology head-on, they remain a minority.
A summer 2023 study by Chegg found that 20% (n=1,018) of U.S. students said they had used generative AI for learning, the second lowest adoption rate among other countries surveyed. rate. A majority of U.S. students believe the use of generative AI tools should be restricted in assessed assignments (53%), and 10% think they should be prohibited.
Less than half (47%) of U.S. learners say they want their curriculum to include training on AI tools. A quarter of respondents said AI was not relevant to their future career, and 17% said they did not want any training at all.
What is holdup?
Participants in the Student Voice survey gave a variety of reasons for not wanting to use AI tools. Some disparaged the technology as a whole, while others suggested it was unsuitable for use in higher education.
In a Chegg survey, when asked what their top three concerns are regarding the use of generative AI in education, students said cheating (52%), receiving false or inaccurate information (50%), and data privacy (39%). ) I understand that you are worried about this.
“Whether we’re very concerned about this for a variety of reasons, ethically, environmentally, socially, economically, or we’re just enthusiastic about it, we’ve been in this space for a while, and it’s strange. “I think we need to recognize that it’s going to be complicated,” says Chuck Lewis, director of writing at Beloit College in Wisconsin.
In a study recently published in Science Direct at the University of California, Irvine, researchers surveyed 1,001 students to understand their usage and concerns about ChatGPT. The most common themes among students with concerns were related to ethics, quality, careers, accessibility, and privacy or surveillance.
Some survey respondents said they were concerned that unintentional plagiarism or use of ChatGPT could compromise their work and lead to repercussions from their institutions.
“I refrain from using it at all because I’m afraid of being warned,” a third-year student at Florida Gulf Coast University wrote in a Student Voice survey.
Others surveyed by the Irvine researchers were concerned about the quality of the output ChatGPT provided, which could impact student creativity or result in inaccurate information.
“I don’t see any application for chatbots. They spend more time correcting mistakes than actually writing,” a University of Wisconsin-Milwaukee junior said in the Student Voice survey.
Additionally, Irvine’s research shows that reliance on ChatGPT can impair critical thinking skills and make people feel “too comfortable” bypassing the learning process, which can negatively impact their job prospects. The Irvine study said some students were worried that this might not be the case.
reverse the trend
Afia Tasneem, senior director of strategic research at consulting firm EAB, said one of the reasons students are anti-AI is because of the reluctance of educational institutions to respond to AI and negative biases toward the technology. In fall 2022, universities quickly implemented anti-AI policies to limit plagiarism and other academic misconduct, instilling fear among students.
Lewis found that a learner’s inclination toward technology can be related, in part, to the student’s field of study. For example, his humanities students are much more likely to express disdain for AI than STEM students.
“I felt there was a kind of duality in students’ attitudes,” Lewis says. “Some people say, ‘Oh, crap, that’s not why we’re here.’ For example, when you talk to creative writers about AI, they really say, ‘This is just bad news.’ It’s not fun.”But on the other end of the spectrum, there are a lot of students who say, “Why wouldn’t I want to use a tool that would allow me to complete this task faster and easier?”
As more industry experts now believe that AI literacy and skills are essential, universities must fundamentally change their culture, which is no easy task. But some believe higher education would be doing a disservice to students if it allowed them to opt out of AI use entirely.
According to a May survey conducted by Cengage Group, 70% (n=1,000) of recent college graduates believe basic generative AI training should be included in their courses, and 69% believe their current They say they need more training on how to work with new technology in their roles.
“Certainly there are people who oppose the use of AI in many situations, and we need to put clear guardrails around AI, but as instructors, mentors, and experts, we want to ensure that the next generation of workers We also need to help them apply their skills … to make smarter decisions about AI,” says NACE’s VanDerziel.
Looking to the future
From 2022 onwards, the capabilities and availability of generative AI tools will explode, increasing excitement among organizations and employers about the next evolution.
“Companies want to adopt it for all the right reasons, and they do it to improve revenue, increase competitiveness, and increase efficiency. These are the reasons they adopt technology in the first place. “Everything is, in a sense, just a new technology that companies have to adopt,” said James DiLelio, a professor of decision science at Pepperdine University’s Graziadio School of Business. University.
But understanding the future impact of AI on today’s college students is like looking into a crystal ball: it’s largely unclear and open to interpretation.
“I think a lot of universities started thinking about this as a new competency and kind of an essential skill for the workforce and moved on pretty quickly,” said Dylan Liu, senior program manager for research at Ithaca S+R. says Mr. Digger. “I think it’s still hard to know whether that proves to be true. There seems to be a bit of disillusionment around technology in the business world. Is it a temporary thing or I don’t know if it’s a permanent trend.”
VanDerziel said that employers are generally not requiring workers to use AI today, and that AI is part of a larger technological competency that students will need in the future and will be applied alongside other skills. I emphasize that I think there is.
According to a May survey by NACE, 75% of employers have not used AI in the past year, and only 3% plan to use AI in workplace operations within the next year. It was.
“From the internship research we released in the spring, we found that less than 10% of interns acquire AI skills during their internship,” VanDerziel said. “I thought this really spoke to how employers are currently using AI. It’s a small percentage of our students (students) who have probably even been exposed to it in their internships, and who are actually using AI. That’s where we would expect the application to take place. It just hasn’t happened yet.”
Dylan Cyphers, a physics professor at Eastern Washington University, thinks generative AI is a fad that’s gotten too much attention in higher education these days.
“That’s not what most people think. It’s not intelligent, it’s not conscious, and it doesn’t take away our jobs,” Cyphers says. “It’s really interesting software.”
For Cyphers, conversations about AI and student workforce readiness feel like a direct response to national pressures to legitimize the value of higher education. But generative AI and tools are rapidly evolving, and making students AI competent has become a major goal.
Rather, the role of higher education is to provide students with lasting tools for a career, not just their next job, through the promotion of communication, critical thinking, and other lasting skills, Cyphers says. he claims.
Thinking about pedagogy and curriculum
If AI skills are essential to the jobs of the future, as some experts believe, then the question becomes how to provide these skills equitably across academic programs. Recent trends in higher education are for institutions to engage early in student career development and planning to ensure that all students receive individualized support and assistance as they begin their post-university journeys. It looks like this.
“To level the playing field and ensure that no student is left behind with AI, we need to integrate AI across disciplines and curricula,” VanDerziel said. “That’s the only way, so that students, no matter what course load they have, can be exposed to technology that a large portion of the population uses and that is needed in the university workforce. We know the future. “
But bringing generative AI into the classroom is more difficult than teamwork and communication skills.
“As long as individual teachers have the final say on how AI is used in the classroom, some teachers may not want to allow the use of generative AI,” Ruediger says. “I’d be surprised if it went away on its own quickly.”
As a Pepperdine faculty member, D’Ilelio considers it his mission to prepare students to immediately apply what they learn to the job, and that includes using new technology.
“We want students to take advantage of it (generative AI) because we know these tools are not going away in the workplace,” DiLellio says. “We have to find ways to encourage students to be willing to embrace technology, and faculty can help with that.”
Some of DiLellio’s MBA students use ChatGPT to perform analytical calculations just like they do in Excel, making them faster and more efficient. “This is extremely valuable. You can find software that allows you to think more critically about results, rather than just understanding how to generate them,” DiLellio says.
Cyphers, on the other hand, believes the rigor of completing calculations is the reason he studies and attends college.
“I’m not asking introductory physics students to solve problems, because the world needs to know the answers to those problems,” he says. “They have been solved many times before. I ask them to solve those problems as an intellectual exercise to improve themselves.”
Ultimately, understanding where AI fits in the curriculum requires instructors to distill the course’s core learning outcomes, such as creative thinking, problem solving, communication, analysis, and research, says Beloit’s Lewis. says Mr.
“As educators, I think we are in the uncanny valley, where we don’t really know what we mean by what is supposed to be human and what is supposed to be machine. “No,” Lewis said.
Does your institution require students to use AI? Please tell us more.