As generative AI platforms (a type of artificial intelligence that generates human-like responses) and tools become more available in the classroom, professors are raising a variety of concerns as they adapt to their new presence.
Associate Professor Kathryn Storey from the School of Computer Science said AI analyzes existing data to create and develop countermeasures.
“This is effectively a large-scale model of code that has been trained on a large amount of data that is in the public domain. This means that we have built a model of the language that people have been using based on what is available out there. ,” Storey said. . “And what it does is remember the connections between things. So when you ask a question, you can use multiple data sources to go back and generate what is commonly generated for users.”
There are concerns about the role of generative AI in academia, with critics claiming it could enable plagiarism. However, many North Carolina State University professors are embracing generative AI as a learning tool.
Paul Fyfe, an associate professor in the English department, is incorporating generative AI into his curriculum and using it to show students the challenges and limitations of the software.
“I asked my students to use this tool to cheat on their final papers and then write reflections about what they thought about. Did it work?” Fife said. “What was it like working on this piece as a writing partner? Did you feel like the content was your own? How much of this is your writing? And how does this exercise in which students work on AI? Now, the limitations and problems of working with AI, and for some people, the real opportunities of how it can help in certain ways, have become clearer than I have.”
Storey said the benefits AI brings to computer science can help students learn new programming languages. However, it also points out the shortcomings of AI as a learning tool.
“I have students do a kind of thought experiment and put their own thoughts into it, asking them what ChatGPT would do instead,” Stolee says. “Then you can see how what you say relates to context and how incorrect what ChatGPT might say as a way to highlight limitations.”
However, Storey said AI has various shortcomings, such as non-existent images, information, and quotes, which he called hallucinations and amounted to a form of misinformation. This can be misleading as the answer may sound authoritative.
David Reeder, a professor in the English department, said he expects generative AI to solidify its place in the classroom, despite opposition from professors.
“I was reminded a few years ago that there were historical parallels between the introduction of calculators in math classes and, for example, the introduction of these large-scale language models in English classes.” said Reeder. “There have been protests, math teachers have been picketing. And clearly, calculators are part of the math curriculum. Similarly, large language models are basically our calculators, so I think we need to find a way to make them fit.”
Fife said AI literacy is needed to understand how to utilize AI as a tool in the classroom.
“So all of this leads to a broader mission, if not a crisis, of establishing what we call AI literacy, not only among students but also among faculty and instructors, about the possibilities and pitfalls of these technologies. ” Fife said. “We also need the right kind of guidelines and norms for using them.”
Rieder said he aims to teach students how to critically engage with AI in the classroom.
“You can ask someone to teach you a basic recipe or write a simple poem, etc., but for high-quality research and professional work, it takes creating the right kind of prompt and , it takes a lot of effort to learn different strategies “to use these machines,” Reeder said.