When ChatGPT was released in November of 2022, University of Alaska Anchorage nursing student Sarah Baisden was still a high school sophomore on the Kenai Peninsula.
Baisden is now in her first semester at UAA, where both staff and students are grappling with how to use — or not use — the ubiquitous artificial intelligence chatbot that’s become a mainstay on college campuses across the country.
Baisden is cautious when it comes to AI. Her tuition costs tens of thousands of dollars, and she needs to pass several crucial exams in order to earn her nursing degree in a few years. She worries that if she relies too much on AI while doing assignments, she won’t perform well in class or on tests.
“I want to get my full education, and that’s all there is to it. I don’t want to be doing it halfway,” Baisden said. “There has to be some trials and difficulties in that and that’s what I really like about school, is failing things and going past that. That accomplishment is just so good.”
Baisden will sometimes use AI tools, but said she does so sparingly. She once prompted ChatGPT to generate a study schedule for a particularly stressful week where she had several tests and wasn’t quite sure what to prioritize. She admits to struggling with transitional language in her writing, and has used writing assistant software Grammarly, which uses AI, to edit those sections. She’s also used the tool to help simplify confusing language from professors to make it easier to understand.
“If there’s bigger words or if the wording is weird to understand, then they’re able to simplify it down to a very basic level, and that makes it to where it’s easier to understand. And that, I would say, is pretty good,” Baisden said.
But she said she’s more likely to use ChatGPT to generate a pros and cons list to decide if she should get a sweet treat on her way home from class than she is to outsource her homework entirely to a chatbot. She’s not exactly sure what constitutes a “good” use of AI, or whether it’s worth the potential environmental harm caused by using AI-powered tools at all, among other concerns.
Those are questions Baisden, University of Alaska professors, other UA students and people in education across the country are unsure how to answer.
As generative AI technology improves, educational institutions have largely been hesitant to enact specific restrictions or all-out bans on the tools until they’ve had ample time to learn more about them.
Currently, the University of Alaska has no uniform, blanket policy governing the use of AI on its campuses across the state. While professors are seeing students use AI frequently, they say policing its use can be tricky, and detecting when students use AI for an assignment has become nearly impossible as the technology develops rapidly.
A rise in AI-related misconductKenrick Mock, UAA professor of computer science and dean of the College of Engineering, leads an activity for UAA’s Artificial Intelligence and Robotics Club in which students are tasked with creating an AI player for a board game. (Marc Lester / ADN)
UAA Engineering College Dean and computer science professor Kenrick Mock said most UAA faculty members address whether using AI is permissible in their syllabuses so students know what to expect.
“The University does have some really general policies … but they’re so general that they don’t really have any use for them. There’s things about, you know, ethical usage, and what does that mean?” Mock said. “It would be really difficult to come up with a really strong university policy that applies everywhere, just because of the different disciplines.”
Michael Votava, UAA’s assistant dean of students and director of student conduct and ethical development, said AI use is addressed by a section of the student code of conduct that prohibits “utilizing devices not authorized by faculty members.”
“It gives our faculty members the flexibility where individual faculty members can say generative AI is absolutely prohibited, others can say that generative AI is permissible to be used in full, and there’s a third set of professors that are going to be able to say that generative AI can be used, but it needs to be documented how it’s used,” Votava said.
Of the more than 8,800 UAA students, a report from the Dean of Students Office showed that 115 were found responsible for academic misconduct last year. Of those, about two-thirds of cases were AI-related.
The report noted a sharp increase of academic misconduct using AI. It pointed to developmental writing courses as the most common place where students attempted to pass AI-generated work off as their own. Votava said he’s dealt with cases from upper-level courses as well.
“There are some common themes that I’ve seen over the last 17 years that I’ve been in this role, and one of the reasons that students commit academic misconduct is time management,” Votava said.
Among the data collected last year is a spike in students using AI for math classes. Votava said one UAA professor hid a nearly invisible prompt in an assignment using light-colored ink and a small font. Students who completed their own work did not see the “Trojan horse.” But AI chatbots that six students pasted the problem into did, producing noticeably different answers from their classmates’ and alerting the professor to their misuse of AI for the assignment.
Votava said faculty who add specific guardrails to the work they assign students are less susceptible to AI use than those who don’t.
Most students caught using AI to cheat are required to complete a tutorial on academic integrity, and may be given a written warning. Severe cases are very rare, according to Votava, who said suspensions for academic dishonesty are only issued about once every two years. While Votava isn’t seeing pervasive misuse among UAA students, he said his office is finding that students caught using AI for one class are likely to have used it in others as well.
“There haven’t been that many examples yet where we have seen pervasive use, but it’s difficult to quantify, because that’s part of the problem is that it’s so hard to detect,” Votava said. “I think it’s going to lead to a lack of students’ ability to think critically and evaluate information by which to make decisions.”
‘Trying to strike that balance’
Mock, the Engineering College dean, researches AI. Last month, he called on Alaskans to share community problems involving large datasets that could be solved using AI’s predictive analysis.
Mock said limits placed on AI for a writing class may not work for his engineering or computer science classes, where prompting a chatbot to generate a piece of code or solve lengthy equations wouldn’t raise ethical questions about academic integrity. Mock recalled a former student of his who spent months combing through what he called a “messy” dataset to make sense of the results. That student’s work could have been completed in a fraction of the time if they had an AI tool to organize the data more efficiently, Mock said.
“Especially having worked in the space for a long time where it couldn’t do these things, where it was like, ‘oh, one day it’ll do this,’” Mock said. “Now we seem to be getting there.”
Mock stressed that students need to go back through and check the work generated by AI to verify its accuracy and remove errors before turning that assignment in. Mock is concerned that AI has gotten too good, and worries that students won’t develop the skills they’ll need once they enter the workforce after graduation.
Mock said AI tools can quickly become a crutch for students, and warned against placing false confidence in AI to accurately solve complicated problems, when it may not scale up to the level required by professional engineers.
“We really want students to be able to at least verify that what the AI has generated is really correct,” Mock said. “We don’t want to be producing students that are endangering the public.”
The University of Alaska has not shied away from teaching students about AI, either. UAA added a Master of Science degree in artificial intelligence, data science and engineering this fall.
“We’re trying to strike that balance between using these things in a responsible way, and still taking advantage of them, but learning what you need to learn,” Mock said.
Spencer Douthit, vice president of UAA’s Artificial Intelligence and Robotics Club, leads a meeting in UAA’s Engineering and Industry Building. (Marc Lester / ADN) Cognitive offloading
UA students and professors say some students have no issue using AI for assignments that aren’t the subject they’re majoring in, opting to prioritize their brain power in what’s been dubbed “cognitive offloading.”
Andrew Harnish, an assistant professor of writing at UAA, said that part of the purpose of college is learning to complete difficult tasks independently.
“I think it’s important for us to both prepare students for how they will work when they leave our campus, and also to exercise their minds and their sense of civic engagement while they’re here, and so I think ‘cognitive offloading’ is a concern,” Harnish said. “The idea is not just to give them assignments that are going to make them do these things, it’s to give them assignments that will help them to care about these things and help them to care about their minds, and citizenship, and engagement with their fellow students and community members.”
Harnish teaches a sophomore-level class where he requires students to produce their own professional portfolio and reflect on their growth as a writer in the AI era. The assignment calls for students to ask AI to help them find job opportunities, search industry job boards and analyze job descriptions. But it places distinctions and ethical notes around where AI use is allowed, explicitly stating that using AI when they shouldn’t could get students fired from a job.
Harnish’s assignment asks, “Are these technologies helping you as a learner? Is it harming you? Is it some of both? Or are you avoiding it entirely?”
Harnish said he’s dealt with about one student per semester in each of the four classes he teaches who have academic integrity infractions stemming from the improper use of AI for assignments.
“I think it’s a mistake to suppose that most students are just gung-ho to use these tools as much as they can. A lot of them have real concerns about what impact they might have on their job prospects, on society as a whole, on this information. So I don’t think that our campus is rife with misuse,” Harnish said.
‘Coming back to humanity’
UAA Writing Department Chair and author Don Rearden allowed an AI to write the first section addressing its use for his class syllabus, which is meant to act as a starting point for student discussions on the topic.
“The analogy I use with my students is that we’re going to climb Denali,” Rearden said. “Denali is graduation, but we’re walking from here, and any shortcuts you take might get you there faster, but when you’re standing on top of the peak, if you didn’t earn those steps to get there, you’re not going to know how to get down.”
He’s concerned about AI’s use of work from writers and artists without their permission and without compensation. He also worries that the readily accessible tool may deteriorate his students’ collective love of literature.
“My concern is the stagnation of language and ideas. If we’re all just regurgitating it, at what point are the students going to be not writing their own work and the professors are going to be not grading that work?” Rearden said. “If they’re just getting summaries of stories and regurgitating that, they’re going to miss out on the beauty of it.”
“They’re going to miss out on having writing that will make them cry or laugh or be afraid,” he added.
Rearden was shocked to learn through a database maintained by The Atlantic that AI companies had scraped all of his published books — including one in French — to train their models. He was also unnerved by a poem he prompted AI to produce that was able to mimic his distinct writing voice.
But when asked what safeguards should be placed on the tools, Rearden said that’s above his pay grade.
“Where it’s been interesting is coming back to humanity and the appreciation for the process and the beauty and power of writing, and how that can’t be predicted with the large language models,” Rearden said. “For me, the process is the important part.”
He’s hopeful that his students will continue to develop an affinity for strong writing the old-fashioned way.
“I’m trying to have students feel powerful with their own voice and not have that diminished or be afraid to write,” Rearden said. “So many people are just afraid to write, and it’s one of the most powerful things as humans we do.”