When ChatGPT was released in November of 2022, University of Alaska Anchorage nursing student Sarah Baisden was still a high school sophomore on the Kenai Peninsula.
Baisden is now in her first semester at UAA, where both staff and students are grappling with how to use â or not use â the ubiquitous artificial intelligence chatbot thatâs become a mainstay on college campuses across the country.
Baisden is cautious when it comes to AI. Her tuition costs tens of thousands of dollars, and she needs to pass several crucial exams in order to earn her nursing degree in a few years. She worries that if she relies too much on AI while doing assignments, she wonât perform well in class or on tests.
âI want to get my full education, and thatâs all there is to it. I donât want to be doing it halfway,â Baisden said. âThere has to be some trials and difficulties in that and thatâs what I really like about school, is failing things and going past that. That accomplishment is just so good.â
Baisden will sometimes use AI tools, but said she does so sparingly. She once prompted ChatGPT to generate a study schedule for a particularly stressful week where she had several tests and wasnât quite sure what to prioritize. She admits to struggling with transitional language in her writing, and has used writing assistant software Grammarly, which uses AI, to edit those sections. Sheâs also used the tool to help simplify confusing language from professors to make it easier to understand.
âIf thereâs bigger words or if the wording is weird to understand, then theyâre able to simplify it down to a very basic level, and that makes it to where itâs easier to understand. And that, I would say, is pretty good,â Baisden said.
But she said sheâs more likely to use ChatGPT to generate a pros and cons list to decide if she should get a sweet treat on her way home from class than she is to outsource her homework entirely to a chatbot. Sheâs not exactly sure what constitutes a âgoodâ use of AI, or whether itâs worth the potential environmental harm caused by using AI-powered tools at all, among other concerns.
Those are questions Baisden, University of Alaska professors, other UA students and people in education across the country are unsure how to answer.
As generative AI technology improves, educational institutions have largely been hesitant to enact specific restrictions or all-out bans on the tools until theyâve had ample time to learn more about them.
Currently, the University of Alaska has no uniform, blanket policy governing the use of AI on its campuses across the state. While professors are seeing students use AI frequently, they say policing its use can be tricky, and detecting when students use AI for an assignment has become nearly impossible as the technology develops rapidly.
A rise in AI-related misconductKenrick Mock, UAA professor of computer science and dean of the College of Engineering, leads an activity for UAAâs Artificial Intelligence and Robotics Club in which students are tasked with creating an AI player for a board game. (Marc Lester / ADN)
UAA Engineering College Dean and computer science professor Kenrick Mock said most UAA faculty members address whether using AI is permissible in their syllabuses so students know what to expect.
âThe University does have some really general policies … but theyâre so general that they donât really have any use for them. Thereâs things about, you know, ethical usage, and what does that mean?â Mock said. âIt would be really difficult to come up with a really strong university policy that applies everywhere, just because of the different disciplines.â
Michael Votava, UAAâs assistant dean of students and director of student conduct and ethical development, said AI use is addressed by a section of the student code of conduct that prohibits âutilizing devices not authorized by faculty members.â
âIt gives our faculty members the flexibility where individual faculty members can say generative AI is absolutely prohibited, others can say that generative AI is permissible to be used in full, and thereâs a third set of professors that are going to be able to say that generative AI can be used, but it needs to be documented how itâs used,â Votava said.
Of the more than 8,800 UAA students, a report from the Dean of Students Office showed that 115 were found responsible for academic misconduct last year. Of those, about two-thirds of cases were AI-related.
The report noted a sharp increase of academic misconduct using AI. It pointed to developmental writing courses as the most common place where students attempted to pass AI-generated work off as their own. Votava said heâs dealt with cases from upper-level courses as well.
âThere are some common themes that Iâve seen over the last 17 years that Iâve been in this role, and one of the reasons that students commit academic misconduct is time management,â Votava said.
Among the data collected last year is a spike in students using AI for math classes. Votava said one UAA professor hid a nearly invisible prompt in an assignment using light-colored ink and a small font. Students who completed their own work did not see the âTrojan horse.â But AI chatbots that six students pasted the problem into did, producing noticeably different answers from their classmatesâ and alerting the professor to their misuse of AI for the assignment.
Votava said faculty who add specific guardrails to the work they assign students are less susceptible to AI use than those who donât.
Most students caught using AI to cheat are required to complete a tutorial on academic integrity, and may be given a written warning. Severe cases are very rare, according to Votava, who said suspensions for academic dishonesty are only issued about once every two years. While Votava isnât seeing pervasive misuse among UAA students, he said his office is finding that students caught using AI for one class are likely to have used it in others as well.
âThere havenât been that many examples yet where we have seen pervasive use, but itâs difficult to quantify, because thatâs part of the problem is that itâs so hard to detect,â Votava said. âI think itâs going to lead to a lack of studentsâ ability to think critically and evaluate information by which to make decisions.â
âTrying to strike that balanceâ
Mock, the Engineering College dean, researches AI. Last month, he called on Alaskans to share community problems involving large datasets that could be solved using AIâs predictive analysis.
Mock said limits placed on AI for a writing class may not work for his engineering or computer science classes, where prompting a chatbot to generate a piece of code or solve lengthy equations wouldnât raise ethical questions about academic integrity. Mock recalled a former student of his who spent months combing through what he called a âmessyâ dataset to make sense of the results. That studentâs work could have been completed in a fraction of the time if they had an AI tool to organize the data more efficiently, Mock said.
âEspecially having worked in the space for a long time where it couldnât do these things, where it was like, âoh, one day itâll do this,ââ Mock said. âNow we seem to be getting there.â
Mock stressed that students need to go back through and check the work generated by AI to verify its accuracy and remove errors before turning that assignment in. Mock is concerned that AI has gotten too good, and worries that students wonât develop the skills theyâll need once they enter the workforce after graduation.
Mock said AI tools can quickly become a crutch for students, and warned against placing false confidence in AI to accurately solve complicated problems, when it may not scale up to the level required by professional engineers.
âWe really want students to be able to at least verify that what the AI has generated is really correct,â Mock said. âWe donât want to be producing students that are endangering the public.â
The University of Alaska has not shied away from teaching students about AI, either. UAA added a Master of Science degree in artificial intelligence, data science and engineering this fall.
âWeâre trying to strike that balance between using these things in a responsible way, and still taking advantage of them, but learning what you need to learn,â Mock said.
Spencer Douthit, vice president of UAAâs Artificial Intelligence and Robotics Club, leads a meeting in UAAâs Engineering and Industry Building. (Marc Lester / ADN) Cognitive offloading
UA students and professors say some students have no issue using AI for assignments that arenât the subject theyâre majoring in, opting to prioritize their brain power in whatâs been dubbed âcognitive offloading.â
Andrew Harnish, an assistant professor of writing at UAA, said that part of the purpose of college is learning to complete difficult tasks independently.
âI think itâs important for us to both prepare students for how they will work when they leave our campus, and also to exercise their minds and their sense of civic engagement while theyâre here, and so I think âcognitive offloadingâ is a concern,â Harnish said. âThe idea is not just to give them assignments that are going to make them do these things, itâs to give them assignments that will help them to care about these things and help them to care about their minds, and citizenship, and engagement with their fellow students and community members.â
Harnish teaches a sophomore-level class where he requires students to produce their own professional portfolio and reflect on their growth as a writer in the AI era. The assignment calls for students to ask AI to help them find job opportunities, search industry job boards and analyze job descriptions. But it places distinctions and ethical notes around where AI use is allowed, explicitly stating that using AI when they shouldnât could get students fired from a job.
Harnishâs assignment asks, âAre these technologies helping you as a learner? Is it harming you? Is it some of both? Or are you avoiding it entirely?â
Harnish said heâs dealt with about one student per semester in each of the four classes he teaches who have academic integrity infractions stemming from the improper use of AI for assignments.
âI think itâs a mistake to suppose that most students are just gung-ho to use these tools as much as they can. A lot of them have real concerns about what impact they might have on their job prospects, on society as a whole, on this information. So I donât think that our campus is rife with misuse,â Harnish said.
âComing back to humanityâ
UAA Writing Department Chair and author Don Rearden allowed an AI to write the first section addressing its use for his class syllabus, which is meant to act as a starting point for student discussions on the topic.
âThe analogy I use with my students is that weâre going to climb Denali,â Rearden said. âDenali is graduation, but weâre walking from here, and any shortcuts you take might get you there faster, but when youâre standing on top of the peak, if you didnât earn those steps to get there, youâre not going to know how to get down.â
Heâs concerned about AIâs use of work from writers and artists without their permission and without compensation. He also worries that the readily accessible tool may deteriorate his studentsâ collective love of literature.
âMy concern is the stagnation of language and ideas. If weâre all just regurgitating it, at what point are the students going to be not writing their own work and the professors are going to be not grading that work?â Rearden said. âIf theyâre just getting summaries of stories and regurgitating that, theyâre going to miss out on the beauty of it.â
âTheyâre going to miss out on having writing that will make them cry or laugh or be afraid,â he added.
Rearden was shocked to learn through a database maintained by The Atlantic that AI companies had scraped all of his published books â including one in French â to train their models. He was also unnerved by a poem he prompted AI to produce that was able to mimic his distinct writing voice.
But when asked what safeguards should be placed on the tools, Rearden said thatâs above his pay grade.
âWhere itâs been interesting is coming back to humanity and the appreciation for the process and the beauty and power of writing, and how that canât be predicted with the large language models,â Rearden said. âFor me, the process is the important part.â
Heâs hopeful that his students will continue to develop an affinity for strong writing the old-fashioned way.
âIâm trying to have students feel powerful with their own voice and not have that diminished or be afraid to write,â Rearden said. âSo many people are just afraid to write, and itâs one of the most powerful things as humans we do.â