Members of the animation and visualization cohort are concerned about what the new UCF course, called the art of AI, could mean for artificial intelligence implementation in their studies.
Asa Carvajal
For Dr. Rudy McDaniel, director of UCF’s School of Visual Arts and Design, witnessing today’s rapid evolution of artificial intelligence reminds him of watching the internet come onto the scene as he grew up in the early ’90s.
“It changed everything,” McDaniel said. “I think we’re going through a similar evolution right now with technology, and some of the concerns are very similar to what we saw with concerns about the internet.”
When UCF introduced the art of AI, a new course within the College of Arts and Humanities, concern ignited among students, which added to the broader, ongoing conversation surrounding AI integration in creative practice.
Benny Rosenberg, junior emerging media major on the animation and visualization track, said they’re paying tuition to be taught skills, not how to use generative AI to avoid using them.
“I don’t think it makes sense for a student to take a class that teaches them how not to use skills,” Rosenberg said. “I don’t really like the fact that the school’s resources are being put to that instead of teaching us the skills we’re paying to come learn.”
The course explores the intersection of art and AI, and provides students with a “comprehensive understanding of AI tools, techniques, and ethical implications in creative industries,” according to the course’s syllabus. The course is open to all university students interested in the subject who have completed the introduction to computer art course.
According to the syllabus, “students will learn to create AI-assisted artworks” through “hands-on activities, theoretical discussions, and critical analysis,” while also addressing issues such as copyright and societal implications.
Additionally, the course’s learning outcomes include “demonstrate fluency in using AI-based creative tools” and “create AI-integrated art projects blending human and machine creativity.”
“So it’s kind of just like, how do we get used to this?” McDaniel said. “How do we still market ourselves as being able to have careers and do interesting things in the world and not just let this AI take over everything?”
McDaniel said in the course, students are doing things like hypothetical case studies about AI in various ethical contexts. He said he thinks students are becoming more interested in these scenarios because they are becoming more common.
“Whether we like it or not, AI is not going away, right?” McDaniel said. “I think it’s something that is going to continue to grow, and it’s our obligation to try to teach students to know how to use it once they go out there and graduate and start looking for jobs.”
However, the AI industry is being marketed significantly differently than the internet in its infancy, according to Time magazine. According to Time, technology companies often downplay the worries of workers who are now being forced to use technology that can change or even eliminate their own jobs. Tech companies are encouraging other companies to require employees to use company products.
Today, certain jobs not only require AI skills, but many are threatened to be taken away by AI.
Around 64% of U.S. adults believe AI will eliminate jobs over the next two decades, according to a study by the Pew Research Center. According to Time magazine, most American adults’ conclusion that “AI will render certain jobs obsolete nor humans inconsequential to society” is not at all impossible.
McDaniel said he thinks that in a world where AI skills are being asked for on LinkedIn and in job postings, being able to use AI as a brainstorming, collaborative, procedural tool is a great skill set to have.
“Some students may choose to engage AI as a tool in their art-making process, but others might avoid it entirely, and that’s perfectly okay, too,” McDaniel said.
Professor Stephen Kuebler, founding associate director of UCF’s Center for Ethics and co-organizer of REALity of ARTificial Intelligence, has concerns that the ease with which anyone can create something using AI might be limiting professional growth, whether in art or STEM writing. He also worries about the massive data set from which most AI programs draw, which includes artwork posted on various platforms across the internet.
While he agrees AI can be used as a different sort of ‘paint brush’, Kuebler said that with a lack of control of the palette, it becomes hard to tell what the artist did and what they didn’t.
“I’m concerned about credit,” Kuebler said. “If you use this tool, are we properly crediting the work of others from which it draws? You know, is it effectively just copycatting in ways? That’s a fine line there.”
McDaniel said student concerns are legitimate because the “idea of theft” is built into a lot of models, but that not all AI systems are built the same, with different products, algorithms and processes. McDaniel said that even the way images are sourced can vary widely from tools like Adobe Firefly, which compensates artists who consent for their work to be used in the model, while other AI programs like Gemini, Chat GPT and DALL-E do not.
“So maybe the misunderstanding is that AI is a monolithic entity rather than a set of tools developed by different companies with different ethical parameters and guidelines,” McDaniel said.
Generative AI is trained on practically all of the internet, including copyrighted materials, without creator consent, blurring the lines of authorship.
Kuebler said that in creating, whether it’s art or STEM, everyone is influenced by everything that has gone into their education, observations, and lived experiences. He said that as a chemist, he can’t do anything he does without standing on the shoulders of all of the greats before him.
“So when I create something, it’s really even hard to say this is totally my creation, and the same has got to be true for artists and musicians and people doing theater,” Kuebler said. “But that line of ‘I did this’ versus ‘someone else did this’ becomes much, much blurrier with current implementations of AI.”
McDaniel said at SVAD, he wants students to understand the ethical concerns of this rapidly evolving technology, including how images are sourced and if artists who contributed work can give consent or be compensated, while balancing the fact that AI can also be used as a unique creative medium, like many other digital tools artists may be more familiar with.
“We try to help them [students] have a more balanced perspective, rather than a reactive one, where it’s immediately ‘AI is evil’, or ‘AI is the best thing ever,’” McDaniel said. “The truth is more complicated than that.”
McDaniel said teaching students to use AI for art in a more iterative way, rather than as a tool that spits out any image, is important. He said that this cannot be done well without first understanding and learning art fundamentals, which give artists the language to generate the type of output they are looking for.
McDaniel said AI also opens an avenue for people without an artistic background to express themselves and communicate in artistic ways they may not be skilled or trained in.
If students are reluctant to use image generation, McDaniel said AI can also be used as a brainstorming tool for their artistic processes or a marketing tool for their portfolios.
“I think we tend to pigeonhole AI in the art world as just generating art, but there are lots of other ways that it can be used as well that are useful to artists, writing artist statements or refining artist statements, things like that,” McDaniel said.
Kuebler said he is glad there is a class to increase student awareness of the emergence of AI to get students thinking about the technology, because of its significant impacts on society across various sectors.
However, he said to appreciate art, knowing the process and dedication that it takes plays a big role, and that some of that appreciation is going to get lost if everything is AI-developed and easy to do.
Kuebler said what he loves about UCF is that it’s not just a research school but one that hosts many creative majors, spaces and events, such as UCF Celebrates the Arts.
“I think having the arts and the sciences and education and all the different institutes that allow students to train and people to produce in all these areas of human creative endeavor make us a university,” Kuebler said. “Without that, we would not be a university, and so I hope it will always be there.”
Justin Braz, junior emerging media major on the animation and visualization track, has been creating art since he was very young. He said being able to study something to work in a discipline he has loved for so long feels like turning his childhood dreams into reality.
However, Braz and others in the animation and visualization cohort are concerned about what the art of AI course could mean for the implementation of the technology in creative practice at UCF. The course is taught by Dr. Matt Dombrowski, an associate professor of emerging media and assistant director of SVAD, who also oversees the highly competitive animation and visualization program, of which Braz is a part.
“It does feel like a bit of an insult to how much work that people such as me and the rest of my cohort have put in, and our main person in charge of our degrees [Dombrowski], is promoting this thing that steals from other artists is bad for the environment and all that,” Braz said.
The growth of AI technology has led to the rapid creation of AI data centers. There are already 151 centers in Florida, mostly in urban locations like Orlando, Tampa, Jacksonville and Miami, according to DataCenters.com.
Braz is particularly concerned with the environmental costs of AI data centers, which are mostly powered by drinking water, stressing the Floridan Aquifer. The aquifer provides Florida residents with most of the water they use, and is already strained due to over-extraction, rising sea levels and increased saltwater intrusion, according to National Geographic.
A study at the University of Michigan found that most AI data centers use over 10 million gallons of drinkable water every year, making balancing citizens’ need for water nearly impossible. The study says that on average, one data center can consume up to two megawatt hours of electricity, which is roughly the amount of power a small town would consume.
According to the study, data centers consumed more than 4% of U.S. electricity in 2023, with estimates suggesting that consumption could rise to 12% by 2028. Rosenberg said he shares Braz’s same concerns about the environment and water shortages.
Alexander Tyson, senior emerging media major and president of UCF’s chapter of We in Animation, said AI has become so controversial among students that attendance at meetings specifically discussing the topic has been an issue. Tyson said WIA’s first meeting had 50 members in attendance, but a recent meeting on how AI is impacting the animation industry garnered only three attendees.
“Unfortunately, talking about AI is a bit taboo to artists, especially newer artists who don’t really have all of the knowledge of how it has impacted things, both good and bad,” Tyson said.
Kuebler said at UCF, students are going to become leaders in their organizations and workplaces, who will have their own museums or companies, and will elect and become our political figures. By learning about something many anticipate to be ever-present in the modern world, he thinks that knowing more about it can make UCF graduates more well-versed in what future workplaces may demand.
“If they’re [students] thinking about these issues, they have a chance to not just navigate, but to chart a course, and take us in directions with AI or whatever that are good and more beneficial for the whole community,” Kuebler said.