As technology and artificial intelligence become increasingly embedded in education and student life, Lehigh professors are taking varied approaches to their use in the classroom.
Some adopt a hands-off stance, while others restrict devices entirely. Approaches to AI usage in class are mixed, with some limits its usage in class and others encouraging students to engage with it responsibly in an effort to boost focus and meaningful learning.
Psychology professor Michael Gill implemented a no-technology policy in all of his classes last semester, prohibiting the use of computers, iPads and phones during class.
Gill said he’d considered the policy for at least a decade but acted after noting a sharp decline in exam grades following COVID. He attributed the drop to students becoming overly reliant on their devices and struggling to readjust to in-person learning.
He initially hesitated, unsure how students would respond and wanting to preserve their autonomy. But after sitting in the back of a colleagues’ lecture, he reconsidered.
“I was really bothered by what I saw, students were playing games, shopping, doing all sorts of off-task things,” Gill said.
He said visible distractions not only affect individual students but also undermine the classroom environment by discouraging engagement.
He added that device use can be disrespectful to both instructors and students who are trying to learn. Gill also cited research suggesting that handwritten notes are more effective than typed ones as another motivation behind the policy.
After implementing the rule, Gill surveyed his students. About 40 students responded on a scale of 0 to 100, with 0 indicating strong dislike and 100 indicating strong support. The median response was 85.
“For me learning is a sacred thing,” Gill said. “You come to class to listen to professors who’ve been studying things for decades and putting together this knowledge.”
He said there were around four that argued that students should be able to choose whether they use technology or not, and he would agree with that, if it wasn’t harming other students’ learning.
He said he also noticed increased participation since introducing the policy.
Other professors take a more flexible approach.
Computer science and engineering professor George Witmer allows students to decide how they want to use technology in class.
He encourages students to learn how to use AI, and said it will be essential in the workplace. At the same time, he acknowledged the growing challenges of distinguishing between AI-generated and original student work.
“AI has become a huge game changer,” Witmer said.
He said AI can be useful for completing tedious tasks but shouldn’t interfere with learning.
Witmer said while students are often distracted by technology, he understands its addictive nature and focuses instead on making lectures engaging.
To help maintain attention, he distributes worksheets during class to keep students involved and improve attendance. Still, he emphasized that students ultimately control their own choices.
“I am dismayed, maybe disappointed, given how much students are spending to be here,” Witmer said.
Marketing professor K. Sivakumar views AI as a neutral tool whose value depends on context.
He said debates about whether AI is “good” or “bad” often miss the point.
“I’m not for AI or against AI,” Sivakumar said. “In some cases, it’s appropriate, and in some cases, it isn’t.”
He doesn’t support a one-size-fits-all policy, noting that expectations vary by discipline. For example, he would allow tools like Grammarly in an upper-level marketing course but understands why an English professor might prohibit it.
Sivakumar said his main concern is that students may use AI to shortcut the thinking process. He encourages them to use it thoughtfully and refine its output with their own judgment.
He also said many students misunderstand AI, treating it like a calculator or search engine rather than a system that can generate varied responses.
In his own research, Sivakumar uses AI to reformat citations or summarize literature, allowing him to focus on developing new ideas.
Looking ahead, he said AI will reshape employer expectations and that students should demonstrate AI literacy on their resumes.
“Companies want people who can use AI to augment their work, not replace it,” Sivakumar said. “You cannot fear AI, you cannot ignore AI and you cannot be a slave to AI. As long as you use it thoughtfully, we’ll be alright.”
Kate Jackson, a professor in the College of Health in the Community and Global Health department, takes a stricter stance, prohibiting AI use in her classes and viewing technology as a barrier to student engagement.
She acknowledges that her position makes her an outlier in a college she described as “progressive” and “exploratory” in its approach to AI.
Jackson said her concerns stem from the environmental and social impacts of AI.
“For me, AI is not necessary and I find its environmental impacts don’t outweigh the benefits for me and the work that I do,” Jackson said.
She said many of her students, who often focus on sustainability and community issues, are relieved by the policy, especially in courses centered on human interaction and qualitative work.
Jackson also said technology can create distance in the classroom.
“Sometimes there’s full classes where I’m like, is anyone here?” Jackson said.
She said heavy screen use signals disengagement and can make instructors feel undervalued.
Jackson also pointed to broader concerns, including the environmental costs of AI — such as water usage, noise pollution and strain on local ecosystems — and the risk of bias in AI systems.
“We’re going to need really ethical leaders who are thinking about the complexities of our impact on the environment,” she said.
Jackson said she’d reconsider her stance if more environmentally responsible AI systems were developed.
“Until the benefits of it outweigh the environmental and social harms, I’m not going to use it,” she said.