As artificial intelligence becomes more accessible and embedded across K-12 education, school systems are grappling with how to harness its potential while safeguarding sensitive student and staff data.
That was the subject of a webinar Oct. 7, hosted by the cloud software company PowerSchool, in which experts discussed the growing threats schools face, the need for organizationwide frameworks, and strategies for educators and leaders to use AI tools safely and effectively.
Panelists from Microsoft and the cybersecurity companies Abnormal AI and Secure Schools all agreed that AI can strengthen operations and instruction, but only if schools take a systemwide approach that aligns cybersecurity, governance and professional learning. The speakers also collectively stated that K-12 institutions are now a top target for cyber attacks, and districts must focus on collaboration, transparency and continuous improvement to keep pace with evolving threats.
THE EVOLVING THREAT LANDSCAPE
According to several panelists, K-12 schools are prime targets for cyber attacks because they manage large volumes of student data — including medical records, documentation regarding students’ disabilities and faculty financial information — often with limited resources.
Zach Oxman, area vice president at Abnormal AI, said AI has effectively leveled the playing field for malicious actors, noting that hackers are constantly improving their skills.
“The tooling that is now available to adversaries is shocking,” he said. “We’ve armed the most novice hackers with some of the tools that used to only be available to some of the most organized groups in the world.”
Secure Schools co-founder and CEO Paul Alberry echoed this point, noting that the capabilities of potential attackers have grown rapidly.
“I don’t think we can talk about threat without talking about capability. And I think with the introduction of AI tools, everybody and everything just became a lot more capable,” he said.
The panelists highlighted that ransomware and data extortion remain the most common threats against K-12 institutions. While AI has increased the speed and precision of attacks, the speakers urged school leaders that investing in just basic defenses — multifactor authentication, unique passwords and patching systems — is more critical than ever.
“As the use of AI develops, making sure that humans stay in the loop is super important,” Alberry said. “Fact-checking by a human within the verification process of any output from AI is really important at the moment.”
BUILDING SECURE FRAMEWORKS
To mitigate risks, experts emphasized the need for clear governance structures and consistent communication across all levels of a district. The panelists said AI adoption should begin with policies that prioritize security, transparency and ethical use.
Geri Gillespy, Microsoft’s director and industry adviser for the Americas, said school leaders must remain actively engaged in cybersecurity decision-making.
“We have to be aware and include everyone in our learning community,” said Gillespy, who spent the first three decades of her career as an educator, instructional coach and superintendent. “It’s always difficult as an educational leader to see how much oversight to give without trying to micromanage.”
Gillespy also said district partnerships with technology vendors play a major role in establishing sustainable frameworks for AI use and implementation, and urged schools to demand clarity in how data is collected, used and protected.
Oxman, speaking from the vendor perspective, agreed.
“From the vendor community, it’s up to us to reach out and understand how can we create courses, internships or opportunities, even at the K-12 level, to bring students into these programs?” he said. “And how can we create the next generation of [people who] are going to be able to operate with these tools when they get into the work world, or as they’re using it to apply to colleges?”
PROFESSIONAL LEARNING AND CONTINUOUS IMPROVEMENT
The speakers argued that building a security-first culture requires more than policies — it demands ongoing professional learning for staff, students and families that empowers community members to take ownership of their digital safety.
Gillespy said cybersecurity and AI ethics should be woven into communitywide discussions throughout the school year, rather than confined to one-time training sessions.
“This is not something we can talk about at the beginning of a school year. It’s something that needs to be talked about regularly,” she said. “Our students will always outnumber us, and they’re the largest resource we have. So how are we engaging them in this conversation?”
Alberry noted that training should be personalized and engaging, reflecting different roles and responsibilities across a district.
“Not everyone has the same training needs,” he said. “No teacher wants to sit in on a lecture and check off a box that a requirement is completed.”
He instead suggested that school leaders separate initial onboarding from refresher training.
INTEGRATING SYSTEMS WHILE PROTECTING PRIVACY
Districts often operate with disconnected systems, which can create vulnerabilities, according to the panelists. Thus, as AI tools begin to interact across platforms, the speakers recommended that data integration be carefully managed.
“Where I see us moving, and where I see, especially in education, a big opportunity or move for advancements, is LLMs [large language models] built and contained within specific environments,” Oxman said. “So how are we going to be able to share data across districts to provide for better outcomes? It’s a huge challenge.”
Oxman also stressed the importance of balancing the ideas of safety and convenience, urging school leaders not to introduce AI into a system for the sake of having it, but rather carefully vetting and understanding the technology before implementing it into school operations.
“If we think of tools and tooling that we’ve all become very aware of and use every single day, convenience allows us to have maybe one sign-on piece that gives us access to an entire suite of tools in our organization and in our environment,” he said. “While that’s great, it also creates a very simple entry point for a bad actor to go spread damage across a large attack surface and compromise student or faculty and staff data.”
The speakers further expressed that adopting AI securely is not just a technical project but a cultural shift. By embedding cybersecurity into governance, training and vendor relationships, in addition to engaging entire school communities in ongoing dialogs, they said districts can create resilient systems that balance innovation with responsibility.