![]()
Education Lab is a Seattle Times project that spotlights promising approaches to persistent challenges in public education. The Seattle Foundation serves as the fiscal sponsor for Education Lab, which is supported by grants from the Gates Foundation and Ballmer Group. Learn more about Ed Lab and subscribe to our weekly newsletter.
Amid questions about artificial intelligence’s role in K-12 schools, 10 Washington districts are diving into an 18-month Microsoft-funded program to work on AI initiatives.
The 615-student Manson district in Chelan County intends to explore a mandatory ninth grade AI literacy class, which it hopes to pilot this fall.
Seattle is planning to translate AI guidance to help teachers, students and families understand how and when they’re allowed to use AI.
And Issaquah is working on an AI agent for high schoolers who receive special education services to navigate their education plans in real time and ensure the legally mandated services are working.
The $75,000-per-district grants, announced in February, are part of the Microsoft Elevate Washington initiative to expand AI access and related skills statewide. Districts can also get up to $25,000 to work with a Microsoft-certified consultant. The other grantees are Bellevue, Federal Way, Kennewick, Highline, Puyallup, Quincy and Walla Walla school districts.
The grants come as tech companies — from Microsoft to OpenAI to Anthropic — offer their products to districts or partner with educators.
AI boosters say the tools could reduce time educators spend on administrative tasks — from creating lesson plans to analyzing large data sets — and free up teachers to work with students. Teachers and students are already using AI regardless of district policies, and students will need to know how to use AI and do so ethically and responsibly in the workplace, proponents say.
That push lands amid increased calls from parents and others to roll back technology — from cellphones to laptops — in schools over concerns about privacy and the effects on students’ learning, critical thinking and well-being. They worry current AI tools are untested and unsafe for children, and the companies’ rush to partner with districts are akin to customer-acquisition programs that build lifelong brand affinity, potentially at children’s expense.
Still, students’ use is increasing. A Pew Research Center report in February found that 54% of students aged 13 to 17 used chatbots to assist with schoolwork and 57% to find information, more than twice as many who reported using ChatGPT for schoolwork in a 2024 survey. Twelve percent said they’d used chatbots for emotional support or advice.
Meanwhile, a RAND survey released in March showed that students are also concerned about how AI may impact their ability to think critically, with 67% agreeing that it would harm their critical thinking skills as more students used AI for homework.
Jolenta Coleman-Bush, Microsoft’s director of education, workforce and AI skills, said the company was not dictating what schools should do with AI. But it’s important educators understand the tools, as they are best-suited to determine appropriate use in schools, to help shape policy, and “to prepare the next generation for an AI-enabled world,” she said.
“That’s our intent: It’s really to build up the capacity of educators,” she said.
But Benjamin Riley, the founder of Cognitive Resonance, which educates people on generative AI, said districts should resist the pressure and “stop chasing technology hype.”
Riley thinks the tools are pernicious because they outsource thinking and impede the student-teacher relationship.
“These are tools of cognitive automation, and since the business of education is to build knowledge and improve cognition, these are really profoundly unhelpful tools by and large,” he said.
Riley pushed back on the argument “the future demands this.” Not too long ago, the same people were saying everyone needs to learn how to code, he said.
Schools are in a hard spot because they’re under pressure to “do something,” but it’s still unclear what’s good AI policy, professional development or best practice, said Justin Reich, an associate professor of digital media at the Massachusetts Institute of Technology.
Districts, he said, should be upfront about their experimentation. They could run small, local experiments, gather data, evaluate and change accordingly, said Reich, who is also director of the Teaching Systems Lab.
“Step one is humility, and to say, ‘we do not know what to do …,’ ” he said. “I think there are a lot of elite actors that are not doing a very good job with this humility.”
District officials in the Microsoft project who were interviewed said they don’t feel like they’re being pitched a product — some already use Copilot, or other products, like Google Gemini.
Meg O’Conor, a Microsoft spokesperson, said the skepticism was fair, but the company’s education work predates AI, and it’s agnostic about the products the districts use.
District officials said they applied for the Microsoft grant because they were already experimenting with AI — with policies, teacher professional development or student use. The grant provides access to expertise the districts might not otherwise have, and the cohort of diverse districts offers a chance to learn from peers, they said.
“It’s been an equalizer,” said Tina Brewer, director of professional learning and assessment in Kennewick, where some teachers have embraced AI while others remain skeptical.
Kennewick is considering adding more professional development for teachers and a chatbot for teachers to help strengthen and differentiate instruction.
“Everybody is trying to find their way through this … and we’re all trying to support each other in making the right decisions, because we are all wanting to do the right things for students,” Brewer said.
Networking was also a big draw for Seattle, said Debra Knickerbocker, the district’s digital learning manager.
Seattle has been releasing AI guidance the last year and half, she said. Part of its project will address AI literacy and communicating to teachers and students the district’s expectations for AI use, including in specific classes.
She said the district wants to “help our students figure out when AI would make sense in our educational journey and when it might impact their critical thinking.”
Manson started an AI class in its career-and-technical-education pathway last year, but Superintendent Tabatha Mires said educators found students were missing key digital skills, including how to assess answers for bias.
She said the district is exploring AI because it’s aligned with Manson’s goal to graduate students ready for a changing world.
“If we just bury our heads in the sand, I don’t think we are doing a great job of ensuring that we are prepared for some of those dangers that we have heard about,” Mires said. “We’ve got to stay pretty well-educated about both the benefits and the possible harm that can come through use of AI with children.”
In Issaquah, officials are envisioning an AI agent that will enable students with disabilities to pull up the legal documents detailing how they will be educated and get real-time answers about those plans in language they can understand.
The project grew from panels the district hosted last year with secondary students who receive special education services. Those students said it was burdensome to discuss their accommodations with every teacher they encountered, according to Sharine Carver, the district’s executive director of special education.
Carver acknowledged there are challenges, including ensuring student data — like medical information that may be included in those documents — is protected and won’t be used for training AI models. The district plans to get student input on any product developed.
Emily Cherkin, a teacher, SPS parent and ed-tech writer, is also concerned about student privacy. She understands why children need to learn about AI, but believes there are ways to teach students about risks without exposing them to potential harm.
Both Cherkin and Riley signed a recent petition for a five-year pause on generative AI in pre-K-12 schools nationwide.
“Honestly, the best thing you can do to prepare children for the world we live in is to give them the critical thinking and executive function and citizenship skills offline,” she said. “Those are the skills that are going to raise critical thinkers to go into the tech fields to know why and when and how to use these tools.”
Microsoft Philanthropies underwrites some Seattle Times journalism projects.
Denisa R. Superville: 206-464-8216 or dsuperville@seattletimes.com. Denisa R. Superville is an Education Lab reporter at The Seattle Times, where she writes about school districts as they emerge from the pandemic years.