NEW YORK CITY — As artificial intelligence becomes more present in classrooms nationwide, district leaders are navigating what it means to pilot, safeguard and scale new tools responsibly.
According to an expert panel of school leaders Oct. 22 at the EdTech Week conference in New York City, doing so effectively depends as much on training, privacy and alignment with learning goals as on the technology itself. Together, they described a moment of both experimentation and caution, and the need to balance innovation through AI tools with trust, clarity and evidence of impact.
PILOTS: AI TOOLS IN CLASSROOMS
As districts move from industry hype to hands-on testing of AI tools, leaders say many pilots begin with tools already embedded in district systems or with tightly scoped, curriculum-aligned experiments.
For Boston Public Schools (BPS), the exploration of AI began with elements already in use.
“We wanted to look at existing tools that we had in our district,” said Rhianon Gutierrez, the district’s director of digital learning. “At the height of the pandemic, we were already using a platform called Panorama, which supports student success. And we also have our climate survey data in there. And so it is ultimately tied to our student data.”
That existing relationship led Boston to pilot the AI-powered software Solara, Gutierrez said, which the school system has been using for nearly a year.
“Solara … was created by the team at Panorama in partnership with us,” Gutierrez said. “They were already thinking about how to use AI in their platform, and because we were a longtime valued partner of Panorama, they said, ‘We have this tool. Are you interested in looking at it?’”
Training and support, she added, were essential.
“It was really important for us to understand how the tool worked and how we can train our teachers and staff at the central office level on using this tool,” she said.
In Denver, school leaders started by assessing what uses of AI interested both teachers and administrators, then assembled a team to identify use cases for AI and figure out what infrastructure and training was needed to implement them.
“We had a pretty solid bottom-up strategy, meaning a lot of interest from teachers across the network using AI, high usage, but we’re not seeing the best outcomes, and it seemed like a lot of it was wasting time,” said Zachary Kennelly, director of STEM and AI partnerships for Denver’s STEM-focused charter school system. “We built a cross-functional team to identify those use cases … really figuring out, how do we ensure that our staff understand how to get the best out of AI through upskilling them and building the infrastructure that they need to do that?”
That work included a partnership with the tech nonprofit Playlab.ai, whose software, Kennelly said, allows educators to create and customize AI tools for their classroom without having to code. Teachers can design AI-powered learning experiences like chatbots and differentiated activities for varying student levels.
The main, ongoing goal for the partnership between Denver Public Schools and Playlab, he added, was to ensure that using the software allowed for more deterministic and reliable data, and that it was used to improve student outcomes.
Alexander Fishman, manager of digital learning design and instructional GenAI at Chicago Public Schools (CPS), said his district decided to create its own tool after struggling to find an external product that fit its scale and needs.
CPS has been developing Skyline AI (SKAI) for a year, he said, and designing it to support internal curriculum work before expanding.
“We’re first building a tool for our curriculum designers — about 25 folks — so we’re de-risking a little bit, and then looking to scale that out to our educators first, and only then our students,” Fishman said. “While you’ll hear me mention a lot of things we’re doing with students, with families and literacy work, when it comes to building, we’re building small.”
SAFETY, INSTRUCTIONAL ALIGNMENT AND LONG-TERM VALUE
Amid so many experiments with AI, some panelists emphasized that responsible use requires grounding technology in pedagogy, privacy and sustainability.
“We really believe in our educators, and … starting with bottom-up demand — what are the bottom-up demands for the people who have deep expertise in the contextualized problems of our schools, our system and our community?” Kennelly said.
He added that most challenges are long-standing rather than novel, which encouraged the district to shift from an “innovation mindset” to an “outcomes mindset.”
“Most of our problems are deeply entrenched and extremely difficult to address,” he said. “What we’re most interested in are the things that we get right, what we know to be true about teaching and learning, and then technologies that can help us unlock bottlenecks in adaptive ways.”
For BPS, Gutierrez said data privacy is a central concern, stating that the school system does not want to put student data into any tool without established, effective safeguards. Because of this, she encouraged vendors to understand district data protocols before reaching out.
“If you’re interested in partnering with a district, large or small, look at their tech standards,” she said. “Be willing to sign data privacy agreements. If you’re going to be working in Boston Public Schools, there’s the Massachusetts Data Privacy Agreement, and you have to be prepared to sign it.”
Gutierrez also pointed to financial sustainability as a growing barrier. But, she added, integration and interoperability of ed-tech tools can help offset cost burdens.
Fishman echoed Gutierrez, saying CPS focuses on interoperability for any tools it intends to implement at scale. However, he cautioned against both unchecked optimism and fear in conversations surrounding AI.
“This swing between the extreme optimism and that energy, the LinkedIn energy, and then the fear and the dread and the concern … we need to exist in the space in between,” he said. “We’re also ready to look to the future. We’re just ready to look at it without rose-colored glasses.”

