Skip to Content

California’s two biggest school districts botched AI deals. Here are lessons from their mistakes.

CalMatters

With all the hubris of a startup founder, Alberto Carvalho, superintendent of Los Angeles Unified School District, took to the stage in March to launch Ed the chatbot. He told parents and students it had “the potential to personalize the educational journey at a level never before seen in this district, across the country, or around the world.”

“No other technology can deliver real time on this promise,” he said. “We know it will succeed.”

In June, after only three months and nearly $3 million, the district shelved Ed following layoffs of more than half of the staff at AllHere, the startup that made the conversational AI assistant. District spokesperson Britt Vaughan refused to answer questions about the bot’s performance or say how many students and parents used it before the shutdown.

Also in June, an AI controversy unfolded in San Diego, where school board members reportedly weren’t aware that the district last summer bought a tool that automatically suggests grades for writing assignments. The dustup began after Point Loma High School teacher Jen Roberts told CalMatters that using the tool saved her time and reduced burnout but also gave students the wrong grade sometimes. A week later, Voice of San Diego quoted two members of the school board saying they were unaware the district had signed a contract involving AI. In fact, no one on the board seemed to know about the tool, the news outlet said, since it was included as part of a broader contract with Houghton Mifflin that was approved unanimously with no discussion alongside more than 70 other items. (None of the board members responded to CalMatters’ requests for comment. San Diego Unified School District spokesperson Michael Murad said that since AI is a quickly evolving technology, “we will make an increased effort to inform board members of additional relevant details related to contracts presented to them in the future.”)

Mistakes in Los Angeles and San Diego may trace back to growing pressure on educators to adopt AI and underline the need for decision-makers to ask more and tougher questions about such products before buying them, said people who work at the intersection of education and technology. Outside experts can help education leaders better vet AI solutions, these people said, but even just asking basic questions, and demanding answers in plain English, can go a long way toward avoiding buyer’s remorse.

No one disputes that educators face increasing demands to find ways to use AI. Following the release of OpenAI’s generative AI tool ChatGPT nearly two years ago, the California Education Department released guidance referencing an “AI revolution” and encouraging adoption of the technology. Educators who previously spoke with CalMatters expressed fear that if they miss the revolution, their students could get left behind in learning or workforce preparedness.

Grading AI tools

Staff shortfalls, techno-optimism, a desire to be on the cutting edge and a fear of missing out all push educators to adopt AI, said Hannah Quay-de la Vallee, a senior technologist at the Center for Democracy and Technology, a nonprofit that’s studied how teachers and students are adopting generative AI.

She thinks recent events in Los Angeles and San Diego show that more education leaders need to engage in critical analysis before bringing AI tools into classrooms. But whether a particular AI tool deserves more scrutiny depends on how it’s used and the risk that use poses to students. Some forms of AI, like the kind used for grading and predicting if a student will drop out of school, she said, deserve high risk labels.

The European Union regulates AI differently based on risk level, and in the U.S. the National Institute of Standards and Technology released a framework to help developers, government agencies, and users of AI technology manage risk.

California’s state schools superintendent, Tony Thurmond, was unavailable to respond to CalMatters’ questions about any action he could take to help prevent future school AI snafus.

Lawmakers are considering a bill that would require the superintendent to convene a working group to make recommendations on “safe and effective” use of artificial intelligence in education. The bill was introduced by Josh Becker, a Democrat from Silicon Valley, and supported by Thurmond and the California Federation of Teachers.

Quay-de la Vallee suggested that educators work with organizations that vet and certify education technology tools such as Project Unicorn, a nonprofit that evaluates edtech products.

When education leaders rush to adopt AI from education technology providers anxious to sell AI, both may cut corners, said Anaheim Union High School District Superintendent Michael Matsuda, who hosted an AI summit in March attended by educators from 30 states and more than 100 school districts.

He thinks the recent AI problems in San Diego and Los Angeles demonstrate the need to avoid getting caught up in hype and to vet claims made by companies selling AI tools.

School districts can assess how well AI tools perform in classrooms with help from tech-minded teachers and internal IT staff, Matsuda said. But assistance is also available from nonprofits like The AI Education Project, which advises school districts across the nation about how to use the technology, or a group such as the California School Boards Association, which has an AI task force that tries to help districts and counties “navigate the complexities of integrating artificial intelligence.”

“We have to work together, consider what we learned from missteps, and be open about that,” he said. “There’s a lot of good products coming out, but you have to have the infrastructure and strategic policies and board policies to really vet some of these things.”

Education leaders don’t always have an intimate understanding of tech used by teachers in their school district. Matsuda said Anaheim Union High School District uses AI to personalize student learning material and even offers classes to students interested in a career in AI, but he said he doesn’t know if Anaheim educators use AI for grading today. Following events in San Diego, Matsuda said the district may consider high risk labels for certain use cases, such as grading.

Using common sense

You don’t have to be an expert in AI to be critical of claims made about what AI can do for students or teachers, said Stephen Aguilar, co-lead of the Center for Generative AI and Society at the University of Southern California, and a former developer of education technology. District officials who sign contracts with AI companies need to know their own policy, know what the district seeks to achieve by signing the contract, and ask questions. If contractors can’t answer questions in plain English, that may be a signal they’re overselling what’s possible or attempting to hide behind technical jargon.

“I think everyone should take the lessons learned from LA Unified and do the post mortem, ask questions that weren’t asked, and slow things down,” Aguilar said. “Because there’s no rush. AI is going to develop, and it’s really on the AI edtech companies to prove out that what they’re selling is worth the investment.”

The challenge, he said, is that you don’t evaluate an AI model once. Different versions can produce different results, and that means evaluation should be a continuous process.

Aguilar said that while events in Los Angeles and San Diego schools demonstrate the need for greater scrutiny of AI, school district administrators seem convinced that they have to be on the cutting edge of technology to do their jobs, and that’s just not true.

“I don’t quite know how we got into this cycle,” he said.

The market is pressuring edtech providers to include AI in their products and services, foundations are pressuring school leaders to include AI in their curriculum, and teachers are told that if they don’t adopt AI tools then their students might get left behind, said Alix Gallagher, head of strategic partnerships at the Policy Analysis for California Education center at Stanford University.

Since AI is getting built into a lot of existing products and contracts involving curriculum, it’s highly likely that San Diego’s school board is not alone in discovering AI unexpectedly bundled into a contract. Gallagher said that administrative staff will need to ask questions about supplemental curricula or software updates.

“It’s close to impossible for districts and schools to keep up,” she said. “I definitely think that’s even more true in smaller school districts that don’t have extra people to devote to this.”

Gallagher said AI can do positive things like reduce teacher burnout, but individual teachers and small school districts won’t be able to keep up with the pace of change, and so trusted nonprofits or state education officials should help determine which AI tools are trustworthy. The question in California, she said, is who’s going to step up and lead that effort?

___

This story was originally published by CalMatters and distributed through a partnership with The Associated Press.

Article Topic Follows: AP California

Jump to comments ↓

Associated Press

BE PART OF THE CONVERSATION

KION 46 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content