Nearly 400 cameras with artificial-intelligence capabilities are scattered across the Cheyenne Mountain School District in Colorado Springs — and they can find you.
AI facial-recognition functionality means school administrators or security officers can upload a photo into the system identifying someone as a “person of interest.” When anyone matching that photo is caught on camera, school officials are notified and immediately given the relevant video footage.
The cameras can also pinpoint and track people using search terms. For example, a principal could type in that a student wearing a red shirt and yellow backpack ran away from their classroom, and the AI-enabled camera system could find students matching that description and quickly determine which way they went.
“There are some interesting cases of how it can be used to quickly find people in an emergency and enhance building security in an emergency,” said Colorado state Sen. Chris Hansen, who sponsored legislation that created a task force to discuss how the state can effectively govern AI usage. “We need to balance that for potential misuses and overly zealous surveillance. That’s what we’ve been grappling with.”
A handful of Colorado school districts and higher education institutions have implemented AI surveillance technologies in a bid to keep students safe, though a statewide moratorium has prevented the majority from doing so — though that could change next summer, when the prohibition ends.
At the same time, state legislators and technology experts are debating how to best regulate AI usage in schools, where security concerns butt up against the ethics of using artificial intelligence to surveil children.
Last month, California-based security technologies company Verkada convened at the Colorado Convention Center in Denver for a three-day technology conference showcasing its cutting-edge products to thousands of potential customers, including local school district officials.
Existing Colorado customers, including representatives of the Cheyenne Mountain School District and Greeley’s Aims Community College, came to talk up the technology they said has drastically improved their campus security.
AI cameras weren’t the only tech offering of the day.
Verkada sells wireless systems that can lock all doors in seconds with the push of a button in the event of a school lockdown. The company also offers mounted panic buttons with 24/7 monitoring and the option for immediate police dispatch.
The company also makes air quality sensors that Cheyenne Mountain hangs in secondary school bathrooms to detect students vaping. A software upgrade on the sensors can even detect whether the vape smoke contains THC. Verkada cameras stationed outside school bathrooms can then help identify a culprit, said Greg Miller, executive director of technology for the Cheyenne Mountain district.
“We went down a road of Verkada, even though it’s definitely not the cheapest, because it’s something as a small district without adding more people that we could easily support,” Miller said. “It’s been critical in multiple incidents where we can click on a face and know which door that child exited so they can find them and safely make sure they aren’t harming themselves. We can do that in under 30 seconds with the staff at a school having access.”
While the conference drew people from all over the country, Colorado’s tragic school-shooting history offered an unfortunate array of options for booking speakers who addressed the need for robust campus security.
Frank DeAngelis, who was principal of Columbine High School during the 1999 massacre, was a panelist alongside Pat Hamilton with the “I Love U Guys” Foundation, founded in 2006 after the Platte Canyon High School hostage crisis in Bailey, during which 16-year-old Emily Keyes was shot and killed. The foundation’s programs for crisis response are used in more than 50,000 schools and other organizations worldwide.
DeAngelis stressed the importance of cameras and push-button locking doors in schools, noting that security measures were “not meant to scare, but to prepare.” Seconds matter in disaster preparedness, DeAngelis said, and if AI-enabled cameras allow law enforcement to more quickly assess a threat and respond, then that’s worth it, he said.
Hamilton spoke of the necessity of having a standard response protocol like the “I Love U Guys” Foundation teaches, including the lockdown procedure of “locks, lights, out of sight” used to keep classrooms quiet and orderly during an emergency.
While Hamilton said some of the newer school safety technologies may seem beyond the reach of public schools, they could be funded by money raised through bonds and mill levies.
In 2021, K-12 schools and colleges in the United States spent an estimated $3.1 billion on security products and services, up from $2.7 billion in 2017, according to an American Civil Liberties Union report issued last year on the education technology surveillance industry.
Cheyenne Mountain officials declined to say how much in total the district has spent on Verkada products and services, but noted the district has paid an average of $35,000 per building on the company’s access control system and $60,000 per building on cameras. The district is made up of eight schools.
Aims Community College officials said they have spent about $1.25 million on Verkada products in the five years the school has used the company’s services.
Aims’s campus uses Verkada camera and the company’s monitoring system. John Fults, director of campus safety and security, said the school can set up a geofence around building perimeters and receive alerts and video footage when people cross the invisible line.
Fults said the college tells students upfront about the nearly 300 AI cameras on the Greeley campus.
“We always relate it to there is nowhere you can go in a city or building where there is not a camera watching every move you make,” he said.
The increasing use of AI led the Colorado legislature to form a task force to discuss best practices and regulations around artificial intelligence technologies, including notice and disclosure requirements, protecting disproportionately impacted communities from algorithmic discrimination, privacy concerns and data retention.
In 2022, state legislators passed a bill that placed a moratorium on statewide public schools contracting with vendors that offer AI facial-recognition technologies until July 2025. However, districts that already had these technologies — like Cheyenne Mountain — before the bill are allowed to continue using them.
Hansen, a Denver Democrat, sponsored the 2022 bill and the 2024 bill creating the AI task force comprised of 26 people across the technology, surveillance and artificial intelligence spectrum.
The task force is trying to determine how many school districts in Colorado are currently using AI or facial-recognition surveillance.
Hansen said he expects there will be legislation in the 2025 session that would create safeguards around the technologies.
“More and more school districts are talking about it,” Hansen said. “With many of these new technologies like AI biometrics, there are great upsides and some significant potential downsides.”
Namely, there’s no consensus as to whether AI technologies actually make schools safer.
Anaya Robinson, senior policy strategist at the ACLU of Colorado who is serving on the legislative AI task force, said everyone agrees keeping kids safe at school is a top priority, but there’s disagreement over how to do so.
“We don’t think that the potential benefits — and there’s not a whole lot of data to prove those exist — outweigh the harms not only to privacy but also the general safety and comfort and ease that students should get to feel in the place they spend the vast majority of their youth,” he said.
Robinson said he struggles to understand how the money spent on education technology surveillance wouldn’t be better spent on more school staffing.
Among the leading reports on school safety technology, a 2016 John Hopkins study found there are no “honest brokers” to test or recommend specific technologies or vendors to schools, leading many campus officials to rely on vendor-sponsored research, word of mouth or advice from law enforcement. The study found limited evidence of the success or cost-effectiveness of technology in schools to prevent and mitigate crime, disorder or catastrophic events.
The 2023 ACLU report found marginalized students including those in the LGBTQ community, students of color, low-income students and those who are undocumented or have undocumented family members are particularly susceptible to harmful consequences brought on by school surveillance.
One of the reasons for the temporary prohibition on the technologies, Hansen said, is because the legislature was presented with evidence about AI cameras misidentifying students, particularly students of color.
Miller said the Cheyenne Mountain School District doesn’t use its cameras to target students.
“If a kid is not a risk to himself for some reason — like a preschooler who is running off — outside of things like that, we do not keep kids uploaded to where a principal or vice principal at a site level could continually watch,” Miller said. “We’re not in the business of trying to shift a bias on a kid. We are only in the business of trying to keep a kid as safe as possible.”
When parents, teachers or community members express concerns over the use of AI in schools, Miller said their fear often comes from ignorance of the technology and that educating them on its benefits are key.
“It’s like, are you more worried that someone is going to steal this data or that someone is going to get hurt and killed?” Miller said during his Verkada conference panel.
Mollie Markey, Verkada’s associate communications manager, said the company does focus on privacy-protecting measures like the option to blur faces on video security feeds and configure privacy zones to block certain areas from being recorded. The firm’s technology builds in permissions for administrators to determine which people can access which features, and AI features have the option to be toggled on and off, she said.
Kenneth Trump, president of National School Safety and Security Services, said there is little to no evidence that AI security measures make schools safer. Instead, Trump said vendors of these technologies practice “marketing on steroids” that bombards school administrators.
“Administrators are under an enormous amount of pressure in school communities to do something,” Trump said. “That creates a ‘do something, do anything, do it fast and do it now’ policy.”
Trump said administrators should invest in proactive approaches to keeping schools safer rather than reactive. Funding more educators, staff, counselors and better training for the people on the ground should be the future of school security, he said.
“The real future is not in bells and whistles and shiny objects,” he said. “It’s in your people.”