From Face Mask Detection to Temperature Checks, Districts Bought AI-Surveillance Cameras to Fight COVID. Why Critics Call Them ‘Smoke and Mirrors’
Support The 74's year-end campaign. Make a tax-exempt donation now.
This story is part of a series produced in partnership with The Guardian exploring the increasing role of artificial intelligence and surveillance in our everyday lives during the pandemic, including in schools.
When students in suburban Atlanta returned to school for in-person classes amid the pandemic, they were required to cover their faces with cloth masks like in many places across the U.S. Yet in this 95,000-student district, officials took mask compliance a step further than most.
Through a network of security cameras, officials harnessed artificial intelligence to identify students whose masks drooped below their noses.
“If they say a picture is worth a thousand words, if I send you a piece of video — it’s probably worth a million,” said Paul Hildreth, the district’s emergency operations coordinator. “You really can’t deny, ‘Oh yeah, that’s me, I took my mask off.’”
The school district in Fulton County had installed the surveillance network, by Motorola-owned Avigilon, years before the pandemic shuttered schools nationwide in 2020. Under a constant fear of mass school shootings, districts in recent years have increasingly deployed controversial surveillance networks like cameras with facial recognition and gun detection.
With the pandemic, security vendors switched directions and began marketing their wares as a solution to stop the latest threat. In Fulton County, the district used Avigilon’s “No Face Mask Detection” technology to identify students with their faces exposed.
During remote learning, the pandemic ushered in a new era of digital student surveillance as schools turned to AI-powered services like remote proctoring and digital tools that sift through billions of students’ emails and classroom assignments in search of threats and mental health warning signs. Back on campus, districts have rolled out tools like badges that track students’ every move.
But one of the most significant developments has been in AI-enabled cameras. Twenty years ago, security cameras were present in 19 percent of schools, according to the National Center for Education Statistics. Today, that number exceeds 80 percent. Powering those cameras with artificial intelligence makes automated surveillance possible, enabling things like temperature checks and the collection of other biometric data.
Districts across the country have said they’ve bought AI-powered cameras to fight the pandemic. But as pandemic-era protocols like mask mandates end, experts said the technology will remain. Some educators have stated plans to leverage pandemic-era surveillance tech for student discipline while others hope AI cameras will help them identify youth carrying guns.
The cameras have faced sharp resistance from civil rights advocates who questioned their effectiveness and argue they trample students’ privacy rights.
Noa Young, a 16-year-old junior in Fulton County, said she knew that cameras monitored her school but wasn’t aware of their high-tech features like mask detection. She agreed with the district’s now-expired mask mandate but felt that educators should have been more transparent about the technology in place.
“I think it’s helpful for COVID stuff but it seems a little intrusive,” Young said in an interview. “I think it’s strange that we were not aware of that.”
‘Smoke and mirrors’
Outside of Fulton County, educators have used AI cameras to fight COVID on multiple fronts.
In Rockland Maine’s Regional School Unit 13, officials used federal pandemic relief money to procure a network of cameras with “Face Match” technology for contact tracing. Through advanced surveillance, the cameras by California-based security company Verkada allow the 1,600-student district to identify students who came in close contact with classmates who tested positive for COVID-19. In its marketing materials, Verkada explains how districts could use federal funds tied to the public health crisis to buy its cameras for contact tracing and crowd control.
At a district in suburban Houston, officials spent nearly $75,000 on AI-enabled cameras from Hikvision, a surveillance company owned in part by the Chinese government, and deployed thermal imaging and facial detection to identify students with elevated temperatures and those without masks.
The cameras can screen as many as 30 people at a time and are therefore “less intrusive” than slower processes, said Ty Morrow, the Brazosport Independent School District’s head of security. The checkpoints have helped the district identify students who later tested positive for COVID-19, Morrow said, although a surveillance testing company has argued Hikvision’s claim of accurately scanning 30 people at once is not possible.
“That was just one more tool that we had in the toolbox to show parents that we were doing our due diligence to make sure that we weren’t allowing kids or staff with COVID into the facilities,” he said.
Yet it’s this mentality that worries consultant Kenneth Trump, the president of Cleveland-based National School Safety and Security Services. Security hardware for the sake of public perception, the industry expert said, is simply “smoke and mirrors.”
“It’s creating a façade,” he said. “Parents think that all the bells and whistles are going to keep their kids safer and that’s not necessarily the case. With cameras, in the vast majority of schools, nobody is monitoring them.”
‘You don’t have to like something’
When the Fulton County district upgraded its surveillance camera network in 2018, officials were wooed by Avigilon’s AI-powered “Appearance Search,” which allows security officials to sift through a mountain of video footage and identify students based on characteristics like their hairstyle or the color of their shirt. When the pandemic hit, the company’s mask detection became an attractive add-on, Hildreth said.
He said the district didn’t actively advertise the technology to students but they likely became aware of it quickly after students got called out for breaking the rules. He doesn’t know students’ opinions about the cameras — and didn’t seem to care.
“I wasn’t probably as much interested in their reaction as much as their compliance,” Hildreth said. “You don’t have to like something that’s good for you, but you still need to do it.”
A Fulton County district spokesman said they weren’t aware of any instances where students were disciplined because the cameras caught them without masks.
After the 2018 mass school shooting in Parkland, Florida, the company Athena Security pitched its cameras with AI-powered “gun detection” as a promising school safety strategy. Similar to facial recognition, the gun detection system uses artificial intelligence to spot when a weapon enters a camera’s field of view. By identifying people with guns before shots are fired, the service is “like Minority Report but in real life,” a company spokesperson wrote in an email at the time, referring to the 2002 science-fiction thriller that predicts a dystopian future of mass surveillance. During the pandemic, the company rolled out thermal cameras that a company spokesperson wrote in an email could “accurately pre-screen 2,000 people per hour.”
The spokesperson declined an interview request but said in an email that Athena is “not a surveillance company” and did not want to be portrayed as “spying on” students.
Among the school security industry’s staunchest critics is Sneha Revanur, a 17-year-old high school student from San Jose, California, who founded the youth-led group Encode Justice to highlight the dangers of artificial intelligence on civil liberties.
Revanur said she’s concerned by districts’ decisions to implement surveillance cameras as a public health strategy and that the technology in schools could result in harsher discipline for students, particularly youth of color.
Verkada offers a cautionary tale about the potential harms of pervasive school surveillance and student data collection. Last year, the company suffered a massive data breach when a hack exposed the live feeds of 150,000 surveillance cameras, including those inside Tesla factories, jails and at Sandy Hook Elementary School in Newtown, Connecticut. The Newtown district, which suffered a mass school shooting in 2012, said the breach didn’t expose compromising information about students. The vulnerability hasn’t deterred some educators from contracting with the California-based company.
After a back-and-forth with the Verkada spokesperson, the company would not grant an interview or respond to a list of written questions.
Revanur called the Verkada hack at Sandy Hook Elementary a “staggering indictment” of educators’ rush for “dragnet surveillance systems that treat everyone as a constant suspect” at the expense of student privacy. Constant monitoring, she argued, “creates this culture of fear and paranoia that truly isn’t the most proactive response to gun violence and safety concerns.”
In Fayette County, Georgia, the district spent about $500,000 to purchase 70 Hikvision cameras with thermal imaging to detect students with fevers. But it ultimately backtracked and disabled them after community uproar over their efficacy and Hikvision’s ties to the Chinese government. In 2019, the U.S. government imposed a trade blacklist on Hikvision, alleging the company was implicated in China’s “campaign of repression, mass arbitrary detention and high-technology surveillance” against Muslim ethnic minorities.
The school district declined to comment. In a statement, a Hikvision spokesperson said the company “takes all reports regarding human rights very seriously” and has engaged governments globally “to clarify misunderstandings about the company.” The company is “committed to upholding the right to privacy,” the spokesperson said.
Meanwhile, Regional School Unit 13’s decision to use Verkada security cameras as a contact tracing tool could run afoul of a 2021 law that bans the use of facial recognition in Maine schools. The district didn’t respond to requests for comment.
Michael Kebede, the ACLU of Maine’s policy counsel, cited recent studies on facial recognition’s flaws in identifying children and people of color and called on the district to reconsider its approach.
“We fundamentally disagree that using a tool of mass surveillance is a way to promote the health and safety of students,” Kobede said in a statement. “It is a civil liberties nightmare for everyone, and it perpetuates the surveillance of already marginalized communities.”
In Fulton County, school officials wound up disabling the face mask detection feature in cafeterias because it was triggered by people eating lunch. Other times, it identified students who pulled their masks down briefly to take a drink of water.
In suburban Houston, Morrow ran into similar hurdles. When white students wore light-colored masks, for example, the face detection sounded alarms. And if students rode bikes to school, the cameras flagged their elevated temperatures.
“We’ve got some false positives but it was not a failure of the technology,” Hildreth said. “We just had to take a look and adapt what we were looking at to match our needs.”
With those lessons learned, Hildreth said he hopes to soon equip Fulton County campuses with AI-enabled cameras that identify students who bring guns to school. He sees a future where algorithms identify armed students “in the same exact manner” as Avigilon’s mask detection.
In a post-pandemic world, Albert Fox Cahn, founder of the nonprofit Surveillance Technology Oversight Project, worries the entire school security industry will take a similar approach. In February, educators in Waterbury, Connecticut, spurred controversy when they proposed a new network of campus surveillance cameras with weapons detection.
“With the pandemic hopefully waning, we’ll see a lot of security vendors pivoting back to school shooting rhetoric as justification for the camera systems,” he said. Due to the potential for errors, Cahn called the embrace of AI gun detection “really alarming.”
Disclosure: This story was produced in partnership with The Guardian. It is part of a reporting series that is supported by the Open Society Foundations, which works to build vibrant and inclusive democracies whose governments are accountable to their citizens. All content is editorially independent and overseen by Guardian and 74 editors.
Support The 74's year-end campaign. Make a tax-exempt donation now.