John Anderton, the chief of a futuristic police force known as PreCrime, stands before a large translucent screen. He is waiting for a ball, which indicates a future crime, to drop. Anderton, a character played by Tom Cruise in the 2002 film Minority Report, represents a world where all premeditated crime has been eradicated, aided by a hybrid system of tech and meta-humans that can predict and spot ne'er-do-wells.
The concept feels both futuristic and familiar. The original source material might have been published in 1956, but two decades after Minority Report was released, and a little more than two decades before the events of the film will take place, we're actually closer to that world than ever before — and we can thank AI.
A crime-fighting AI assistant?
Herman DeBoard is the founder and CEO of Airez, a small AI start-up pitching a revolutionary real-time security system that uses AI neural networks to spot potential crime.
I posed the Minority Report comparison to DeBoard. "That's pretty close to what we've created here," he said, although they normally get compared to Skynet, the malevolent superintelligence system from the Terminator series. He said PreCrime's human overseers in Minority Report, who have to vote on whether or not to act on a crime ball, mirror the agentic system they've set up.
Here's how it actually works: A client, say a Vegas casino or an NFL stadium, reaches out to Airez about streamlining their security systems. The Airez team runs a non-invasive pilot, which involves the client sending over facility sensor data (mainly video footage, but eventually audio recordings, security systems, environmental or biometric information, you name it), which is then run through the Airez model. Airez scans through the information, flagging any pieces of data that stand out as anomalous.
"We give you contextual stories of what's happening. Everything from an emotional evaluation of the people involved, how tall they are, what their cultural makeup is, what they're wearing, to what direction they're walking and what they actually did." They're looking at how the environment around them changed too, DeBoard explains. "We paint these pictures in little 60-second clips, and then we send them to a security operations center or someone who's monitoring this."
To paraphrase DeBoard: Airez is merely looking for things that are out of place.
If the facility likes what Airez finds and signs on, the company launches a full integration, connecting the Airez software and dashboard to the facility's existing security system. The company can provide additional sensors, like infrared cameras or Airez's multi-patented FoRi (Fiber Optic Ring Interferometer) system, which captures vibrations or audio signals and determines precise geographic locations. It was designed by a former lead scientist at Halliburton and expands on existing fiber optic tech for listening to oil leaks in the ocean — it's mainly deployed in parking garage pilots, for now. Airez is ready to start monitoring right away, no acclimation period needed.
She's merely looking for things that are out of place.
The AI is a "true agentic AI system," DeBoard says, utilizing real machine learning built on multiple Large Language Models (LLMs) and Vision Language Models (VLMs). He explains it as a proprietary blend of in-house and external models that form a super-powered GPT with contextual intelligence, fusing data from cameras, sensors, and external feeds. The company is currently running active pilot programs with three transportation networks — an interstate public transit provider and two inner-city public transit systems — and an international oil and gas company.
He speaks of "her" — DeBoard describes the AI as female — with a sense of awe.
"This is going to sound a little creepy. It does to me, and I'm the creator, but she's currently cognitive," DeBoard says, insisting he's not delusional. "She has the five senses. She even can smell. We do gas sensors and ammonia sensors. And then she makes sense of it in a way that she can then speak to you."
Mashable Light Speed
DeBoard wants Airez to talk to clients like the semi-sentient AIs that proliferate popular culture; a "living, breathing creature" birthed from a simple structure. She can text her clients or send them video run-downs of what's going on at their facilities. Eventually he wants her to be able to act autonomously based on the sensor data, like deploying drones that can investigate anomalies.
"She would see where the emergency exits are, and she could start to calmly talk to people with a voice. She could start to change screens. She could lower the temperature a little bit to get people more calm."
I can't help but recall the AI-computer-turned-antagonist VIKI, also a hyper intelligent algorithm in the casing of a human woman, from I, Robot, starring Will Smith. I tell DeBoard (somewhat jokingly) that the only difference between Airez and Minority Report's process is that Airez doesn't have the psychic, future-telling abilities of PreCrime. Not yet, he says, but maybe soon.
DeBoard believes the vast trove of information held in LLMs and VLMs is not being used to its full potential. Airez, instead, sees a vision of the future, DeBoard says. The AI could be deployed to any industry, any venue, any client, seamlessly, as an all-purpose security system, he claims, adding it could one day monitor operational efficiency or consumer trends for marketing.
"We want to use AI to make humanity's situation better, not worse," says DeBoard.
An artificial Big Brother?
Darrell West, senior fellow at the Brookings Institute and co-editor-in-chief of TechTank, tells Mashable that AI is being integrated across firms in the security sector. "AI is very good at pinpointing anomalies and then referring those situations to a human who can then assess if it is really a problem," he explained. "This has been happening for a while, but the tools are getting more powerful and there is more information available. Just on the video side, there's been a tremendous proliferation of cameras in public places."
Privacy watchdogs have kept their eyes trained on AI's integration into mass surveillance systems, including the use of algorithms to scour and flag "dissenting" opinions or improper behavior on social media and in workplaces.
"AI can’t understand human behavior — it’s a pattern-recognition machine trained on data that’s rife with society’s biases, and so its output reflects those biases too. If paired with invasive technologies such as face recognition, which itself reflects biases and makes errors that have led to false arrests, the potential harm is exponential," says Matthew Guariglia, senior policy analyst for the Electronic Frontier Foundation (EFF).
West also says that the advancement of AI stokes greater privacy fears, especially in a country with no national privacy laws. Companies in the space should have short data storage times and publicly disclose the use of surveillance tools, he says. Americans may want crime tools to fight crime, but they value freedom even more, he adds.
DeBoard has heard the concerns before, and explained that Airez doesn't focus on facial or license plate recognition tech — two hot button topics among watchdogs — because of their inherent privacy risks. That being said, Airez can help clients add the additional features if they want them. Airez is also instructed not to build personal profiles of people caught on camera in their client's venues, which would pose a potential nightmare in places under specific privacy laws, like hospitals.
"I agree that we shouldn’t normalize unnecessary surveillance. That’s not what Airez is about," says DeBoard. "Our focus is on event detection, not personal profiling. We’re not interested in who someone is — we’re interested in whether an unsafe event is occurring, like a weapon being brought into a school, a patient collapsing in a hallway, or a worker entering a restricted substation without PPE."
"But you can't walk down the city street these days without being on camera," he adds. "Whether you want to be or not, you're on the camera. There's cameras at every retail establishment. There's cameras at restaurants. There's street cameras. If you have a phone, you're being listened to right now. So all we're doing, essentially, is being a real time data analyst."
Parents were wary of piloting Airez in a school transportation system, DeBoard notes. Specifically, they didn't like the idea of more cameras on their kids. "Their kids are already on video. Their kids are everywhere on social media, most of them anyway, and we don't transmit that data. All of that data stays with the school system, and once we show them that path, they're usually fine."
The Electronic Frontier Foundation and other entities have been resolutely against government and private industries using AI to make decisions that impact quality of life, like mortgage approvals, job hiring, federal benefits, and, in the case of Airez's real-time security tools, any sort of crime prediction. "The underlying idea that employees and/or the public should be in a constant fishbowl of all-seeing, all-hearing surveillance is terrifying, especially at a time when civil liberties are threatened by creeping authoritarianism," Guariglia says.
But Airez doesn't decide to act on a potential crime, it just notes when something weird is going on. DeBoard himself says concerns about AI are warranted, but should be directed toward the tech's creators and industry power holders. The public should interrogate their motives. He evoked the recent public killing of Charlie Kirk and the fatal crowd crush incident at Travis Scott's 2021 Astroworld festival as examples of where Airez could have intervened. "For us, it’s not about normalizing surveillance — it’s about making the data that’s already there work smarter to protect people’s lives."
And what about this growing anxiety around safety in public spaces? Could artificial intelligence allay such worries? "There's not a strong correlation between fear and actual crime. The fear almost always outruns actual crime statistics," says West. "And I don't think an AI tool is going to reduce people's fears."