Jay Stanley, Senior Policy Analyst, ACLU Speech, Privacy, and Technology Project

Free Future home

The cloud Automatic License Plate Reader (ALPR or LPR) company Flock is building a dangerous nationwide mass-surveillance infrastructure, as we have been pointing out for several years now. The problem with mass surveillance is that it always expands beyond the uses for which it is initially justified — and sure enough, Flock’s system is undergoing insidious expansion across multiple dimensions. If your community adopts this technology, you need to know it’s doing more than just recording what car is driving where and at what time. It’s worth stepping back and looking at an overview of what’s going on.

The company’s surveillance data is being used by ICE.
First, as has received wide attention, this system is being used by ICE to help carry out the Trump Administration’s abusive removal program.

Flock sells their cloud-connected cameras to police departments and private customers across the nation, pulls the license plate readings they collect into their own servers, and allows police to do nationwide searches of the resulting database, giving even the smallest-town police chief access to an enormously powerful driver-surveillance tool. The tech news outlet 404Media obtained records of nationwide searches which include a field in which officers list the purpose of their search. These records revealed that many of the searches were carried out by local officers on behalf of ICE for immigration purposes, including its notorious Enforcement and Removal Operations division. Emails from police departments in Oregon also shed light on how local police are providing informal assistance to ICE.

It’s safe to say that even many who support the use of ALPR programs by their local police to catch local criminals do not support funneling the data that is collected to the Trump Administration and those carrying out its abusive and often unlawful immigration program.

A search for a recipient of an illegal abortion
The same kinds of police department logs that revealed ICE’s access to Flock’s dragnet also revealed that a police officer in Texas used the system to search nationwide for a woman who’d had a self-administered abortion — illegal in the state. An abortion rights group told 404Media that, based on calls to their hotline, already “there is an overwhelming fear” among women that they’re “being watched and tracked by the state” — and such reports are hardly going to help. This mass surveillance tool is creating fear among those targeted by immigration, anti-abortion, and other regressive actions, but eventually everyone will become aware that their movements are being tracked. That’s no way to live in a democratic society.

Plugging in to data brokers
Meanwhile, as police around the nation expand their uses of this surveillance machinery, Flock is expanding the power of the system itself. For example, the company is planning to plug its systems into commercial data brokers that offer services such as “people lookup.” Flock has long claimed that their LPRs don’t collect personally identifiable information, as if license plates can’t easily be connected to specific people. That claim was always bogus, but with their new product they make that falsity explicit, boasting that the new product will let police “jump from LPR to person.”

In the 1970s, after some government agencies were found to be building dossiers on people who aren’t suspected of involvement in crime like the East German Stasi, Congress enacted the Privacy Act banning agencies from such recordkeeping. Yet the ethically shady and frequently inaccurate data broker industry does basically the same thing, and when law enforcement becomes a customer of those data brokers, it represents an end-run around the law. By tying its LPR data together with data brokers, Flock is effectively automating and scaling the end run around our checks and balances that law enforcement data broker purchases represent. (A proposal called the Fourth Amendment is Not For Sale Act that would ban this was passed by the U.S. House in 2024, but got blocked in the Senate.)

From still to video, and with AI
In another major expansion, Flock is turning its plate readers into surveillance cameras. The company has announced that police departments will soon be able to obtain not just still photos from ALPR cameras, but also video, with the ability to request live feeds or 15-second clips of cars passing by the cameras. And Flock is using AI to let law enforcement search through that data using natural language searches. The company uses the example of searching for “landscaping trailer with a ladder,” but we have to assume searches could encompass descriptions of anything captured by one of their cameras, including vehicle occupants and bystanders.

We recently wrote about how generative AI is turbo-charging video search and surveillance, and this is an example of the trend. Imagine that a police officer stood on your street writing detailed notes about you every time you drove or walked by them. All the details about what your car looks like (make, model, color, distinguishing characteristics, bumper stickers, etc.), as well as details about visible occupants and pedestrians — how many, at what time, their activities, demographic data, what they are wearing, attributes they may have such as a beard, hat, tattoo, or t-shirt, and what that hat, t-shirt, or tattoo might say. Now imagine that there is an army of police officers doing this on every block.

This is the surveillance world that Flock is building.

Creating an infrastructure for corporate blacklisting and surveillance
In June, Flock also announced the launch of a “Flock Business Network,” a “collaborative hub designed to help private sector organizations work together to solve and prevent crime.”

This will sound ominous to anyone familiar with the very long history of private companies and government agencies working together to create watch lists, blacklists, and databases about people in the United States. In the heyday of the labor movement (and perhaps today), organizers were commonly put on blacklists as “troublemakers,” and could have trouble getting a job as their name was shared among employers. During the civil rights, antiwar, and other social justice movements of the 20th century, there were a number of private databases created by shady collections of right-wing vigilantes and super-patriots who took it upon themselves to compile dossiers on activists they disagreed with. These private databases, such as the San Diego Research Library and the Western Goals Foundation, were shared with police and government security agencies and took on quasi-official roles in the efforts of police “intelligence” arms to combat those progressive movements, while remaining outside the normal checks and balances of government.

Today, face recognition technology threatens to make these lists easier than ever to create and administer — and so does license plate surveillance. In its announcement, Flock boasted that its service would allow companies to “add vehicles to Flock Hotlists… so any user subscribed to that Hotlist is alerted the next time that vehicle is detected by a Flock LPR,” giving the private sector “the power of a shared network to identify threats.” Elsewhere Flock says, “By sharing insights and intelligence, companies can identify patterns, suspects, and criminal networks that might not be apparent to a single security team.”

Investigating criminals should be the job of law enforcement, not big companies that have strong incentives to use these infrastructures against labor activists, disfavored customers, and others — to use them not for crime, but to protect the bottom line.

Generating suspicion
Finally, as I recently wrote about, Flock has also introduced AI analytics products that shift the company from providing tools for officials to use in investigating suspicion to generating suspicion. Because the company funnels plate reads from customers across the nation into its own centralized database, it is able to run analytics on that dataset. One such analysis service that it has begun selling is an attempt to identity “large-scale criminal activities” by scanning the movement patterns of all vehicles contained in their dataset to try to identify and alert law enforcement to those that their algorithm decides are “suspect.”

Overall, this explosion of new uses is what happens when you build an authoritarian tracking infrastructure — it expands in more and more ways. State legislatures and local governments around the nation need to enact strong, meaningful protections of our privacy and way of life against this kind of AI surveillance machinery.

 

Date

Saturday, August 16, 2025 - 12:45pm

Featured image

Vintage car driving on a road

Show featured image

Hide banner image

Override default banner image

Vintage car driving on a road

Tweet Text

[node:title]

Share Image

Vintage car driving on a road

Related issues

Privacy

Show related content

Imported from National NID

213776

Menu parent dynamic listing

1776

Imported from National VID

213876

Imported from National Link

Show PDF in viewer on page

Style

Standard with sidebar

Teaser subhead

Build it (an authoritarian tracking infrastructure) and they (expanded uses) will come.

Show list numbers

Jay Stanley, Senior Policy Analyst, ACLU Speech, Privacy, and Technology Project

Free Future home

The Trump Administration’s targeting of immigrants, combined with growing awareness that ICE is making use of local automated license plate reader (ALPR) systems in the Trump Administration’s mass deportation efforts, appears to be adding to growing opposition to such driver-surveillance programs around the country.

As I discussed in my Monday overview of the company, the most prominent seller of ALPR cameras, Flock, allows police departments to make the data from their plate readers available to other departments across the country in what amounts to a giant nationwide warrantless surveillance infrastructure. In May, we learned that national searches are being carried out by local officers on behalf of ICE for immigration purposes.

This data sharing shouldn’t have been surprising; ICE has long shown an interest in using this technology to locate people. In 2019, the ACLU of Northern California explained how ICE was using a system run by Flock competitor Motorola to target immigrants — in part by making requests to friendly local officers who had access to the company’s giant location database.

Higher stakes
Still, this newest revelation seems to be reverberating across the country, fueling debates over the technology. That’s likely due in part to greater awareness of the general privacy problems with the technology. But it’s definitely also because the stakes are higher given the Trump Administration’s abusive and illegal (and unpopular) anti-immigrant program, which has targeted not only undocumented immigrants but also green card holders, others with various forms of legal status, and naturalized citizens.

Adding to the angst over LPR programs has been the revelation that a police officer in Texas used Flock’s system to search nationwide for a woman who’d had a self-administered abortion — illegal in the state. LPR data can be abused not just by ICE, and not just for immigration purposes, but for other purposes too. The Trump Administration has been going after all manner of perceived enemies, and there are many scenarios in which location data could be used to compromise someone.

I warned about ICE usage of Flock in a 2022 white paper, where I quoted Flock CEO Garret Langley’s response when asked by Vice News whether Flock could be used for immigration purposes: “Yes, if it was legal in a state, we would not be in a position to stop them,” adding, “We give our customers the tools to decide and let them go from there.” Of course, if your town’s data is being shared across the nation, you don’t actually have the tools to decide —anyone with access to that data could share it with ICE. Or anti-abortion state officials. Or others.

Policymakers need to recognize that the boundaries between local surveillance and the Trump Administration (as well as malevolent administrations in other states) are porous and hard to maintain — that if you collect local data on your residents’ comings and goings, it will be hard to keep that from being used in unintended ways. And recognition of this does seem to be growing. Examples in recent months include:

  • In Gig Harbor, Washington, the police chief found himself being grilled by city council members over ICE data sharing. “You sent some information I think about the city of Puyallup and they said that I believe they belong to 593 networks,” asked one. “So … if we share data with Puyallup what's to keep Puyallup from putting that data all over 593 networks?”
  • In Syracuse, New York, councilors demanded to know how their local LPR data ended up being searched 4.4 million times by police around the country without a warrant and shared with ICE despite promises to the contrary.
  • The Evanston police department revoked out-of-state access to its ALPR database after revelations that the city’s data had been subject to at least seven ICE searches, despite a state law that bans data sharing with ICE.
  • Austin officials (shown in photo above questioning police officials) decided not to renew the city’s contract with Flock. “Austin should not be participating in Trump’s mass surveillance programs,” declared Council Member Mike Siegel.
  • In Denver the city council rejected a Flock contract proposal unanimously. “We acknowledge that today’s environment is much different than when the pilot began in early 2024,” said the mayor.
  • In Richmond, the police agreed to share LPR data with the Bureau of Alcohol, Tobacco, Firearms and Explosives (AFT). In June they found out that ATF had been making immigration-related searches without notifying the city. Richmond then cut off the data sharing.

Even in places that ultimately approve of Flock contracts, I’m hearing more questioning and opposition by elected officials and community members. And we’re starting to see interest among state legislators in putting new limits on ALPR surveillance. A lawsuit in Virginia is rightly claiming that LPR readers of a certain density violate the Fourth Amendment by routinely tracking people not suspected of wrongdoing.

I’m not the only one to notice this trend — after I initially drafted this piece, the surveillance research publication IPVM published a similar report, “Flock Facing Rising Opposition,” detailing local opposition across the country.

Behind much of this activity stand local activists working against this program — for example, a coalition pushing back hard on LPR in San Diego, and a man named Will Freeman, who runs a site called “DeFlock” that seeks to map out the exact locations of LPR devices across the country. Flock tried to scare Freeman by sending him a letter threatening legal action, but fortunately he obtained representation by our friends at the Electronic Frontier Foundation, who decisively shut down the company’s unethical intimidation attempt.

Increasing awareness of how data collected locally can be used by ICE and the Trump Administration should also grow to cover private customers of companies like Flock. That includes not just customers like homeowner associations but also the managers — and customers — of companies like Home Depot and Lowes, which records show are feeding data from hundreds of cameras they maintain to law enforcement through Flock.

The spreading opposition to mass driver surveillance is great, but we need to see a lot more of it. Let’s hope that state legislatures and local governments around the nation not only move in the same direction as these towns and officials, but go on to enact strong, meaningful protections of our privacy against this AI surveillance machinery.

 

Selected ACLU coverage of license plate scanners:

Flock’s Aggressive Expansions Go Far Beyond Simple Driver Surveillance (August 2025)

Surveillance Company Flock Now Using AI to Report Us to Police if it Thinks Our Movement Patterns Are “Suspicious” (July 2025)

Communities Should Reject Surveillance Products Whose Makers Won't Allow Them to be Independently Evaluated (March 2024)

Californians Fought Hard for Driver Privacy Protections. Why Are the Police Refusing to Follow Them? (February 2024)

How to Pump the Brakes on Your Police Department’s Use of Flock’s Mass Surveillance License Plate Readers (February 2023)

White paper: Fast-Growing Company Flock is Building a New AI-Driven Mass-Surveillance System (March 2022)

Report: You Are Being Tracked (July 2013)

Date

Tuesday, August 19, 2025 - 11:00am

Featured image

Police being grilled on Flock at Austin City Council meeting

Show featured image

Hide banner image

Override default banner image

Police being grilled on Flock at Austin City Council meeting

Tweet Text

[node:title]

Share Image

Police being grilled on Flock at Austin City Council meeting

Related issues

Privacy Immigrants' Rights

Show related content

Imported from National NID

214109

Menu parent dynamic listing

1776

Imported from National VID

214165

Imported from National Link

Show PDF in viewer on page

Style

Standard with sidebar

Teaser subhead

Policymakers are beginning to recognize that the boundaries between local surveillance and the Trump Administration are hard to maintain.

Show list numbers

Pages

Subscribe to ACLU of Maine RSS