
2025 became a turning point in the debate on digital surveillance. Investigations by the Electronic Frontier Foundation (EFF) into Flock Safety revealed that automated license plate recognition (ALPR) systems have long moved beyond “public safety” tools and evolved into an infrastructure of mass control—open to abuse, discrimination, and political pressure.
This is not about isolated incidents or “misuse.” The problem lies in the very architecture of the system.
By analyzing logs of more than 12 million search queries conducted by nearly 4,000 law enforcement agencies, EFF identified a clear pattern: Flock’s system was actively used to monitor protests.
Police tracked participants in mass demonstrations—from the 50501 protests in February to the No Kings movement in June and October. Queries explicitly or implicitly referenced protest activity, and in some cases used neutral wording to mask surveillance of the constitutional rights to free speech and assembly.
Particularly revealing was the targeting of Direct Action Everywhere, an organization that exposes cruelty in industrial animal agriculture through civil disobedience. ALPR enabled law enforcement to precisely track activists’ movements—without warrants or judicial oversight—because they challenged powerful industries.
This is a classic chilling effect: when people know their movements can be recorded, they are less likely to participate in protests.
One of EFF’s most disturbing findings was large-scale ethnic discrimination amplified by technology.
More than 80 law enforcement agencies used ALPR queries containing racist and stereotypical language targeting Roma people. Searches such as “Roma traveler” or openly derogatory terms were executed without any link to criminal activity. The Convoy feature allowed tracking groups of vehicles—effectively entire communities—based solely on ethnic identity.
This is not merely individual bias. It is an example of how digital systems automate and scale discrimination, making it faster, cheaper, and less visible.
Another defining issue of 2025 was the use of ALPR in abortion-related cases. In Texas, law enforcement used the Flock network under the pretext of a “missing persons investigation,” which in reality targeted a woman who had a self-managed abortion.
A single query labeled “had an abortion, search for female” unlocked access to more than 83,000 cameras nationwide. No warrant. No meaningful oversight. Maximum intrusiveness.
This case highlights a core danger: a centralized surveillance network turns local persecution into a nationwide operation.
In response to criticism, Flock Safety announced data retention limits, geofencing, and other “privacy-friendly” features. But as EFF rightly notes, cosmetic changes do not cure a systemic disease.
The issue is not configuration—it is the business model. Flock profits from building a single, interconnected surveillance network, where every new customer amplifies risks for everyone else.
In 2025, Flock announced expanded functionality: always-on microphones capable of detecting “human distress,” including screaming. This marks a shift from visual monitoring to audio surveillance in public spaces.
Such systems create serious risks:
After public backlash, Flock removed the word “screaming” from its marketing materials—but did not abandon development of the technology. This is a textbook case of regulatory theater, not real accountability.
2025 also showed something else: communities can push back. Following EFF’s reporting, cities from Austin to Evanston terminated contracts with Flock. In California, a lawsuit was filed over millions of warrantless searches. Federal and state authorities launched investigations.
This proves that technology is not inevitable. It exists only as long as society consents to its use.
The situation described in the Flock Safety case in the U.S. is particularly relevant for Ukraine for several reasons. First, after 2022, Ukrainian cities and government agencies have actively been implementing digital video surveillance systems, facial recognition, traffic analysis, and “smart city” solutions for public safety. This creates a risk that technologies intended for law enforcement may transform into tools of mass control without proper legal and ethical safeguards.
Second, Ukraine’s legal framework for privacy and personal data protection is still developing. GDPR in the EU and global cases like Flock Safety show that even in countries with long-standing traditions of protecting citizens’ rights, mass surveillance systems are quickly used for discrimination or political pressure. In Ukraine, without clear rules and procedures, “red lines” can be easily crossed, endangering the rights of minorities, activists, and participants in protest movements.
Third, the risk of a technological “cold trail”—when data is collected centrally and remains accessible to authorities or private companies—is particularly high in Ukrainian regions with unstable security situations. Using ALPR systems to track people or vehicles without judicial oversight can become a tool of pressure on political activity, freedom of movement, and the exercise of civil rights.
Finally, Ukrainian society already demonstrates a high level of digital awareness: activists, human rights defenders, and media monitor the use of technology by police and municipalities. Contemporary cases like Flock Safety serve as a signal for developing national rules—from transparency in video surveillance use to limits on data storage and sharing. Without such regulations, any innovation in the “smart city” domain could become not a tool for safety, but a mechanism of control and potential abuse.
Thus, the lessons of 2025 for Ukraine are clear: implementing public safety technologies without clear legal frameworks creates risks to human rights, discrimination, and political pressure. ALPR and mass surveillance are not just technological issues—they are political and social. Ukraine must prevent the “normalization” of total control before it becomes part of everyday life.
Flock Safety is not about safety. It is an example of how, under slogans of “efficiency” and “crime prevention,” a surveillance infrastructure incompatible with democratic values is built.
For Europe and Ukraine—where video analytics and “smart city” systems are rapidly expanding—the lesson is clear:
architectures of total surveillance will always be used against the most vulnerable—protesters, minorities, women, and activists.
The question is no longer whether abuses are possible.
The question is whether society is willing to stop them before they become the norm.
Subscribe to our channels on social networks:
Contact us: business@avitar.legal
Violetta Loseva
,