Reverse Surveillance: Recalibrating the Panopticon through Government Watch

Domestic Data Streamers
10 min read5 days ago

--

Words by Pau Aleikum, edited by Jaya Bonelli

In today’s ever-connected world, we’re under constant observation. Surveillance cameras dot our urban landscapes, data companies track our digital footprints, and governments worldwide have been known to monitor citizens’ communications. This network of watchful eyes has become a fabric of modern life, so embedded in our daily experiences that we barely notice it anymore. Surveillance, from the French word for “watching over,” has traditionally been a tool of power. Governments and corporations hold the keys to vast surveillance systems, casting a wide net of control and influence. Ever since Jeremy Bentham’s 1791 writing Panopticon, or The Inspection House, where he describes a panopticon, a prison in which inmates could be watched without knowing if they’re being observed, the power dynamic has always been top-down, a central authority monitoring those beneath it.

This is exactly what was further criticized by French philosopher Michel Foucault, in his infamous book Discipline and Punish — he identifies this panopticon-prison with our modern disciplinary society, in which every thought, movement and action is constantly monitored and surveilled, a net hindrance to the idea of personal freedom.

Panopticon illustration by Adam Simpson for the New York Times

The idea of the panopticon is even more so relevant today too, when we see to what extent surveillance laws have been growing all over Europe in the last decade:

In Spain, the use of surveillance systems by police is enabled by the Organic Law on the Protection of Citizen Security, also known as the “Gag Law”. This law was adopted in 2015 and has been criticized for its impacts on freedom of assembly and expression, as well as for targeting journalists covering police actions. The law is currently undergoing a reform process, but there are concerns that the repressive nature of the law has not been adequately addressed in the proposed changes​​. (Civicus)

In France, the use of surveillance systems by police is regulated by a security bill passed in 2021. This law extends police powers and restricts the publishing of images with the intent to harm on-duty police officers. It also gives more autonomy to local police and extends the use of surveillance drones. Critics have raised concerns about the potential impact of the law on civil liberties and its potential to enable abusive legal proceedings​. (France24)

In 2016, the European Court of Human Rights (ECHR) found that broad secret surveillance activities conducted by the Hungarian Anti-Terrorism Task Force violated the rights of the applicants, citing Hungary’s failure to provide judicial oversight over Task Force actions and other sufficiently precise and effective safeguards. The applicants, two Hungarian lawyers, challenged the country’s 2011 anti-terrorism legislation due to its permitting sweeping secret surveillance activities. After the Hungarian Constitutional Court dismissed their case, the lawyers filed a complaint with the ECHR, which ruled in their favour. (LOC)

In the UK, the Public Order Act of 1986 governs the use of surveillance systems by police. However, the Police, Crime, Sentencing, and Courts Bill, proposed in 2021, has led to controversy due to its potential impact on the right to protest. The bill would give the police more power to set conditions on non-violent protests, including those deemed to be a nuisance. Critics argue that this infringes on the right to peaceful protest.

Flipping the script

Yet, as technology advances, so does the nature of surveillance — and the who is doing the watching, too. The improved access to technology, such as the ubiquity of smartphones and the internet, has also given citizens the means to reverse this gaze. Reverse surveillance, or sousveillance coined by Steve Mann (Monahan, T. 2006) comes from the French word “sous” (meaning “under”) coupled with “veillance” (meaning “watching”), is the act of monitoring the activities of those traditionally in positions of power, a subversive vigilance from “underneath”: a form of surveillance inquiry or legal protection involving the recording, monitoring, study, or analysis of surveillance systems, proponents of surveillance. This phenomenon represents a significant shift in the dynamics of surveillance and a potential counterbalance to the overwhelming “top-down” power of the Panopticon.

Sousveillance with wearable cameras: Comparison of surveillance and sousveillance devices. From “Sousveillance,” by S. Mann, 1998. Copyright 1998 by Steve Mann. Retrieved from http://wearcam.org.

The project

This project comes as an answer to this growing anti-governmental mobilization towards surveillance laws and precedes a continuation of The Flemish Scrollers, first developed by the great creative technologist Dries de Poorter in 2021. In this project, he used the YouTube channel of the Flemish government in Belgium to track the moments in which representatives got distracted by their phones.

Dries de Pooerter, 2021

In our case, we are extending the scope of the project by making use of the widespread access to online streaming cameras in different parliaments (Spanish, French, UK and more). The point is to subvert new computer vision technologies — generally used in acts of surveillance by governments — into tools that can help us detect behaviours and moments of disengagement during government sessions, such as sleepiness, signs of boredom, absences, thus holding politicians accountable for their actions (or lack thereof).

First tests on Reverse Surveillance. DDS May 2023
First tests on Reverse Surveillance. DDS May 2023

This project, while seeming to be an exercise in voyeurism, is deeply political in nature. It is about keeping in check those in power, demanding transparency, and invoking the right of citizens to know and understand the actions of their representatives. It is an exercise in democratic vigilance, a counter-measure against the apathy and indifference that can creep into any political system.

Risks

When assessing the risks of this project, we can clearly see two of them: first off, there’s a risk that the data might be misinterpreted or used out of context. A politician appearing disinterested or tired might be due to factors unrelated to their dedication to their job, and the images we create could easily be manipulated in favour of the political agendas of other parties. The second main danger is that such a project, explained without the necessary context, could contribute to the normalization of constant surveillance, potentially aggravating the erosion of privacy norms in our societies.

Future debate

Research and public discussion on surveillance and data breaches have been around for a while now. Already in 1997, Latanya Sweeneye, back then a graduate student at MIT, did an experiment to find out if she could identify the Governor of Massachusetts by using information that was available to the public — she succeeded in finding his medical records. This had a big impact on privacy laws, like the Health Insurance Portability and Accountability Act (HIPAA), which protects people’s health information.

Then again, in today’s day and age, we can also see that government surveillance is no longer the most important breach in the Panopticon. For instance, this special article from the New York Times, One Nation Tracked, published in December 2019, uncovers the pervasive location-tracking practices of numerous largely unregulated companies. These firms collect and store location data from tens of millions of mobile phones, creating extensive data files. One such file, obtained by The Times Privacy Project, contained over 50 billion location pings from the phones of more than 12 million Americans across major cities in 2016 and 2017. This data was not derived from telecoms, tech companies, or government surveillance but from location data companies. This allows for the tracking of individuals from almost every neighbourhood and block, making it virtually impossible to avoid such surveillance​​. The fact that there are no federal laws limiting this vast human tracking trade, protections against misuse are often merely protected through internal company policies.

Companies argue that people consent to be tracked and that the data is anonymous and secure. However, the New York Times refutes these claims, stating that it’s relatively simple to connect real names to the anonymized data points on the maps and that precise, longitudinal geolocation information is impossible to anonymize.

That brings us to the question: What would be the “sous-veillance” of private tech companies? How can we build reverse surveillance mechanisms for things out of the public eye?

ANNEX: Technical Development

This project extends our ongoing research on AI while aiming to create an open, replicable methodology that can be adopted by fellow artists and activists worldwide. Here, we detail the core technologies employed and provide resources to facilitate their implementation in other contexts.

Our goal is to provide an open, repeatable process for identifying moments of inactivity — like sleeping — in parliamentary streams. Below is a step-by-step guide to help artists and activists adapt this approach in their own contexts, along with ideas for alternate tools that might work better depending on local needs or hardware constraints.

  1. System Overview: we start with a YouTube link to a parliamentary broadcast (for example, the European Parliament). The framework downloads the video and reads key metadata (frame rate, duration). From there, it detects human figures, filters for meaningful moments where at least two individuals appear, and flags potential sleeping behavior through visual question-answering. Finally, the system uses Grad-CAM overlays to highlight which parts of each frame informed the sleeping classification.
  2. Video Retrieval: use youtube-dl (https://github.com/ytdl-org/youtube-dl) or a similar command-line tool to pull the video locally. Then, extract metadata with FFmpeg (https://ffmpeg.org/). Pay attention to frames per second (FPS) so you can plan how many frames to process.
  3. Person Detection: feed the video frames into Tiny YOLOv3 (https://pjreddie.com/darknet/yolo/) or an equivalent lightweight detector. If YOLOv3 feels like overkill, consider MobileNet SSD (https://github.com/chuanqi305/MobileNet-SSD) as a smaller alternative. Track how many people are in each frame. Store only frames where two or more individuals appear. This can reduce CPU/GPU usage and speed up the workflow.

4. Sleep Identification: feed the selected frames into Blip (https://arxiv.org/abs/2201.12086) using the visual question-answering feature. Ask a direct query like: “Are there any sleeping politicians in this image?” Blip provides a quick yes/no answer or a short description. Retain images it flags as containing someone sleeping. If needed, adjust the prompt to refine results (e.g., “Does this image show anyone dozing off?”).

5. Human Validation: even with good AI tools, mistakes happen. Check flagged frames manually to confirm that someone looks genuinely asleep. This step prevents overcounting or mislabeling. Multiple team members can vote on each flagged frame. If at least two people agree, move the frame to the next step.

6. Grad-CAM Overlay for Interpretability: pass confirmed frames into ALBEF (https://arxiv.org/abs/2107.07651) with Grad-CAM activated. Provide a caption like “A sleeping person in a parliamentary session.” ALBEF highlights the face or posture that triggered its sleeping detection. Save the overlay images separately. This makes it easier to present the system’s reasoning to the public.

7. Hardware and Environment: if you plan to do large-scale scans (several hours of video a day), a machine with a decent GPU is helpful. Google Colab (https://colab.research.google.com/) or Kaggle Notebooks (https://www.kaggle.com/) can offer GPU resources. Keep track of GPU usage and runtime. Processing thousands of frames can be time-consuming.

8. Alternatives and Extensions: if bandwidth is limited, sample frames at fixed intervals (e.g., every second or every five seconds). For better face detection or posture analysis, consider RetinaFace (https://github.com/serengil/retinaface) or OpenPose (https://github.com/CMU-Perceptual-Computing-Lab/openpose). For real-time detection, explore a streaming approach with a lightweight model like YOLO Nano or MobileNet variants, then apply the Grad-CAM step offline. If you worry about privacy issues, you can blur the faces of non-relevant participants or anonymize identifying details post-processing.

9. Practical Tips: keep a clear record of timestamps. This makes it easier to produce highlight reels or summaries. Store the original frames in a secure location. If needed, produce a separate set of images with watermarks or anonymized faces for public release. Consider publishing your code and dataset on GitHub or a similar repository to invite feedback, replication, or improvements from a wider community.

Feel free to use this to target any other spaces from smaller city council meetings, to corporate board streams, or any other setting where accountability is a priority.

References:

Streaming Video Sources

Surveillance Video Dataset

References:

  • Monahan, Torin (2006). Surveillance And Security: Technological Politics And Power in Everyday Life, p.158. ISBN 9780415953931.

--

--

Domestic Data Streamers
Domestic Data Streamers

Written by Domestic Data Streamers

Turning excel spreadsheets into erotic lyrics.

No responses yet