By Kieran Birse
In 1786, Jeremy Bentham introduced the panopticon, envisioning a circular prison structure for constant and uncertain surveillance in order to control the behaviour of prisoners. The uncertainty of surveillance encourages self-discipline. George Orwell's popular and influential book "1984" warned against pervasive surveillance of this nature, nefariously clawing its way into the minds of the citizens to control their thoughts, for the purposes of totalitarian control. I believe ideas like this have led to an ingrained and irrational fear of government overreach and has blinded us to the potential benefits of a public or national panopticon. Could mass surveillance be the key to a utopia of security, justice, and equality instead of the prison it's currently associated with?
The belief that harsh punishment does very little to deter crime is supported by the US Department of Justice and the UK's College of Policing, both emphasising that ‘certainty of apprehension’ is a far superior deterrent. Despite 7.5 million CCTV cameras in the UK, crime persists. The crime rate in the UK is currently 76 crimes per 1000 people: an 8% rise since 2021. Proposing legislation for a national panopticon to blanket all public and private spaces with surveillance technology, linked to a central policing hub aligns with Bentham's panopticon design, with a modern twist. This would provide that total, yet uncertain surveillance which encourages self-discipline. Consider the prevalence of domestic abuse; 1 in 5 adults experience domestic abuse in their lifetime. According to the NSPCC, nearly half a million children are victims of abuse every year and 1 in 20 children in the UK have experienced sexual abuse. A panopticon reaching into the private homes could significantly reduce such crimes by introducing a certainty of being caught in the act. There could be a justification of the perceived ‘intrusion’ under exceptions in the Human Rights Act 1998, specifically 'public safety,' 'prevention of disorder or crime,' and the 'protection of health or morals'. Implementing a panopticon might then render a large police force unnecessary, redirecting some of the £23.5 billion police budget to enhance technology, education, mental health services, and maintaining public spaces and institutions.
The UK’s terrorism threat level is currently ‘Substantial’. A panopticon could enhance counter-terrorism efforts, eradicating domestic threats and potentially reducing the UK's £15.1 billion expenditure. Racially motivated incidents, the likes of which often follow terrorist attacks, could decrease. Research suggests that prejudices often stem from a fear of ambiguity or the unknown. A national panopticon could eliminate the ambiguity, by ensuring any illegal activity by individuals (such as terrorism) could not go unseen, levelling the playing field, ensuring citizens are truly treated equally in this regard and therefore curbing fear and prejudgment. While some could argue it challenges privacy rights, exceptions for 'national security' in the Human Rights Act 1998 could justify the surveillance.
Corruption and distrust of the state arise as clear concerns. Trust in the management of surveillance systems is crucial, and human corruption must be eradicated to ensure a panopticon's acceptance within society. With AI and machine learning advancements, artificial entities could replace aspects of human management, ensuring safer, unbiased, and impartial surveillance. Machine learning's potential to predict crimes and interpret emotions presents a solution to persistent issues in law enforcement. At its core, machine learning involves utilising algorithms to discern patterns. The National Institute of Justice acknowledges that these algorithms have the potential to match faces, identify weapons, other objects, and “detect complex events such as accidents and crimes”. Additionally, the capability of machine learning to interpret body language, facial expressions, and emotions raises the prospect of lie detection reaching unprecedented levels. There exists a plausible scenario wherein machines, through predictive modelling, could predict crimes before they occur, but this would require surveillance on the scale of panopticon to be effective. Privacy concerns could be addressed through GDPR-like controls, erasing non-criminal information after a review for example but the trust in this system would require robust checks and balances.
Mass surveillance is believed to negatively impact creativity and mental health, as privacy denial deeply affects human psychology and as technology continues to advance, so does the ability to trace our every move. Aside from the previous examples, there is also evidence that programmers have been able to measure someone’s job satisfaction through facial recognition or analyse a social media post to detect signs of depression, or even determine someone’s heart rate through a webcam. In some contexts, the effects of mass surveillance can be as severe as inducing PTSD. There is a clear connection between the health of the human body and the level of privacy one is awarded. Edward Snowden (someone whose career and subsequent charges under the Espionage Act were based around mass surveillance) advocates privacy as the fountainhead of all other rights, emphasising the risk of never reaching one's full potential under constant surveillance. A fair panopticon application, in its most basic form, should ensure legitimate private acts stay private, again this could be in the same vein as GDPR. Artificial machine learning could potentially handle this process, addressing these privacy concerns and minimising the need for human eyes on sensitive information. Admittedly, this would be incredibly difficult to implement and arguably, the societal challenges could be tackled by much less invasive means, such as addressing poverty, lack of opportunity or educational inequality. A study from 2019 suggests that investments in early childhood education programs can lead to long-term reductions in crime rates. Solutions can even be found in the utilisation of environmental design as a crime prevention strategy, but can they go far enough to keep everyone safe at all times? The social and economic cost of crime is not an issue that will solve itself and solutions can only be implemented if governments can wrangle the political will. More dialogue is needed, more questions need to be asked of ourselves. Today, ‘tough on crime’ is a tired slogan, not a roadmap to a safer future. Let's ask ourselves and our leaders, what can we do better?
Balancing the goal of eradicating crime with the potential drawbacks of a national panopticon requires careful consideration. If implemented with finesse, care, and morally intelligent use of technology, panopticonism could become an accepted part of the social contract. The question lies in how much freedom we are willing to trade for a potentially safer society. Reassessing aspects of freedom prompts us to contemplate trade-offs and the possibility of building a safer society through mass surveillance. The key is to ensure a fair and just application, leveraging advancements in technology to improve citizen outcomes while maintaining the essential trust in the system. There may be far less invasive means to address this, but can they go far enough to truly eradicate crime?
Kieran Birse is currently in his second year of studying Politics, Philosophy and Economics part-time at the Open University and currently works in the animal health sector and with NGOs to improve the food supply chain in Africa.
Explore our qualifications and courses by requesting one of our prospectuses today.