The Pitfalls of Predictive Policing in the Minority Report
With the rise of artificial intelligence and the use of personal data to enhance algorithms,
law enforcement agencies in the United States and around the world have begun to rely on
algorithmic crime-reporting systems that forecast potential criminal activity and drive police
operations. These systems are marketed as neutral, as they operate on non-biased data to identify
high-risk communities and individuals sourced from historical crime reports, policing trends, and
machine learning models. However, through its growing application, critics of its use highlight
that these systems reinforce existing historical racial biases within policing and are used in a
manner that undermines autonomy, democracy, and fundamental Western legal principles. The
Minority Report (2002), directed by Steven Spielberg, offers a perspective on the ethical
problems associated with predictive policing. The film takes place in a future Washington, D.C.,
where the policing system operates under “Precrime,” an arrest system in which citizens are
arrested based on predictions of their likelihood to commit a crime, sourced from three
genetically modified beings called the “precogs”. The film, along with scholarly research,
presents the ethical dilemmas of contemporary algorithmic policing systems. Thus, highlighting
that the very implementation of predictive policing models is ethically unjust. The film Minority
Report exposes how the use of predictive policing models, both in the fictional and
contemporary contexts, reduces humans to lines of data, creating a technocratic system that
devalues autonomy, consent, and amplifies structural inequalities through over-policing.
Ethical Foundation Conflicts Within the Legal System of the Minority Report:
Central to the ethical problems highlighted in The Minority Report is the failure of the
predictive policing model to uphold the values of autonomy, legal due process, and justice that
are foundational to Western legal systems. Spielberg’s Minority Report presents the philosophical
foundation of the Western legal system regarding the relationship between free will,
self-determination, and the shift from punitive justice to preemptive policing. The PreCrime
system presents major ethical concerns within the three foundational ethical approaches to the
Western legal system, through the lens of deontological, distributive justice, and utilitarianism
ethics. The use of the precrime system is primarily based on the ethics of agency and autonomy,
as it directly targets the tenets of deontological ethics, the moral theory that focuses on
rule-based ethics formulated on principles of right and wrong. Scholars Salvi and Nigri argue
that the use of predictive policing presents a deterministic approach to crime, in which the
possibility of future actions is upheld as an undeniable fact rather than a statistical probability
(Salvi & Nigri 8). In the film, the use of the Precogs aligns with risk assessments in
contemporary models of predictive policing. When forecasting a violent crime attributed to a
suspect, the suspect automatically loses all rights to agency and self-determination; the right to
self-determination inherently becomes eroded. The Chief Inspector John Anderton describes the
precogs as “pattern-recognition filters”[who] are connected to computer-based neurotechnologies
that can stream their thoughts as images containing details of events” (Krahn, Fenton, and
Meynell 76), denoting that risk assessments of humans are purely patterns of existing behavior.
Scholars Salvi and Nigri argue that although potentially efficient, “the potential for
algorithmic discrimination in the equitable application of criminal justice is a strong risk.”(Salvi
& Nigri 9). As such, the predictive policing model can erode autonomy in favor of efficiency.
Spielberg’s portrayal of the character John Anderton highlights the moral crisis at the heart of this issue. Although Anderton was a respected and prominent police officer within Washington
D.C., the PreCrime system predicted that he would eventually murder Leo Crow, instantly after
which Anderton lost all social status and became branded as a criminal, without legitimately
committing the crime or showing signs of intent. Thus, the application of predictive policing
within the films shifts the foundational belief of innocent until proven guilty to guilty until
proven innocent, as the framework of due process is abandoned and the use of a trial is deemed
unnecessary. As highlighted through scholars such as Krahn, Fenton, and Meynell, the use of
these predictive policing technologies is in direct violation of the deontological principles of
punishment, as there was no action, a principle that Western legal systems were built upon
(Krahn, Fenton, and Meynell 83).
Furthermore, the use of predictive policing models raises serious concerns under
utilitarian ethics, which focus on maximizing overall societal welfare through action. It raises
issues associated with prioritizing utility overall. The use of the PreCrime Systems throughout
the film initially has presented itself to yield stark benefits within Washington D.C., as the
system has been touted as making the city “Murder-Free for six years,” although on paper, the
film’s progress highlights the inherent negatives of the predictive policing models (Krahn,
Fenton, and Meynell 82). As highlighted in the film, the PreCogs exist in a perpetual state of
suffering, exploited for the overall betterment of society, regardless of their condition. Scholars
such as Cynthia Bond suggest that the PreCrime systems’ utilitarian approach to creating a
surveillance state reflects the modern use of predictive policing models that exploit existing data
from marginalized communities to promote technological precision. The system itself was later
exposed, showing that the utilitarian approach was largely ineffective, built on a false sense of
fact, and reflecting the use of precrime systems in the modern day.
On balance, the film highlights the foundational ethical concerns of distributive justice in
the widespread use of predictive justice. Throughout the film, the PreCrime system created a
binary class structure in which those unaffected by the precogs were inherently seen as better. At
the same time, those persecuted by it were considered less than. This creates an inherently unjust
foundation for the legal system, where one, simply by the potential to commit a crime, is deemed
less than and thus denied the same rights as everyone else. Bond highlights that this film presents
law as a “visual regime” where the justice system is, in fact, unjust (Bond 32). The use of drone
searches, biological identification scans, and the uncertainty of the likelihood of being present
within a PreCog vision shifts the foundational model of the legal system. It shifts the Western
punitive system from reactive to proactive. Batey notes that PreCrime, like the modern-day
algorithmic policing system, reinforces divisions between those with institutional power and
those who do not. As such, the application of this model fundamentally fails to meet the
standards on which the Western legal system’s ethical model was founded.
The Transformation of Humanity to Data Insights Through Technocentrism:
A more profound ethical dilemma embedded within the Minority is the use of predictive
policing through a Technocentric lens. In line with PreCrime’s failures to uphold the values of the
Western legal system, there are strong ethical concerns about its foundational nature and how,
like modern-day algorithmic systems, it is built on faulty, unethically sourced insights. Within
the movie, the PreCogs are presented as exploited subjects whose humanity is overlooked in the
pursuit of prioritizing raw data for the security of the masses. Scholars suggest that the condition
of the PreCogs is a profoundly unethical aspect of the surveillance state that is mainly reflective
of the modern policing systems. The scholars within Krahn, Fenton, and Meynell highlight that
the conditions of the precogs being in a perpetual state of suffering, where their bodies are viewed as simple input that is necessary to the legal system, reflects the broader implication of
the use of existing data that could suggest the suffering of higher policed communities under
higher policing as mere data (Krahn, Fenton, and Meynell 75). Nevertheless, the suffering of the
precogs is marketed as inherently good, as their output supports the masses.
Within the physical and mental enslavement of the precogs as they are suspended within
a milk bath, they have virtually no autonomy; they are stripped of all semblance of personality.
The sheer conditions they are presented with reduce them to machines, not people. These
conditions alone highlight a moral dilemma: eliminating human agency for the public good
through the prioritization of technological innovation. Within this model, the PreCogs’
forecasting abilities stem from their severe neurological damage as the children of drug addicts.
The use of these former children reflects how the system is predicated on suffering rather than
holistic scientific innovation, denoting a prominent sense of Technocentrism and
techno-optimism as a cure-all to crime. Through the traumatic visions that the PreCogs see, they
are not volunteering to provide the information but instead are forced through involuntary
experiences of pain. Irrespective of their conditions, the PreCrime policing force uses these
involuntary visions, converts them into data, and derives actionable insights and further legal
predictions of violent crime. This is directly highlighted in the relationship between modern data
cultivation practices that ignore the conditions of crime reporters and yet act on that information.
Under which the data cultivation practices of the modern day do not seek out the consent of the
communities it wishes to surveil, and thus the inherent flaws of the technocentrism highlight the
failure to uphold moral grounding.
Important to the condition of the PreCogs that largely reflects the dehumanization present
within the predictive policing model is the stark depersonalization present within the relationship the police force has with the PreCogs and broader the philosophy of technocentrism and policing
models. Within the film, Anderton refers to Agatha (one of the PreCogs) as “the Senior” rather
than her actual name, in a manner no different from that of a computer program. The Director
Lamar Burgess considers the PreCogs to be valuable assets rather than people, viewing them as
tools rather than separate individuals. Throughout Bond’s analysis of them, filmed the use of the
PreCrime’s PreCogs embodies the Legal Use value farming work that demotes the value of
humans, so long as they hold value within the justice system. This framework reflects heavily
within the context, as minority populations are often dehumanized as points of data to create
predictive policing models, as the minority communities’ traumatic experiences under
discrimination are extracted and weaponized within predictive algorithmic policing models in a
similar capacity to that of the PreCogs.
PreCrime and The Erosion Of Free Will:
The application of PreCrime not only represents technocentrism but also presents a
starkly similar logical reaction to the use of modern big-data policing practices and their erosion
of free will. Within the opening arrest scene of the film, there is an arrest sequence that depicts
Mr. Marks, in an attempt to kill his wife, as she was cheating. The sequence presents the ritual of
Officers swarming the house, Anderton apprehending the man, and reading out his predicted
crime. The man pleaded and was later forced into jail without a trial, nor was an appropriate
crime given, since he never actually murdered his wife. Director Spielberg uses this scene and
the ritualistic nature of the arrests to examine the absurdity of a system centered mainly on the
criminalization of thoughts rather than actions. As such, the fundamental arrest is unjust as no
action has been committed. This is largely reflective of Brayne’s emphasis on the modern-day “Big Data”
corporations, such as Palantir’s Gotham platform that supports the use of arrest records and gang
affiliation in tandem with geographic data to promote the heavy policing of often minority
communities, and punishing them for existing previous criminal activity rather than legitimate
intent to commit the crimes (Brayne 1006). The fragility of this line of reasoning to substantiate
these arrests is highlighted when John Anderton, a devout rule follower, is predicted to commit a
murder; thus, the film highlights the central philosophical issue with the use of predictive
policing models, as to whether individuals, despite what the data may suggest, have the ability to
choose different actions irrespective of the predictions. Although within the film, the PreCogs
were touted as having a one hundred percent crime avoidance rate similar to that of Palantir’s
Gotham, with its application being under the guise of “objective” truths, erodes free will and
lacks legitimate findings if, through free will, the result of the visions if left unattended by the
PreCrime police would make come into fruition. This is reflected in Anderton’s desperate
attempts to escape his predicted fate, which highlight the possibility of an individual acting in
opposition to their prediction and thus to the very ideology under which the system was put in
place.
The Consent Paradox:
In tandem with the erosion of free will, an ethical dilemma arises regarding the use of
predictive policing systems on the basis of consent. In both
contemporary society and The Minority Report, individuals are subjected to invasive forms of
surveillance without the chance to provide meaningful, revocable, or informed consent regarding
whether the surveillance system is appropriate for their lives. In the Minority Report, the citizens
of Washington, D.C., live under a biometric tracking system that analyzes individuals through retinal scanners, promotes targeted advertisements, and exploits the uncertainty of persistent
PreCrime surveillance. Despite this, none of the citizens have any semblance of approval or
outward consent to being subjected to this surveillance state. Minority Report’s futuristic
depiction of Washington, D.C., highlights a society in which public consent is irrelevant
primarily to government operations, as the policing system’s invasiveness is presented as a
universal public good. This ethical issue is reflective mainly within the contemporary policing
platforms such as the United Kingdom’s Xcaliber and the Palantir’s Gotham platform which
collects an unprecedented amount of personal data, through movement patterns, historical arrest,
gang databases, personal bills and many other forms of identifiable data without the knowledge
or consent of the those of which the data is collected from.
The unethical nature of the PreCogs within the Minority Report is reflected mainly in
Amnesty International’s comprehensive report, Automated Racism, which highlighted the
relationship between the use of predictive policing across the United Kingdom. Through the
report, it was observed that communities under predictive policing, especially those who are
historically marginalized, “have no meaningful opportunity to know how their personal data is
used, to challenge inaccuracies, or to refuse inclusion in algorithmic systems that shape the
police actions” (Amnesty International UK 16). Throughout this system, individuals are forced,
without their consent, to become data points, which are subjected to predictive policing systems
that rely entirely on historical data and surveillance technology. As such, those who have any
semblance of an interaction with the police, justly or unjustly, inherently become data that is later
fed permanently to algorithmic policing systems that the subject did not agree to. This is mainly
reflective of the involuntary condition of the PreCogs, whose bodies and minds are used as mere
permanent instruments of the state’s predictive policing, without the possibility of opting out. As such, both systems amplify the necessity of the measures for the public good but fail to acknowledge the erosion of individual consent.
This paradox presents an increasingly harrowing reality when it is the use of predictive
policing against communities that are facing structural inequality. Communities and individuals
of marginalized identities have historically fallen victim to being over policed. Subsequently,
they are overrepresented in crime data, surveillance, and interviews used to train predictive
policing models. The communities that have historically been overpoliced never consented to
serving as the foundational raw material for algorithmic policing models, yet are the most
vulnerable to their outcomes. In the minority report, the PreCogs are a visual representation of
the injustice of predictive policing models, as their autonomy has been datafied. This paradox, in
tandem with existing inequalities, highlights the fundamental flaws of the PreCrime system,
which is predicated on the use of non-consensual data extraction for actionable insights. Despite
claims of legitimacy in invoking public safety, the erosion of autonomy undermines the rights of
those who are being predicted to commit a crime.
Reinforcing Racial Inequality Through Predictive Policing:
In line with the inherent unethical nature of the data cultivation, the use of the data, as
previously highlighted, systematically reproduces and reinforces the existing racial inequality
within policing through optimizing existing biased historical data to make future predictions on
“high-risk” individuals, a dynamic strongly present within the film as well as the modern day.
Within Amnesty International’s report, Automated Racism, the policing data from 33 offices that
use the predictive policing model Xcaliber yield strong indications that they reflect existing
“structural and institutional racism and discrimination” within British society (Amnesty
International UK 9). It found that Black British Nationals were 3.3 times as likely as their white counterparts to be stopped by police, with 80 percent of these stops resulting in no further legal
action (Amnesty International 34). This is mainly reflective of the deployment of PreCrime
within the Minority Report that directly impacted the most vulnerable, the political
disenfranchised, in favor of those with power, in line with Lamar Burgess’ exploitation of the
system to frame Anderton to support his own interests. This political and inherent power
imbalance presented within the film, as well as the contemporary application, is highlighted with
Safiya Noble’s Algorithms of Oppression, as policing presents “contradictions inherent in its
projects must be contextualized in the historical conditions that both create it and are created by
it,” thus preexisting power imbalances will persist (Noble 163). Furthermore, the use of the
PreCrime System aligns with Virginia Eubanks’s view, highlighting real-world predictive
systems that shift the transformation of criminal data to support the development of institutional
suspicion. Within this approach, Nobel notes that automated risk directly punishes people for
institutional forces that affect their lives. Thus, predictive policing and PreCrime within
Minority Report highlight the negative feedback loop associated with the unjust nature of
historical context and its transformation into an authoritative response.
Conclusion:
The use of the PreCrime system throughout The Minority Report highlights a legal
system that holds a reliance on technological predictions rather than a response to actual crimes.
As such, under predictive policing models, humans become data inputs in direct opposition to the
legal foundations of autonomy, consent, and equality that uphold the Western legal system. The
PreCrimes system’s reliance on the precogs highlights how data, although presented as neutral,
can be used to justify existing systems of bias. The ethical concerns of predictive policing
challenge the contemporary determination to promote technological efficiency, which undermines the very human values that the legal system is predicated upon. Without tangible
oversight, predictive policing presents stark philosophical and political risks.
Works Cited
Amnesty International. “Automated Racism Report.” Automated Racism Report,
www.amnesty.org.uk/files/2025-02/Automated Racism Report – Amnesty
International UK – 2025.pdf. Accessed 23 Nov. 2025.
Bond, Cynthia D. “Law as Cinematic Apparatus: Image, Textuality, and Representational
Anxiety in Spielberg’s Minority Report, 37 Cumb. L. Rev. 25 (2006).” UIC Law
Open Access Repository, repository.law.uic.edu/facpubs/99/. Accessed 23 Nov. 2025.
Brayne, Sarah. “Big Data Surveillance: The Case of Policing.” American Sociological
Review, U.S. National Library of Medicine, Oct. 2017,
pmc.ncbi.nlm.nih.gov/articles/PMC10846878/.
Krahn, Timothy, et al. “Novel Neurotechnologies in Film – A Reading of Steven
Spielberg’s Minority Report.” Neuroethics,
www.researchgate.net/publication/226402281_Novel_Neurotechnologies_in_Film-A
_Reading_of_Steven_Spielberg’s_Minority_Report. Accessed 23 Nov. 2025.
“Minority Report.” Twentieth Century Fox, 2002.
Noble, Safiya. “Algorithms of Oppression: How Search Engines Reinforce Racism. .” NYU
Press, 2 July 2019, nyupress.org/9781479837243/algorithms-of-oppression/.
Salvi, Nicolás, and Santiago Nigri. “Minority Report: The Road to a Deterministic Theory
for the Philosophy of Criminal Law.” Opinión Jurídica, Universidad de Medellín,
www.scielo.org.co/scielo.php?script=sci_arttext&pid=S1692-25302022000300002.
Accessed 23 Nov. 2025.
Salvi, Nicolás, and Santiago Nigri. “Minority Report: The Road to a Deterministic Theory
for the Philosophy of Criminal Law.” Opinión Jurídica, Universidad de Medellín, www.scielo.org.co/scielo.php?script=sci_arttext&pid=S1692-25302022000300002.
Accessed 23 Nov. 2025.