Secret History #4: How Evil Triumphs — Professor Jiang Deep Dive
Secret History #4: The Architecture of How Evil Triumphs
The enduring myth of evil is that it arrives as a monstrous, thunderous eruption of violence led by singular, cinematic villains. History, however, whispers a more chilling truth: the most effective architectures of oppression are silent, administrative, and profoundly unremarkable. Hannah Arendt famously identified this as the "banality of evil" during the trial of Adolf Eichmann—a man who was not a demonic psychopath but a "terribly and frighteningly normal" bureaucrat who simply followed orders without thinking. In our modern epoch, this banality has migrated from the filing cabinet to the algorithm. As the research of Loubna Ali illuminates, cybersecurity has shifted from a depoliticized, technical shield into a sophisticated "technology of power." Evil triumphs today through the "terribly normal" systems of digital governance that shape behavior through normalization and surveillance. This report uncovers the secret history of how the modern subject is rendered a "docile body" within a digital panopticon, where the silent dispossession of the moral person is achieved not through force, but through the seamless integration of control into daily life.
To understand the hidden architecture of the algorithm, one must apply Michel Foucault’s framework of "Power/Knowledge." Modern governance no longer relies on the heavy hand of sovereign decree; instead, it operates through a productive force that categorizes and regulates. Cybersecurity is the modern apparatus of this power, an epistemological tool that defines what is "safe" versus "risky" or "trusted" versus "suspect." This is visible in the hypermodern panopticism of Pegasus spyware, which transforms the most intimate device—the smartphone—into a real-time surveillance instrument, inducing self-regulation through the mere suspicion of being watched. In East Asia, the Social Credit System exemplifies behavioral normalization by encoding state-defined standards into algorithmic feedback loops, while Big Tech’s "surveillant assemblage" subtly disciplines the masses through a decentralized network of data doubles and recommendation systems. These infrastructures produce compliant subjects through "ambient visibility," where the constant, invisible threat of being monitored leads to an internalized discipline that erodes human freedom and replaces autonomy with a persistent, gnawing social anxiety.
The psychological foundation of this digital cage is rooted in the classical realism of Machiavelli and Hobbes, who understood that human nature is a "sorry lot"—fickle, greedy, and easily corrupted. Machiavelli observed in his Discourses on Livy that humans rarely have the courage to be "perfectly good" or "honourably bad." Instead, they choose a "hazardous middle course," a path of compromise and passive compliance that facilitates systemic corruption. This tendency is exploited by the modern state, which has evolved from the Hobbesian "Leviathan"—a protector designed to end the "war of all against all"—into a mechanism of "anticipatory governmentality." This modern Leviathan no longer merely protects; it treats every user as a potential threat to be eliminated or neutralized before they act. By utilizing the inherent human fear of violence and disorder, the state justifies totalizing surveillance, moving from a role of guardian to a controlling mechanism that views the freedom of the individual as a risk to be managed.
When this systemic power remains unchecked, it progresses toward the "industrialization of evil" and the eventual superfluity of the human person. Hannah Arendt’s typology of concentration camps provides the terrifying blueprint for this end-state: "Hades" for the isolated undesirables, "Purgatory" for those undergoing ideological re-education, and "Hell" for those marked for systematic extermination. This process of dehumanization begins with legal dispossession and moves toward the destruction of the moral person, eventually reducing the individual to an "animal state" where survival overrides all human dignity and solidarity. Central to this is the concept of "superfluity"—the rationalization that certain groups have no place or utility in the system. The tragedy of this history is that such horrors are facilitated by "clownish" bureaucrats who, like Eichmann, abdicate their capacity for thought in favor of systemic efficiency. In the digital age, this manifests as the "diminished self-value" and psycho-emotional trauma experienced by those whose data is breached or whose digital identities are classified as "untrusted" by an unthinking machine.
The secret history of order also reveals the "Mimetic Trap," where communities restore peace through "Sacrificial Violence." René Girard’s theory of mimetic desire explains that humans copy the desires of others, leading to inevitable rivalry and a "spiral of violence." Modern algorithms drive this mimetic trap by showing us what others want, accelerating social tensions. When these tensions threaten the community, order is restored through the "scapegoat mechanism"—projecting collective aggression onto a marginalized target. In the digital context, this manifests as cancel culture and deplatforming, where the algorithmic targeting of "untrusted" users serves as a form of substitutive violence to restore a temporary, precarious peace. This mechanism conceals the true nature of social conflict under a veil of perceived necessity, ensuring that the cycle of exclusion remains a silent architect of digital stability.
A critical synthesis of modern data reveals the empirical weight of these philosophical claims. Survey results using Principal Component Analysis confirm that the digital panopticon is functioning with high efficiency: 71% of digital subjects report changing their behavior due to perceived online monitoring, while 41.9% explicitly avoid sharing sensitive opinions to escape profiling. These "Cautious Conformers"—representing roughly 45% of the surveyed population—consciously modify their tone and content to align with algorithmic expectations. This internalized discipline signifies a regression from the civilizational process, potentially sliding toward a state of "barbarism" where violence has no ritual or political objective but becomes a gratuitous end in itself. Yet, the existence of "Resistant Independents" suggests a limit to this normalization. These users engage in what Foucault calls "counter-conduct," refusing to modify their behavior despite the presence of surveillance. Their resistance proves that while the state has transitioned from a protector to an oppressive controlling mechanism, human agency persists as a stubborn obstacle to total algorithmic control.
The triumph of evil in the digital age is defined by the abdication of personal thought and the blind acceptance of systemic classification. We have moved from a world of coercive surveillance to one of internalized discipline, where the fear of being "untrusted" by an algorithm governs the soul. To prevent the individual from becoming superfluous in the era of Big Data, the informed citizen must demand more than just security; they must demand ethical accountability and "auditability" in the digital infrastructure. The transition from "coercive surveillance" to "internalized discipline" requires a robust "critique of thought" to ensure that user-platform interactions remain fundamentally human. True resistance lies in refusing the "middle course" of passive compliance and recognizing that every decision within a digital system—every score, every classification—is a political act with profound normative consequences. We must remain vigilant, for the architecture of evil is built not with monsters, but with the quiet, efficient tools of a system that we have been taught to trust.
Watch the Video
Based on Professor Jiang Xueqin's Secret History series. Deep dive analysis and fact-checking generated with AI assistance.
Share this post
Help this article travel further
One tap opens the share sheet or pre-fills the post for the platform you want.