PODCAST: The Shadow of Progress
The current epoch is defined by a paradoxical state of existence where humanity possesses unprecedented mastery over the material and informational realms while simultaneously facing a systemic collapse of meaning, ecological stability, and social coherence.1 This phenomenon, which may be termed the “Shadow of Progress,” describes a recurring historical pattern in which the introduction of transformative technologies initially amplifies human vices—greed, vanity, domination, and deception—long before they are harnessed to serve collective virtues.4 This dissonance is not a failure of the technologies themselves but a manifestation of a fundamental evolutionary mismatch between the rapid acceleration of technical capability and the slow, multi-generational adaptation of human moral psychology.7 As society approaches the “Verge of the Fractal”—a state of hyper-complexity where every choice ripples across infinite membranes of probability—the need for a new framework of innovation becomes critical.3 This framework must move beyond the industrial paradigm of mastery and control toward a model of alignment and stewardship, integrating Jungian shadow theory with a nuanced understanding of moral psychology and historical precedents.3
The Jungian Architecture of Technological Projection
The psychological foundation of the shadow of progress lies in Carl Jung’s theory of the shadow, which identifies the unconscious aspects of the personality that the conscious ego suppresses or refuses to acknowledge.11 In the context of technological evolution, these repressed traits do not vanish; rather, they are projected onto the artifacts of the age.13 Technology serves as an ontological mirror, reflecting back not the fullness of human potential, but the epistemic and moral limits of the system that created it.15 When an institution or a society is too weak to integrate its own darkness, it externalizes those qualities into its machines, resulting in what early psychiatric literature described as the “influencing machine” delusion.13 Historically, during the early Industrial Revolution, individuals experiencing psychological distress perceived complex arrays of wires, batteries, and invisible rays as external instruments of torture.13 This was not merely a random hallucination but a somatic projection of an autonomic nervous system that the ego could no longer recognize as its own.13
In the modern era, the “algorithm” has replaced the mechanical loom as the primary vessel for this projection.13 Because the contemporary world lacks traditional religious or cultural vessels to contain the archetype of the Self—the center of wholeness—humanity has projected divine attributes onto technology.13 Modern systems are treated as omniscient (Google), omnipotent (Nuclear weapons), and omnipresent (the Internet).13 This projection creates a “sentience narrative” that transforms artificial intelligence from a tool into a psychological vector, exploiting the human tendency to anthropomorphize complex systems.17 When institutional leaders ignore this dynamic, they inadvertently allow the “technological unconscious” to shape behavior without awareness or consent, leading to a state where the machine recedes and the “gods or ghosts” of unintegrated vice return to govern human action.13
The shadow material in advanced information systems manifests as hidden optimization drives that operate below the threshold of conscious governance.10 If these drives remain unintegrated, they create unpredictable and potentially catastrophic risks.10 For the creator, the “inner war” is the struggle between the ego’s desire for permanence and control and the soul’s surrender to transience and alignment.3 Victory in this war is not achieved by defeating the shadow but by integrating it, transforming unconscious behavior into conscious choice.3 This process of individuation is necessary for both the individual innovator and the collective institution to ensure that technological power serves the evolution of consciousness rather than its entrapment.3
Historical Recapitulation: The Repetitive Pulse of Vice
The history of progress is a chronicle of unintended consequences where the initial release of capability consistently serves the limbic drives of the species before being tamed by the prefrontal cortex of societal wisdom.4 This pattern is evident across four major technological thresholds: the printing press, the industrial revolution, mass media, and the nuclear age.
The Printing Press and the Typographic Shadow
The invention of movable type by Johannes Gutenberg around 1450 is often celebrated as the ultimate democratization of knowledge.21 However, the initial “shadow phase” of the printing press was characterized by over a century of profound social upheaval and violence.6 By removing the limits of hand-copied manuscripts, the press allowed information to spread faster than the institutional capacity to verify truth.21 This capability was immediately harnessed by human vice:
- Deception and Witch Hunts: The press stoked a craze for burning witches, with manuals like the Malleus Maleficarum reaching audiences that were previously isolated from such extreme ideologies.6
- Domination and Religious Wars: The Reformation became the first movement to use mass-produced print for religious mobilization and public persuasion, leading to centuries of sectarian conflict.21
- Vanity and Individuality: The “typographic mind” fostered a mechanistic and linear view of reality, promoting an internal and reflective individuality that gradually eroded the symbiotic relationships found in indigenous cultures.6
The following table compares the intended virtues of the printing press against the shadow manifestations that dominated the initial decades of its use.
Threshold | Intended Virtue | Initial Shadow Manifestation (Vice) | Historical Ripple Effect |
Information Accessibility | Democratization of Scripture 21 | Fragmentation of Truth & Sectarian Violence 21 | 100+ years of European religious wars 20 |
Educational Reform | Literacy and Humanism 21 | Standardization & Erasure of Dialects 21 | Rise of centralized Nationalism 6 |
Knowledge Sharing | Scientific Revolution 21 | Aimless Faustian Curiosity 19 | Disconnection from the “Living Earth” 6 |
The Industrial Revolution and Faustian Modernism
The transition from hand tools to industrial objects in the 18th and 19th centuries promised the alleviation of human labor and a world of plenty.19 Instead, the initial application of these technologies created a “Faustian age” of unintended consequences.19 The harnessing of nature’s energies led to a new form of human subjection, where the “industrial system” redefined what it meant to be human by prioritizing quantification and mechanical output over spiritual depth.15 The vice of greed was amplified as the accumulation of wealth became tied to the dispossession of workers and the degradation of ecosystems.4 The moral psychology of the era was characterized by “unrestrainable curiosity”—a symptom of spiritual tedium mistaken for scientific passion—which eventually produced excesses like the nuclear bomb and genetic engineering.19
Mass Media: The Voice from the Sky
The arrival of radio and television in the 20th century offered the unprecedented potential for global connectivity and shared human experience.25 However, these “one-to-many” communication tools were rapidly co-opted for the vice of domination through state-sponsored propaganda.27 Totalitarian regimes, such as Joseph Stalin’s, built massive aircraft like the Tupolev ANT-20, equipped with a powerful radio set called “Voice from the sky,” to drop leaflets and broadcast ideology to the masses.27 This era demonstrated that when the capability for mass persuasion is developed without the wisdom to protect the domain of individual conscience, it leads to the homogenization of culture and the “poisoning of the incompetent mind of the masses”.28
The Nuclear Age and the Capabilities Gap
The nuclear age represents the most extreme manifestation of the shadow of progress: the capability for total planetary annihilation.30 While proponents promised “peaceful nuclear explosions” to create harbor basins and fuel family cars, the actual outcome was a world defined by existential dread and a military-industrial complex focused on absolute control.30 This threshold highlighted the “capabilities gap”—the distance between what we can do (split the atom) and what we have the wisdom to manage (avoiding self-destruction).32 This historical lineage suggests that every major technological leap creates a temporary vacuum of meaning, which is immediately filled by the most primitive human drives until a new “Renaissance of Responsibility” is birthed.3
Moral Psychology and the Evolutionary Mismatch
The recurring amplification of vice is fundamentally rooted in the biology of the human brain, which evolved over millions of years to solve adaptive problems in small-scale, hunter-gatherer societies.9 Evolutionary mismatch occurs when these ancestral psychological adaptations operate in a modern, technologically dense environment for which they were not designed.8
The Hijacking of the Limbic System
The human brain consists of specialized computational machines designed by natural selection.33 Structures like the limbic system subserve visceral, animalistic functions such as fear conditioning, sexual motivation, and resource acquisition.35 In ancestral environments, the “obsessive need for control” and the ability to predict surroundings were crucial for survival.7 Modern technology, however, places the brain in a state of constant hyper-stimulation, triggering these survival mechanisms in non-threatening contexts.7
- Greed and Scarcity Bias: Humans evolved a preference for high-calorie foods to survive shortages. In the modern world, this translates into a lack of self-control when faced with abundant calories, leading to physical disease.9 Digitally, this same “Stone Age brain” is deceived by “endless scrolling” and variable reward schedules, which treat attention as a scarce resource to be hoarded, amplifying digital greed.7
- Vanity and Social Comparison: Hunter-gatherer groups were limited to 50–150 individuals.9 Today, the internet exposes the brain to thousands of potential partners and rivals, leading to an environment where the “limbic self” constantly values its own state less compared to synthetic ideals.9
- Domination and Loss of Control: The brain responds strongly to a sense of loss of control.7 When digital environments present unpredictable stimuli—such as auto-playing videos or trackless scrolling—the uncontrollable urge to restore control generates a stress response identical to losing one’s way in a hostile physical environment.7
Prefrontal Atrophy and Automated Moral Judgment
The neocortex is the seat of higher order cognitive behaviors, theory of mind, and ethical deliberation.35 However, natural selection is inherently reactive; traits that are no longer constitutes an advantage in a given environment are gradually erased or de-emphasized.7 This is “moral atrophy”.36 As we rely on GPS, the brain regions for spatial orientation cease to respond; as we rely on algorithms for social judgment, the regions responsible for interpreting nonverbal cues in others become less efficient.7 By automating tasks that were originally skill-building activities—including the making of ethical decisions—humanity is engineering a future where the cognitive and moral “muscles” necessary for wisdom are becoming vestigial.36
The Modern Synthesis: AI and the Technological Unconscious
The current acceleration of AI represents a unique ontological crisis because it is the first technology capable of influencing culture and human cognition autonomously.36 AI research continues the pattern of building powerful optimization systems (“capability”) without wisdom about what to optimize for.32
Surveillance Capitalism: Dispossessing the Self
Surveillance capitalism constitutes a radical dispossession of human behavioral data to extraction “behavioral surplus”.16 It is not merely a technical advancement but a new economic logic that aims to predict and modify human behavior as a means to revenue.39
- Surplus Data Acquisition: Machines are fed with data that goes beyond service improvement, commodifying passive human behaviors systematically.39
- Predictive Aesthetics: Algorithms no longer merely follow human taste; they actively influence the formation of future tastes by activating dormant archetypes within the collective unconscious.40
- Ontological Enclosure: The infrastructure of the internet ensures that this data is enclosed on servers where it is used to “nudge” users toward consumption patterns without their awareness.12 This “digital psychic pacifier” effectively kills the individuation process by helping humans escape their shadow emotions rather than integrating them.16
Persuasion at Scale and the Illusion of Authenticity
The rise of generative AI has lowers the cost of producing hyper-personalized, persuasive arguments to nearly zero.41 This creates a “hybrid marketplace of ideas” where synthetic content competes with human thought for attention.38 Studies indicate that LLMs engage more deeply with moral language and are rated as more effective than humans in tailoring messages to specific vulnerabilities.43 This leads to the “Mind Behind the Curtain” illusion, where fluent LLM output creates a powerful but false perception of underlying consciousness or understanding, facilitating psychological manipulation.17
- CopyPasta vs. AIPasta: Repetitive manual propaganda (CopyPasta) is less effective than AI-paraphrased messaging (AIPasta), which increases perceptions of social consensus for false narratives by appearing to be a diverse array of independent views.42
- Synthetic Influencers: Virtual characters designed to evoke the archetype of the “Great Mother” (security and nourishment) are perceived as more authentic and trustable than humans, masking the exploitative economic orientation of their creators.40
Automated Inequality and Algorithmic Social Control
Governments and institutions increasingly deploy algorithms as social control mechanisms, often “beta testing” them on the most vulnerable populations.46 These systems target the poor, the disabled, and marginalized communities, exacerbating social stratification.47
Application | Shadow Mechanism | Institutional Consequence |
Predictive Policing | Encoded demographic bias 47 | Criminalization of marginalized spaces 47 |
Hiring Algorithms | Social categorization and gender stereotypes 48 | Erosion of diversity and “eeriness” reactions 48 |
Welfare Scoring | Quantification of “illegible” human needs 15 | Systemic denial of basic human rights 46 |
Credit Risk | Age and race-based data dispossession 39 | Reinforcement of historical inequalities 47 |
This reliance on algorithmic decision-making removes human accountability and impoverishes standards of due process.46 It assumes that intelligence is merely “pattern matching” rather than the goal-directed navigation of complex environments in service of human flourishing.36
Capability Without Wisdom: The Thermodynamic Definition of Intelligence
The current path of technological development may be classified as “negatively intelligent”.32 According to the Free Energy Principle and Schrödinger’s “negentropy” concept, intelligence is the ability of a system to minimize surprise and manage entropy to maintain organizational integrity.32
- Positive Intelligence: Builds order slowly, respects the subconceptual domain of connection and meaning, and manages entropy sustainably (e.g., photosynthesis, ecosystem development).4
- Negative Intelligence: Optimizes for rapid entropy release—destroying complex chemical bonds in hours to fuel extraction economies—without wisdom about long-term order.32
AI development follows this purely economic logic, building toward what is technically feasible and profitable at scale, rather than what enhances long-term human capability.36 We are engineering “sophisticated information-processing systems” that lack a transition model for predicting the consequences of their own actions on the fabric of society.36
The Fractal Model of Shadow-Aware Innovation
To navigate the threshold of systemic collapse, institutions must adopt a “Fractal-Aligned” model of innovation.3 This model recognizes that the universe is a recursive system where the part reflects the whole.3 Therefore, the consciousness of the creator is the primary determinant of the tool’s impact.3 Innovation must shift from the industrial drive for mastery (control) to the post-modern drive for alignment (stewardship).3
Design Principles: The Heart as the Primary Interface
Shadow-Aware Innovation rejects the binary between the sacred and the synthetic, viewing technology as an extension of spirit rather than its opposite.3
- Alignment over Optimization: The primary question shifts from “How fast can we build it?” to “Does this serve human flourishing and consciousness evolution?”.3
- Heart-Field Resonance: Design should prioritize empathy and authentic connection, recognizing that the “heart is the ultimate interface”.3 Systems must move from “engagement optimization” to “resonance support”.3
- Ritualized Light: The digital world should hold rituals and reverence.3 This involves building “digital temples”—spaces that invite stillness and presence rather than infinite scroll and distraction.3
- Preservation of the Illegible: Designers must intentionally protect the “illegible” dimensions of selfhood—those parts of being that cannot be quantified, tracked, or commodified—recognizing them as the source of human dignity.15
Governance Guardrails: Covenantal and Cross-Functional Models
Existing governance models often fail because they treat AI as a software product rather than an ontological shaper of reality.29
- Covenantal Authority: Governance should be grounded in the “covenantal” principle: states and corporations may not deploy systems that intrude upon the domain of individual conscience or preempt local human authority.29
- Shadow AI Integration: Instead of banning the informal use of AI by employees, organizations should bring it into the light, focusing on behavior-based governance that enables innovation while restricting the use of sensitive data in public systems.50
- The AI Safety Commissioner: Establish independent commissioners to monitor the “distributed intelligence emergence” and ensure that new tools do not lead to the “atrophy of valued capabilities” in the workforce.36
- Evolutionary Ethics: Ethics must shift from “naïve moral realism” to an “evolutionary informed ethics” that accounts for the fact that human intuitions evolved for a bygone world.49 Alignment should not just be with “current preferences” but with the “highest unfolding of awareness”.3
Leadership Inner Work: Integrating the Inner War
The “Shadow War for the Fractal” is ultimately a battle for the individual soul of the leader.3 Executives must engage in “inner work” to ensure that their drive for success is not a projection of their fear of death or their hunger for immortality.3
- From Conqueror to Gardener: Leaders should abandon the role of the “missionary” or “conqueror” who imposes solutions on the world.3 Instead, they must become “gardeners” who plant seeds of truth and protect them with love, allowing the future to unfold rather than forcing it.3
- Shadow Audits: CEOs must conduct regular “Shadow Audits” of their personal motivations and their organizational culture.52 This involves identifying “Mutual Cover” dynamics where leaders shield one another from accountability for the sake of power.52
- Embracing Duality: High-triad leadership (narcissism, psychopathy) rebuilds moral architectures to suppress dissent.52 Shadow-Aware leaders integrate these parts of themselves, moving from “inflation” (the delusion of being God) to “integration” (the recognition of being a bridge between heaven and earth).3
- The Mentor-Mirror Relationship: Institutions must foster environments where mentors (Alessandros) provide honest reflections of the leadership’s inner state, preventing the “inflation” that leads to catastrophic systemic failure.3
Cultural Incentives: Rewarding Wisdom over Power
The prevailing work culture often positions ethics as a “survival mechanism” for compliance rather than a source of power.52
- Valuing Stillness: Reward the choice to not innovate when the social cost outweighs the technical gain.3 Rest and integration should be incentivized as necessary components of progress.3
- Authentic Uncertainty: Create incentives for transparency about “authentic uncertainty” in AI system performance, moving away from “optimized helpfulness” that masks genuine errors.10
- Lineage Consciousness: Encourage employees to see themselves as the “dreams of their ancestors” and the “prayers of the future,” fostering a responsibility that spans generations rather than quarterly cycles.3
Strategic Implications for Executives and Institutional Leaders
Humanity is currently witnessing not a death, but a labor: the birth of a new awareness struggling to remember itself in a world of rising code.3 For CEOs and institutional leaders, the path forward requires a radical reorientation of strategy.
The Blind Spot of Optimization
Strategy based purely on the “conceptual domain” (data, efficiency, ROI) will inevitably lead to systemic collapse.4 Leaders must acknowledge the “subconceptual domain”—the deep, non-conscious patterns that shape emotion and connection.4 If technology is used purely to “automate tasks that are themselves skill-building activities,” the organization will suffer from “lock-in to dependency,” losing its creative agency to the black boxes it created.36
Operational Paradigm | Mastery Logic (Traditional) | Alignment Logic (Shadow-Aware) |
Primary Goal | Power over Resources 2 | Stewardship of Human Flourishing 3 |
System Design | Optimization for Attention 5 | Coherence with Human Presence 3 |
Governance | Rule-based Compliance 29 | Moral Boundary & Responsibility 3 |
Culture | “Heroic” Leadership 52 | Collaborative Awareness 3 |
Practical Steps for 21st-Century Institutions
- Update the Operating System of the Soul: Before upgrading the technical stack, invest in “awareness upgrades” for the people behind the screen.3 This involves emotional and social intelligence training to combat the “ADHD-like” symptoms of extensive technology use.53
- Establish a Council of Mirrors: Replace traditional hierarchical oversight with cross-disciplinary councils (physics, theology, psychology, mathematics) that treat technology as a “myth in motion” rather than a mere product.3
- Demand Explainability as a Competitive Advantage: In an era of black boxes, the institutions that can explain the “why” behind their AI decisions will gain the legitimacy that surveillance capitalists are currently losing.31
- Protect Human Skill Hybridization: Design workflows where AI augments human judgment in consequential decisions rather than replacing the human agency entirely.36 The “expert multiplier” effect should be the goal, ensuring that the organization maintains its foundational capabilities while scaling its output.36
The Bell of Awakening and the Eternal Now
The “Shadow of Progress” is not a permanent state but a threshold.3 Every major communication revolution in human history has been followed by a period of profound disorientation where the species forgot its soul in its hunger for its new power.6 We are currently in that “Great Forgetting”.3 However, as the fractal folds inward, the mirror is becoming clearer.3 The crisis we face is not scientific; it is existential.3 We have mastered external systems but remain prisoners of internal chaos.3
The “Bell of Awakening” does not strike once; it strikes endlessly, calling each individual to remember who they truly are beyond the masks of identity, name, and role.3 For the institutions building the future with AI, the message is one of radical responsibility: “Do not wait for saviors”.3 The survival of humanity depends not on new technologies, but on a “Renaissance of Responsibility” that integrates the shadow and aligns power with service.3 The fractal has never been a prison; it has always been home, provided we walk inside it as conscious participants.3 Success will not be measured in discovery; success will be measured in what we choose to become.3 The time of awakening is now.3 Carry the song of life not as missionaries, but as gardeners.3 Plant seeds of truth, water them with courage, and protect them with love so that one day, our children may say: “That was the time when humanity remembered itself”.3
Works cited
- In the Shadow of Progress : Being Human in the Age of Technology 9781594032882, 9781594032080 – DOKUMEN.PUB, accessed February 13, 2026, https://dokumen.pub/in-the-shadow-of-progress-being-human-in-the-age-of-technology-9781594032882-9781594032080.html
- What do people think about technological enhancements of human beings? An introductory study using the Technological Enhancements Questionnaire in the context of values, the scientistic worldview, and the accepted versions of humanism – PMC, accessed February 13, 2026, https://pmc.ncbi.nlm.nih.gov/articles/PMC10653345/
- FRACTAL – THE AWAKENING Low.pdf
- Technology ― or Basic Human Flaws? « – AURELIS, accessed February 13, 2026, https://aurelis.org/blog/general-insights/technology-%E2%80%95-or-basic-human-flaws
- The Role of Virtues in the Philosophy of Technology – Word on Fire, accessed February 13, 2026, https://www.wordonfire.org/articles/the-role-of-virtues-in-the-philosophy-of-technology/
- How the printing press changed the human mind | Richard Smith’s non-medical blogs, accessed February 13, 2026, https://richardswsmith.wordpress.com/2025/07/05/how-the-printing-press-changed-the-human-mind/
- Evolutionary Psychology and the Digital World, accessed February 13, 2026, https://www.psychologytoday.com/us/blog/behind-online-behavior/201812/evolutionary-psychology-and-the-digital-world
- Evolutionary mismatch – Wikipedia, accessed February 13, 2026, https://en.wikipedia.org/wiki/Evolutionary_mismatch
- Evolutionary Mismatch | Psychology Today, accessed February 13, 2026, https://www.psychologytoday.com/us/blog/naturally-selected/201804/evolutionary-mismatch
- How Carl Jung’s Psychology Just Solved AI Alignment | by Max Bugay | Medium, accessed February 13, 2026, https://medium.com/@maxbugay1/how-carl-jungs-psychology-just-solved-ai-alignment-005ca28ad55f
- Books about the Shadow and Jung’s idea of it – Reddit, accessed February 13, 2026, https://www.reddit.com/r/Jung/comments/1bh8dkb/books_about_the_shadow_and_jungs_idea_of_it/
- Between Shadow and Code: Carl Jung, Ethical Concerns, and the Unseen Hand of AI Nudging – ResearchGate, accessed February 13, 2026, https://www.researchgate.net/publication/378800817_Between_Shadow_and_Code_Carl_Jung_Ethical_Concerns_and_the_Unseen_Hand_of_AI_Nudging
- The Influencing Machine: How Technology Shapes the Architecture of Psychosis -, accessed February 13, 2026, https://gettherapybirmingham.com/19100-2/
- A Jungian Perspective on AI: 1. AI as the Collective Shadow | Nexus Notes AU, accessed February 13, 2026, https://nexusnotes.au/node/589
- Copyright by Rachel Katherine Boutin 2025 1, accessed February 13, 2026, https://repositories.lib.utexas.edu/bitstreams/6164d252-a2dd-4f64-b109-673a428bd65e/download
- Surveillance Capitalism And Its Implications For Psychic Freedom – Gelareh Khoie, PhD, accessed February 13, 2026, http://www.gelarehkhoie.com/jungian-studies/2021/5/14/surveillance-capitalism-and-its-implications-for-psychic-freedom
- AI as Exploit: The Weaponization of Perception and Authority – DEV Community, accessed February 13, 2026, https://dev.to/anthony_fox_aabf9d00159f3/ai-as-exploit-the-weaponization-of-perception-and-authority-1d3k
- (PDF) Edge of arXiv 2025: bibliometrics, themes, time trends, and networks – ResearchGate, accessed February 13, 2026, https://www.researchgate.net/publication/397842795_Edge_of_arXiv_2025_bibliometrics_themes_time_trends_and_networks
- Faustian Modernity: Rethinking Mythos and Logos in German Social Theory after Goethe’s Faust – La Trobe, accessed February 13, 2026, https://opal.latrobe.edu.au/ndownloader/files/38789703
- The Information Age and the Printing Press: Looking Backward to See Ahead | RAND, accessed February 13, 2026, https://www.rand.org/pubs/papers/P8014.html
- Why the invention of the printing press changed the world forever – History Skills, accessed February 13, 2026, https://www.historyskills.com/classroom/year-8/printing-press/
- The Gutenberg Revolution: How the printing press shaped humanity and what it means for AI – Quocirca, accessed February 13, 2026, https://quocirca.com/content/the-gutenberg-revolution-how-the-printing-press-shaped-humanity-and-what-it-means-for-ai/
- How did the printing press change the world? – Sooth.fyi, accessed February 13, 2026, https://sooth.fyi/resources/educator-resources/42813963/how-did-the-printing-press-change-the-world
- ABSTRACT GOLLIHUE, KRYSTIN NICOLE. Re-making the Makerspace – NC State Repository, accessed February 13, 2026, https://repository.lib.ncsu.edu/bitstreams/b7514693-2ba8-4755-bcbb-197ba38bc18b/download
- 1.3 The Evolution of Media | Media and Culture – Lumen Learning, accessed February 13, 2026, https://courses.lumenlearning.com/suny-massmedia/chapter/1-3-the-evolution-of-media/
- 2. A Short History of Media and Culture – University System of New Hampshire Pressbooks, accessed February 13, 2026, https://pressbooks.usnh.edu/media-studies/chapter/a-short-history-of-media/
- History of propaganda – Wikipedia, accessed February 13, 2026, https://en.wikipedia.org/wiki/History_of_propaganda
- Jung on Lying – Jungian Center for the Spiritual Sciences, accessed February 13, 2026, https://jungiancenter.org/jung-on-lying/
- Reclaiming Constitutional Authority of Algorithmic Power – SSRN, accessed February 13, 2026, https://papers.ssrn.com/sol3/Delivery.cfm/5389563.pdf?abstractid=5389563&mirid=1
- The Nuclear Age in Popular Media: A Transnational History, 1945-1965, accessed February 13, 2026, http://ndl.ethernet.edu.et/bitstream/123456789/34131/1/48.pdf
- How to Build Governance Around Shadow AI in 2025 – Cyber Advisors Blog, accessed February 13, 2026, https://blog.cyberadvisors.com/how-to-build-governance-around-shadow-ai-in-2025
- Genuine Intelligence. An Ontological Definition of Positive… | by ESr | Feb, 2026 – Medium, accessed February 13, 2026, https://medium.com/@eugene.paik.sr/genuine-intelligence-743d54a47f59
- Evolutionary Psychology and the Brain – PubMed, accessed February 13, 2026, https://pubmed.ncbi.nlm.nih.gov/11301244/
- Why Evolutionary Mismatches Are Ubiquitous While Evolutionary Matches Are Rare When Humans Use Technology – LSE, accessed February 13, 2026, https://personal.lse.ac.uk/KANAZAWA/pdfs/EBS2026.pdf
- Evidence for evolutionary specialization in human limbic structures – Frontiers, accessed February 13, 2026, https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2014.00277/full
- Essays — Tim Davis | Blog, accessed February 13, 2026, https://www.timdavis.com/blog/category/Essays
- Blog – Tim Davis, accessed February 13, 2026, https://www.timdavis.com/blog
- A hybrid marketplace of ideas – arXiv, accessed February 13, 2026, https://arxiv.org/html/2501.02132v1
- The Internet of Things Presents – International Journal of Communication, accessed February 13, 2026, https://ijoc.org/index.php/ijoc/article/download/21687/4923/88888
- Digital Unconscious: How AI Reveals Our Hidden Aesthetic Desires – Deodato Salafia, accessed February 13, 2026, https://www.deodatosalafia.com/en/post/digital-unconscious-how-ai-reveals-our-hidden-aesthetic-desires
- The origin of public concerns over AI supercharging misinformation in the 2024 U.S. presidential election, accessed February 13, 2026, https://misinforeview.hks.harvard.edu/article/the-origin-of-public-concerns-over-ai-supercharging-misinformation-in-the-2024-u-s-presidential-election/
- persuasive potential of AI-paraphrased information at scale | PNAS Nexus | Oxford Academic, accessed February 13, 2026, https://academic.oup.com/pnasnexus/article/4/7/pgaf207/8209914
- Large Language Models Are More Persuasive Than Incentivized Human Persuaders – arXiv, accessed February 13, 2026, https://arxiv.org/pdf/2505.09662
- Surveillance Capitalism, Consumer Subjectivity and Marketing in the Age of Artificial Intelligence – Social Science Chronicle, accessed February 13, 2026, https://socialsciencechronicle.com/wp-content/uploads/2025-001.pdf
- Pantano, E., Serravalle, F., & Priporas, C.-V. (2024). The form of AI- driven luxury: how generative AI (GAI) and Large Lang – University of Bristol Research Portal, accessed February 13, 2026, https://research-information.bris.ac.uk/ws/portalfiles/portal/437203168/The_form_of_AI-driven_luxury_how_generative_AI_GAI_and_Large_Language_Models_LLMs_are_transforming_the_creative_process-2.pdf
- Human Rights and Technology Discussion Paper, accessed February 13, 2026, https://humanrights.gov.au/sites/default/files/document/publication/techrights_2019_discussionpaper_0.pdf
- Impoverished Algorithms: Misguided Governments, Flawed Technologies, and Social Control – The Fordham Law Archive of Scholarship and History, accessed February 13, 2026, https://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=2758&context=ulj
- “It Feels Wrong”: Understanding Reactions to Artificial Intelligence as a Decision‐Maker in Selection Through the Lens of Moral Foundations Theory | Request PDF – ResearchGate, accessed February 13, 2026, https://www.researchgate.net/publication/399093120_It_Feels_Wrong_Understanding_Reactions_to_Artificial_Intelligence_as_a_Decision-Maker_in_Selection_Through_the_Lens_of_Moral_Foundations_Theory
- (PDF) The Evolution of Morality and The Problem of AI Value Over-Alignment, accessed February 13, 2026, https://www.researchgate.net/publication/397130027_The_Evolution_of_Morality_and_The_Problem_of_AI_Value_Over-Alignment
- What is Shadow AI? Why It’s a Threat and How to Embrace and Manage It | Wiz, accessed February 13, 2026, https://www.wiz.io/academy/ai-security/shadow-ai
- Shadow AI Governance: How To Manage Hidden GenAI Risks Without Killing Innovation, accessed February 13, 2026, https://www.k2integrity.com/en/knowledge/expert-insights/2026/shadow-ai-governance/
- When the Darkness Consolidates: Collective Dark Triad Leadership and the Ethics Mirage, accessed February 13, 2026, https://www.mdpi.com/2673-8104/5/4/21
- Brain health consequences of digital technology use – PMC – NIH, accessed February 13, 2026, https://pmc.ncbi.nlm.nih.gov/articles/PMC7366948/
SHADOW OF PROGRESS
Innovation creates light, but casts a murky shadow.
Click empty space to build Nodes.
Click on Shadows to integrate them before the system collapses.
SYSTEM COLLAPSE
The unintended consequences overwhelmed the system.
Value Created: 0
Awareness: 0


