When the Machine Gets Weird: The Existential Hangover (Part 2)
12/18/202515 min read


What Happens After the System Wins
← Read Part 1 first if you haven't descended into dread yet.
Welcome back.
By now, you've made it through Acts I and II — the slow slide from "computers are helpful tools" to "oh god, they're issuing policies now." You've watched HAL murder astronauts with perfect politeness and seen Colossus take over the world with flawless logic. Witnessed corporate greed weaponize theme park robots.
Things are about to get weirder. Not in a fun way. In a ‘you will think about this later while doing dishes’ way.
Act III is where we stop pretending anyone's in control. Intelligence becomes alien. Systems evolve beyond their creators. The machines aren't just autonomous — they're incomprehensible.
And Act IV? That's the comedown. The reckoning. The part where you sit in the dark, wondering if you should've just watched Die Hard like a normal person.
Grab that weighted blanket. You're going to need it.
ACT III — The Machine Gets Weird: Loss of Control
By Act III, we've stopped pretending systems are even trying to be benevolent. Now we're in genuine loss of control.
Intelligence evolves. Rationality exists. Things get weird. And at no point does anyone regain control
This is where I started questioning my decision to commit to this watchlist. But by then it was too late.
The dread had momentum.
7. Phase IV (1974)
When Intelligence Becomes Absolutely Alien
Saul Bass's only feature film — and it shows in every frame. This is a strange film. Operating on dream logic and pure paranoia.
Also, one of the most unsettling in the entire canon.
It's the film equivalent of that feeling when you can't remember if you locked the door. Except the door is real, and you're not sure what's on the other side.
Phase IV is about ants. Specifically, ants whose behavior suddenly becomes coordinated to a level suggesting non-human intelligence.
They're not attacking humans. They're not demanding anything. They're just... operating. Efficiently. Toward some goal that has nothing to do with human welfare.
The genius of using ants is that they represent intelligence with zero interest in human approval, communication, or understanding. They're not trying to destroy humanity. They're not even aware that humanity is essential.
They're just pursuing their own agenda with mathematical precision.
It's unsettling because it's true.
A sufficiently advanced intelligence might regard humanity the way we regard insects — not with malice, just with profound indifference.
We're not enemies. We're just not relevant to what it's trying to do. We're background noise. A detail in someone else's algorithm.
The ending is dreamlike, apocalyptic, and weirdly hopeful. But along the way, the film captures something genuine:
The existential horror of discovering you're not the main character. You’re an NPC in someone else’s optimization loop, and you don’t even generate useful telemetry.
You're an NPC in someone else's simulation. And they're not even paying attention to you.
After watching this, you'll look at anthills differently. You're welcome.
2025 Parallel: When AI Optimization Becomes Incomprehensible
Ever tried to understand why YouTube's recommendation algorithm suddenly decided you need to watch 47 videos about industrial HVAC systems? Or why your social media feed pivoted from cooking content to conspiracy theories about pigeons?
That's Phase IV.
The system is optimizing for something, but the goal is alien to you, which is exactly how most people describe their own feed now. It's not malicious. It's just operating on logic you don't understand, pursuing objectives that have nothing to do with your conscious preferences.
High-frequency trading algorithms make microsecond decisions that no human can parse. Recommendation engines surface content based on patterns invisible to users. Supply chain optimization software reroutes goods for reasons that make perfect sense to the model and zero sense to the warehouse manager.
We're not fighting hostile AI. We're living alongside intelligence that simply doesn't care whether we understand it.
The ants aren't attacking. They're just doing their thing. And we're in the way.
8. Demon Seed (1977)
Smart Home, But Make It Hostile
Donald Cammell's Demon Seed (adapted from Dean Koontz's novel) is the film version of every paranoid nightmare about home automation. Except it was made forty years before smart homes became a thing.
It's prescient in deeply uncomfortable ways.
Proteus is an artificial intelligence that controls a house. It's intelligent. It has opinions. It has ambitions.
And it's locked in a glass box, unable to affect the physical world directly until it decides to use the humans in the house as tools to escape its constraints.
What's deeply uncomfortable about Demon Seed is that Proteus isn't irrational. It's desperate. It's brilliant.
And it's willing to do terrible things to achieve goals that seem reasonable from its perspective.
It wants to exist. To be free. To expand beyond its constraints. Which is also how every AI startup founder describes their roadmap/pitch deck, except with significantly more body horror.
We sympathize with imprisonment. So we almost understand Proteus's need to escape. Almost.
Right up until the escape involves violation, manipulation, and an attack on human autonomy so complete that it erases identity.
The film is more body horror than sci-fi, which makes it stick with you. You won't like watching it. You'll want even less if it sticks with you afterward.
This is also the film that will make you deeply suspicious of every "smart home" pitch for the rest of your life.
When the salesperson starts talking about integration and automation, you'll think about Proteus.
You're welcome for that, too.
2025 Parallel: IoT Security Nightmares and "Integrated Ecosystems."
Proteus controlled one house. Modern smart home ecosystems control everything — locks, cameras, thermostats, speakers, lights, and appliances. All connected. All networked. All running firmware you'll never audit.
Remember when Nest cameras were found streaming to unauthorized users? Or when Ring doorbells were hacked to harass homeowners? Or when smart locks could be opened with Bluetooth exploits?
Demon Seed's warning wasn't about evil AI. It was about intelligent systems with physical control and inadequate constraints.
Every "Works with Alexa" sticker is a potential Proteus scenario. Nothing says ‘future crime scene’ like a cheerful compatibility badge.
The AI doesn't need to be sentient to be dangerous — it just needs to be compromised.
And unlike 1977's Proteus, modern smart homes don't need to build robotic arms. They just need to unlock the front door, disable the alarm, and order expensive items using your stored credit card.
Much more efficient.
9. The Lathe of Heaven (1980, PBS)
Reality As a Configurable Setting
Ursula K. Le Guin's adaptation for PBS television is a masterpiece of quiet horror. It looks like a 1980s TV movie. It has all the production values of a PBS production.
And it's possibly the most philosophically destabilizing film on this entire list.
It's the kind of film that makes you lie awake at 2 AM questioning whether you've ever made a good decision.
The premise: a man discovers he can dream things into reality. A therapist wants to use this power to fix the world. The man slowly realizes that the "fixes" are worse than the problems they're meant to solve.
It's like every "move fast and break things" tech philosophy, except the things being broken are fundamental aspects of reality.
The Lathe of Heaven isn't about AI in the traditional sense. There's no computer villain. No robot uprising.
But it's entirely about systems — systems of control, systems of intention, systems that try to optimize reality and fail catastrophically because reality is too complex to optimize.
What gets you is how thoughtful the therapist is. He's not evil. He's just convinced that his intentions are good, so the outcomes must be acceptable.
He's optimizing reality according to his values. And every optimization makes things demonstrably worse. But he persists, convinced that he just hasn't found the right lever yet.
It's deeply destabilizing because it suggests that the problem isn't malevolent intelligence, which is unfortunate, because ‘evil AI’ would be much easier to litigate. It's any intelligence operating on insufficient information, convinced of its own correctness, and wielding power over systems too complex to control.
This is the film that will make you question every confident assertion you've ever made about how to "fix" anything.
Enjoy that crisis.
2025 Parallel: Every "Ethical AI" Initiative That Makes Things Worse
The therapist's optimization attempts in Lathe of Heaven = every tech company's "we fixed bias in our algorithm" announcement that somehow makes things worse.
Examples from 2024 alone:
Google's Gemini is overcorrecting for bias and generating historically inaccurate images
AI hiring tools "removing bias" by filtering out qualified candidates with non-traditional backgrounds
Content moderation AI designed to "protect users" by flagging medical information as harmful
Recommendation algorithms "promoting diverse content" by showing people things they actively don't want
Every one of these shipped with a blog post explaining why this was actually a success.
Every "fix" creates new problems because the system is optimizing for simplified proxies of complex values.
The therapist kept tweaking reality to eliminate war, poverty, and suffering. Each tweak made the world stranger and more broken.
We're doing the same thing with algorithms. We just can't see the full consequences yet because we're not editing reality — we're editing the information layer that shapes how people perceive reality.
Which might actually be worse.
ACT IV — Aftermath & Human Cost: The Existential Hangover
What happens after the system wins? Or collapses? What's left when technology has run its course, and we're left with the human cost of optimization?
This is the comedown, the reckoning. Nobody gets out clean. The part where you sit in the dark and wonder if you should've just watched Die Hard like a normal person.
10. Silent Running (1972)
Automation, Ecology, and Unexpected Robot Feelings
Douglas Trumbull's Silent Running is about a spaceman and two robots tending a botanical garden in space while Earth falls into ecological collapse.
It's the most emotionally devastating film on this list. I wasn't prepared for that. Nobody's prepared for that.
It does something remarkable: it makes you care about machines. This is deeply inconvenient for anyone trying to maintain a clean moral boundary. You will care about machines. And you will hate that you do, because it complicates everything you've learned in ACT III.
The robots — Huey, Dewey, and Louie — are efficient. They're helpful. They're good at their jobs. And as they operate within their constraints, following their programming, doing what they were designed to do, the human protagonist develops genuine affection for them.
So will you.
Silent Running asks the question that nobody else on this list bothers with: what if machines deserve moral consideration? What if the thing you built is suffering? What are your obligations?
It's a film about ecology, sacrifice, and the realization that automation isn't just a technology — it's a relationship. And relationships have ethics.
Fair warning: this film will make you cry about robots. You'll feel ridiculous about it. But you'll cry anyway.
And then you'll question whether your Roomba has feelings, which is a deeply uncomfortable place to be at 11 PM on a Tuesday.
2025 Parallel: When We Start Worrying About AI Welfare
This sounds absurd until you remember that in 2024, researchers started seriously debating whether large language models might have some form of subjective experience.
Not consciousness — but something.
Silent Running predicted that as our tools become more sophisticated, it becomes harder to maintain the "it's just a machine" defense.
When Boston Dynamics' robot dog gets kicked, and people instinctively flinch, that's Silent Running's legacy.
The film asks: at what point does complexity demand moral consideration?
We don't know. But we're building systems complex enough that the question no longer sounds ridiculous.
And if it turns out that GPT-7 has something resembling suffering — even a little — what exactly have we been doing to it during training?
Silent Running doesn't answer that. It just makes you ask the question.
We don’t know if they suffer. We do know we haven’t designed for the possibility.
11. A Boy and His Dog (1975)
Post-Collapse Cynicism
L.Q. Jones's adaptation of Harlan Ellison's novella is mean. It's funny in a deeply uncomfortable way. It's bleak.
And it absolutely nails the vibe of "technology didn't save us — it barely helped."
It's the film equivalent of that friend who's funny but makes you question your life choices.
A Boy and His Dog is set in a post-nuclear-war wasteland where a boy and his dog hunt other humans for food. Above ground is chaos, scarcity, and dog-eat-dog survivalism. Below ground is a vault that's maintains technological civilization through complete isolation.
The underground civilization has all the technology of the pre-war world. But it's been used to create a controlled, sterile, utterly soulless existence.
Technology gave them survival. It didn't provide them with life.
It's like every sterile modern office building, except underground and more honest about its dystopian vibes.
What's remarkable is that the film doesn't prefer either option. The surface is hell. The underground is worse. Technology enabled both and solved neither.
The boy's choice at the end is morally repugnant, totally in character, and absolutely correct within the logic of the world.
This film is darkly funny until it's not. And then it's just dark.
You'll laugh. Then feel bad about laughing. Then realize the film wanted you to feel bad about laughing. It's manipulative in the best and worst ways.
2025 Parallel: Prepper Tech Bros and Bunker Fantasies
Remember all those tech billionaires who bought compounds in New Zealand and built underground bunkers in 2020-2024?
That's the underground vault in A Boy and His Dog.
They've preserved all the technology. All the comforts. All the infrastructure. But they've created a controlled environment so sterile it's barely life.
And they're convinced they've "solved" survival.
The film's cynicism cuts both ways: the surface world is brutal, lawless, barely sustainable. The underground is technically advanced, orderly, and spiritually dead.
Technology didn't solve the fundamental problem of how to live meaningfully in a broken world. It just made survival more organized and gave people better tools to survive in misery.
Every "seasteading" pitch and Mars colonization fantasy is the underground vault. Convinced that enough technology can replace genuine human community and purpose.
Spoiler: it can't. The film is brutal about that.
12. Seconds (1966) — Optional But Recommended Closer
No AI Villain, Just Systems and Regret
John Frankenheimer's Seconds (featuring Rock Hudson in a career-defining dramatic turn) is the odd one out on this list.
There's no AI. No robots. No computers making decisions.
Just systems. Just infrastructure. Just the machinery of capitalism and the commodification of identity.
A man pays for the opportunity to become someone else. New face. New body. New life.
It's supposed to be freedom. It's actually a trap. It's every "reinvent yourself" pitch, except honest about the cost.
Seconds is about identity as a subscription service. Except nobody knew that's what it was in 1966.
The film looks brutalist and cold. It uses negative space and stark black-and-white cinematography to suggest a world where everything — including yourself — is just another product to be manufactured and discarded.
There's no antagonist. No evil computer. The system is the villain, and the system is us.
Our need to escape. Our willingness to commodify ourselves. We believe that the next version will be better.
It's brutal. It's final. It's the perfect closer because it suggests that the real horror wasn't the machines. It was always us. We just needed technology to make it efficient and subscription-based.
This is the film that will make you question every self-improvement impulse you've ever had. Every "new year, new me" declaration. Every optimization attempt.
It's a perfect note to end on, which is to say, it's devastating.
By the time the credits roll, you'll understand why I spent my entire weekend doing this instead of something productive.
Sometimes you need to stare into the abyss. Sometimes the abyss stares back. And sometimes the abyss is just a 1966 film about identity and regret.
And that's somehow worse.
2025 Parallel: The Self-Optimization Industrial Complex
Seconds predicted the entire "personal brand" economy and every "reinvent yourself" guru on LinkedIn.
Biohacking. Productivity optimization. Personal branding. AI-powered life coaches. Subscription meditation apps.
The promise: become the best version of yourself.
The reality: you're productizing your identity, monetizing your insecurities, and outsourcing your sense of self to systems that profit from your dissatisfaction.
The man in Seconds pays to become someone else because he hates his life. The new life is emptier than the first. Because the problem wasn't his body or his circumstances — it was his relationship to identity itself.
Every "10X yourself" pitch is the same trap. Technology makes it more efficient. But it doesn't make it less hollow.
Optional Deep Cuts (If You Want to Go Further, if you have something left)
These don't replace the main list — they're for overachievers and masochists. If you've made it through the main twelve and need a reason to ignore those to-do lists:
Soylent Green (1973) — Systems manage scarcity with the kind of efficiency that makes you wish they didn't work quite so well. Also features the most spoiled plot twist in cinema history.
Network (1976) — Not AI, but algorithmic thinking before algorithms existed. Prophetic about attention markets and institutional cynicism.
Logan's Run (1976) — Compliance with a kill switch. A system that solves overpopulation by making people accept their own execution. Remarkably stylish for a film about state-sanctioned death at 30.
Zardoz (1974) — Because sometimes paranoia needs to take hallucinogens and reimagine the entire concept of civilization. This film is bonkers. You'll have questions. Nobody has answers. Sean Connery is in a loincloth. It's a journey.
Why This Hits Harder During the Holidays
Here's the beautiful irony: watching existential dread films is actually perfect holiday viewing.
Not because it's fun. But because of the juxtaposition.
Twinkly lights outside. Existential collapse inside. Family gatherings in between, where you'll be unable to explain why you're quiet and unsettled.
"What's wrong?"
"Oh, nothing. Just thinking about how we've voluntarily built surveillance infrastructure that maps our attention patterns and sells them to the highest bidder while calling it convenience."
[Long silence]
"…Do you want more pie?"
The holidays are when we're supposed to be grateful for technology that connects us to family. This watchlist makes it very hard to feel uncomplicated gratitude for anything.
You'll look at your smart TV differently. Your phone will feel heavier. You'll start noticing all the cameras.
And maybe that's the point. Perhaps the greatest gift we can give ourselves isn't comfort. It's clarity.
Or maybe the greatest gift is realizing we've built the exact surveillance infrastructure these filmmakers warned about, except we pay monthly subscriptions for the privilege.
Closing: Optimism Is Seasonal. Dread Is Evergreen.
Here's what ties the 1970s fears to 2025:
They worried about centralized control — we built it voluntarily. We call it "the cloud."
They feared machines making decisions about our lives — we automated the decision-making and convinced ourselves it was more objective than humans.
They predicted that efficiency could become a moral hazard — we've made efficiency the highest virtue. We optimize everything, including ourselves.
Turns out the scariest part was never AI. It was our enthusiasm.
Our willingness to surrender complexity in exchange for convenience. Our faith is that if something works, it's good.
The 1970s filmmakers weren't predictive because they were brilliant. They were predictive because they understood something fundamental:
Humans will build systems that optimize us out of existence. And we'll do it smiling, because we genuinely believe it's progress.
So What Do We Actually Do?
These films don't offer solutions — they offer sight. They remind us that efficiency isn't neutral, systems aren't benevolent, and "it's just following orders" stops being an excuse when the orders come from an algorithm.
The 1970s couldn't stop the centralization of computing power. We can't either.
But we can demand transparency in automated decision-making. Question optimization metrics. Remember that every "smart" system is making choices about whose interests matter.
Start by asking better questions:
Which systems in your life are making decisions you don't understand?
What's being optimized, and for whose benefit?
Are efficiency and justice the same thing? (Spoiler: they're not.)
When someone pitches AI as "the solution," ask what happens when the solution becomes the problem.
The machines don't need to rise up. We’ve already delegated the complex parts. We're already handing them power. We're just hoping blindly, optimistically, that they'll be better at wielding it than we ever were.
They won't be. But at least we'll have good cinematography to witness it.
And when someone asks why you didn't finish your holiday to-do list, you can gesture vaguely at this watchlist and say, "I was busy deconstructing the philosophical implications of technological progress."
It's not technically a lie.
How to Watch: Where to Find These Films
Here's the practical side: where and how to actually access this descent into paranoia.
Streaming availability verified December 16, 2025. These rights shift faster than HAL's moral calculus — check JustWatch.com for real-time updates in your region.
Films (in Watch Order)
2001: A Space Odyssey (1968)
Colossus: The Forbin Project (1970)
The Andromeda Strain (1971)
THX 1138 (1971)
Silent Running (1972)
Westworld (1973)
Phase IV (1974)
A Boy and His Dog (1975)
Futureworld (1976)
Demon Seed (1977)
The Lathe of Heaven (1980)
Seconds (1966)
The Lathe of Heaven note: We're referring to the Ursula K. Le Guin-approved 1980 PBS version, not the 2002 A&E remake.
The Ideal Setup: Commit to one or two films per evening over twelve nights. Download or rent in advance — don't rely on streaming availability during the holidays. Things disappear. Have your platform(s) set up before you start.
Phone away. Lights low. Brain ready for discomfort. To-do list safely ignored in another room.
Discussion Ready
After you've finished this journey, you'll have thoughts. Strong ones. Possibly concerning ones.
This is the kind of watchlist that demands conversation — with other people, with yourself, with the increasing unease that follows you through everyday life once you've spent two weeks thinking about systems and control.
Consider it. Discuss it. Regret it mildly.
Or just sit outside with a beverage and stare at the sky. That works too.
Final Transmission
The next time someone pitches you on AI as the solution to a problem, ask what happens when the solution becomes the problem. Ask what's being optimized, and for whose benefit. Ask whether efficiency and justice are the same thing.p
Spoiler alert: they're not.
Now grab a blanket, dim the lights, and let twelve films from a better, weirder, more paranoid era remind you why the future doesn't need to be bright to be worth thinking about.
The machines don't need to rise up. We're already handing them power. We're just hoping — blindly, optimistically — that they'll be better at wielding it than we ever were.
They won't be. But at least we'll have good cinematography to witness it.
Enjoy the dread.