In Apple TV’s ‘Pluribus,’ the biggest ethical dilemmas ‘are our fault,’ a philosopher says
Could the apocalypse actually be a good thing for humanity? A Northeastern University philosopher and ethicist digs into the heady questions posed by Apple TV’s new hit science fiction show.

Spoiler alert: This story contains details about the first five episodes of “Pluribus.”
In “Pluribus,” Apple TV’s new hit science fiction show, the apocalypse seems downright cheery.
After humanity receives a signal from outer space, they accidentally manufacture an alien virus that infects the world and turns almost everyone into part of a peaceful hive mind. The content collective consciousness seemingly has only one goal: tend to the needs of the few unaffected people on the planet. For one of those people, author Carol Sturka (Rhea Seehorn), that might be even worse than all the death and destruction that came with the birth of the hive mind.
Take away all the mystery, drama and dark humor from “Pluribus” and at its core is the kind of ethical dilemma that defines the best sci-fi stories. Is Carol, and humanity, better off joining the collective or stubbornly clinging to her individuality?
It’s a complex question without an easy answer and not the only one posed in the show. To understand more about the ethics of “Pluribus” and what has drawn such huge interest, Northeastern Global News sat down with John Basl, an associate professor of philosophy at Northeastern University.
“For me, when I’m thinking about all the different questions [‘Pluribus’] is asking, it’s about how do we navigate very difficult tradeoffs for which we don’t have any answers?” Basl said. “How do we think about the different dimensions of ethics when we’re trying to navigate big and small choices?”

Although Carol is the protagonist, the show’s writers poke and prod at her worldview in ways that complicate the assumption that she is right about wanting to return the world to the way it was before the virus.
“There’s no doubt that when the world gets taken over by zombies, that you don’t want to turn into a zombie,” “Pluribus” creator Vince Gilligan said in a video released by Apple. “But in this show, with what happens, I hope you kind of ask yourself, ‘Maybe this isn’t so bad.’ I think everybody is going to have their own opinion. To me, that’s the best part.”
Despite the complex ethical questions “Pluribus” poses, Basl is resolute on one piece of the philosophical puzzle.
“Everyone who has become part of the collective has died,” Basl said. “The thing that they become might have a better overall life. It might be good for me to be radically cognitively different than I am … but it won’t be me.”
For Basl, “Pluribus” tests the boundaries between the needs of the individual and the needs of the many in a way that complicates both sides of that binary.
Unlike before the virus, the Others, as members of the collective are known in the show, are nonviolent to the extreme. They won’t even hurt a fly. In a post-Others world, there is no war and no crime. However, when the hive mind encounters negative emotions, every member of the collective on Earth immediately goes into a kind of catatonic state, violently shaking as if they are in the middle of a seizure.
As officially the most miserable person on the planet, Carol ends up killing millions of people just by yelling at one of the Others. For Carol and the audience, it’s the most fatal lesson in how every action has a consequence and every person is part of a whole.
“My actions as the individual, when they’re negative, can really harm a huge amount of people, the collective,” Basl said. “What does that do to our obligations?”
For Basl, an AI ethicist, it’s impossible to think about these questions and not read “Pluribus” as a metaphor for AI. The constantly upbeat, flattering way the Others talk mirrors AI sycophancy. The way the Others share knowledge evokes real-world conversations about the potential for AI to become superintelligent and surpass human cognitive abilities, Basl said.
Like AI, the virus in “Pluribus” only comes out of human ingenuity. Human-made satellites detect the alien signal; human-run labs create the virus.
“All the philosophical challenges that arise in the show are our fault,” Basl said.
Editor’s Picks
However, Basl mostly sees the show as a comp for the paperclip maximizer, a thought experiment coined in 2003 by philosopher Nick Bostrom. In this scenario, humanity tasks an AI superintelligence with maximizing paper clip production.
“Then the machine might reason, ‘What’s the biggest risk to paper clip production? People might turn me off. So, I’ll just kill them all,’” Basl said. “The idea is that it carrying out its task might lead to the destruction of us even if that’s not its intention.”
Similarly, the Others view themselves as technically a more peaceful alternative to humanity. They don’t kill each other, they don’t harm animals or the planet and they are totally unified. There is so much Carol, and the audience doesn’t know about the Others. They can suffer, but can they experience joy? What do they want other than to persist?
So, is it truly better for Carol to give in and join or remain stubbornly herself? According to Basl, the existential and ethical tradeoffs in each case are too significant — and too personal — for there to be a right answer. That’s what makes “Pluribus” sing.
“For me, what hits hardest is appreciating all the little value tradeoffs and the big value tradeoffs,” Basl said. “For a woman who wants to join the collective because her child’s part of the collective, that’s a very understandable instinct. … There’s the small but deeply important ethical issues in this broader ethical issue.”










