
The moral of the story in John Marrs’ 2019 near-future book, The Passengers, is that artificial intelligence isn’t actually the problem per se; it’s still us, the humans, and particularly, and importantly, the humans with the power of the government behind them. At least, that’s my takeaway from Marrs’ book. The worst villain in the book isn’t some AI robot gaining sentience to harm humans, but a government bureaucrat enriching himself. So, I didn’t so much leave Marrs’ book worried about AI-backed vehicles — although there are certainly issues to consider — but instead, I reaffirmed in my distrust of giving fallible human beings the power governments afford them to wield over others.
In The Passengers, we have a juicy near-future premise: Eight people trapped in what are considered Level 5 AI-vehicles — these are vehicles without even manual overrides, such as steering wheels and brakes, which doesn’t make much sense to me! — ranging from a 78-year-old TV star to a pregnant young woman, and who are told that in two and a half hours, they will die. Oh, and it’s all being broadcast to billions of people the world over via traditional media and social media. Set in Britain, the leader in the “Road Revolution,” as it’s called, we meet a “jury” known as the Vehicle Inquest Jury, a brilliant little red tape addition by Marrs, which is tasked with judging ethically and morally whether an AI-vehicle was at fault in a car crash, or the human. However, the jury seems like a sham since almost-always, the human is found to be at fault. Libby, the one non-governmental, non-medical, non-religious pluralist on the jury, is already predisposed to distrust AI-vehicles after seeing an AI-vehicle swerve and kill three generations of a family in front of her. That’s always the utilitarian calculation an AI-vehicle makes: Protect the Passenger (the person in the vehicle) and/or others, if the costs of killing someone(s) else are lower. Of course, my natural ethical question, and I know this is all an elongated version of the classic trolley problem, is, why is this is a binary ethical question? For philosophical purposes, we can, of course, reduce it to a fun trolley problem that is binary, but the real world is never binary. The real world is never the ticking time bomb is about to go off so we must torture the prisoner or else. The real world is more complex, and thus, more philosophically interesting in that way! But I digress.
What becomes interesting is that the Hacker who has “hacked” these AI-vehicles and is sending these eight people to their death, then makes this impaneled jury decide who to save and who to send to their fiery death. They can only save one. The Hacker does so in the way it appears the jury was already doing in actual cases: Based on only the littlest of information known about the Passengers, i.e., the occupants. First, there’s Claire Arden, who is the pregnant woman. We later learn in a twist that her husband is dead in her “boot,” aka trunk. But then we later learn, he had an aneurysm ready to blow in his brain and he told her to take his body to work to get a better insurance payout, if he blew. So, even that “twist” needed more context to understand. Then, there’s Jude Harrison, who is suicidal, and then we later learn, the object of Libby’s affection after a one-off bar encounter (a bit weird how infatuated she is with him and he with her after a one-off bar encounter!), and still later, that Jude never existed and is probably the Hacker (actually, it’s his brother, Alex, out for revenge as the lover of one of the victim’s in the crash Libby witnessed). Sofia Bradbury is our aging TV actress, who spends the first chunk of the book thinking she’s actually on reality television series and none of this is real, when in fact, she is on a reality television series in a manner of speaking, but it is all too real. Keeping the theme of a twist going, it’s revealed by the Hacker that Sofia protected her pedophile husband primarily to protect her own career, even though the pedophile husband preyed on her own niece. There is not much further context that can abate that crap storm. She later kills herself instead of facing the public. Sam and Heidi Cole are a fun one in a macabre way because they’re a married couple, and we learn that Sam is married to another woman (and married her at virtually the same time based on another near-future dystopian subplot, the ability to find your “soulmate” through DNA matching) and to keep his brain straight, when he had a boy and girl with the second wife, he named them the same names as his boy and girl with the first wife! What a sleazeball, and in her grief and distress, Heidi, a police officer, blackmailed him using her police resources. The last three we don’t spend much time with: Shabana Khartri, a woman fleeing a domestic abuse situation, is killed almost instantly by the Hacker, as is a a double-amputee war veteran, and a Somalia refugee with five children to care for. The latter three don’t even get twist reveals to show they are complicated figures; they’re just completely sympathetic characters, if you’re a normal person. Of course, Jack, the member of parliament and others on social media, find fault with them: Shabana doesn’t speak English despite being in Britain for decades, the Somali immigrant … is an immigrant, and well, I guess there was nothing much said for the double-amputee, negatively-speaking.
In the end, the jury decides to save Claire because her water breaks and the birth of her child she’s nicknamed Tate is imminent, as is the supposed collision the Hacker is sending them on. But before that, we learn that, according to the Hacker, Jack and the British government have effectively been playing God with the AI-vehicles, determining in a best-case scenario Plato way, and worst-case scenario as Libby accuses him, Nazi eugenics way, of who is best fit to survive for the betterment of society, like saving someone in the military versus a drug-user. That’s where the whole, it’s not actually the robots’ fault part I mentioned at the top comes into play; it’s the humans.
Anyhow, the hacker, as I was starting to suspect, doesn’t make all of them collide and die. Instead, the vehicles stop, and the Hacker makes all of the other vehicles on the road collide with each other causing more than 1,000 deaths and more than 4,000 injuries. Libby goes on to continue crusading against AI-vehicles and agitating for more transparency, and Claire has leveraged the terrorist ordeal into media fame and money. As I mentioned, we learn that it’s “Jude’s” brother, Alex, behind it all, and at the end of the book, Libby confronts him and with the help of the police, he’s killed. That then leads to the Hacker Collective, apparently a group of people who helped Alex, to be arrested. My one lingering question: the Hacker threated that if the British government tried to stop the hacked AI-vehicles, they would either explode and/or schools throughout Britain would be bombed, and if they tried to evacuate the schools, nail bombs would go off. Did they ever check such a threat out to see if that was legitimate?! And if it was, how is that even possible for the Hackers to pull that off? It never gets brought back up. Not to mention, it’s never addressed how the Hackers pulled off infiltrating the government’s top secret jury room to begin with, but I suppose if they could hack the apparently unhackable (like unsinkable!) AI-vehicles, then they could hack an offsite, secret government jury room.
Overall, this was a fast-paced (short chapters, often with fun graphic introductions, like what was being said on social media or in the news), fun near-future thriller I didn’t take too seriously, but I also appreciated for not being the typical, “The robots are going to kill us all!” book. Rather, again, it’s the humans who are the problem, after all. This was my first Marrs book, and I look forward to reading more and soon because I also bought his book, The One.

