This short but fascinating article gives a quick overview of the decisions programmers will have to make when coding automated driving cars. If an emergency arises, should the car save you or save others? For example, if your auto-drive car calculates that you will kill twenty pedestrians if you swerve one direction, while you will be slaughtered by an oncoming Mack truck if you swerve in the other direction, it might just decide it would rather you bite the dust rather than those 20 people on the side walk.
The car has all the Vulcan logic of Mr. Spock in Wrath of Khan.* But, do people really want cars making those decisions, or do we want our own human preservation instinct to kick in in these situations? It’s a question programmers have to ask (AND answer, which is the hard part. Asking is easy. I just did that, and wow, was it really really easy. I can ask other questions, too, if you like. But you’d probably think I was digressing, so I won’t ask other questions right now.)
Since I’m not digressing, let’s talk how this relates to Life First. In the novel, the society of the future is all about preserving life as a whole. They’ve faced the edge of extinction and lived to tell the tale. So, their belief system would be very much like Mr. Spock’s, who said, “The needs of the many outweigh the needs of the few.” (For the people who bear the shame of never having seen Star Trek II: Wrath of Khan, Mr. Spock gets radiation poisoning when he fixes the warp core, saving all the lives aboard the ship. As an aside, you don’t have to wear that shame forever. Go watch the movie.)
So, in the Federation of Surviving States, where Life First takes place, it is quite clear what they’d program the cars to do. If more lives would be saved by you dying, then it’s time for you to say, “Sayonara.” 🙁
Gee, that feels like a downer (not the saving lives part; the dying part). However, we’re not quite to the point where automated cars are in every garage. At the moment, only Google is doing the autopilot car thing. To end this post, here are a couple of questions ( since we’ve already established it’s easy for me to ask questions): Would you ride in a car programmed to kill you if that’s what saved more lives? Do you think programmers should make these choices, or should they punt and offer a manual override, so you have to make a decision right as you’re soiling your pants because your car is hurtling toward doom at 60 mph?