Monday, 11 July 2016

Why You Should Wait A Year Or Two Before Rushing Out To Buy That Delicious Tesla


My wife complains sometimes that I rely too much on cruise control; that I'm putting my confidence in a computer that could malfunction in a heartbeat and hurtle the vehicle into the backend of the one in front of us. I like cruise control: I can stretch out my legs, and prevent that cramping sensation between the top of my foot and my ankle. I like the ability to cruise along at a consistent speed. And following behind someone who's also on cruise control is a comfortable thing because I can act on the assumption that, barring any sudden debris or wandering animal across the road, the vehicle ahead will maintain a consistent speed. It's frustrating for me to drive behind someone who pumps their foot on the gas.

I indeed place a deal of trust in my vehicle's cruise control system; but how much more trust must one put in a self-driving vehicle, like the Tesla Model 3's 'autopilot' system? With autopilot, you can pretty much let the car do the driving. The law states you must have your hands close to the wheel, and eyes focused on the road--you're not yet permitted to fall asleep in the driver's seat and wake up at your destination. You are relying on a machine, you know.

Our trust in Tesla's autopilot system was put to the test this week, when a Model X, allegedly on autopilot, veered out of control and crashed into a guard rail. This comes on the heels of a May 7th fatality in which a Tesla Model 3 went under the trailer of an 18-wheeler semi and the roof was torn off at impact. The Model 3 then veered off the road before slamming through 2 fences and into a power pole. 

When you download the latest OS X from Apple, for example, you can expect a number of bugs that, over time, will be worked out. Each time you run an upgrade, you are running a better iteration of the operating system. But what happens when you have a buggy operating system that's controlling a passenger vehicle traveling at 100 kms/hr? This is the issue with self-driving vehicles; and we can't expect the technology to advance without bugs being worked out; and we can't expect the bugs to be worked out without crashes of one kind or another. Does the bug issue justify the loss of lives or a $100K vehicle? Certainly not. But we have to expect a level of risk when we take these fairly new technologies and entrust our and our passenger's lives to them. Will the bugs get worked out? Over time, yes--to such an extent that these vehicles will be considered safer than human operated ones. 

Regarding the Florida fatality, I find it incredulous that it took almost 2 months before Tesla reported the accident. It seems to be a similar reaction to that of Google when its car collided with a bus--they blamed the Google car operator, not the system itself. 

These technologies are exciting and futuristic and dramatically raise the comfort of driving; however, there is a price to pay for the technology to advance. And those early adopters of self-driving cars must be aware of the plausibility of encountering bugs as they're testing the product. Indeed, driving a Tesla right now is an extended test-run; the vehicle is taking all your information and recording it; and engineers--software and otherwise--are analyzing that data to make corrections and eliminate bugs in the system. Glitches are just part of it; unfortunately, those glitches can cost lives. 

For now, I'll stick to my van, my cruise control, and, endearing no less, my wife's periodic disparagement. 




No comments:

Post a Comment