Steve’s breakdown: A brand agency should give Tesla a call because their “Autopilot” should not be branded autopilot. It’s completely, by definition, incorrect.
PALO ALTO, CA: This spring I road-tested a Tesla Model S P90D in Autopilot mode. The first thing to understand about Autopilot is that it’s not one.
Unlike airliner autopilot systems that allow pilots to set a course so they can work on other tasks, Tesla explicitly warns customers at its stores that its Autopilot is for “driver assistance only.” Drivers must pay attention.
After all, at 30,000 feet there are no semis. No trees. No stoplights. On solid ground, Tesla Autopilot demands the pilot be engaged.
In my half-hour drive on Telegraph and the streets of Oakland County, my hands were never far from the steering wheel. Were they not, I tempted any number of risks — from veering off the road to running through red lights.
Sadly, since Autopilot was introduced last fall, some drivers haven’t taken these warnings to heart. In the past week, we’ve learned of a fatality alleged to be related to a Tesla Model S on Autopilot in Florida and a major accident involving a Tesla Model X on the Pennsylvania Turnpike. Blame the drivers, sure, but the automotive press needs to be more cautious about over-hyping the potential of self-driving cars. Truly self-driving cars are years away and may never be truly autonomous. And the startup company, too, has over-hyped what is clearly a test-phase program in order to get a leg up in the dog-eat-dog luxury segment.
Take the misleading name Autopilot. A quick primer on how it works: Though Autopilot is the most advanced driver-assist feature on the road today, its technology is familiar in the digital revolution that has transformed autos in the last five years. Most luxury cars — BMW, Audi, Mercedes — employ similar systems using cameras and radar. Even many nonluxury vehicles now option adaptive cruise-control and lane-assist.
Adaptive cruise-control uses a radar in the front grille to maintain a distance from the car in front while the car is at a set speed. Add a camera above the front mirror and a car will also monitor lane markings to maintain its lane.
With Tesla’s version 7.0 software update, its Model S and X puts these driver-assist capabilities on steroids. In addition to radar and front and rear cameras, 12 ultrasonic sensors wrap the car in a 360-degree digital cocoon monitoring vehicles in front, beside and behind. The software also enables self-parking and the ability to remotely extract a car from a parking space.
But my Model S tester was no Google car. The marshmallow-shaped autobot I drove — more accurately, rode in — at Google headquarters last year was truly autonomous. It doesn’t even have a steering wheel. In addition to radar and camera sensors, the Google car is equipped with LiDAR on its roof that constantly scans its surroundings.
Tesla’s system is less ambitious. Think of it as a safety monitor — or just a cool toy.
For example, Tesla programs its hardware to preform neat tricks like auto-lane changing. On Telegraph, I pulled the left-turn signal — but the Tesla did not immediately react. The ultrasonic sensors sensed a car next to me. Once that car glided past, my Model S switched lanes without me touching the wheel. Neat, but don’t get too comfortable.
In fact, I was aware I shouldn’t be relying on Autopilot much at all. “Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel,” warns Tesla’s online manual. The reason became immediately clear as I approached a stoplight. The camera couldn’t see it. I put my hands back on the steering wheel and braked to a stop. No wonder my Tesla contact advised that “we only recommend using Autopilot on the highway” — there are no stoplights on the open road.
Engaging Autopilot again — a simple tug on the cruise-control stalk — I pulled away from the traffic signal and was back up to speed. Then Telegraph’s right-lane marker suddenly disappeared at a neighborhood entrance. “BING!” The Model S’s chime told me the system was confused — and a message in the digital instrument display warned “hold steering wheel.”
You get the idea. At no time was I tempted to text on my phone, much less watch a video or take a nap. Autopilot it is not. But clearly some in Tesla’s army of fanatical customers are willing to act as beta testers. Beta-testing is alien to Detroit’s lawyered-up auto industry, but in the Silicon Valley computer culture out of which Tesla was born, it’s second nature.
“We still think of it as a public beta, so we want people to be quite careful,” Tesla CEO Elon Musk said when unveiling Autopilot last October.
But the consequences of beta-testing automobile software are much greater than, say, the latest version of “World of Warcraft.”
“Slow-moving gridlocked traffic on Autopilot works super well,” Musk enthused last fall, “almost to the point where you can take your hands off. I won’t say you should. Some people may — we don’t advise that.”
But with a name like Autopilot, I can see where folks might be tempted.