
Did you hear? Self-driving cars are here! Or at least that’s what the headlines suggested last week, as Tesla released an update for their pricey cars to enable a new “autopilot” system. As it turns out, the new system isn’t new – but the deployment is. And, unfortunately, that’s what makes it so dangerous. Tesla is setting expectations for their “autopilot” system much higher than what the car can actually do – and someone is probably going to get killed.
Let’s start by reviewing some of the Friday headlines:
- Welcome to the self-driving car revolution: Tesla releases Autopilot patch (Computer World)
- Tesla rolls out self driving car software (Ft.com)
- Tesla cars gain self-driving sentience overnight (WaPo)
- With New Software Rollout, Tesla Accelerates Toward Fully Self-Driving Cars (Recode)
- Tesla amps up Model S self-driving capabilities with firmware update (Extreme Tech)
The Washington Post had an especially starry eyed (and factually incorrect) write up, which reads like a PR dream:
With one over-the-air update Wednesday night, Tesla Motors has brought a new breed of self-driving car to American roads.
Tens of thousands of Tesla’s all-electric sedan, the Model S, bought in the U.S. over the last year have already started downloading or installing “Autopilot” mode, one of the first great breakthroughs for making the kind of driverless magic seen mostly in Google-car demos.
With “Autopilot,” the Tesla S can steer, change lanes and drive at highway speeds with little to no help from the human behind the wheel. It can parallel park, using its banks of cameras and sensors, and slow to a stop if the driver happens to drift asleep. In the next update, it may even be able to rouse itself from its parking space and pick the driver up.
Washington Post
Wow!
But hold on a second. Is this stuff actually new? And is it as great as Tesla is making it out to be?
As it turns out, the software package isn’t quite the revolution that Tesla’s PR buzz would have you believe. Essentially, the technology has been around for a good five years, sold under different marketing names, and available almost exclusively in high-end German luxury cars.
While features like adaptive cruise control and active lane assist have been around for a few years, it was in 2013 when Mercedes bundled them all together in one “self-driving” package.
Competing carmakers offer some of the same assistance and safety features. But only Mercedes has integrated the sensors, controls and 36 separate technologies — of which 11 are new or updated — to work together in a bundle, which it calls Intelligent Drive. One Daimler executive said the system has been in the works for 15 years.Probably the most advanced feature is traffic-jam assist, which rivals are scrambling to get into their vehicles. In congested traffic, a driver can let the car steer, brake and accelerate itself at speeds lower than 37 mph.
Combining the features to work together is the tricky part, says Richard Wallace, director of transportation systems analysis at the Center for Automotive Research in Ann Arbor, Mich. “Sophistication is needed to integrate these features. … They do seem to be ahead of the pack,” Wallace said. “It is easier said than done.”
Intelligent Drive includes systems to help prevent collisions, a pedestrian and animal recognition feature, lane keeping, parking assistance, rear-crash monitoring, crosswind stabilization, distance control, night vision and a suspension that automatically adjusts before the car hits an imperfection on the road.
Automotive News
In fact, the Mercedes system still offers a lot more than what Tesla rolled out, specifically the pedestrian detection, night vision, and crosswind stuff.
If Mercedes isn’t your cup of tea, here’s an Infiniti doing some self-driving over a year ago.
https://www.youtube.com/watch?v=zY_zqEmKV1k
So what makes the Tesla launch special?
1) It’s coming out of Silicon Valley, so it MUST be revolutionary.
2) Tesla is less concerned about safety than the established automakers
To the first point, it’s an unfortunate trend we see in tech reporting – which is where Tesla is generally covered, rather than the established automobile reporting. It’s always report-first, ask questions later. The website that can get the PR out first wins. If the PR says it’s revolutionary, then damn it, it must be.
It reminds me of how Uber introduced the revolutionary idea of “collection points” to save everybody time and money by making the trip more efficient. Because it was “introduced” by a hip tech company, it is apparently the next big thing. Of course, one might point out that the concept was actually pioneered by the public buses and their “bus stops.”
One might also be reminded if the hysteria around the Hyperloop announcement.
But that’s not really important here. What is important is that Tesla’s hype, + a tech audience might lead to some very dangerous results. Especially because unlike the German companies, Tesla seems to not give a crap about safety.
Tesla released their new software update with three important legal caveats:
1) It is a beta test
2) It should only be used on limited access highways
3) Drivers should keep their hands on the steering wheel
Or so they claim. But that’s not what’s happening.
1) A beta test implies a limited release – not software pushed to EVERY car in the fleet, as Tesla did.
2) If Tesla cared about this “highway only” limitation, they would use their great software + embedded navigation system to only allow autopilot activation on limited access highways. Especially because the definition of highway varies greatly. Instead, they’ve put zero limits in place. Combine with the PR tour (journalists were taken on a test of the software on urban streets), and they’re encouraging widespread use of the software packages in all scenarios.
3) If Tesla cared about drivers keeping their hands on the wheel, they would require it with a sensor. Many of the existing German systems will disengage autopilot if they do not sense hands on the wheel, because, you know, they’re serious about this point.And again, Tesla did PR where “hands free” was a feature.
Also alarming is the way this update was sent out. It was released on a Friday, with huge PR buzz, pushed to every vehicle. The natural result is that Tesla owners jumped into their cars to test it out.
This is especially problematic because as a “cool tech gadget” Tesla vehicles may be seen and treated like a toy – rather than heavy machinery.
Browsing through the Tesla subreddit reveals the glee in which the owners are experimenting with the tech. Not just on limited access highways of course, but EVERYWHERE.
Autosteer, by contrast, is designed for freeway use at this point, but I engage it everywhere I possibly can β to learn the limits (and because itβs fun).
Formidable Ventures
Here’s a video of someone taking it onto a bi-directional suburban road
https://youtu.be/tH1ipC4MBZ0?t=3m47s
While the video looks great, it was day 1, when the driver was in full attention. What happens in two weeks, when the drivers gets used to the product and dozes off?
Remember, the car does not respond to traffic signs or signals. If there is no car in front (5:38), stopping, it would blow through a stop sign or light. At 6:36, you see the car does not know what to do at the roundabout – the driver has to immediately take control. Later in the video, he tried to slow the car down to see if it would manage another roundabout. Nope, it immediately starts accelerating so he has to take over.
Here’s some more “fun”
A driver on a bidirectional roadway, at night, in the rain who spends about 20 seconds looking at the monitor trying to turn the feature one. Yes, while moving, and this is without the software!
https://youtu.be/gYBV9Dqmsxs?t=41s
So much fun!
The reports of alarming situations are not limited to user error, as in the case of that video.
One YouTube video shows a potentially fatal collision narrowly avoided – by the driver
After several seconds of successful hands-free driving, I admit I started to ignore the warning to keep my hands on the wheel so that I could record the moment to share with friends. That’s when all hell broke loose. My car was tracking the car in front of me to successfully steer, but I was going slower and that car eventually drifted far ahead of me. Shortly after that, another car was coming in my car’s direction from the opposite side of the road. I can only guess at what happened next. My car apparently thought the oncoming car was supposed to be followed, and suddenly veered to the left, crossing the double-yellow road divider line right into its path. Had I not reacted quickly to jerk the steering wheel in the opposite direction, a devastating head-on collision would have occurred.
https://www.youtube.com/watch?v=MrwxEX8qOxA
Yup, he wasn’t on a limited access highway. But again, isn’t this something that Tesla can control? Shouldn’t the navigation system make it clear they’re not on an interstate? Is that not terrifying?
Another owner talks about some of the hazardous situations the car can put itself into:
If for some reason it loses the lane markings, it falls back to following the car in front of you. It can lose the lane markings because they don’t exist (in intersections) or because it can’t see them (deep shadows, cresting a hill, or faded markings.)
When it makes this transition, if the car in front of you does something dumb, you’re going too. I’ve had this happen in intersections where people make illegal lane changes in the intersection – the Model S tries to follow them. Also, if you have close oncoming traffic it could mistake an oncoming car as the car to follow. That’s where it gets bad and you have to intervene.
Another common problem seems to be in its natural habitat – the limited access highway. Driving along briskly at 65mph on a highway, the car decides to follow the fog line straight into a low speed exit – without slowing down.
“I hope the next update will teach AP about how offramps work. Most offramps required me to place my hands on the wheel and force it to stay on the highway.” 1
“The only surprise was when I was in the right lane on the freeway, the car took the next exit by itself, following the righthand line!” 2
“Stay out of far right line, or it does like to exit the freeway π With a car in front of me it stayed on, but once tracking by itself it would leave if the chance came up.” 3
“But it might be a bit of a surprise to the driver, since it maintains highway speeds on the offramp. When driving on a freeway which terminated into a hard turn and conversion to a surface street, the autopilot seemingly kept continuing on at freeway speeds. I turned off autopilot before risking a crash.” 4
Watch this driver almost crash on his off-ramp, as the Tesla appears to think the landscaped berm is part of the highway.
https://youtu.be/oIxNgts8y_E?t=2m52s
One sharp-eyed reader noticed that the Tesla website advertising the feature was stripped of the following sentence right before release:
Standard equipment safety features are constantly monitoring stop signs, traffic signals and pedestrians, as well as for unintentional lane changes.
That’s right. The self-driving Tesla does NOT respond to stop signs, traffic signals, and apparently, pedestrians. Except this was presented by omission. Someone who has been following Tesla news might think this update includes all the advertised features.
I haven’t seen anything about bicycles yet, but one of the off-ramp anecdotes hints at potential problems with the car and bike lanes.
Potential bike lane failure opportunity:
“Most offramps were preceded by dashed white lines, and it happily drifted over those lines.” 5
Auto-pilot is not quite ready for prime-time. That’s understandable. What’s not understandable is pretending it is, and releasing the update in such an irresponsible way with no fail-safes. Sure, if anything happens, Tesla will point to the fine print. Legally, that might help them out. But sadly, the marketing team isn’t quite on the same page.
After all, just last week, the Tesla marketing team visited automobile magazine around the country and took them on test-drives – on urban (non-highway) roads, where the journalist was allowed to remain with no hands on the wheel.
Here’s another test drive, done by Tesla, with CNET, showing them encouraging the same dangerous behavior – urban road, hands and feet off.
That looks like an endorsement of the behavior to me.
And that’s extremely dangerous.
Incidentally, owners of the luxury German cars with similar software were not immune to abusing their systems:
Romanian police have issued a warning over reckless motorists using autopilot systems to let cars drive on their own following a new, worrying trend. The warning came after a shocking video showing a ‘driver’ sitting in the backseat of his car while driving at 40mph as his terrified girlfriend screams from the passenger seat. The Mercedes S Class in the footage is fitted with dozens of sensors which read the road ahead to adjust the steering, speed and brakes.The system only works while the driver’s hand is on the wheel, but it appears the sensor is easily fooled when any other object, such as a bottle, is hung there instead. Speaking in Romanian, the man is heard telling his girlfriend not to worry, and boasting that the car can drive itself.
Daily Mail (sorry) with video – from January 2015
But again, these companies actually built in various fail-safes, like sensors requiring hands on the wheel, and advertise their product as driver assistance – not autopilot. Look again at the image of the Mercedes, every feature “helps” the driver, is an “assist” or “protection”, not a “pilot” or “self driving”.
Language is important in setting expectations. Mercedes nailed the language, from a safety perspective but it looks like Tesla values press over safety. And so far it has worked – outlets like the Washington Post truly believe Tesla invented this stuff. I just hope the next stage of this “beta” doesn’t involve a fatality.
The most worrying thing is the degree to which Musk appears to believe his own blathering (on this and many other issues), even when it's obvious to anyone with a clue that he's completely full of it….
[Elon, you know you're not actually Tony Stark, right?]