Tesla upgrades its Autopilot

Tesla meets UFO
via Tesla

The new system should be able to detect "a UFO in dense fog."

A driver of a Tesla running on Autopilot was killed in May in a side underride crash. The Autopilot didn't recognize the side of the truck, the driver was likely not keeping his hands on the steering wheel like you are supposed to even on Autopilot, and none of the safety devices built into the Tesla worked because that's what happens in side underride crashes.

One might wonder if this might make Tesla reconsider whether Autopilot is actually ready for prime time but Tesla is not giving up on it; They are introducing an upgrade in October with "more advanced signal processing to create a picture of the world using the onboard radar." Tesla explains that previously, the camera did most of the work but that now they have figured out how to put the radar to better use.

After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar. Photons of that wavelength travel easily through fog, dust, rain and snow, but anything metallic looks like a mirror. The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar. On the other hand, any metal surface with a dish shape is not only reflective, but also amplifies the reflected signal to many times its actual size. A discarded soda can on the road, with its concave bottom facing towards you can appear to be a large and dangerous obstacle, but you would definitely not want to slam on the brakes to avoid it.

The most difficult problem is dealing with objects like overhead signs; the Autopilot is not smart enough to determine whether a car can pass under an object like that. It's possible that the car in the crash thought that the truck was in fact a sign overhead. So now, the cars will use "fleet learning" to build a database, "mapping the world according to radar. "

The net effect of this, combined with the fact that radar sees through most visual obscuration, is that the car should almost always hit the brakes correctly even if a UFO were to land on the freeway in zero visibility conditions.

Tesla is also making it harder to let the Autopilot do all the work, and will require driver to touch the wheel more often. Musk is quoted in the Verge:

With the software update, drivers who ignore the warnings often enough to receive three audible warnings will see the Autopilot system disabled until they bring the car to a halt and put it in park. "New users with Autopilot are incredibly tentative. They pay attention very closely," Musk said. "But the people who know it best are where we see the biggest challenges."

People who use the car a lot tend to rely on the autopilot a lot. Which raises the question once again: Is this a good idea? Google took the approach that a self-driving car shouldn't have a steering wheel, that people are the weakest link in the system. By sort of demanding that the driver pay some kind of attention, is Tesla asking for trouble? Or is it really just a fancy driving assist, an upgraded cruise control? What do you think?


Tags: Self-driving car | Tesla Motors

WHAT'S HOT ON FACEBOOK

treehugger slideshows