I wanted to start a thread for discussion on the risks of not fully functioning self driving cars. Basically, when you have a “self driving car” or a car that has many features to make it somewhat autonomous, the driver is likely to not always pay full attention, assuming the car has things under control. Google decided to address the issue by designing a car that wouldn’t even have a steering wheel, but it seems regulators aren’t fully on board with that idea.
This past week Tesla released it’s Autopilot feature – costing $2500 to activate on one of the coolest cars around. Based on what I’ve seen, Autpilot may be the most advanced step towards a self-driving car available to the general public at this time. Unfortunately, because it is not a complete self driving product, humans are misusing it. There are plenty of videos already around of people using it on side streets or rural roads, only to have it not properly detect the lane and swerve outside the lane. The fact that people have videos of it is a good sign the people are not fully paying attention. There is already one video of a guy explaining how he got a ticket because the car didn’t properly adjust to the speed limit – and he’s explaining all this and showing off multiple different apps while the car is driving itself.
I’m waiting to hear about actual accidents because people aren’t paying proper attention. It is one thing to have it as a back-up system in case the driver gets distracted briefly. It is another for people to rely on it (in its current state) to take them from one place to another without their watching.
Unfortunately, I think that if there are accidents (or more near misses that people are posting online), this will embolden the anti autonomous crowd with examples of the risks. And having a good system that isn’t yet perfect might actually be more dangerous than not having the system at all. People will get too complacent and the system is quite ready for that – based on what I’ve seen.
On the upside, the system looks pretty amazing and seems to work extremely well. It shows what the technology can reasonably accomplish already and is mighty impressive. I can’t wait until there is a mainstream autonomous car to drive me to and from work each day.
Surprisingly, I haven’t heard of any accidents yet from Tesla’s auto pilot. It seems there are plenty of videos of people doing really stupid things with it, as I mentioned above. But so far, fortunately, I haven’t heard of an accident. There was one posted video of the car stopping to avoid an accident when a car cut in front – it was an Uber car actually. I think it makes for good promotional material for both Uber and Tesla, but I also think it shows that the autopilot does drive differently than most human drivers. Specifically, I wouldn’t be driving so fast in the open lane, figuring that there is a chance someone may pull out quickly to try to get into the open lane. Either way, the car worked wonderfully well at stopping when an obstacle was detected:
And then there are videos like this complete idiot who got out of his seat and was sitting in the back. It sucks that any accidents that occur from people being stupid will make the technology look bad. In fact, the technology is amazing. I’m so jealous that I can’t afford to have one of these Tesla’s to take me to and from work every day.