Tesla Stans Accidentally Highlight Shady AP Behavior

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!


A couple of weeks ago, I came across a very interesting article at Electrek that’s worth talking about. In it, Fred Lambert (a man who has become a target for Tesla Stan hatred in recent years), discusses something kind of crazy that happened when the Stans tried to defend a very poor performance by Tesla’s vision-only driver assist features.

In the video, Mark Rober took several different cars with semi-autonomous features and tested them against different situations. Children in the road, smoke, heavy rainfall, and blinding light were all simulated. Most comical of all was a Looney Tunes–style test with a fake roadway, like Wile E. Coyote attempts to use against the Roadrunner. Lidar-based systems easily saw through the trickery, but Tesla’s vision-only system plowed right into the (thankfully styrofoam) wall.

Instead of admitting that the god-king might actually poo and pee like us mere lazy mortals, the Tesla Stans went into conspiracy mode. They found some frames in the video showing that the Autopilot system wasn’t on, and concluded that Rober must have manually driven the vehicle into the barrier to make Tesla and Dear Leader look bad.

But a frame of video out of context didn’t tell the whole story. Lambert pointed out that just a few seconds earlier, the system had, in fact, been on. Only at the last second, when the Autopilot system detected an imminent and unavoidable crash, did it shut itself off. So, in reality, the Autopilot system could not cope with the Wile E. Coyote test.

The article goes on to point out that this is far from the only time we’ve seen this behavior by Tesla’s systems. On multiple occasions, Tesla’s systems have been caught disengaging themselves just before crashes, possibly with the intent of evading responsibility. If the system wasn’t technically on at the time of the crash, Tesla can (as it has in the past) claim that the driver was entirely at fault. Divorcing this fact from the context that the system shut off at the last second is deceptive.

The article goes into greater detail about this, and does point out that we can’t be 100% sure why Tesla has the system behave like this. However, seeing Tesla Stans use this information this way does show us how deceptive this practice can really be. I don’t doubt that at least some of them did believe that Autopilot was turned off, but I also don’t doubt that some of them picked through the video and tried to use this to deceive. And this is all assuming that the accounts in question on “X” aren’t Tesla sock puppets.

Sadly, this kind of thinking that both keeps Tesla from going back to radar and lidar and sustains the cult can’t work out in the long run. There is nobody on the planet that always makes the right call. No matter how awesome, intelligent, and competent someone is, they will  choose poorly at times. When someone believes that they are always right and nobody is around them to push back on mistakes, this problem compounds until disaster happens.

Hopefully soon Tesla’s board and the remaining pro-Elon Tesla Stans will figure out that Tesla the company is in bad shape, and needs real leadership while Elon goes on with his quixotic quest in politics.

Featured image by CleanTechnica.

Whether you have solar power or not, please complete our latest solar power survey.



Chip in a few dollars a month to help support independent cleantech coverage that helps to accelerate the cleantech revolution!


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.


Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one if daily is too frequent.


Advertisement



 


CleanTechnica uses affiliate links. See our policy here.

CleanTechnica’s Comment Policy



Source link

Leave A Reply

Your email address will not be published.