> The car was making a turn. Something felt off—the steering wheel jerked one way, then the other, and the car decelerated in a way I didn’t expect.
I use the latest FSD in an M3 and I have noticed it behave indecisively when changing lanes, not so much when turning, but I believe the author's account.
> I turned the wheel to take over. I don’t know exactly what the system was doing, or why. I only know that somewhere in those seconds, we ended up colliding with a wall.
The author disengaged FSD (reasonable when concerned) and ran into a wall.
I almost never let go of the steering wheel when FSD is driving. I want to be able to take over with the minimum delay. I don't know that I'll ever trust it to drive unsupervised.
It's an unbelievable driver-assistance system. But you need to treat it as such. Tesla may market it, and name it, otherwise, but anybody using FSD should quickly realize it has limits.
For aircraft it's very rare to receive control back from the autopilot in an upset-state. An exception is a trim-runaway, which is a very serious emergency that gets trained for.
In cars you don't have that luxury, it's much more likely that you have to act immediately with very short notice. That does not work for human drivers.
So why pay extra to take risks?
>I don’t know enough about what actually happened during my accident to say that Tesla’s technology crashed the car.
The cause may have been the combination of FSD and human takeover. When the car fucks up, the driver can take over poorly. Sort of how 'overcorrection' on a highway could spin you out of control.
There is a gray area where a car's guidance drives stupidly, yet would not actually result in an accident. The hot take is that a driver with his face buried in his phone the whole time may have had a better outcome.