- I strongly believe LIDAR is the way to go and that Elon's vision-only move was extremely "short-sighted" (heheh). There are many reasons but that drives it home for me multiple times a week is that my Tesla's wipers will randomly sweep the windshield for absolutely no reason.
This is because the vision system thinks there is something obstructing its view when in reality it is usually bright sunlight -- and sometimes, absolutely nothing that I can see.
The wipers are, of course, the most harmless way this goes wrong. The more dangerous type is when it phantom-brakes at highway speeds with no warning on a clear road and a clear day. I've had multiple other scary incidents of different types (swerving back and forth at exits is a fun one), but phantom braking is the one that happens quasi-regularly. Twice when another car was right behind me.
As an engineer, this tells me volumes about what's going on in the computer vision system, and it's pretty scary. Basically, the system detects patterns that are inferred as its vision being obstructed, and so it is programmed to brush away some (non-existent) debris. Like, it thinks there could be a physical object where there is none. If this was an LLM you would call it a hallucination.
But if it's hallucinating crud on a windshield, it can also hallucinate objects on the road. And it could be doing it every so often! So maybe there are filters to disregard unlikely objects as irrelevant, which act as guardrails against random braking. And those filters are pretty damn good -- I mean, the technology is impressive -- but they can probabistically fail, resulting in things that we've already seen, such as phantom-braking, or worse, driving through actual things.
This raises so many questions: What other things is it hallucinating? And how many hardcoded guardrails are in place against these edge cases? And what else can it hallucinate against which there are no guardrails yet?
And why not just use LIDAR that can literally see around corners in 3D?
by dlcarrier
10 subcomments
- This looks to me like they are acknowledging that their claims were premature, possibly due to claims of false advertising, but are otherwise carrying forward as they were.
Maybe they'll reach level 4 or higher automation, and will be able to claim full self driving, but like fusion power and post-singularity AI, it seems to be one of those things where the closer we get to it, the further away it is.
by an0malous
2 subcomments
- How have they gotten away with such obvious misadvertising for this long? It’s undeniably misled customers and inflated their stock value
by AbrahamParangi
5 subcomments
- I use self-driving every single day in Boston and I haven’t needed to intervene in about 8 months. Most interventions are due to me wanting to go a different route.
Based on the rate of progress alone I would expect functional vision-only self-driving to be very close. I expect people will continue to say LIDAR is required right up until the moment that Tesla is shipping level 4/5 self-driving.
by Nitsua007
1 subcomments
- Small correction: LiDAR can’t literally see around corners — it’s still a line-of-sight sensor. What it can do is build an extremely precise 3D point cloud of what it can see, in all lighting conditions, and with far less susceptibility to “hallucinations” from things like glare, shadows, or visual artifacts that trip up purely vision-based systems.
The problem you’re describing — phantom braking, random wiper sweeps — is exactly what happens when the perception system’s “eyes” (cameras) feed imperfect data into a “brain” (compute + AI) that has no independent cross-check from another modality. Cameras are amazing at recognizing texture and color but they’re passive sensors, easily fooled by lighting, contrast, weather, or optical illusions. LiDAR adds active depth sensing, which directly measures distance and object geometry rather than inferring it.
But LiDAR alone isn’t the endgame either. The real magic happens in sensor fusion — combining LiDAR, radar, cameras, GNSS, and ultrasonic so each sensor covers the others’ blind spots, and then fusing data at the perception level. This reduces false positives, filters out improbable hazards before they trigger braking, and keeps the system robust in edge cases.
And there’s another piece that rarely gets mentioned in these debates: connected infrastructure. If the vehicle can also receive data from roadside units, traffic signals, and other connected objects (V2X), it doesn’t have to rely solely on its onboard sensors. You’re effectively extending the vehicle’s situational awareness beyond its physical line of sight.
Vision-only autonomy is like trying to navigate with one sense while ignoring the others. LiDAR + fusion + connectivity is like having multiple senses and a heads-up from the world around you.
- They made tons of money on the Scam of the Decade™ from Oct 2016 (See their "Driver is just there for legal reasons" video) to Apr 2024 (when they officially changed it to Supervised FSD) and now its not even that.
by mettamage
1 subcomments
- I’m not surprised. As a former Elon fan, it never struck me that he thought about this from first principles, whereas for SpaceX he did.
For as long as we can’t understand AI systems as well as we understand normal code, first principles thinking is out of reach.
It may be possible to get FSD another way but Elon’s edge is gone here.
- Here's what Claude has to say about electrek.co:
Tesla Headlines Sentiment Analysis - Electrek.co
Bottom Line: Strongly Negative Sentiment
Based on analysis of Tesla headlines and articles from Electrek over the past few months, the sentiment is overwhelmingly negative (approximately 85% negative, 10% neutral, 5% positive). The coverage reveals a company in decline across multiple fronts.
- I think I’d call what Tesla did fraud. Or scam. Or both.
- War Is Peace. Freedom Is Slavery. Ignorance Is Strength. FSD is... whatever Elon says it is.
by IgorPartola
4 subcomments
- I don’t need self driving cars that can navigate alleys in Florence, Italy and also parkways in New England. Here is what we really need: put transponders into the roadway on freeways and use those for navigation and lane positioning. Then you would be responsible for getting onto the freeway and getting off the exit but can take a nap between. This would be something that would be do e by the DOT, supported by all car makers, and benefit everyone. LIDAR could be used for obstacle detection but not for navigation. And whoever figures out how to do the transponders and land a government contract and get at least one major car manufacturer on board would make bank.
by shadowgovt
2 subcomments
- "Full Self Driving (Supervised)." In other words: you can take your mind off the road as long as you keep your mind on the road. Classic.
Tesla is kind of a joke in the FSD community these days. People working on this problem a lot longer than Musk's folk have been saying for years that their approach is fundamentally ignoring decades of research on the topic. Sounds like Tesla finally got the memo. I mostly feel sorry for their engineers (both the ones who bought the hype and thought they'd discover the secret sauce that a quarter-century-plus of full-time academic research couldn't find and the old salts who knew this was doomed but soldiered on anyway... but only so sorry, since I'm sure the checks kept clearing).
- My experience working in an automotive supplier suggest that Tesla engineers must have always knowns this and the real strategy was to provide the best ADAS experience with the cheapest sensor architecture. They certainly did achieved that goal.
There were aspirations that the bottom up approach would work with enough data, but as I learned about the kind of long tail cases that we solved with radar/camera fusion, camera-only seemed categorically less safe.
easy edge case: A self driving system cannot be inoperable due to sunlight or fog.
a more hackernew worthy consideration: calculate the angular pixel resolution required to accurately range and classify an object 100 meters away. (roughly the distance needed to safely stop if you're traveling 80mph) Now add a second camera for stereo and calculate the camera-to-camera extrinsic sensitivity you'd need to stay within to keep error sufficiently low in all temperature/road condition scenarios.
The answer is: screw that, I should just add a long range radar.
there are just so many considerations that show you need a multi-modality solution, and using human biology as a what-about-ism, doesn't translate to currently available technology.
- I thought we would have almost AGI by now? https://x.com/elonmusk/status/1858747684972048695
by ratelimitsteve
0 subcomment
- Bad angle shot: This thing where advertisers exploit the need to clarify ambiguity in order to smuggle in custom, private definitions of words that mean the opposite of the agreed-upon definitions of those same words is a problem. Calling something "full self-driving" when it doesn't drive by itself fully is lying even if you put in the fine print that "full" means "not full" and "self-driving" means "not driving by itself"
- Lidar is the first thing brought up in these discussions. Lidar isn’t that great of a sensor. It does one thing well and that’s measure distance. A visual sensor can be measured along the axis of spatial resolution (x,y,z) temporal resolution(fps) and dynamic range(bit depth). You could add things like light frequency and phase etc. Lidar is quite poor in all of these except the spatial z dimension, measuring distance as mentioned before. Compared to a cheep camera the fps is very low, the spatial resolution in x and y is pathetic 128. in the vertical, higher horizontal but its not mega pixels. Finally the dynamic range is 1 bit(something is there or not).
Lidars use near infrared and are just as susceptible to problems with natural fog (not the theatrical fog like in that Roper video) and rain.
Multiple cameras can do good enough depth estimation with modern neural networks. But cameras are vastly better at making sense of the world. You can’t read a sign with lidar.
by AndrewKemendo
1 subcomments
- Karpathy should be held liable for this (maybe less than Musk) but he should at least be considered persona non grata for pushing it.
It was his idea, his decision to build the architecture and he led the entire vision team during this.
Yet, he remains free from any of this fallout and still widely considered an ML god
https://youtu.be/3SypMvnQT_s?si=FDmyA6amWnDpMPEj
by starchild3001
4 subcomments
- Feels like Musk should step down from the CEO role. The company hasn’t really delivered on its big promises: no real self-driving, Cybertruck turned into a flop, the affordable Tesla never materialized. Model S was revolutionary, but Model 3 is basically a cheaper version of that design, and in the last decade there hasn’t been a comparable breakthrough. Innovation seems stalled.
At this point, Tesla looks less like a disruptive startup and more like a large-cap company struggling to find its next act. Musk still runs it like a scrappy startup, but you can’t operate a trillion-dollar business with the same playbook. He’d probably be better off going back to building something new from scratch and letting someone else run Tesla like the large company it already is.
- The lesson here is to wait for a chill SEC and friendly DOJ before you recant your fraudulent claims, because then they won’t be found to be fraudulent
by RyanShook
1 subcomments
- Looking forward to the class action on this one…
- Needs to be known that Fred Lambert pushes out so much negative Tesla press that its reasonable to say that he's on a crusade. And not a too fact-based one.
Like with this. No, Tesla hasn't communicated any as such. Everyone knows FSD is late. But Robotaxi shows it is very meaningfully progressing towards true autonomy. And for example crushed the competition (not literally) in a recent very high-effort test in avoiding crashes on a highway with obstacles that were hard to read for almost all the other systems: https://www.youtube.com/watch?v=0xumyEf-WRI
by paradox460
0 subcomment
- One of the shower thoughts I've had is why don't we start equipping cars with UWB tech. UWB can identify itself, two UWB nodes can measure short range distances between each other (around 30m) with fairly decent accuracy and directionality.
Sure, it wouldn't replace any other sensing tech, but if my car has UWB and another car has UWB, they can telegraph where they are and what their intentions are a lot faster and in a "cleaner" manner than using a camera to watch the rear indicator for illumination
- This nazi-saluting manchild has been purposefully lying about self-driving for close to 10 years now, each year self-driving coming "next year". How is this legal and not false advertisement?
by AdmiralAsshat
0 subcomment
- Kinda wish we as consumers had some way to fight back against this obvious bullshit, since lord knows the government won't do anything.
Like if a company comes out with a new transportation technology and calls it "teleportation", but in fact is just a glorified trebuchet, they shouldn't be allowed to use a generic term with a well-understood meaning fraudulently. But no, they'll just call it "Teleportation™" with a patented definition of their glorified trebuchet, and apparently that's fine and dandy.
I am still bitter about the hoverboard.
by ChrisArchitect
0 subcomment
- Earlier:
Tesla’s autonomous driving claims might be coming to an end [video]
https://news.ycombinator.com/item?id=45133607
- And stock is up $15
- What I don't understand about this is that to my experience being driven around in friends teslas, its already there. It really seems like legalese vs technical capability. The damn thing can drive with no input and even find a parking spot and park itself. I mean where are we even moving the goalpost at this point? Because there's been some accidents its not valid? The question is how that compares to the accident rate of human drivers not that there should be an expectation of zero accidents ever.
by amanaplanacanal
0 subcomment
- I wonder if this change came from the legal department after their loss in the lawsuit over that poor woman that was killed.
by GUNHED_158
0 subcomment
- The link contains malicious scripts.
by diebeforei485
0 subcomment
- This article makes no sense to me. They aren't changing the meaning of anything for consumers, it's only defining it for the purpose of the compensation milestone.
- Honest question: did Tesla in the past promise that FSD would be unsupervised? My based-on-nothing memory is that they weren't promising that you wouldn't have to sit in the driver's seat, or that your steering wheel would collect dust. Arguing against myself: they did talk about Teslas going off to park themselves and returning, but that's a fairly limited use case. Maybe in the robotaxi descriptions?
My memory was more that you'd be able to get into (the driver's seat of) your Tesla in downtown Los Angeles, tell it you want to go to the Paris hotel in Vegas, and expect generally not to have to do anything to get there. But not guaranteed nothing.
- One problem might be that American driving is not exactly... well great, is it? Roads are generally too straight and driving tests too soft. And for some weird reason, many US drivers seem to have a poor sense of situational awareness.
The result is it looks like many drivers are unaware of the benefits of defensive driving. Take that all into account and safe 'full self driving' may be tricky to achieve?
- The title is rather misleading. They haven't given up on promise of autonomy...
- I guess you can either go full waymo or full comma. The rest is just hype.
- Guess living and working in the factory ain’t working out so well
- So what does it mean for Tesla's "Robotaxi"? Is that being shut down?
It's pathetic. The Austin Robotaxi demo had a "safety monitor" in the front passenger seat, with an emergency stop button. But there were failures where the safety driver had to stop the vehicle, get out, walk around the car, get into the drivers's seat, and drive manually. So now the "safety monitor" sits in the driver's seat.[1] It's just Uber now.
Do you have to tip the "safety monitor"?
And for this, Musk wants the biggest pay package in history?
[1] https://electrek.co/2025/09/03/tesla-moves-robotaxi-safety-m...
by mensetmanusman
0 subcomment
- Our current infrastructure isn’t compatible with lidar. We were consulted to fix it, but governments have no idea how to approach this problem so it won’t happen for years.
- Given this move, like the rest of TSLA's inane investor base, I wholeheartedly support the potential $1 trillion pay package for Musk
by MagicMoonlight
0 subcomment
- There needs to be a class-action against Tesla. It’s blatant fraud.
- Fir a long time, I don’t think full self driving makes economic sense. Would this hurt car sells at long term?
- Most Honest Company (Sarcasm)
- I was a fool's game from the start, with only negative aspects = what could possibly go wrong?
- Electrek has been anti Tesla for a long time now.
by yencabulator
0 subcomment
- Full (Limited)
by olyellybelly
0 subcomment
- Shock! Horror!
by aamargulies
0 subcomment
- I knew that FSD was nonsense when I tried to use Tesla's autopark feature under optimal conditions and it failed to park the car satisfactorily.
- If you can’t reach the goal, move the goal posts!
- [flagged]
by jqpabc123
1 subcomments
- [flagged]
- My 1993 Nissan has FSD. I can fully drive myself anywhere.
by freerobby
3 subcomments
- This is clickbait from a publication that's had it out for Tesla for nearly a decade.
Tesla is pivoting messaging toward what the car can do today. You can believe that FSD will deliver L4 autonomy to owners or not -- I'm not wading into that -- but this updated web site copy does not change the promises they've made prior owners, and Tesla has not walked back those promises.
The most obvious tell of this is the unsupervised program in operation right now in Austin.
by resfirestar
0 subcomment
- I don't read the article (besides the clickbait headline and the author's "take") as Tesla "giving up". No marketing is changing, no plans for taxi services are changing. This is about the company's famously captured board giving their beloved CEO flexibility on how to meet their ambitious-sounding targets, by using vague language in the definitions. This way if Tesla fails to hit 10 million $100/month FSD subscriptions, they could conceivably come up with a cheaper more limited subscription and get Elon his pay.