Discussion
darkwizard42: Not too much to specifically take away yet, but appears that the degradation detection system did not function well. That is pretty egregious for FSD given a human won't be able to tell if FSD is confident or needs the human to intervene.Overall, yikes.
jonthepirate: Tesla is a premium product - if someone is going to use FSD they know its a luxury feature and should pay for the most comprehensive safety features available which in my mind would of (of course) require lidar
Animats: "In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred."Does it not detect them at all, or fail to deal with detected sensor degradation adequately? Does "Full Self Driving (assisted)" slow down under conditions of poor visibility?Does Tesla even look for the road surface? One big advantage of those up-top LIDAR units is that you have a good scan of the pavement ahead. If you're not sensing flat pavement ahead, don't go there. That's basic. Vision-only systems, going back to Mobileye, have been overly dependent on looking for known kinds of obstacles. Original Mobileye could only detect car rear ends.
Sohcahtoa82: I think vision-only can certainly work for 99.9% of driving.But it's that 0.1% of situations where the results will be catastrophic. Sure, you can detect vehicles, traffic cones, bikes (both bicycles and motorcycles), people, mopeds, traffic lights, lane markings, everything you'd expect on a road.But what about the mattress that fell out of someone's truck? If the car doesn't know what a mattress is and what it looks like, it can't really adequately determine its size based on the monocular vision that Tesla has. Sure, maybe it could use motion vectors between video frames to make a guess, but I'm not convinced that's going to work well, especially relative to LIDAR.Steering back to the subject at hand...> "In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred."I don't think I've ever had my Tesla disable Autopilot based on road conditions, though maybe it's because when conditions are bad, I've just taken manual control preemptively. I've let it go through construction areas where cones are guiding traffic outside the painted lines, and surprisingly, it's handled it fine, though I've only done this at low speeds (~20 mph).Camera visibility is another story. In heavy rain at night, I've had it not allow me to enable AP, though I've never had it disable AP and tell me to take control. However, it HAS limited the cruise speed based on visibility.All this to say......anybody buying Tesla's FSD is being swindled, as far as I'm concerned. "FSD (Supervised)" is a scam. If you have to supervise it, it's not self-driving. It's just a party trick that you have to watch to make sure nothing goes wrong.
starkeeper: "Cameras only" is a cost cutting for profit only feature that is subject to Wile E. Coyote attacks.It is a shameful engineering design to leave out LIDAR and it has cost human lives.Let's hope Musk does not leave out something important for the moon landing. His proposal for it is absolutely ridiculous, it looks like a children's book fantasy and many smaller top-heavy craft have already toppled on the moon!
DonHopkins: His moon lander will deploy a parachute that keeps the lander suspended for as long as it takes for the AI to grok the fact that there is no air on the moon, and then it finally falls according to cartoon physics.
mrguyorama: >In each of these crashes, FSD also lost track of or never detected a lead vehicle in its path.Oh good, Tesla vehicles apparently struggle with the task of "Hey, there's a car there" in degraded conditions.Probably don't need to worry about that while driving though.>Tesla also described internal data and labeling limitations that prevented a uniform identification and analysis of crash events with the subject system engaged. ODI believes this limitation could have led to under-reporting of subject crashes over portions of the defined time-period.I thought Tesla was a "Software" company!This report is insanely vague though. It's very preliminary, opened yesterday.
kvuj: > This report is insanely vague though. It's very preliminary, opened yesterday.Yeah I think posting this here is premature without any details.Maybe I'm misremembering things, but I feel like 4-5 years ago we didn't have these clickbait headlines that fed political discourse. It feels like reddit culture has permeated this place for a while.Anytime one of Elon Musk's company has a misstep, the headlines violently shoots to the top of the front page.
dstroot: I own two Tesla’s. When conditions are adverse, i.e. fog, heavy rain, the system simply shuts off and reverts back to manual driving. Elon has said several times that humans can drive with two eyes and Tesla should be able to drive with X number of cameras. however, it suffers from the same problems humans do: if it can’t see it can’t drive and ironically that’s when it reverts back to human control.
9dev: Birds can fly with two wings and humans should be able to fly with X number of limbs.
loxodrome: The introduction of self-driving technology at scale will inevitably result in a few accidents no matter how many sensors are used. It's the same with every new technology deployed in high-risk situations, including motor vehicles themselves. Even malfunctioning airbags have caused fatalities. The important thing is to identify the issue early so the company can address it before more people get hurt, which the ODI in case is thankfully doing.
michaelmrose: Remember that Elon isn't actually an engineer
mbreese: > Tesla is a premium productI'm not sure that's the case anymore. Each Tesla model has gotten more spartan over the years. And the interiors have never been all that "premium" when compared with other manufacturers. They should still offer the most comprehensive safety features, but whether or not thats because of "luxury" or not, I'm not sure.
ModernMech: Yes, this was something that the industry figured out in 2007. But because Musk has a lot of money, people do whatever he says, no matter how ignorant and deadly and dangerous. The shame is so profound and widespread it's hard to fathom really.
dkenyser: > clickbait titles that fed political discourse.Eh, while I agree with you on the permeation of reddit culture on this board, this post is in no way clickbait or political in nature.In fact, the title of this post is literally copy and pasted from the problem description.
UltraSane: "Elon has said several times "At this point I truly don't understand why anyone cares what that liar says.
corygarms: To be fair, Tesla vehicles are recalled more than any other automaker and it isn't close https://www.autoweek.com/news/industry-news/a43625242/tesla-...
jedberg: Ironic to see this on the front page just next to the report about Waymos being 13 times safer than humans.
conductr: My Lexus does this too. I rarely get it due to weather however it’s how I know I’m past due for a car wash (dust on sensors)
ares623: This moved from the top spot on the front page so fast
renlo: > When conditions are adverse, i.e. fog, heavy rain, the system simply shuts off and reverts back to manual driving.I also own a Tesla, and there is no indication shown to the user that FSD's vision is degraded. They need to add this in.For example, numerous times I have been driving my Tesla with FSD activated with ostensibly a clean and clear windshield when suddenly the car will do the "clean the windshield in front of the camera routine" without any indication that the car's camera is degraded. If people haven't seen this "clean the windshield routine", the wiper fluid is dispensed and the wiper will vigorously wipe in front of the camera only -- the rest of the windshield only gets a cursory wipe.This indicates to me that the camera has poor visibility and I am not informed or aware of this as a driver, which is concerning. I am often curious if there is a thin occluding film on the windshield in the camera box in front of the camera, or something that has degraded FSD's vision, but they do not give you the ability to view the camera feed, nor do they notify you that the vision is degraded. I think a "thin occluding film" may be in the camera box because my normal windshield outside of the camera box started to show a thin chemical film after a couple of months, which apparently (according to a Google search) happens when a new car off-gasses, adding a thin film of chemical byproduct to the windshield. This is my first new car so I've no idea if this is normal or not.
kirubakaran: [delayed]
lateforwork: > humans can drive with two eyes and Tesla should be able to drive with X number of camerasSystems built from cameras that are only nearly as capable as human eyes and software that is only nearly as capable as the human brain will fall short overall. To match or surpass human performance, the individual components need to exceed human abilities where possible--and that's where LiDAR provides an advantage.
kirubakaran: Not surprising at all though, considering the "safety first" vs "yolo" approaches, right?
IncreasePosts: What are wile e coyote attacks? Painting a tunnel entrance on a wall?If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human? Especially because a human has a window to override FSD, but FSD doesn't really get a chance to override a human, except in limited scenarios like automatic emergency braking. And it gets more people using it by providing FSD at a lower cost?
ModernMech: > What are wile e coyote attacks?https://www.thedrive.com/news/tesla-autopilot-fails-wile-e-c...> If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human?That was the analysis when the industry was in its infancy. I think a lot more work has to go into that argument for people to accept it now that the driverless car industry has been operating for a decade+, it's not really clear that this pans out.For example, today you can look at a car and predict how it's going to behave because you have a good model for how people drive. But let's say in the future driverless cars are much "safer" on paper than human drivers, but they behave very differently from them such that it's hard for people to predict their behaviors.Now you've created a highly dynamic system where you don't have a good model for the all the actors because some of them behave one way but others behave a completely different way. Does this increase the overall safety of the system or decrease it, despite the new actors being statistically safer than the current ones?I don't think you can with great confidence say what's going to happen just by looking at crash rates and comparing to the current system. You're going to change the whole system by introducing large numbers of actors who "crash in different scenarios than a human"
bradfox2: Maybe you have not received an alert but, yes it does, and it's annoying as all hell. Dirt, sun, etc all pop an alert about degraded performance.
starkeeper: The renderings look like the cover of a Young Adult Sci Fi novel by Robert H. Heinlein. Have Spacesuit, Will Travel comes to mind. Probably the first true Science Fiction I have ever read!
starkeeper: >> What are wile e coyote attacks? Painting a tunnel entrance on a wall?Yes!!! Thank you hopefully I will get credit for inventing this attack :)>> IncreasePosts 39 minutes ago | parent | context | flag | on: Tesla: Failure of the FSD's degradation detection ...What are wile e coyote attacks? Painting a tunnel entrance on a wall?>> If Tesla FSD is better than the average driver using it, isn't it still a net win, even if it might crash in different scenarios than a human?I don't think so because it is fooled by simple things that could easily be prevented and counting on a human to override is very risky because the human is simply not alert in the passive mode.I think cameras are great, but there is no excuse not to also use LIDAR.
whoknowsidont: It's not premature. Every single expert in this field has warned about these issues since even ~2012 days when these types of platforms were being publicly discussed.This is an expected and understood result given the hardware and software involved.You will not get past these issues without a RADICAL improvement in camera technology paired with specialized, dedicated processing hardware matched against several (and I mean several several) "common" environment profiles.FSD is a scam. It's not safe. It is not technically sound.The fact that there aren't many more accidents with the system is a by product of consistent and well thought out road standards, car standards, other safety systems present on cars, and driver education.
resfirestar: You’re just reciting your priors, which I think supports GP’s point: no one is getting new information out of the posted link, so it’s probably premature to comment on it.
whoknowsidont: You are misusing some of those words and I'm not even sure how to interpret them even with a hefty dose of good-faith reading.The report is not premature and it's not premature to comment on them.Can you clearly and explicitly state why you feel like the report or the commentary is premature?
resfirestar: I was agreeing with kvuj and rguyorama that the original link is to an announcement that an investigation is happening, and it's too early in the process to productively discuss it. People have very strong and emotional pro or anti stances on the Tesla Vision system in general, and love an excuse to have the debate again, but in the comments here where people are talking about their stance you might notice that they don't reference any specific facts from the linked report to support their arguments. This is because the report is still vague at this stage and doesn't provide any specifics that inform the discussion.
dawnerd: Absolutely could be a clouded windshield on the inside (where it's really hard for normal people to clean). I brought this up when I got my last Model Y that it was foggy and they said it was "fine". Took it into service over a year ago and noticed they cleaned it. Clearly it's a problem but they're not being too transparent about it. I suspect they don't want to because it's not the easiest thing to remove the cover for normal people to clean.