Most active commenters
  • modeless(10)
  • llamaimperative(5)
  • jeffbee(4)
  • ben_w(4)
  • jvanderbot(3)
  • misiti3780(3)
  • eric_cc(3)
  • kelnos(3)
  • londons_explore(3)
  • rvnx(3)

←back to thread

410 points jjulius | 74 comments | | HN request time: 0.002s | source | bottom
Show context
bastawhiz ◴[] No.41889192[source]
Lots of people are asking how good the self driving has to be before we tolerate it. I got a one month free trial of FSD and turned it off after two weeks. Quite simply: it's dangerous.

- It failed with a cryptic system error while driving

- It started making a left turn far too early that would have scraped the left side of the car on a sign. I had to manually intervene.

- In my opinion, the default setting accelerates way too aggressively. I'd call myself a fairly aggressive driver and it is too aggressive for my taste.

- It tried to make way too many right turns on red when it wasn't safe to. It would creep into the road, almost into the path of oncoming vehicles.

- It didn't merge left to make room for vehicles merging onto the highway. The vehicles then tried to cut in. The system should have avoided an unsafe situation like this in the first place.

- It would switch lanes to go faster on the highway, but then missed an exit on at least one occasion because it couldn't make it back into the right lane in time. Stupid.

After the system error, I lost all trust in FSD from Tesla. Until I ride in one and feel safe, I can't have any faith that this is a reasonable system. Hell, even autopilot does dumb shit on a regular basis. I'm grateful to be getting a car from another manufacturer this year.

replies(24): >>41889213 #>>41889323 #>>41889348 #>>41889518 #>>41889642 #>>41890213 #>>41890238 #>>41890342 #>>41890380 #>>41890407 #>>41890729 #>>41890785 #>>41890801 #>>41891175 #>>41892569 #>>41894279 #>>41894644 #>>41894722 #>>41894770 #>>41894964 #>>41895150 #>>41895291 #>>41895301 #>>41902130 #
1. modeless ◴[] No.41889518[source]
Tesla jumped the gun on the FSD free trial earlier this year. It was nowhere near good enough at the time. Most people who tried it for the first time probably share your opinion.

That said, there is a night and day difference between FSD 12.3 that you experienced earlier this year and the latest version 12.6. It will still make mistakes from time to time but the improvement is massive and obvious. More importantly, the rate of improvement in the past two months has been much faster than before.

Yesterday I spent an hour in the car over three drives and did not have to turn the steering wheel at all except for parking. That never happened on 12.3. And I don't even have 12.6 yet, this is still 12.5; others report that 12.6 is a noticeable improvement over 12.5. And version 13 is scheduled for release in the next two weeks, and the FSD team has actually hit their last few release milestones.

People are right that it is still not ready yet, but if they think it will stay that way forever they are about to be very surprised. At the current rate of improvement it will be quite good within a year and in two or three I could see it actually reaching the point where it could operate unsupervised.

replies(11): >>41889570 #>>41889593 #>>41890163 #>>41890174 #>>41890177 #>>41890374 #>>41890395 #>>41890547 #>>41893442 #>>41893970 #>>41894426 #
2. seizethecheese ◴[] No.41889570[source]
If this is the case, the calls for heavy regulation in this thread will lead to many more deaths than otherwise.
3. jvanderbot ◴[] No.41889593[source]
I have yet to see a difference. I let it highway drive for an hour and it cut off a semi, coming within 9 to 12 inches of the bumper for no reason. I heard about that one believe me.

It got stuck in a side street trying to get to a target parking lot, shaking the wheel back and forth.

It's no better so far and this is the first day.

replies(3): >>41889602 #>>41889978 #>>41890441 #
4. modeless ◴[] No.41889602[source]
You have 12.6?

As I said, it still makes mistakes and it is not ready yet. But 12.3 was much worse. It's the rate of improvement I am impressed with.

I will also note that the predicted epidemic of crashes from people abusing FSD never happened. It's been on the road for a long time now. The idea that it is "irresponsible" to deploy it in its current state seems conclusively disproven. You can argue about exactly what the rate of crashes is but it seems clear that it has been at the very least no worse than normal driving.

replies(1): >>41889623 #
5. jvanderbot ◴[] No.41889623{3}[source]
Hm. I thought that was the latest release but it looks like no. But there seems to be no improvements from the last trial, so maybe 12.6 is magically better.
replies(1): >>41889648 #
6. modeless ◴[] No.41889648{4}[source]
A lot of people have been getting the free trial with 12.3 still on their cars today. Tesla has really screwed up on the free trial for sure. Nobody should be getting it unless they have 12.6 at least.
replies(1): >>41889731 #
7. jvanderbot ◴[] No.41889731{5}[source]
I have 12.5. maybe 12.6 is better but I've heard that before.

Don't get me wrong without a concerted data team building maps a priori, this is pretty incredible. But from a pure performance standpoint it's a shaky product.

replies(1): >>41889788 #
8. KaoruAoiShiho ◴[] No.41889788{6}[source]
The latest version is 12.5.6, I think he got confused by the .6 at the end. If you think that's bad then there isn't a better version available. However it is a dramatic improvement over 12.3, don't know how much you tested on it.
replies(1): >>41889893 #
9. modeless ◴[] No.41889893{7}[source]
You're right, thanks. One of the biggest updates in 12.5.6 is transitioning the highway Autopilot to FSD. If he has 12.5.4 then it may still be using the old non-FSD Autopilot on highways which would explain why he hasn't noticed improvement there; there hasn't been any until 12.5.6.
10. hilux ◴[] No.41889978[source]
> ... coming within 9 to 12 inches of the bumper for no reason. I heard about that one believe me.

Oh dear.

Glad you're okay!

11. snypher ◴[] No.41890163[source]
So just a few more years of death and injury until they reach a finished product?
replies(4): >>41894278 #>>41894434 #>>41895317 #>>41895493 #
12. misiti3780 ◴[] No.41890174[source]
i have the same experience 12.5 is insanely good. HN is full of people that dont want self driving to succeed for some reason. fortunately, it's clear as day to some of us that tesla approach will work
replies(4): >>41890270 #>>41890473 #>>41893961 #>>41897823 #
13. bastawhiz ◴[] No.41890177[source]
> At the current rate of improvement it will be quite good within a year

I'll believe it when I see it. I'm not sure "quite good" is the next step after "feels dangerous".

replies(1): >>41894658 #
14. ethbr1 ◴[] No.41890270[source]
Curiousity about why they're against it and enunciating your why you think it will work would be more helpful.
replies(1): >>41890659 #
15. delusional ◴[] No.41890374[source]
> That said, there is a night and day difference between FSD 12.3 that you experienced earlier this year and the latest version 12.6

>And I don't even have 12.6 yet, this is still 12.5;

How am i supposed to take anything you say seriously when your only claim is a personal anecdote that doesn't even apply to your own argument. Please, think about what you're writing, and please stop repeating information you heard on youtube as if it's fact.

The is one of the reasons (among many) that I can't take Tesla booster seriously. I have absolutely zero faith in your anecdote that you didn't touch the steering wheel. I bet it's a lie.

replies(3): >>41890454 #>>41890462 #>>41891686 #
16. wstrange ◴[] No.41890395[source]
I have a 2024 Model 3, and it's a a great car. That being said, I'm under no illusion that the car will ever be self driving (unsupervised).

12.5.6 Still fails to read very obvious signs for 30 Km/h playgrounds zones.

The current vehicles lack sufficient sensors, and likely do not have enough compute power and memory to cover all edge cases.

I think it's a matter of time before Tesla faces a lawsuit over continual FSD claims.

My hope is that the board will grow a spine and bring in a more focused CEO.

Hats off to Elon for getting Tesla to this point, but right now they need a mature (and boring) CEO.

replies(1): >>41891728 #
17. eric_cc ◴[] No.41890441[source]
Is it possible you have a lemon? Genuine question. I’ve had nothing but positive experiences with FSD for the last several months and many thousands of miles.
replies(4): >>41890737 #>>41893857 #>>41894414 #>>41898678 #
18. modeless ◴[] No.41890454[source]
The version I have is already a night and day difference from 12.3 and the current version is better still. Nothing I said is contradictory in the slightest. Apply some basic reasoning, please.

I didn't say I didn't touch the steering wheel. I had my hands lightly touching it most of the time, as one should for safety. I occasionally used the controls on the wheel as well as the accelerator pedal to adjust the set speed, and I used the turn signal to suggest lane changes from time to time, though most lane choices were made automatically. But I did not turn the wheel. All turning was performed by the system. (If you turn the wheel manually the system disengages). Other than parking, as I mentioned, though FSD did handle some navigation into and inside parking lots.

19. eric_cc ◴[] No.41890462[source]
I can second this experience. I rarely touch the wheel anymore. I’d say I’m 98% FSD. I take over in school zones, parking lots, and complex construction.
20. eric_cc ◴[] No.41890473[source]
Completely agree. It’s very strange. But honestly it’s their loss. FSD is fantastic.
replies(1): >>41895165 #
21. jeffbee ◴[] No.41890547[source]
If I had a dime for every hackernews who commented that FSD version X was like a revelation compared to FSD version X-ε I'd have like thirty bucks. I will grant you that every release has surprisingly different behaviors.

Here's an unintentionally hilarious meta-post on the subject https://news.ycombinator.com/item?id=29531915

replies(3): >>41890595 #>>41890815 #>>41896509 #
22. modeless ◴[] No.41890595[source]
Sure, plenty of people have been saying it's great for a long time, when it clearly was not (looking at you, Whole Mars Catalog). I was not saying it was super great back then. I have consistently been critical of Elon for promising human level self driving "next year" for like 10 years in a row and being wrong every time. He said it this year again and I still think he's wrong.

But the rate of progress I see right now has me thinking that it may not be more than two or three years before that threshold is finally reached.

replies(1): >>41890818 #
23. misiti3780 ◴[] No.41890659{3}[source]
It's evident to Tesla drivers using Full Self-Driving (FSD) that the technology is rapidly improving and will likely succeed. The key reason for this anticipated success is data: any reasonably intelligent observer recognizes that training exceptional deep neural networks requires vast amounts of data, and Tesla has accumulated more relevant data than any of its competitors. Tesla recently held a robotaxi event, explicitly informing investors of their plans to launch an autonomous competitor to Uber. While Elon Musk's timeline predictions and politics may be controversial, his ability to achieve results and attract top engineering and management talent is undeniable.
replies(5): >>41892088 #>>41893945 #>>41894709 #>>41895153 #>>41895855 #
24. ben_w ◴[] No.41890737{3}[source]
I've had nothing but positive experiences with ChatGPT-4o, that doesn't make people wrong to criticise either as modelling their training data too much and generalising too little when they need to use it for something where the inference domain is too far outside the training domain.
25. Laaas ◴[] No.41890815[source]
Doesn’t this just mean it’s improving rapidly which is a good thing?
replies(1): >>41891562 #
26. ben_w ◴[] No.41890818{3}[source]
The most important lesson I've had from me incorrectly predicting in 2009 that we'd have cars that don't come with steering wheels in 2018, and thinking that the progress I saw each year up to then was consistent with that prediction, is that it's really hard to guess how long it takes to walk the fractal path that is software R&D.

How far are we now, 6 years later than I expected?

Dunno.

I suspect it's gonna need an invention on the same level as Diffusion or Transformer models to be able to get all the edge cases we can get, and that might mean we only get it with human level AGI.

But I don't know that, it might be we've already got all we need architecture-wise and it's just a matter of scale.

Only thing I can be really sure of is we're making progress "quite fast" in a non-objective use of the words — it's not going to need a re-run of 6 million years of mammilian evolution or anything like that, but even 20 years wall clock time would be a disappointment.

replies(1): >>41890938 #
27. modeless ◴[] No.41890938{4}[source]
Waymo went driverless in 2020, maybe you weren't that far off. Predicting that in 2009 would have been pretty good. They could and should have had vehicles without steering wheels anytime since then, it's just a matter of hardware development. Their steering wheel free car program was derailed when they hired traditional car company executives.
replies(1): >>41891427 #
28. ben_w ◴[] No.41891427{5}[source]
Waymo for sure, but I meant also without any geolock etc., so I can't claim credit for my prediction.

They may well best Tesla to this, though.

replies(1): >>41896217 #
29. jeffbee ◴[] No.41891562{3}[source]
No, the fact that people say FSD is on the verge of readiness constantly for a decade means there is no widely shared benchmark.
30. jsjohnst ◴[] No.41891686[source]
> I have absolutely zero faith in your anecdote that you didn't touch the steering wheel. I bet it's a lie.

I’m not GP, but I can share video showing it driving across residential, city, highway, and even gravel roads all in a single trip without touching the steering wheel a single time over a 90min trip (using 12.5.4.1).

replies(1): >>41892020 #
31. pelorat ◴[] No.41891728[source]
The board is family and friends, so them ousting him will never happen.
replies(1): >>41893514 #
32. jsjohnst ◴[] No.41892020{3}[source]
And if someone wants to claim I’m cherry picking the video, happy to shoot a new video with this post visible on an iPad in the seat next to me. Is it autonomous? Hell no. Can it drive in Manhattan? Nope. But can it do >80% of my regular city (suburb outside nyc) and highway driving, yep.
replies(1): >>41903350 #
33. ryandrake ◴[] No.41892088{4}[source]
Then why have we been just a year or two away from actual working self-driving, for the last 10 years? If I told my boss that my project would be done in a year, and then the following year said the same thing, and continued that for years, that’s not what “achieving results” means.
34. m463 ◴[] No.41893442[source]
> the rate of improvement in the past two months has been much faster than before.

I suspect the free trials let tesla collect orders of magnitude more data on events requiring human intervention. If each one is a learning event, it could exponentially improve things.

I tried it on a loaner car and thought it was pretty good.

One bit of feedback I would give tesla - when you get some sort of FSD message on the center screen, make the text BIG and either make it linger more, or let you recall it.

For example, it took me a couple tries to read the message that gave instructions on how to give tesla feedback on why you intervened.

EDIT: look at this graph

https://electrek.co/wp-content/uploads/sites/3/2024/10/Scree...

35. dboreham ◴[] No.41893514{3}[source]
At some point the risk of going to prison overtakes family loyalty.
replies(1): >>41894646 #
36. kelnos ◴[] No.41893857{3}[source]
If the incidence of problems is some relatively small number, like 5% or 10%, it's very easily possible that you've never personally seen a problem, but overall we'd still consider that the total incidence of problems is unacceptable.

Please stop presenting arguments of the form "I haven't seen problems so people who have problems must be extreme outliers". At best it's ignorant, at worst it's actively in bad faith.

37. kelnos ◴[] No.41893945{4}[source]
> It's evident to Tesla drivers using Full Self-Driving (FSD) that the technology is rapidly improving and will likely succeed

Sounds like Tesla drivers have been at the Kool-Aid then.

But to be a bit more serious, the problem isn't necessarily that people don't think it's improving (I do believe it is) or that they will likely succeed (I'm not sure where I stand on this). The problem is that every year Musk says the next year will be the Year of FSD. And every next year, it doesn't materialize. This is like the Boy Who Cried Wolf; Musk has zero credibility with me when it comes to predictions. And that loss of credibility affects my feeling as to whether he'll be successful at all.

On top of that, I'm not convinced that autonomous driving that only makes use of cameras will ever be reliably safer than human drivers.

replies(1): >>41896091 #
38. kelnos ◴[] No.41893961[source]
> HN is full of people that dont want self driving to succeed for some reason.

I would love for self-driving to succeed. I do long-ish car trips several times a year, and it would be wonderful if instead of driving, I could be watching a movie or working on something on my laptop.

I've tried Waymo a few times, and it feels like magic, and feels safe. Their record backs up that feeling. After everything I've seen and read and heard about Tesla, if I got into a Tesla with someone who uses FSD, I'd ask them to drive manually, and probably decline the ride entirely if they wouldn't honor my request.

> fortunately, it's clear as day to some of us that tesla approach will work

And based on my experience with Tesla FSD boosters, I expect you're basing that on feelings, not on any empirical evidence or actual understanding of the hardware or software.

replies(1): >>41903313 #
39. latexr ◴[] No.41893970[source]
> At the current rate of improvement it will be quite good within a year and in two or three I could see it actually reaching the point where it could operate unsupervised.

That’s not a reasonable assumption. You can’t just extrapolate “software rate of improvement”, that’s not how it works.

replies(1): >>41895883 #
40. quailfarmer ◴[] No.41894278[source]
If the answer was yes, presumably there’s a tradeoff where that deal would be reasonable.
41. londons_explore ◴[] No.41894414{3}[source]
I suspect the performance might vary widely depending on if you're on a road in california they have a lot of data on, or if its a road FSD has rarely seen before.
42. josefx ◴[] No.41894426[source]
> it will be quite good within a year

The regressions are getting worse. For the first release anouncement it was only hitting regulatory hurdles and now the entire software stack is broken? They should fire whoever is in charge and restore the state Elon tried to release a decade ago.

43. londons_explore ◴[] No.41894434[source]
So far, data points to it having far fewer crashes than a human alone. Teslas data shows that, but 3rd party data seems to imply the same.
replies(2): >>41894584 #>>41895126 #
44. rvnx ◴[] No.41894584{3}[source]
It disconnects in case of dangerous situations, so every 33 miles to 77 miles driven (depending on the version), versus 400'000 miles for a human
45. dlisboa ◴[] No.41894646{4}[source]
There is no risk of going to prison. It just doesn’t happen, never have and never will, no matter how unfair that is. Board members and CEOs are not held accountable, ever.
replies(2): >>41894892 #>>41895119 #
46. rvnx ◴[] No.41894658[source]
"Just round the corner" (2016)
replies(1): >>41897808 #
47. Animats ◴[] No.41894709{4}[source]
> and Tesla has accumulated more relevant data than any of its competitors.

Has it really? How much data is each car sending to Tesla HQ? Anybody actually know? That's a lot of cell phone bandwidth to pay for, and a lot of data to digest.

Vast amounts of data about routine driving is not all that useful, anyway. A "highlights reel" of interesting situations is probably more valuable for training. Waymo has shown some highlights reels like that, such as the one were someone in a powered wheelchair is chasing a duck in the middle of a residential street.

replies(1): >>41896324 #
48. rvnx ◴[] No.41894892{5}[source]
https://fortune.com/2023/01/24/google-meta-spotify-layoffs-c...

As they say, they take "full responsibility"

49. llamaimperative ◴[] No.41895119{5}[source]
https://www.justice.gov/opa/pr/former-enron-ceo-jeffrey-skil...
50. llamaimperative ◴[] No.41895126{3}[source]
Tesla does not release the data required to substantiate such a claim. It simply doesn’t and you’re either lying or being lied to.
replies(1): >>41895194 #
51. llamaimperative ◴[] No.41895153{4}[source]
The crux of the issue is that your interpretation of performance cannot be trusted. It is absolutely irrelevant.

Even a system that is 99% reliable will honestly feel very, very good to an individual operator, but would result in huge loss of life when scaled up.

Tesla can earn more trust be releasing the data necessary to evaluate the system’s performance. The fact that they do not is far more informative than a bunch of commentators saying “hey it’s better than it was last month!” for the last several years — even if it is true that it’s getting better and even if it’s true it’s hypothetically possible to get to the finish line.

52. llamaimperative ◴[] No.41895165{3}[source]
Very strange not wanting poorly controlled 4,000lb steel cages driving around at 70mph stewarded by people calling “only had to stop it from killing me 4 times today!” as great success.
53. londons_explore ◴[] No.41895194{4}[source]
tesla releases this data: https://www.tesla.com/VehicleSafetyReport
replies(3): >>41895375 #>>41896186 #>>41897290 #
54. Peanuts99 ◴[] No.41895317[source]
If this is what society has to pay to improve Tesla's product, then perhaps they should have to share the software with other car manufacturers too.

Otherwise every car brand will have to kill a whole heap of people too until they manage to make a FSD system.

replies(1): >>41896045 #
55. rainsford ◴[] No.41895375{5}[source]
That data is not an apples to apples comparison unless autopilot is used in exactly the same mix of conditions as human driving. Tesla doesn't share that in the report, but I'd bet it's not equivalent. I personally tend to turn on driving automation features (in my non-Tesla car) in easier conditions and drive myself when anything unusual or complicated is going on, and I'd bet most drivers of Teslas and otherwise do the same.

This is important because I'd bet similar data on the use of standard, non-adaptive cruise control would similarly show it's much safer than human drivers. But of course that would be because people use cruise control most in long-distance highway driving outside of congested areas, where you're least likely to have an accident.

56. the8472 ◴[] No.41895493[source]
We also pay this price with every new human driver we train. again and again.
replies(1): >>41898694 #
57. KaiserPro ◴[] No.41895855{4}[source]
Tesla's sensor suite does not support safe FSD.

It relies on inferred depth from a single point of view. This means that the depth/positioning info for the entire world is noisy.

From a safety critical point of view its also bollocks, because a single birdshit/smear/raindrop/oil can render the entire system inoperable. Does it degrade safely? does it fuck.

> recognizes that training exceptional deep neural networks requires vast amounts of data,

You missed good data. Recording generic driver's journeys isn't going to yield good data, especially if the people who are driving aren't very good. You need to have a bunch of decent drivers doing specific scenarios.

Moreover that data isn't easily generalisable to other sensor suites. Add another camera? yeahna, new model.

> Tesla recently held a robotaxi event, explicitly informing investors of their plans

When has Musk ever delivered on time?

> his ability to achieve results

most of those results aren't that great. Tesla isn't growing anymore, its reliant on state subsidies to be profitable. They still only ship 400k units a quarter, which is tiny compared to VW's 2.2million.

> attract top engineering and management talent is undeniable

Most of the decent computer vision people are not in tesla. Hardware wise, their factories aren't fun places to be. He's a dick to work for, capricious and vindictive.

58. modeless ◴[] No.41895883[source]
The timing of the rate of improvement increasing corresponds with finishing their switch to end-to-end machine learning. ML does have scaling laws actually.

Tesla collects their own data, builds their own training clusters with both Nvidia hardware and their own custom hardware, and deploys their own custom inference hardware in the cars. There is no obstacle to them scaling up massively in all dimensions, which basically guarantees significant progress. Obviously you can disagree about whether that progress will be enough, but based on the evidence I see from using it, I think it will be.

59. modeless ◴[] No.41896045{3}[source]
Elon has said many times that they are willing to license FSD but nobody else has been interested so far. Clearly that will change if they reach their goals.

Also, "years of death and injury" is a bald-faced lie. NHTSA would have shut down FSD a long time ago if it were happening. The statistics Tesla has released to the public are lacking, it's true, but they cannot hide things from the NHTSA. FSD has been on the road for years and a billion miles and if it was overall significantly worse than normal driving (when supervised, of course) the NHTSA would know by now.

The current investigation is about performance under specific conditions, and it's possible that improvement is possible and necessary. But overall crash rates have not reflected any significant extra danger by public use of FSD even in its primitive and flawed form of earlier this year and before.

60. modeless ◴[] No.41896091{5}[source]
I have consistently been critical of Musk for this over the many years it's been happening. Even right now, I don't believe FSD will be unsupervised next year like he just claimed. And yet, I can see the real progress and I am convinced that while it won't be next year, it could absolutely happen within two or three years.

One of these years, he is going to be right. And at that point, the fact that he was wrong for a long time won't diminish their achievement. As he likes to say, he specializes in transforming technology from "impossible" to "late".

> I'm not convinced that autonomous driving that only makes use of cameras will ever be reliably safer than human drivers.

Believing this means that you believe AIs will never match or surpass the human brain. Which I think is a much less common view today than it was a few years ago. Personally I think it is obviously wrong. And also I don't believe surpassing the human brain in every respect will be necessary to beat humans in driving safety. Unsupervised FSD will come before AGI.

61. llamaimperative ◴[] No.41896186{5}[source]
Per the other comment: no, they don't. This data is not enough to evaluate its safety. This is enough data to mislead people who spend <30 seconds thinking about the question though, so I guess that's something (something == misdirection and dishonesty).

You've been lied to.

62. IX-103 ◴[] No.41896217{6}[source]
Waymo is using full lidar and other sensors, whereas Tesla is relying on pure vision systems (to the point of removing radar on newer models). So they're solving a much harder problem.

As for whether it's worthwhile to solve that problem when having more sensors will always be safer, that's another issue...

replies(1): >>41896547 #
63. jeffbee ◴[] No.41896324{5}[source]
Anyone who believes Tesla beats Google because they are better at collecting and handling data can be safely ignored.
replies(1): >>41900783 #
64. kylecordes ◴[] No.41896509[source]
On one hand, it really has gotten much better over time. It's quite impressive.

On the other hand, I fear/suspect it is asymptotically, rather than linearly, approaching good enough to be unsupervised. It might get halfway there, each year, forever.

65. ben_w ◴[] No.41896547{7}[source]
Indeed.

While it ought to be possible to solve for just RGB… making it needlessly hard for yourself is a fun hack-day side project, not a valuable business solution.

66. FireBeyond ◴[] No.41897290{5}[source]
No, it releases enough data to actively mislead you (because there is no way Tesla's data people are unaware of these factors):

The report measures accidents in FSD mode. Qualifiers to FSD mode: the conditions, weather, road, location, traffic all have to meet a certain quality threshold before the system will be enabled (or not disable itself). Compare Sunnyvale on a clear spring day to Pittsburgh December nights.

There's no qualifier to the "comparison": all drivers, all conditions, all weather, all roads, all location, all traffic.

It's not remotely comparable, and Tesla's data people are not that stupid, so it's willfully misleading.

This report does not include fatalities. It also doesn't consider any incident where there was not airbag deployment to be an accident. Sounds potentially reasonable until you consider:

- first gen airbag systems were primitive: collision exceeds threshold, deploy. Currently, vehicle safety systems consider duration of impact, speeds, G-forces, amount of intrusion, angle of collision, and a multitude of other factors before deciding what, if any, systems to fire (seatbelt tensioners, airbags, etc.) So hit something at 30mph with the right variables? Tesla: "this is not an accident".

- Tesla also does not consider "incident was so catastrophic that airbags COULD NOT deploy*" to be an accident, because "airbags didn't deploy". This umbrella could also include egregious, "systems failed to deploy for any reason up to and including poor assembly line quality control", as also not an accident and also "not counted".

67. FireBeyond ◴[] No.41897808{3}[source]
Musk in 2016 (these are quotes, not paraphrases): "Self driving is a solved problem. We are just tuning the details."

Musk in 2021: "Right now our highest priority is working on solving the problem."

68. FireBeyond ◴[] No.41897823[source]
I would love self-driving to succeed. I should be a Tesla fan, because I'm very much a fan of geekery and tech anywhere and everywhere.

But no. I want self-driving to succeed, and when it does (which I don't think is that soon, because the last 10% takes 90% of the time), I don't think Tesla or their approach will be the "winner".

69. dham ◴[] No.41898678{3}[source]
A lot of haters mistake safety critical disengagements with "oh the car is doing something I don't like or I wouldn't do"

If you treat the car like it's a student driver or someone else driving, disengagements will go do. If you treat it like you're driving there's also something to complain about.

70. dham ◴[] No.41898694{3}[source]
You won't be able to bring logic to people with Elon derangement syndrome.
71. ethbr1 ◴[] No.41900783{6}[source]
The argument wouldn't be "better at" but simply "more".

Sensor platforms deployed at scale, that you have the right to take data from, are difficult to replicate.

replies(1): >>41905959 #
72. misiti3780 ◴[] No.41903313{3}[source]
Time will show I'm right and you're wrong.
73. lucianbr ◴[] No.41903350{4}[source]
It's so obviously cherry-picking, I have no idea what you are even thinking. To not be cherry-picking would mean that it's actually ready and works fine in all situations, and there's no way Musk would not shout that out from rooftops and sell it yesterday.

Obviously it works some time on some roads, but not all the time on all the roads. A film with it when it works on the road it works is cherry-picking. Look up what the term means.

74. jeffbee ◴[] No.41905959{7}[source]
For most organizations data is a burden rather than a benefit. Tesla has never demonstrated that they can convert data to money, while that is the sole purpose of everything Google has built for decades.