Yes there are actual "legal" notions of driver assistance "levels," but out of those 3 companies Tesla seems to be taking it the most seriously. Elon has a high conviction in the product and has continued to pour billions of dollars in GPUs, talent, supply chains, etc to make it happen. Other manufacturers don't take it nearly as seriously, unfortunately.
Sure you might not be able to legally claim that a Tesla is totally "autonomous," but the fact of the matter (to me, at least) is that they are putting in way more effort to solve the problem than legacy auto manufacturers are.
I can step into a Tesla today, press a destination, and go there without touching the wheel or pedals. Sure it won't be flawless but the fact is, I can. I can't do the same in any other consumer car, and the closest thing is a Waymo. The effort is there, I think its just a matter of time before we start seeing the legal stuff play out.
I think there's some chance that Tesla's approach will work out for them, but I'm not optimistic. Getting self-driving software to "impressively good" happened pretty quickly, and even systems from 2016 would usually pass your "go there without touching the wheel or pedals" test. But from there to "as safe as a human" let alone "as safe as we require human-replacing machines to be" turns out to be quite a hard gap to cross.
When you say "what was available" do you mean what a Tesla was capable of? Because I was trying to talk about things like Cruise driving for 90min at night in a city with no intervention: https://youtu.be/KSRPmng1cmA
If you let me pick the roads, I could easily drive 90 minutes at night with no intervention in the LA area: driving from Ventura, down the 126 to the 5 to LAX can be done basically with no interventions no problem today. It could be done with EAP two years ago too. But, FSD works well on essentially arbitrary roads and it didn't four years ago.
> driving from Ventura, down the 126 to the 5 to LAX
That's a lot simpler than what Cruise showed in their video. Dealing well with pedestrians, stopped cars, cross streets, etc are a significant challenge.
Well, there's consistently and there's consistently.
In 2016 Waymo reported their safety drivers intervening once for every 5,100 miles driven [1] - which implies to me that 99% of journeys nobody touched the wheel or pedals.
The problem is 99% isn't enough, as there are tremendous numbers of cars out there, and a busy bit of freeway would get a disengagement per minute.
It might be missing a few logging roads, private driveways and carpark lanes - but it shows Google is entirely capable of mapping streets in detail all over the world.
> and even systems from 2016 would usually pass your "go there without touching the wheel or pedals" test
This is complete bullshit. The systems in 2016 were restricted to pre-mapped areas with lots of training. (i.e. the waymo/cruise approach today) So if you weren’t in a tiny slice of San Francisco or a few other training areas, this didn’t work.
Even if an area is pre-mapped, if you're operating on public streets you need to handle all sorts of unusual things. Here's a Cruise video recorded in 2016 on SF streets, showing a tricky interactions with buses, stopped delivery vehicles, pedestrians, etc: https://www.youtube.com/watch?v=1Tp6Ubf6mE4
Yeah, I’m not saying they weren’t doing impressive things. I’m saying they were still severely limited and would not be capable of a “go around without touching the wheel/brakes test” like a Tesla can in the entire US now.
> of those 3 companies Tesla seems to be taking it the most seriously.
You call lying about it endlessly, misleading marketing, beta-testing on public roads, and not even having L3 properly in production "taking it the most seriously"?
No, I'm mostly referring to buying talent and GPUs.
I can't point to a single Andrej Karpathy or Chris Lattner or Jim Keller working at Mercedes, BMW, GM, etc (not to mention the not so big named people who are still very very good). And I also don't see many people crossing over from legacy auto autonomous orgs to openai, anthropic, deepmind, etc.
Other manufacturers don't have custom in car inference chips either, or spend billions in R&D for custom training silicon. This is clearly not a side project for Tesla, whereas with other manufacturers, its an obvious afterthought.
My guess is other manufacturers will just license some AV product from whoever is most successful and try to sell products like they always have - through design language and brand feel, not through breakthrough technological innovation.
So yes, I don't think any other consumer car manufacturer is taking it seriously.
From what I've seen, Tesla FSD would fail a driving test here in the EU within the 1st minute. Even the latest and greatest version behaves like a drunk teenager.
I'm not in EU so I can't comment if they have a special circumstance that makes there driving test different than elsewhere in the world, but "behaving like a drunk teenager" is not a descriptor I would use. I make use of Tesla FSD every day, 2 -3 times a day. Over the last few months I've had to engage (of my own choosing) twice, otherwise it is completely hands and foot off experience.
I travel hundred miles a day on average on a mix of local and highway, but all major roads in city and suburb.
Driving tests in Europe, certainly in northern Europe and Scandinavia, are considerably stricter than in the US and many other places in the world. In Norway the test is also not enough, there is also obligatory practice with a qualified instructor including driving on motorways and on a skid pan to simulate driving on ice.
Would definitely pay more to avoid FSD pilots, because the texters sense their surroundings more often than the drivers lulled by false advertisement and (at-best) beta grade software.
The snarky posters can stay, doesn't hurt my life or property.
Or course it would fail a driving test designed for humans.
Humans would fail a driving test designed for autonomous driving too. We don't have the reaction time comparable to computers to ..say.. avoid oncoming traffic or an object/animal on the road within a few milliseconds. Or identify dark objects easier to see in infrared. Or maintain an exact speed, to a few significant digits. Or manually do a good job with traction control.
> Yes there are actual "legal" notions of driver assistance "levels," but out of those 3 companies Tesla seems to be taking it the most seriously.
..and yet Tesla is quick to blame the drivers when accidents happen - and has disclaimers for supervision in its TOS. Mercedes on the other hand, takes all responsibility for eventualities that happen while self-driving is engaged (DrivePilot - not ADAS)
Sure you might not be able to legally claim that a Tesla is totally "autonomous," but the fact of the matter (to me, at least) is that they are putting in way more effort to solve the problem than legacy auto manufacturers are.
I can step into a Tesla today, press a destination, and go there without touching the wheel or pedals. Sure it won't be flawless but the fact is, I can. I can't do the same in any other consumer car, and the closest thing is a Waymo. The effort is there, I think its just a matter of time before we start seeing the legal stuff play out.