Tesla braces for its first trial involving Autopilot fatality::Tesla Inc is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk’s assertions about the technology.
The headline makes it sound like Tesla is trialing a new ‘fatality’ feature for it’s autopilot.
Well, someone has to invent the suicide booths featured in Futurama. Might as well be him.
I really want to trust you’re throwing a dark joke up but the sheer concept of suicide booths is a very harsh critique at a failed society. A very failed society. For it to become a joke…Call me square but that is a joke haimed to who laughs on it.
https://youtu.be/EbmQxZkSswI?si=0lcguQyWQxUggaB5
It’s a joke but a suicide booth isn’t that bad, assisted pain free death is a right everyone should have.
But having it on a street corner for ease of access is pretty fucked
My country is going through a very disputed approval over legislation for medically assisted death, for incurable conditions.
It was sent to the Constitutional Court three times and twice vetoed by the president, one for political reasons.
The majority of the population supports it.
Good luck with that. Hope it can alleviate some people’s suffering.
We’ve basically got it all legal in Australia now, last state ratifies their laws in November.
Making it accessalbe on what might be a fleeting impulse would be a huge problem though in the case of futurama style suicide booths.
Autopilot is not safe.
https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/
Isn’t it a glorified cruise control/lane guidance system, rather than an actual automated driving system? So it would be about as safe as those are, rather than being something that you can just leave along to handle its own business, like a robotic vacuum cleaner.
Driving a car is not safe. 40000 people die on car crashes every year in the US alone. Nothing in that article indicates that autopilot/FSD is more dangerous than a human driver. Just that they’re flawed systems as is expected. It’s good to keep in mind that 99.99% safety rating means 33000 accidents a year in the US alone.
You can’t just put something on the streets without first verifying it’s safe and working as intended. This is missing for Autopilot. And the data that’s piling up is showing that Autopilot is deadly.
First of all what is it that you consider safe? I’m sure you realize that 100% safety rating is just fantasy so what is the acceptable rate of accidents for you?
Secondly would you mind sharing the data “that’s piling up is showing that Autopilot is deadly” ? Reports of individual incidents is not what I’m asking for because as I stated above; you’re not going to get 100% safety so there’s always going to be individual incidents to talk about.
You also seem to be talking about FSD beta and autopilot interchangeably thought they’re a different thing. Hope you realize this.
There are very strict regulations around what is allowed to be in the streets and what isn’t. This is what protects us from sloppy companies releasing unsafe stuff in the streets.
Driver assist features like the Autopilot are operating in a regulatory grey zone. The regulation has not caught up with technology and this allows companies like Tesla to release unsafe software in the streets, killing people.
Exactly. Driver assist features. These aren’t something to be blindly relied on and everyone knows this and the vehicle will remind you. Every crash is fault of the driver - not the system.
Now if you don’t mind showing me the data that’s “piling up is showing that Autopilot is deadly”
Exactly. Driver assist features.
Except Tesla isn’t selling them as such. Theid advertisement videos as early as 2016 say “the driver is not necessary, the car is driving itself”. This is false marketing in its purest and simplest form: https://www.theguardian.com/technology/2023/jan/17/tesla-self-driving-video-staged-testimony-senior-engineer
I’m still waiting for the data that you said is piling up. You also did not specify what number of accidents you find acceptable for a self driving system. It’s almost like you’re trying to evade my questions…
Driving is not safe. These systems could be improved upon, but they’ve also saved numerous lives by preventing accidents from occurring in the first place. The example in the OP happened while this driver was sitting behind the wheel watching a movie. The first example in your article occurred with a driver behind the wheel. If either of them had been driving a 1995 Honda Civic, these accidents would have occurred just the same, but would anyone be demanding that Honda is to blame?
No, we would (rightfully so) blame the driver for merging into a semi truck that from my understanding was clearly visible.
but they’ve also saved numerous lives by preventing accidents from occurring in the first place.
There is no data to make this claim. You’re just making this up.
Give me a break. You think all these companies are dumping billions of dollars into technology that doesn’t work? You’re making stuff up. Go watch some dashcam videos on YouTube if you want some proof.
Are you kidding me? I never said it will never work. But that does not mean its current state is safe to trust your life.
You did in fact just say that by saying that I was making up the fact that these systems have saved lives. Moving the goalposts to “you can’t trust your life to it” doesn’t make your original argument anymore accurate nor does it reference anything in dispute. Nobody said you should trust your life to cruise control.
There is no doubt that one day these systems will be so good that they will make transportation much safer. But there is no data that shows that we’re already there.
You mean you’ve done zero research on the topic before injecting your opinions, so you simply haven’t seen any data?
https://thedriven.io/2023/04/27/accident-rate-for-tesla-80-lower-than-us-average-with-fsd/
New data released in its Impact Report show that Tesla vehicles with Autopilot engaged (mostly highway miles) had just 0.18 accidents per million miles driven, compared to the US vehicle average of 1.53 accidents per million miles.
A statistically significant 16% reduction in the risk of involvement in all casualty crashes of these types and a 22% reduction estimated for fatal and serious injury crashes was associated with LKA fitment to Australian light vehicle was estimated.
https://pubmed.ncbi.nlm.nih.gov/27624313/
The analysis showed a positive effect of the LDW/LKA systems in reducing lane departure crashes. The LDW/LKA systems were estimated to reduce head-on and single-vehicle injury crashes on Swedish roads with speed limits between 70 and 120 km/h and with dry or wet road surfaces (i.e., not covered by ice or snow) by 53% with a lower limit of 11% (95% confidence interval [CI]). This reduction corresponded to a reduction of 30% with a lower limit of 6% (95% CI) for all head-on and single-vehicle driver injury crashes (including all speed limits and all road surface conditions).
https://www.forbes.com/advisor/car-insurance/vehicle-safety-features-accidents/
ADAS functionalities can change the driving experience. According to research by LexisNexis Risk Solutions, ADAS vehicles showed a 27% reduction in bodily injury claim frequency and a 19% reduction in property damage frequency.
The second trial, set for early October in a Florida state court, arose out of a 2019 crash north of Miami where owner Stephen Banner’s Model 3 drove under the trailer of an 18-wheeler big rig truck that had pulled into the road, shearing off the Tesla’s roof and killing Banner. Autopilot failed to brake, steer or do anything to avoid the collision, according to the lawsuit filed by Banner’s wife.
Is this the guy who was literally paying no attention to the road at all and was watching a movie whilst the car was in motion?
I legit can’t find information on it now as every result I can find online is word for word identical to that small snippet. Such is modern journalism.
I know people like to get a hard on with the word “autopilot”, but even real pilots with real autopilot still need to “keep an eye on things” when the system is engaged. This is why we have two humans in the cockpit on those big commercial jets.
The way musk marketed it was as a “self driving” feature, not a driving assist. Yes with all current smart assists you need to be carefully watching what it’s doing, but that’s not what it was made out to be. Because of that I’d still say tesla is responsible.
I think you’re referring to FSD beta and not Autopilot. One is supposed to be the self driving feature at some point while the other is simply lane keeping/cruise control. FSD wasn’t even available when this crash happened.
Tesla’s Autopilot is driving assistance. I don’t know where you saw Musk marketing it as a self driving feature. Hell, even for the misnomer “full self driving” they note:
The currently enabled features require a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.
The feature is called “Autopilot”, meaning that the car automatically pilots itself, rather than using a human pilot. The definition of autopilot is literally “a device for keeping an aircraft or other vehicle on a set course without the intervention of the pilot.” I’m not sure how he could have more explicitly misrepresented the product.
meaning that the car automatically pilots itself, rather than using a human pilot
No it doesn’t. Even an airplane autopilot only maintain the course set by the pilot and it’s not capable of making decisions and navigating autonomously.
All technologies in publicly sold vehicles today and in recent years are of driving assistance and require driver’s attention. Anybody using the tech without paying attention is being negligent.
Autopilot is capable of navigating though, and it does make decisions like when to merge and when to execute a turn, by design. I don’t think it’s adequately equipped to make those decisions, but by design, it does. They even advertise it on their official YouTube channel, with a clip of them just plugging in a destination and letting the car get them there in their video. Tesla is responsible for advertising they do, and claims they make of their product that simply aren’t true.
This is FSD, not autopilot. Also note the driver is paying attention.
It is two different modes of the same system, one just has more features enabled than the other. You also can’t tell if the driver is paying attention, as they are mostly out of frame. Even if they are, their hands are entirely off the wheel, and it’s unlikely that they would be able to react in time to prevent an accident even if they are paying attention.
Autopilot is cable of basically ying the plane itself. A human is there for when shit goes wrong.
There are also two pilots. Because they know people are people. And don’t brand it a self driving and full self driving then.
It seems like an obvious flaw that’s pretty simple to explain. Car is learnt to operate the infromation about collisions on a set height. The opening between the wheels of a truck’s trailer thus could be treated by it as a free space. It’s a rare situation, but if it’s confirmed and reproduceable, that, at least, raises concerns, how many other glitches would drivers learn by surprise.
This is the best summary I could come up with:
SAN FRANCISCO, Aug 28 (Reuters) - Tesla Inc (TSLA.O) is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk’s assertions about the technology.
Self-driving capability is central to Tesla’s financial future, according to Musk, whose own reputation as an engineering leader is being challenged with allegations by plaintiffs in one of two lawsuits that he personally leads the group behind technology that failed.
The first, scheduled for mid-September in a California state court, is a civil lawsuit containing allegations that the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour, strike a palm tree and burst into flames, all in the span of seconds.
Banner’s attorneys, for instance, argue in a pretrial court filing that internal emails show Musk is the Autopilot team’s “de facto leader”.
Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the “Autopilot” and “Full Self-Driving” names.
In one deposition, former executive Christopher Moore testified there are limitations to Autopilot, saying it “is not designed to detect every possible hazard or every possible obstacle or vehicle that could be on the road,” according to a transcript reviewed by Reuters.
The original article contains 986 words, the summary contains 241 words. Saved 76%. I’m a bot and I’m open source!
I can’t understand how anyone is even able to let the car do something on its own. I drive old Dacia Logan and Renault Scénic, but at work we have Škoda Karoq and I can’t even fully trust its beeping backing sensors or automatic handbrake. I can’t imagine if the car steered, accelerated or braked without me telling it to.
Aviation is now mostly full automatic. On the otehr hand, there are tons of beacons to help it.
There’s less stuff to hit in the air.
No it’s not at all, there’s still a ton of work for the pilot and first officer despite autopilot
I think it’s fine at the level where you are there and ready to take control, but you need to be paying attention still. Humans aren’t flawless and we shouldn’t expect our automated systems to be either. This doesn’t excuse Tesla, because they’ve been marketing it as something it’s not for a long time now. They’re driver assist features, not self driving features. It can keep you in a lane and maintain speed well, but you shouldn’t fully trust it. If it’s better than humans at some tasks, it should be used for those regardless of if it will fail at it sometimes. People shouldn’t be lied to and convinced it’s more than it is though.
I actually think that the less a driver has to do, the worse they’ll be at reacting when a situation does come up.
If I’m actually driving and someone, say, runs out in front of me, I’ll slam on the brakes. I’ve had this happen, actually - it was scary as hell because my brain froze up, but…fortunately for us and the guy, my foot still knew what to do, and we stopped in time.
But if I’m sitting in the seat, just monitoring, not actively doing something, my attention is much more likely to wander, and when that incident happens, my reaction time is likely going to be a LOT slower, because I have to “mode shift” back into operating a car, whereas I was already in that mode in the incident above. I don’t think the manufacturers are adequately considering this factor.
(I recognize this might not be a perfect example with automatic brakes, but I think the point is clear.)
deleted by creator
Although it’s far from perfect, autopilot gets into a lot less accidents per mile than drivers without autopilot.They have some statistics here:https://www.tesla.com/VehicleSafetyReportEDIT: As pointed out by commenters in this thread, autopilot is mainly used on high ways, whereas the crash average is on all roads. Also Tesla only counts a crash if the airbag was deployed, but the numbers they compared against count every crash, including the ones without deployed airbags.
Oh yeah, potentially cherrypicked statistics straight from Tesla. I’ll believe those statistics when they come from someone not with a horse in the race to adopt autonomous vehicles.
I think it’s been reported that the FSD statistics they put out are worthless because it tends to disable itself right before collisions.
What’s the motivation to cherrypick though?
Human drivers are bad enough that I don’t think there’s any doubt that autopilot puts them to shame with regards to safety, so they can either look way better and not be suspicious, or look way better and be suspicious… Sounds like an obvious choice to me
They have a financial motivation. You ucould also just Google self driving car safety, and one of the first Google hits is an article that calculated the safety of human drivers from data collected in 2021. Turns out humans are already pretty damn safe, there’s roughly 99.9998 of driving with zero accidents.
there’s roughly 99.9998 of driving with zero accidents
I assume you mean accidents with a fatal injury, given there is a ~1% chance that any given death will be from a car accident (17.4 deaths per 100k per year * 70 years = 1.2%) - using your statistic yields closer to 2.5% however this works with only one driver dying.
Turns out humans are pretty damn safe
Turns out you’ve been tricked by statistics, driving is fucking lethal and chances are most people know or are friends with someone who has died or will die in a car accident (assuming ~80 friends/acquaintances per person)
I don’t think there’s any doubt that autopilot puts them to shame with regards to safety
Where are the numbers to back this up?
Start with the numbers on humans driving drunk, tired, on their phone, while having a conversation, bored or in practically other state and work backwards. Driving is dangerous as fuck and it’s pretty much universally accepted that the biggest challenge for autonomous vehicles is humans doing unpredictable and stupid shit
that the biggest challenge for autonomous vehicles is humans doing unpredictable and stupid shit
The biggest challenge when I’m driving is humans or AI doing unpredictable and stupid shit.
You still have not given any numbers to back up your claim. While we all expect that AI will one day be much better than humans in driving, there is no data to say that it currently is.
Ok so sure there’s nothing on Tesla’s autopilot, however that’s not to say there’s nothing on autonomous systems…
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8431415/
In 2018 and 2017, 6,735,000 and 6,453,000 traffic crashes occurred in the United States, which resulted in 33,919 and 34,560 deaths, respectively.
https://www.orsa.org.uk/reducing-occupational-road-risk/reducing-driver-error-accidents/
In reality, car crashes aren’t accidents and 94% are due to human error In 2011, British police officers attended 118,404 road traffic collisions (figures from the Department of Transport). In 42% of these crashes, the most frequently reported factor was that the driver ‘failed to look properly’. The second most commonly listed factor for 21% of the crashes was the driver ‘failing to judge the other person’s path or speed’. The third most common contributing factor was the driver being actually ‘careless, reckless or in a hurry’ and this accounted for 16% of the crashes.
There’s your stats on humans being reckless and dangerous when driving cars, and of course there’s nothing concrete for fully autonomous cars because they aren’t legal anywhere, but here’s some stats on pretty much every existing driver assist - notably they all prevent accidents compared to just a human driving: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8431415/
It really isn’t a stretch from the 3 most frequent crash causes being human error and human assistance tools reducing accident frequency a bunch to say that all these systems coming together (as they cover near enough everything to do with driving a car) would be safer than a human driver, but I don’t doubt you’ll deny it as you’re asking for something impossible to give (as governments haven’t allowed full autonomous driving cars yet, so there’s no statistics on their use) and so aren’t actually looking for information but to confirm your biases and feel like you’ve “won”, despite the fact there’s no objectively unsuspicious data on the exact situation you’re asking for meaning that you can’t prove yourself right either beyond “I’m a little suspicious of this company so I must be right”
Those stats are misleading though. Autopilot only runs on highways, which are much safer per mile even for human drivers.
Tesla are basically comparing their system, which only runs in pristine, ideal conditions, against an average human that has to deal with the real world.
As far as I’m aware they haven’t released safety per mile data from the FSD cars yet, and until they do I will remain skeptical about how much safer it currently is.
It actually would be really hard to get an unbiased estimate of safety given the current systems, because the data is inherently cherry picked by drivers who can switch the feature on/off depending on how complex the driving task is. What a simple number like crashes per mile really measures is really how likely FSD drivers are to overestimate the system’s ability plus some unknown base rate of unavoidable accidents.
Probably the only way to control for this is looking at cars that are fully autonomous door to door and aren’t limited to pre-selected roads/areas. I don’t know that anyone is even doing that sort of testing.
Doesn’t auto pilot kinda work on normal roads now? Not saying I trust the stat either.
That’d be the fsd stats, not autopilot
Hmm you’re right about autopilot mainly being used on highways and those roads are a lot safer. I’ll edit my main comment
According to this report, the average Tesla equipped with FSD Beta, driven on predominantly non-highway sections of road, crashes 0.31 times per million miles, a dramatic decrease from the average American, who crashes 1.53 times every million miles.
Does that report from Tesla include when autopilot turns off shortly before crashing into something?
That’s literally the only data we have so that’s what I’m basing my opinion to while being fully aware that while I doubt that these stats lie they may however be misleading as statistics often are.
My key argument still stands; autopilot/FSD is not as bad/dangerous as people here make them to be and they’re getting better all the time.
If one is going to make the claim that these systems are more dangerous than human driver then show me the data you’re basing it on. People surely don’t think that just because they don’t like the company/CEO, right?
Do you have statistics not by Tesla?
They’re probably the only ones who even has access to such statistics. If you’re simply just going to refute the stats because of the source then atleast provide some credible counter evidence.
Even according to that article autopilot and FSD seems to be about at the level as human driver. I’m willing to accept that - many others arent.
The narrative here is that these systems are dangerous and shouldn’t be allowed to be used on public roads. My argument is that they’re not as dangerous as reading stories about these individual incidents might make them seem like and they’re getting better all the time. If they’re not significantly better than human drivers now they will be soon and Tesla most likely is going to lead the way.
If you’re simply just going to refute the stats because of the source then atleast provide some credible counter evidence.
Tesla’s numbers are trash. Tesla have been caught again and again lying.
…then provide some more trustworthy stats because you just saying that is not it. This is literally like debating a climate change denier or flat earther.
“Here’s a picture of the earth from space”
- Lies! Nasa cannot be trusted. CGI.
Why would Tesla release any numbers that would make it look bad?
Why should we trust any numbers that comes from Tesla?
And when autopilot is at fault for an accident or fatality, who should be held responsible?
Just because it’s better, shouldn’t absolutely them of responsibility when it fails.
It’s an interesting question. But I would be disappointed if the self-driving was basically killed by the legal questions, since it has a huge potential to save lives.
The driver is always responsible for using the tools within the car correctly and maintaining control of the vehicle at all times.
Either way the driver would be at fault. However, the driver might be able to make a (completely separate) case that the car’s defects made control impossible, but since the driver always had the option to disable self-driving, I doubt that would go anywhere.
Just like you don’t get off the hook if your cruise control causes an accident… and it doesn’t matter how much Tesla lied about what it may or may not be capable of, because at the end of the day it’s always the driver’s responsibility to know the limitations of the vehicle and disable the feature and take control when necessary.
Which is exactly what this case is claiming, that the software is defective.
And what happens when we progress beyond Level 2 or 3 automation? Then the car is making choices for the driver, choices the driver may not have any say in or realistically be capable of reacting to in an emergency?
Deferring responsibility to the driver under any scenario is a cop-out. We have a long history of engineering qualifications and regulations to ensure safety of the populace, engineers and architects design structures to be safe, plumbers have to plumb to code, heck even cars themselves have a mile long list of compliance requirements. All to ensure the thing that companies build aren’t killing the population, and when they do someone is responsible.
Yet as soon as we start talking about software, “not my problem dawg.”.
This is a guy who was using a glorified cruise control (which is all AP is) at high speed whilst watching a DVD instead of looking at the road.
The software can only help so much. There’s a reason why there are laws requiring attentiveness checks now… people are reckless
This reminds me when you google if a certain company or product is good or legit and the top one is posted from the companies website.