The case is believed to be the first time that U.S. prosecutors have brought felony charges against a motorist who was using a partially automated driving system.
Tesla driver who killed 2 people while using autopilot must pay $23,000 in restitution without having to serve any jail time::The case is believed to be the first time that U.S. prosecutors have brought felony charges against a motorist who was using a partially automated driving system.
Yeah, judging by the article, Tesla should take some responsibility here. Not that the driver should get off, if your car is blowing a red light at 120km/h you're just not paying proper attention.
Sure, I'd prefer to know more exactly the time between. Was it 2 seconds or 25? But my premise is this shouldn't happen in the software. I know I read some time ago that Teslas had shut off the software moments before collision, no time to save it, but I'd have to double check that. All to blame the customer
Automakers should not be allowed to use the unsuspecting public as toys for their experimental software, it quickly becomes a 1-4 ton death machine, but I think we agree on that.
Oh yeah, I work in software development myself. No way I'd trust my life to something like Tesla's autopilot, which is perpetually in beta, relies on just the camera feed and is basically run by a manager that has clear issues with over promising and under delivering (among other things). You can get away with shit like that for a website or mobile app, but these are people's lives.
Only if the software is causing the accident or preventing the driver from avoiding one. Here the fault of the software was to not slow down out of the highway (which by experience must be a very specific situation because it most certainly do), the drive could have disengage autopilot or applied brakes to stop at the red light. The software specifically mentions it can't stop at red lights and alerts the driver when it's about to burn one. 100% of fault is the driver here.
Solely? No. But if the airbag, seatbelt, or self-driving autopilot feature that they created contributed to someone's death, they are partially responsible and should face consequences or punishments. Especially if they market it as a safe feature.
Because unless you plan on becoming a lobbyist, or politician, or activist nothing will change. Sitting around saying "They have to get in trouble in some manner" doesn't do anything. If you want that to happen, since they are legally protected from what you want, go make a change.