General Motors' robotaxi service suspends driverless operations nationwide regulators
(apnews.com)
from Hypx@kbin.social to technology@lemmy.world on 28 Oct 2023 23:41
https://kbin.social/m/technology@lemmy.world/t/578944
from Hypx@kbin.social to technology@lemmy.world on 28 Oct 2023 23:41
https://kbin.social/m/technology@lemmy.world/t/578944
General Motors’ Cruise says it's suspending its driverless operations nationwide as the robotaxi service works to rebuild public trust.
threaded - newest
This is the best summary I could come up with:
(tldr: 14 sentences skipped)
In a Tuesday statement, Cruise said it cooperating with regulators investigating the Oct. 2 accident — and that its engineers are working on way for its robotaxis to improve their response “to this kind of extremely rare event.”
(tldr: 1 sentences skipped)
Bryant Walker Smith, a University of South Carolina law professor who studies automated vehicles, wants to know “who knew what when?” at Cruise, and maybe GM, following the accident.
(tldr: 3 sentences skipped)
In December of last year, the NHSTA opened a separate probe into reports of Cruise’s robotaxis that stopped too quickly or unexpectedly quit moving, potentially stranding passengers.
(tldr: 1 sentences skipped)
According to an Oct. 20 letter that was made public Thursday, since beginning this probe the NHSTA has received five other reports of Cruise AVs unexpectedly breaking with no obstacles ahead.
(tldr: 3 sentences skipped)
Cruise has also previously maintained that its record of driverless miles have outperformed comparable human drivers in terms of safety, notably crash rates.
(tldr: 1 sentences skipped)
Walker Smith notes that there are several possibilities — including distinguishing Cruise’s prospects from its competitors, particularly those who haven’t expanded as aggressively, or a “Tesla scenario” where initial outrage may not amount to prompt, significant changes.
(tldr: 7 sentences skipped)
The original article contains 885 words, the summary contains 213 words. Saved 76%. I’m a bot and I’m open source!
But is this actually true? I hate that they just printed this without any attempt to verify it. Surely some independent body has looked into this by now.
Pretty sure it IS 1v1 factually true, but the real question is, “why?”. Is it because everyone is weary around a car with a huge-ass camera and sensor system on top that doesn’t have a driver? Or because the system is good?
I am sure it is true in at least some sense because they would be called on an outright lie but there are many ways you can deceive with true numbers. And I don’t trust them to be fully honest.
But if it is accurate I’d like to see an independent analysis rather than the company’s spin on it.
Oh definitely. It could be to no quality of the cars themselves if their driving record is from everyone else avoiding them on the road.
.
From their privacy policy:
getcruise.com/legal/us/privacy-policy/
I suspect there is something more to this than just that. After all, the car in question did this:
It seems like there are unsolvable safety problems going on.
There is no logic for this issue and I swear everyone will see this problem reoccur for every scenario that hasn’t been accounted for. This will happen in almost every self driving car because it just hasn’t been accounted for.
Yes, the car does not appear to have safety features that let it know a body is caught underneath, but it did try to get out of traffic after the collision.
Since this never happens to human drivers that means autonomous cars are unfeasible.
Or it is an opportunity to add some additional sensors underneath that will make it miles better than human drivers.
Really the main problem with autonomous cars at this point in time is a combination of the co panes hiding issues and the public expecting perfection. More transparency and a 3rd party comparison to human drivers would be the best way to both improve automation and gain public trust when they actually see how bad human drivers can be.
Also charge corporations for betatesting on the fucking public… they’re using tax payer funded roads and putting our lives at risk for their profits. They should share those profits far, far more than they do.
You would think a self driving car could have 360 degrees of vision and not run into things, whether it’s a firetruck or a cardboard box or a person. That should be job 1 for self driving.
It’s not true, this is not the first time Cruise has been caught lying, and at some point an adult needs to step up and tell them to stop putting people in danger.
Even Waymo has commented on the past about Cruise playing fast and loose with the definitions of things that needed to be reported.
Waymo cars seem to operate much more sensibly than Cruise ones from what I’ve watched and read… although IMO that is mainly down to the car calling it quits much sooner and asking for an operator to take control, and driving in a different environment in general.
Cruise on the other hand seems to just carry on anyway, unless its lidar is blocked 😳
Yeah, I mean, some food for thought here is that Waymo started out as a research project and has been doing this since 2009 and they’re ultra conservative with their behaviors. Before starting in 2009, the beginnings of the team were recruited from DARPA Grand Challenge participants. And even they have major mishaps.
Cruise, on the other hand, started out trying to sell retrofit hardware right away. Then tried convincing people they could do city driving right away. Now GM has revenue targets for them, like any adult business would, and they have no hope of ever accomplishing them. So, they’re back to their old tricks, cutting down the number of miles driven for training models, rushing vehicles into service with no monitoring operators in them, deceiving investors and regulators about remote operations.
One is a slow, methodical money furnace that attempts to solve the larger problem set. The other is a fast moving money furnace that tries to get people to pay them for half measures.
Damn, Waymo has been around for that long? TIL
Waymo’s progress is probably a good indicator as to how far along we are with self driving cars IMO. Given that Waymo has their cars pretty thoroughly trained on set routes (well, even us humans need to learn or try various routes before we’re fully confident on them sometimes), Cruise cheaping out on the whole training process is only going to accelerate their demise… especially when it’s at the expense of pedestrians’ safety
If you really want your mind, blown the first autonomous vehicle to drive coast to coast in the US happened in 1989. A vehicle from Carnegie millen University called NavLab. It used lidar, cameras, radar, and ultrasonics. Literally the same stuff we’re using today.
Driverless cars are certainly less error-prone overall than human operated ones. Distraction, sleepiness, intoxication, hubris, and other common "human error" causes of accidents are eliminated. Now we're seeing, though, that human beings - even pretty average ones - are still able to make better judgments in unique situations.
Because the recent incidents have been so laughably stupid from a human perspective, the instinct is to doubt the accuracy of driverless cars in all situations. The robots are able to do the comparatively simple things extremely well. It's just the more complex things they still have trouble with - so far. They're still safer than human operators, and will only continue to get better.
Humans make the same mistakes though. Backup cameras were added to cars because humans kept running over people, especially kids. People block emergency vehicles all the time.
Yes, the automation will always have room for improvement, but the current 'newsworthy' incidents are rarely in the news when humans do the exact same thing.
I would be pretty confused as well if someone ran up to my car and stuck a traffic cone on it.
Would you sit stopped in traffic for 20 minutes looking dumbfounded?
I would if I couldn’t get out of the car and remove it.
It would be nice if we had a Ralph Nader for AI driving.
He did a lot for safety decades ago. Feels like we need similar now.
I knew this coming— with my soft, human brain.
Ironically, the accident that caused them to decide Cruise is unsafe was started by a human driver hit-and-run. So the humans are still more unsafe, but they are punishing the robots for it