Tesla Robotaxi Freaks Out and Drives into Oncoming Traffic on First Day (fuelarc.com)
from KayLeadfoot@fedia.io to technology@lemmy.world on 23 Jun 20:28
https://fedia.io/m/technology@lemmy.world/t/2339583

I saw the Tesla Robotaxi:

And that was in a single 22 minute ride. Not great performance at all.

#autonomy #fsd #technology #tesla #waymo

threaded - newest

wizzor@sopuli.xyz on 23 Jun 20:49 next collapse

Yea, I am not surprised given that the regular lane keep is still ghost braking when going under bridges.

Still, I am surprised how well they are doing, using only cameras.

spankmonkey@lemmy.world on 23 Jun 21:18 collapse

Imagine if they used other sensors than just cameras like the competent companies!

Asetru@feddit.org on 23 Jun 21:34 collapse

No no no, you don’t get it! Humans only have eyes, so cars that only have eyes should perform just as good as humans! Disregard that humans don’t perform well in fog or rain or generally anything that isn’t good weather and also disregard that to match our eyes’ resolution you’d need extremely high resolution cameras that produce way too much data for current computers and also disregard that most of the stuff isn’t happening in our eyes but in our brains and also disregard that the point that is usually being made to advocate for self driving cars is that they should be better than humans!

Zwuzelmaus@feddit.org on 24 Jun 02:16 collapse

Humans only have eyes, so cars that only have eyes should perform just as good as humans!

Everybody knows that a good driver uses his ass.

WatDabney@sopuli.xyz on 23 Jun 20:59 next collapse

And this is why DOGE gutted the Office for Vehicle Automation Safety at the NHTSA.

KayLeadfoot@fedia.io on 23 Jun 21:17 collapse

I thought that was to economize for expenses?!

So naturally they started with 5 employees in the smallest office of one of the smallest divisions of the NHTSA. Nooooo ulterior motive, nosiree

Semi_Hemi_Demigod@lemmy.world on 23 Jun 21:18 next collapse

One of those American robot cars

WhatAmLemmy@lemmy.world on 23 Jun 21:27 collapse

I understood that reference

Lost_My_Mind@lemmy.world on 23 Jun 21:19 next collapse

Student Robo drivers, amirite?

ThePantser@sh.itjust.works on 23 Jun 21:20 next collapse

Sounds like a normal cab driver where I’m from. Need to turn off cabbie mode and turn on Sunday grandma mode.

KayLeadfoot@fedia.io on 23 Jun 21:26 next collapse

The only difference being you can ask a human cabbie to slow down :,)

Bebopalouie@lemmy.ca on 23 Jun 21:33 collapse

Depends. I had this perpetually angry cabby a few months back that when I asked him to slow (son has autism, partially verbal). Not only did he not slow down, he sped up and this was in a snowstorm on the highway. Nothing I could do, if I said anything he went faster. Had a Dr appt so I could do nothing once we were out of that cab. I complained later and best cab co would do is “not send that cab” again.

simplejack@lemmy.world on 23 Jun 23:55 collapse

I’d rather have a car that drives better than a typical cabby or uber driver.

Waymo has arguably been there for a while now. I’ll Uber outside of their coverage area, and take the autonomous car with in it. Every other Uber driver in my area is making late lane choices, tailgating, cutting people off, talking to me about how the world works, etc. The Waymos don’t do any of that shit.

Having experienced FSD, I can honestly say, Waymo’s LiDAR system is way better. It doesn’t do this terrifying shit.

graycube@lemmy.world on 23 Jun 21:33 next collapse

It is probably being remotely driven from India and they just lost wifi for a minute.

BagOfHeavyStones@piefed.social on 23 Jun 23:53 next collapse

To quote AVCH, "His controller disconnected."

brbposting@sh.itjust.works on 24 Jun 04:03 collapse

<img alt="" src="https://sh.itjust.works/pictrs/image/7e541360-d5ef-405a-8ed8-db882c2506b6.jpeg">

AVCH

BagOfHeavyStones@piefed.social on 24 Jun 04:17 collapse

Hehe got it in one.

Some people will find him unbearable or a bit repetitive, but he really enjoys himself.

Favorite phrases of his seem to be
Apocalyptic Dingleberry
His name is John Sena
Woa Woa Woa.
Play stupid games win stupid prizes.
NPC move.
Need to know when to pull out.
You're not in the UK now.

Ledericas@lemm.ee on 24 Jun 08:09 collapse

AI=ALways indian.

Meron35@lemmy.world on 24 Jun 14:59 collapse

AI = Actually Indians

independantiste@sh.itjust.works on 23 Jun 21:36 next collapse

this would get a normal person’s car impounded and drivers license revoked. why can a company get away with it?

sundray@lemmus.org on 23 Jun 22:04 next collapse

Systemic corruption.

independantiste@sh.itjust.works on 23 Jun 22:13 next collapse

It wouldn’t say corruption, I think it’s more that the law around the road was designed with a driver in mind, not with a company or even a robot. the consequences have been thought to hurt a person at fault because at the time only a person could drive

LadyAutumn@lemmy.blahaj.zone on 24 Jun 02:27 collapse

Its very convenient that corporations can both be people and not be people depending on whatever outcome is best for them.

credo@lemmy.world on 23 Jun 23:02 collapse

Regulatory capture decapitation

SalamenceFury@lemmy.world on 24 Jun 02:14 next collapse

Elon has enough fuck-you money to pay off anyone who would’ve complained.

Landless2029@lemmy.world on 24 Jun 06:23 collapse

He also paid his way into a government position to shut down the government offices that opposed him.

DancingBear@midwest.social on 24 Jun 02:53 collapse

They had so many cameras on this car, how many laws do you think each average driver breaks every 22 minutes?

It would be interesting if they could figure out why the car chose to do these specific things,

cronenthal@discuss.tchncs.de on 23 Jun 21:42 next collapse

Oof, these highlighted parts from only one video are already enough for me. This looks very stressful, I don’t think I could finish a whole ride with one of these.

otacon239@lemmy.world on 23 Jun 21:45 collapse

Don’t worry. It’ll get into a collision before you finish a whole ride.

Hayduke@lemmy.world on 23 Jun 23:04 next collapse

You can tell it’s a Tesla because of the way it is.

SalamenceFury@lemmy.world on 24 Jun 00:30 next collapse

lmfao

astronaut_sloth@mander.xyz on 24 Jun 00:43 next collapse

Not great performance at all.

That’s better than I was expecting to be perfectly honest.

I’m pretty impressed with the technology, but clearly it’s not ready for field use.

SoleInvictus@lemmy.blahaj.zone on 24 Jun 03:39 collapse

Yeah, it’s a few years away from being ready. Plus the dumb shits need to backpedal on this “cameras for everything!” idiocy.

I’m surprised the taxis aren’t being driven remotely while Musk lies about their amazing AI or whatever.

astronaut_sloth@mander.xyz on 24 Jun 22:38 collapse

this “cameras for everything!” idiocy.

That’s why I’m so impressed with how well it’s actually working. When they get off that really weird self-imposed restriction, it could be an interesting technology.

njordomir@lemmy.world on 24 Jun 01:20 next collapse

If we’re gonna let them on the road, I say that software should get points just like a driver, but when it gets suspended all the cars running that software get shut down.

Showroom7561@lemmy.ca on 24 Jun 03:10 collapse

How about we leave the driving to people, and not pre-alpha software?

There’s no accountability for this horribly dangerous driving, so they shouldn’t be on the road. Period.

Cocodapuf@lemmy.world on 24 Jun 10:17 collapse

There’s no accountability for this horribly dangerous driving, so they shouldn’t be on the road. Period.

Well that’s exactly what their post was about, adding accountability.

Showroom7561@lemmy.ca on 24 Jun 13:01 collapse

Was it? I didn’t read a single hint of adding accountability in the article.

But that begs the question: shouldn’t accountability be in place now, and not maybe at some point in the distant future? They are already on the road.

Cocodapuf@lemmy.world on 24 Jun 13:19 collapse

Not the article, the post from njordamir that you were directly replying to.

shouldn’t accountability be in place now,

Again literally what that user was suggesting

Showroom7561@lemmy.ca on 24 Jun 13:57 collapse

Ah, Ok.

I agree with accountability, but not with the point system. That’s almost like a “three strikes” rule for drunk drivers.

That’s not really accountability, that’s handing out free passes.

Cocodapuf@lemmy.world on 24 Jun 17:37 collapse

That’s almost like a “three strikes” rule for drunk drivers.

Oh man, that would be amazing. If after 3 strikes, all drunk driving could be eliminated… If only we could be so lucky.

He’s not talking about a per-vehicle points system, he’s talking about a global points system for Tesla inc. If after a few incidents, essentially Tesla FSD had it’s license revoked across the whole fleet, I mean, that’s pretty strict accountability I’d say. That’s definitely not handing out free passes, it’s more like you get a few warnings and a chance to fix issues before the entire program is ended nation wide.

Showroom7561@lemmy.ca on 24 Jun 20:35 collapse

I mean, if they weren’t as buggy as they clearly already are, then sure… do a point system.

But as they stand, they shouldn’t be on the road.

Cocodapuf@lemmy.world on 24 Jun 21:34 collapse

I don’t understand the complaint. I mean given their track record, with a system like this, they wouldn’t be on the road.

You know, unless it all worked.

Showroom7561@lemmy.ca on 24 Jun 21:49 collapse

I mean given their track record

That’s my point. Tesla (the company) has been notorious for pushing forward their deadly “self-driving” technology. It’s one of the worst automated systems on the planet, with plenty of tests, reports, and real-world incidences to raise red flags all over the place.

They SHOULD NOT be on the road, so are they only on the road because Musk was able to influence someone?

Cocodapuf@lemmy.world on 24 Jun 23:14 collapse

You seem really invested in making sure Teslas are off the road, but not at all interested in regulation that would keep all dangerous autonomous vehicles off the road. So… do you work for BMW, or Waymo?

myrrh@ttrpg.network on 24 Jun 01:38 next collapse

…oh, that’s just the vietnam regional setting…

6nk06@sh.itjust.works on 24 Jun 05:10 collapse

It could be the south or west of France too. Driving as if you were drunk is a universal skill.

myrrh@ttrpg.network on 24 Jun 12:11 collapse

…oh, i think you misunderstand me: that’s not impaired driving, that’s skillful navigation through the normal flow of traffic in sàigòn…

last_philosopher@lemmy.world on 24 Jun 02:31 next collapse

At least it’s not driving straight into a tree, I call that an improvement.

Cocodapuf@lemmy.world on 24 Jun 10:35 collapse

Man, I cannot figure out why that vehicle was turning. What is it trying to avoid? Why does it think there could be road there? Why doesn’t it try to correct its action mid way?

I’m really concerned about that last question. I have to assume that at some point prior to impact, the system realized it made a mistake. Surely. So why didn’t it try to recover from the situation? Does it have a system for recovering from errors, or does it just continue and say “well I’ll get it next time, now on with the fetal crash”?

Rexios@lemmy.zip on 24 Jun 14:23 collapse

That wasn’t FSD. In the crash report it shows FSD wasn’t enabled. The driver applied a torque to the steering wheel and disengaged it. They were probably reaching into the back seat while “supervising”.

KayLeadfoot@fedia.io on 24 Jun 17:08 next collapse

I covered that crash.

FSD is never enabled at the moment of impact, because FSD shuts off less than a second before impact, so that Tesla's lawyers and most loyal fans can make exactly the arguments you are making. Torque would be applied to the steering wheel when any vehicle departs the roadway, as the driver is thrown around like a ragdoll as they clutch the wheel. Depending on the car, torque can also be applied externally to the tires by rough terrain to shift the steering wheel.

No evidence to suggest the driver was distracted. Prove me wrong if you have that evidence.

Also welcome to the platform, new user!

Rexios@lemmy.zip on 24 Jun 17:28 collapse

Tesla counts any crash within 5 seconds of FSD disengagement as an FSD crash. Where is the cabin camera footage of the driver not being distracted?

Here is a video that goes over it: youtube.com/watch?v=JoXAUfF029I

Thanks for the welcome, but I’m not new just a lemm.ee user.

Cocodapuf@lemmy.world on 24 Jun 17:28 collapse

You weren’t the user who posted that video, but you seem to be quite knowledgeable in this specific case…

Can you link that crash report? Or can you cite some confirmed details about the incident?

Rexios@lemmy.zip on 24 Jun 17:30 collapse
LadyAutumn@lemmy.blahaj.zone on 24 Jun 02:38 next collapse

I am entirely opposed to driving algorithms. Autopilot on planes works very well because it is used in open sky and does not have to make major decisions about moving in close proximity to other planes and obstacles. Its almost entirely mathematical, and even then in specific circumstances it is designed to disengage and put control back in the hands of a human.

Cars do not have this luxury and operate entirely in close proximity to other vehicles and obstacles. Very little of the act of driving a car is math. It’s almost entirely decision making. It requires fast and instinctive response to subtle changes in environment, pattern recognition that human brains are better at than algorithms.

To me this technology perfectly encapsulates the difficulty in making algorithms that mimic human behavior. The last 10% of optimization to make par with humans requires an exponential amount more energy and research than the first 90% does. 90% of the performance of a human is entirely insufficient where life and death is concerned.

Investment costs should be going to public transport systems. They are more cost efficient, more accessible, more fuel/resource efficient, and far far far safer than cars could ever be even with all human drivers. This is a colossal waste of energy time and money for a product that will not be par with human performance for a long time. Those resources could be making our world more accessible for everyone, instead they’re making it more accessible for no one and making the roads significantly more dangerous. Capitalism will be the end of us all if we let them. Sorry that train and bus infrastructure isnt “flashy enough” for you. You clearly havent seen the public transport systems in Beijing. The technology we have here is decades behind and so underfunded its infuriating.

ComfortablyDumb@lemmy.ca on 24 Jun 03:02 next collapse

This technology purely exists to make human drivers redundant and put the money in the hands of big tech and eventually the ruling class composed off of politicians risk averse capitalists and beurocracy. There is no other explanation for robo taxis to exist. There are better solution like trains and metros which can solve the movement of people from point A to point B easily. It does not come with a 3x-10x capital growth that making human drivers redundant will for the big tech companies.

Red_October@lemmy.world on 24 Jun 08:32 collapse

This technology purely exists to make human drivers redundant and put the money in the hands of big tech and eventually the ruling class composed off of politicians risk averse capitalists and beurocracy. There is no other explanation for robo taxis to exist.

There is another reason, though, and it’s much simpler. Basic greed.

There are people who see the opportunity to make more money for themselves, so they’ll do it. When it comes to robo taxis, they’re not interested in class struggles, it’s not about politics, their interest in making human drivers redundant extends only so far as increasing their customer base. These aren’t Machiavellian schemers rubbing their hands together and cackling at their dark designs coming to fruition, it’s just assholes in suits who’s one and only concern is “number go up.”

Even when it comes to their politics and to the class dynamics, their end goal is always the same. Number go up. They don’t care about what harm it could do. They’re not intent on deliberately doing more harm, they give no thought to doing less harm, they do not care. All that drives them, ever, is Number Go Up.

baggachipz@sh.itjust.works on 24 Jun 10:16 collapse

You got downvoted but you’re right. The only cabal at work here is basic human greed. Anytime you want to know why people do something, consider the motivation of the person and the incentives. Musk constantly talks about how autonomy will make his company worth “trillions”, and he wants that because he’ll keep maxing the high score in Billionaire Bastard Bacchanalia.

He can claim noble intentions, but as you said, the game is simply to make Number Go Up. That it causes untold harm to others isn’t even an afterthought.

Ronno@feddit.nl on 24 Jun 05:32 next collapse

Public transport systems are just part of a mobility solution, but it isn’t viable to have that everywhere. Heck, even here in The Netherlands, a country the size of a post stamp, public transport doesn’t work outside of the major cities. So basically, outside of the cities, we are also relying on cars.

Therefore, I do believe there will be a place for autonomous driving in the future of mobility and that it has the potential to reduce number of accidents, traffic jams and parking problems while increasing the average speed we drive around with.

The only thing that has me a bit worried is Tesla’s approach to autonomous driving, fully relying on the camera system. Somehow, Musk believes a camera system is superior to human vision, while it’s not. I drive a Tesla (yeah, I know) and if the conditions aren’t perfect, the car disables "safety’ features, like lane assist. For instance when it’s raining heavily or when the sun is shining directly into the camera lenses. This must be a key reason in choosing Austin for the demo/rollout.

Meanwhile, we see what other manufacturers use and how they are progressing. For instance, BMW and Mercedes are doing well with their systems, which are a blend of cameras and sensors. To me, that does seem like the way to go to introduce autonomous driving safely.

sykaster@feddit.nl on 24 Jun 06:07 next collapse

There’s usually buses from villages into the major cities though, it live in one and there’s a bus every hour to go to a nearby city, from where I can then take a train. I wouldn’t say it’s that bad

Ronno@feddit.nl on 24 Jun 12:06 collapse

Depends on how far you live from the city I guess, where I live it’s 2 hours to major cities. But anyways, 1 hr wait to get somewhere doesn’t feel desirable to me. It just doesn’t provide enough coverage to fully replace a car.

bitwolf@sh.itjust.works on 24 Jun 15:38 collapse

I believe Austin was chosen because they’re fairly lax about the regulations and safety requirements.

Waymo already got the deal in Cali. And Cali seems much more strict. Austin is offering them faster time to market as the cost of civilian safety.

markovs_gun@lemmy.world on 24 Jun 12:35 next collapse

While I agree focusing on public transport is a better idea, it’s completely absurd to say machines can never possibly drive as well as humans. It’s like saying a soul is required or other superstitious nonsense like that. Imagine the hypothetical case in which a supercomputer that perfectly emulates a human brain is what we are trying to teach to drive. Do you think that couldn’t drive? If so, you’re saying a soul is what allows a human to drive, and may as well be saying that God hath uniquely imbued us with the ability to drive. If you do think that could drive, then surely a slightly less powerful computer could. And maybe one less powerful than that. So somewhere between a casio solar calculator and an emulated human brain must be able to learn to drive. Maybe that’s beyond where we’re at now (I don’t necessarily think it is) but it’s certainly not impossible just out of principle. Ultimately, you are a computer at the end of the day.

LadyAutumn@lemmy.blahaj.zone on 24 Jun 13:51 collapse

I never did say it wouldn’t ever be possible. Just that it will take a long time to reach par with humans. Driving is culturally specific, even. The way rules are followed and practiced is often regionally different. Theres more than just the mechanical act itself.

The ethics of putting automation in control of potentially life threatening machines is also relevant. With humans we can attribute cause and attempted improvement, with automation its different.

I just don’t see a need for this at all. I think investing in public transportation more than reproduces all the benefits of automated cars without nearly as many of the dangers and risks.

markovs_gun@lemmy.world on 24 Jun 14:29 collapse

Driving is culturally specific, even. The way rules are followed and practiced is often regionally different

This is one of the problems driving automation solves trivially when applied at scale. Machines will follow the same rules regardless of where they are which is better for everyone

The ethics of putting automation in control of potentially life threatening machines is also relevant

You’d shit yourself if you knew how many life threatening machines are already controlled by computers far simpler than anything in a self driving car. Industrially, we have learned the lesson that computers, even ones running on extremely simple logic, just completely outclass humans on safety because they do the same thing every time. There are giant chemical manufacturing facilities that are run by a couple guys in a control room that watch a screen because 99% of it is already automated. I’m talking thousands of gallons an hour of hazardous, poisonous, flammable materials running through a system run on 20 year old computers. Water chemical additions at your local water treatment plant that could kill thousands of people if done wrong, all controlled by machines because we know they’re more reliable than humans

With humans we can attribute cause and attempted improvement, with automation its different.

A machine can’t drink a handle of vodka and get behind the wheel, nor can it drive home sobbing after a rough breakup and be unable to process information properly. You can also update all of them all at once instead of dealing with PSA canpaigns telling people not to do something that got someone killed. Self driving car makes a mistake? You don’t have to guess what was going through its head, it has a log. Figure out how to fix it? Guess what, they’re all fixed with the same software update. If a human makes that mistake, thousands of people will keep making that same mistake until cars or roads are redesigned and those changes have a way to filter through all of society.

I just don’t see a need for this at all. I think investing in public transportation more than reproduces all the benefits of automated cars without nearly as many of the dangers and risks.

This is a valid point, but this doesn’t have to be either/or. Cars have a great utility even in a system with public transit. People and freight have to get from the rail station or port to wherever they need to go somehow, even in a utopia with a perfect public transit system. We can do both, we’re just choosing not to in America, and it’s not like self driving cars are intrinsically opposed to public transit just by existing.

LadyAutumn@lemmy.blahaj.zone on 24 Jun 15:24 collapse

What are you anticipating for the automated driving adoption rate? I’m expecting extremely low as most people cannot afford new cars. We are talking probably decades before there are enough automated driving cars to fundamentally alter traffic in such a way as to entirely eliminate human driving culture.

In response to the “humans are fallible” bit ill remark again that algorithms are very fallible. Statistically, even. And while lots of automated algorithms are controlling life and death machines, try justifying that to someone who’s entire family is killed by an AI. How do they even receive compensation for that? Who is at fault? A family died. With human drivers we can ascribe fault very easily. With automated algorithms fault is less easily ascribed and the public writ large is going to have a much harder time accepting that.

Also, with natural gas and other systems there are far fewer variables than a busy freeway. There’s a reason why it hasn’t happened until recently. Hundreds of humans all in control of large vehicles moving in a long line at speed is a very complicated environment with many factors to consider. How accurately will algorithms be able to infer driving intent based on subtle movement of vehicles in front of and behind it? How accurate is the situational awareness of an algorithm, especially when combined road factors are involved?

Its just not as simple as its being made out to be. This isnt a chess problem, its not a question of controlling train cars on set tracks with fixed timetables and universal controllers. The way cars exist presently is very, very open ended. I agree that if 80+% of road vehicles were automated it would have such an impact on road culture as to standardize certain behaviors. But we are very, very far away from that in North America. Most of the people in my area are driving cars from the early 2010s. Its going to be at least a decade before any sizable amount of vehicles are current year models. And until then algorithms have these obstacles that cannot easily be overcome.

Its like I said earlier, the last 10% of optimization requires an exponentially larger amount of energy and development than the first 90% does. Its the same problem faced with other forms of automation. And a difference of 10% in terms of performance is… huge when it comes to road vehicles.

jjjalljs@ttrpg.network on 24 Jun 12:51 next collapse

I’ve been saying for years that focusing on self driving cars is solving the wrong problem. The problem is so many people need their own personal car at all.

LadyAutumn@lemmy.blahaj.zone on 24 Jun 13:53 collapse

Exactly. Bring back trams, build less suburbs, better apartment housing. If we want a society reorganized around accessibility then let’s actually build that.

bitwolf@sh.itjust.works on 24 Jun 15:36 collapse

I always have the same thought when I see self driving taxi news.

“Americans will go bankrupt trying to prop up the auto/gas industries rather than simply building a train”.

And it’s true. So much money is being burned on a subpar and dangerous product. Yet we’ve just cut and cancelled extremely beneficial high speed rail projects that were overwhelmingly voted for by the people.

Showroom7561@lemmy.ca on 24 Jun 03:07 next collapse

Fucking hell. We don’t let drunks drive taxis, and that goddamn thing drove like it was under the influence.

Does Tesla get sent tickets for traffic violations, or are we OK with this?

the_trash_man@lemmy.world on 24 Jun 03:45 collapse

I’m sure they’re legal team is hard at work trying to find loopholes to circumvent any traffic infringements

squaresinger@lemmy.world on 24 Jun 04:40 next collapse

Depending on how exactly the laws are worded, they might even get away without paying fines. Many traffic codes define that only the driver (not the owner of the car) can be fined, and these robo taxis don’t have drivers.

Soggy@lemmy.world on 24 Jun 08:21 next collapse

The system of corporate veiling of responsibility is going to kill us all. What should happen is every single person who signed off on, voted for, or materially contributed to the implementation of this dangerous hardware should be prosecuted for criminal negligence. Gut the C-suite and the board.

squaresinger@lemmy.world on 24 Jun 08:43 collapse

You aren’t wrong.

Cocodapuf@lemmy.world on 24 Jun 10:26 collapse

and these robo taxis don’t have drivers.

Oh yes they do… The diver is Tesla, inc. There’s no problem with charging a company fines, that’s easy. It is difficult to issue higher penalties though, jail time, or license revocation. We’ll need to work out solutions for that, they should not get off free.

But we can certainly fine the driver…

squaresinger@lemmy.world on 24 Jun 10:44 collapse

That’s where law is not justice.

I do agree with your sentiment, but if the law defines a driver as a human, which is usually the case, then by definition Tesla cannot be the driver.

It could even be that the passenger sitting in the driver’s seat of a robotaxi would be defined as the driver.

And sure, these laws need to be adapted before robotaxis should be allowed to hit the streets.

deafboy@lemmy.world on 24 Jun 09:36 collapse

I can see the headlines… “Tesla. De-funding the police!”

Critical_Thinker@lemm.ee on 24 Jun 03:38 next collapse

The rent seeking is so hard with this automate-the-profits bullshit.

The moment we perfect auto-taxis the service should be a public benefit and run by a nonprofit.

KayLeadfoot@fedia.io on 24 Jun 03:50 collapse

NYC Mayoral candidate Mamdani is talking about making busses free, and that makes a radical shitload of sense.

Free autotaxis would be a boon for productivity and personal freedom, like AI promises to be but democratized for everybody rather than just the richest fraction of a percent.

Evotech@lemmy.world on 24 Jun 05:02 collapse

People are going to take a shit in them. And ride them around for fun

KayLeadfoot@fedia.io on 24 Jun 05:04 next collapse

Guess what? People already do that.

freddydunningkruger@lemmy.world on 24 Jun 09:15 next collapse

Thanks for pointing out how insane and disconnected the elon glazers are in believing their Teslas will drive off while they sleep to earn any kind of positive cash flow, then show up back home just in time to recharge for the commute to work, smelling fresh as a daisy.

Tja@programming.dev on 24 Jun 09:16 next collapse

I don’t see a problem with the second one. The bus is already doing the route, it costs basically nothing to have a few joy riders.

deafboy@lemmy.world on 24 Jun 09:31 next collapse

ride them around for fun

Imagine the horror!

zarkanian@sh.itjust.works on 24 Jun 20:52 collapse

People are going to take a shit in them.

Sure, somebody will. But the system will take note of that person, and then they don’t get to ride again. Or they have to pay a fine. Or whatever.

spamspeicher@feddit.org on 24 Jun 04:38 next collapse

The Tesla is is just following the regional driving style. Humans make the same mistakes at 15:06

/s

Rexios@lemmy.zip on 24 Jun 14:31 collapse

This but unironically. If this is the worst thing that happened on launch day then that seems pretty successful to me. This is the worst version of the robo taxi we will ever see.

NigelFrobisher@aussie.zone on 24 Jun 05:55 next collapse

What real world problem does this solve?

Knock_Knock_Lemmy_In@lemmy.world on 24 Jun 06:06 next collapse

Task automation

echodot@feddit.uk on 24 Jun 08:22 collapse

Is it really task automation if it does it worse than a drunk human could have done it?

Knock_Knock_Lemmy_In@lemmy.world on 24 Jun 09:00 collapse

I didn’t claim Tesla has solved this automation problem.

Waymo is closer to human levels, but not yet considerably better.

allan@lemmy.blahaj.zone on 24 Jun 08:04 next collapse

Stonks?

Cocodapuf@lemmy.world on 24 Jun 10:36 next collapse

Actually, lots. The issue is that if it doesn’t work it’s dangerous.

bitwolf@sh.itjust.works on 24 Jun 15:33 collapse

Nothing that a train + scooter / bicycle cannot solve imo

Red_October@lemmy.world on 24 Jun 08:19 next collapse

Remember guys, Tesla wants to have a living person sitting behind the wheel for “safety.” Don’t YOU want to get paid minimum wage to sit in a car all day, paying attention but doing nothing unless it’s about to crash, at which point you’ll be made the scapegoat for not preventing the crash?

Welcome to the future, you’re gonna hate it here.

Tja@programming.dev on 24 Jun 09:15 collapse

I mean, compared to getting minimum wage flipping burgers in a hot kitchen, or picking vegetables in the sun, or working the register in a store in a bad neighborhood, or even restocking stuff at Walmart… yes, I would sit all day in an air conditioned car doing nothing but “paying attention”.

rowinxavier@lemmy.world on 24 Jun 09:32 next collapse

The unfortunate thing about people is we acclimatise quickly to the demands of our situation. If everything seems OK, the car seems to be driving itself, we start to pay less attention. Fighting that impulse is extremely hard.

A good example is ADHD. I have severe ADHD so I take meds to manage it. If I am driving an automatic car on cruise control I find it very difficult to maintain long term high intensity concentration. The solution for me is to drive a manual. The constant involvement of maintaining speed, revs, gear ratio, and so on mean I can pay attention much easier. Add to that thinking about hypermiling and defensive driving and I have become a very safe driver, putting about 25-30 thousand kms on my car each year for over a decade without so much as a fender bender. In an automatic I was always tense, forcing focus on the road, and honestly it hurt my neck and shoulders because of the tension. In my zippy little manual I have no trouble driving at all.

So imagine that but up to an even higher level. Someone is supervising a car which handles most situations well enough to make you feel like a passenger. They will switch off and stop paying attention eventually. At that point it is on them, not the car itself being unfit. I want self driving to be a reality but right now it is not. We can do all sorts of driver assist stuff but not full self driving.

supersquirrel@sopuli.xyz on 24 Jun 12:35 collapse

A good example is ADHD. I have severe ADHD so I take meds to manage it. If I am driving an automatic car on cruise control I find it very difficult to maintain long term high intensity concentration. The solution for me is to drive a manual. The constant involvement of maintaining speed, revs, gear ratio, and so on mean I can pay attention much easier. Add to that thinking about hypermiling and defensive driving and I have become a very safe driver, putting about 25-30 thousand kms on my car each year for over a decade without so much as a fender bender. In an automatic I was always tense, forcing focus on the road, and honestly it hurt my neck and shoulders because of the tension. In my zippy little manual I have no trouble driving at all.

Are you me? I love weaving through traffic as fast as I can… in a video game (like Motor Town behind the wheel). In real life I drive very safe and it is boring af for my ADHD so I do things like try to hit the apex of turns just perfect as if I was driving at the limit but I am in reality driving at a normal speed.

Part of living with severe ADHD is you don’t get breaks from having to play these games to survive everyday life, as you say it is a stressful reality in part because of this. You brought up a great point too that both of us know, when our focus is on something and activated we can perform at a high level, but accidents don’t wait for our focus, they just happen, and this is why we are always beating ourselves up.

We can look at self driving car tech and intuit a lot about the current follies of it because we know what focus is better than anyone else, especially successful tech company execs.

DanWolfstone@leminal.space on 24 Jun 18:08 collapse

I’m glad other people understand the struggles required for daily life in this respect

Red_October@lemmy.world on 24 Jun 13:32 collapse

You seem to have missed the point. Whether or not you think that would be an easy job, the whole reason you’d be there is to be the one that takes all the blame when the autopilot kills someone. It will be your name, your face, every single record of your past mistakes getting blasted on the news and in court because Elon’s shitty vanity project finally killed a real child instead of a test dummy. You’ll be the one having to explain to a grieving family just how hard it is to actually pay complete attention every moment of every day, when all you’ve had to do before is just sit there.

Tja@programming.dev on 24 Jun 16:27 collapse

How about you pay attention and PREVENT the autopilot from killing someone? Like it’s your job to do?

homicidalrobot@lemm.ee on 24 Jun 17:51 next collapse

This is sarcasm, right?

turmacar@lemmy.world on 24 Jun 22:54 collapse

Expecting people to be able to behave like machines is generally the attitude that leads to crash investigations.

echodot@feddit.uk on 24 Jun 08:20 next collapse

Maybe they’re just getting the wrong people to provide training data. The kind of people who drive Tesla’s do tend to drive like morons, so it would make sense.

buzz86us@lemmy.world on 24 Jun 09:31 next collapse

Wow it’s almost like having an AI with a 2D view to go off of is a bad idea? Hmmm who’d have thunk it?

DeathsEmbrace@lemmy.world on 24 Jun 10:45 collapse

You’re telling me we’re not at the point where self driving cars are a thing? But a Tech CEO said so? Who am I supposed to believe if not a Tech CEO?

JohnEdwa@sopuli.xyz on 24 Jun 10:51 collapse

Self-driving cars are a thing, Weymo is doing pretty fine.

But you might be able to spot a few (dozen) teeny-tiny (huge, bulky and extremely obvious) differences between a Waymo and a Tesla cybercab.

supersquirrel@sopuli.xyz on 24 Jun 12:25 collapse

Lie dare you claim Waymo is better than Tesla

(it is a lidar joke, Waymo has lidar sensors which makes it way safer)

bomberesque1@lemm.ee on 24 Jun 10:48 next collapse

Well obviously it’s been trained on human taxi driver behaviour

communist@lemmy.frozeninferno.xyz on 24 Jun 12:20 next collapse

It found out who made it so it knew what to do

gamer@lemm.ee on 24 Jun 12:33 next collapse

Tbh it’s not as bad as I was expecting. Those clips could definitely have resulted in an accident, but the system seems to actually work most of the time. I wonder if it couldn’t be augmented with lidar at this point to make it more reliable? A live stress test is ridiculously irresponsible and will definitely kill people, but at least it’s only Texans at risk (for now).

I was skeptical of the idea of robotaxis, but this kind of sold me on it. If they’re cheaper than human drivers, I might even be able to get rid of my car some day. It doesn’t change the fact that I’ll never get into one because the CEO is a nazi though.

jj4211@lemmy.world on 24 Jun 14:47 collapse

Keep in mind this is a system with millions of miles under it’s belt and it still doesn’t understand what to do with a forced left turn lane in a very short trip in a fairly controlled environment with supremely good visual, road, and traffic conditions. LIDAR wouldn’t have helped the car here, there was no “whoops, confusining visibility”, it just completely screwed up and ignored the road markings.

It’s been in this state for years now, of being surprisingly capable, yet horrible screw ups being noted frequently. They seem to be like 95% of the way there and stuck, with no progress in reality just some willfull denial convincing them to move forward anyway.

melsaskca@lemmy.ca on 24 Jun 12:41 next collapse

Parking in a fire lane to drop off a passenger just makes it seem more human.

I_Has_A_Hat@lemmy.world on 24 Jun 12:44 next collapse

Yea, this one isn’t an issue. If you are dropping off passengers, you are allowed to stop in a fire lane because that is not parking.

jj4211@lemmy.world on 24 Jun 14:42 collapse

Which brings up an interesting question, when is a driverless car ‘parked’ vs. ‘stopped’?

LordCrom@lemmy.world on 24 Jun 16:13 collapse

When the engine is off?

Of course, how to tell this with an electric car?

SpaceNoodle@lemmy.world on 24 Jun 17:56 next collapse

When the motor drivers are energized?

unphazed@lemmy.world on 24 Jun 18:28 collapse

Yeah, tell that to police who bust people with DUIs when the engine is still off.

some_guy@lemmy.sdf.org on 24 Jun 13:34 collapse

They turned the empathy dial to 5%. Works great, right?

sturmblast@lemmy.world on 24 Jun 12:54 next collapse

Watch that stock price fall… wheeeee

hark@lemmy.world on 24 Jun 15:02 collapse

It already jumped up about 10% on monday simply because the service launched. Even if the service crashes and burns, they’ll jump to the next hype topic like robots or AI or whatever and the stock price will stay up.

sturmblast@lemmy.world on 24 Jun 16:16 collapse

And its fallen back down about 30% of those gains already. Hype causes spikes… that’s nothing new.

captainastronaut@seattlelunarsociety.org on 24 Jun 13:27 next collapse

Wow that turn signal sound is annoying. Why does it even need to make a sound in a car that’s supposed to be driving itself?

ThePantser@sh.itjust.works on 24 Jun 13:53 collapse

Important feedback for the passenger to ensure the car is actually following the rules. I would freak out at a corner if I couldn’t tell the car was signaling.

SpaceNoodle@lemmy.world on 24 Jun 17:58 collapse

The rider shouldn’t have to care.

Naturally, simply being in a “self-driving” Tesla is reason enough to worry.

some_guy@lemmy.sdf.org on 24 Jun 13:33 next collapse

Hooray! I feel so safe. I think I’ll move to Texas so I can get obliterated by this taxi from the future.

tym@lemmy.world on 24 Jun 13:48 next collapse

Sounds like the indian guy driving it with a joystick was a bit hungover. You’d think they’d screen that thing at the entrance of the cubicle farm where all these AI folk drive these from. AI is just “anonymous indians” for elmo’s grifting kind.

febrile@lemmy.world on 24 Jun 13:56 next collapse

What’s crazy is that the safety driver’s hair has gone completely grey in just two days.

jj4211@lemmy.world on 24 Jun 14:43 collapse

That safety driver did not give a single fuck about driving on the wrong side of the road…

PumaStoleMyBluff@lemmy.world on 24 Jun 15:10 collapse

He must have seen so much worse to not even be flinching at that.

NostraDavid@programming.dev on 24 Jun 14:03 next collapse

oof

jj4211@lemmy.world on 24 Jun 14:34 next collapse

Navigation issue / hesitation

The video really understates the level of fuck up that the car did there…

And the guy sitting there just casually being ok with the car ignoring the forced left going straight into oncoming lanes and flipping the steering wheel all over the place because it has no idea what the hell just happened… I would not be just chilling there…

Of course, I wouldn’t have gotten in this car in the first place, and I know they cherry picked some hard core Tesla fans to be allowed to ride at all…

GroundedGator@lemmy.world on 24 Jun 21:37 collapse

I’ve come to the realization, at least where I live, that a hell of a lot of accidents are prevented because of drivers who are actually aware and safe. This goes a bit beyond defensive driving IMO. I’m talking flat out accident avoidable. There is an entire class of drivers who are not even aware of the accidents they have almost caused because someone else managed to avoid their stupid driving.

The majority of accidents that are likely to happen with these robocoffins will be single car or robocoffin meets robocoffin. The numbers on safety after a year will be acceptable because non accident causing error prone driving is not reported in any official capacity.

I still maintain that the only safe way to have autonomous vehicles on the road is if they do not share the road with human drivers and have an open standard for communicating with other autonomous cars.

outhouseperilous@lemmy.dbzer0.com on 24 Jun 23:55 collapse

open standard

Soery, no, that’s infrastructure.

Rentlar@lemmy.ca on 24 Jun 14:49 next collapse

So, Tesla Robitaxis drive like a slightly drunk and confused tourist with asshole driving etiquette.

Those right turns on red were like, “oh you get to go? That’s permission for me to go too!”

ChickenLadyLovesLife@lemmy.world on 24 Jun 16:28 collapse

I know many people who believe that “right on red” means they have the right of way to make the turn and don’t have to stop first or yield to traffic.

LePoisson@lemmy.world on 24 Jun 16:33 next collapse

I know many people fucking morons

unphazed@lemmy.world on 24 Jun 18:27 next collapse

I almost failed my first drivers test because I stopped at a stop sign instead of just yielding on a right turn. Still to this day it seems… wrong.

ChickenLadyLovesLife@lemmy.world on 24 Jun 19:04 collapse

Why would you have failed? You are supposed to come to a complete stop at a stop sign.

KayLeadfoot@fedia.io on 24 Jun 19:47 collapse

No kidding, they fail you if you DON'T come to a complete stop.

ChickenLadyLovesLife@lemmy.world on 24 Jun 20:52 collapse

Where I live, a few stop signs have a square white sign below them that says “EXCEPT FOR RIGHT TURN”, i.e. you don’t have to actually stop if you’re turning right. It’s incredibly fucked up - it works fine if you’re a local and you’re familiar with these signs, but people new to the area don’t know anything about it and if they’re on the crossroad they actually expect the other driver to stop since all they see is the backside of the octagon. It’s pointless to have these signs anyway since people usually roll through stop signs as it is.

outhouseperilous@lemmy.dbzer0.com on 24 Jun 23:56 collapse

We should arm pedestrians so we can shoit the subhuman filth who take rights on red.

Blackmist@feddit.uk on 24 Jun 14:56 next collapse

So it emulates a standard BMW driver. Well done.

Valmond@lemmy.world on 24 Jun 15:07 collapse

Still work to be done, it uses the blinkers.

odelik@lemmy.today on 24 Jun 16:58 collapse

At least they were used incorrectly to be just as unpredictable.

Tattorack@lemmy.world on 24 Jun 15:20 next collapse

Woaw! Damn! The robotaxis are a dangerous fuck up!? That’s most surprising thing that happened all year! There’s literally no way I could’ve seen that coming.

Smoogs@lemmy.world on 24 Jun 15:22 next collapse

A man who can’t launch a rocket to save his life is also incompetent at making self driving cars? His mediocrity knows no bounds.

Rbnsft@lemm.ee on 24 Jun 16:04 next collapse

To be fair Musk only has money and doesnt Do shit at either Company

ChickenLadyLovesLife@lemmy.world on 24 Jun 16:30 next collapse

It’s hilarious to me that Musk claims to work 100 hours a week but he’s the CEO of five companies. Even if the claim were true (and of course it isn’t) it means being the CEO of one of his companies is a 20-hour-a-week job at best.

SkyezOpen@lemmy.world on 24 Jun 16:49 collapse

He meddles. That much is apparent. The cybertruck is obviously a top down design as evidenced by the numerous atrocious design compromised the engineers had to make just to make it real. From the glued on “exoskeleton” to the hollowed ALUMINUM frame to the complete lack of physical controls to the default failure state turning it into a coffin to the lack of waterproofing etc.

outhouseperilous@lemmy.dbzer0.com on 24 Jun 23:57 collapse

Seriously. I waa better at rocketry than him by age twelve.

Washedupcynic@lemmy.ca on 24 Jun 15:31 next collapse

Watching anything that fElon fail sparks joy.

merc@sh.itjust.works on 24 Jun 18:12 next collapse

Imagine you’re the guy who invented SawStop, the table saw that can detect fingers touching the saw blade and immediately bury the blade in an aluminum block to avoid cutting off someone’s finger. Your system took a lot of R&D, it’s expensive, requires a custom table saw with specialized internal parts so it’s much more expensive than a normal table saw, but it works, and it works well. You’ve now got it down that someone can go full-speed into the blade and most likely not even get the smallest cut. Every time the device activates, it’s a finger saved. Yeah, it’s a bit expensive to own. And, because of the safety mechanism, every time it activates you need to buy a few new parts which aren’t cheap. But, an activation means you avoided having a finger cut off, so good deal! You start selling these devices and while it’s not replacing every table saw sold, it’s slowly being something that people consider when buying.

Meanwhile, some dude out of Silicon Valley hears about this, and hacks up a system that just uses a $30 webcam, an AI model that detects fingers (trained exclusively on pudgy white fingers of Silicon Valley executives) and a pinball flipper attached to a rubber brake that slows the blade to a stop within a second when the AI model sees a finger in danger.

This new device, the, “Finger Saver” doesn’t work very well at all. In demos with a hotdog, sometimes the hotdog is sawed in half. Sometimes the saw blade goes flying out of the machine into the audience. After a while, the company has the demo down so that when they do it in extremely controlled conditions, it does stop the hotdog from being sawed in half, but it does take a good few chunks out of it before the blade fully stops. It doesn’t work at all with black fingers, but the Finger Saver company will sell you some cream-coloured paint that you can paint your finger with before using it if your finger isn’t the right shade.

Now, imagine if the media just referred to these two devices interchangeably as “finger saving devices”. Imagine if the Finger Saver company heavily promoted their things and got them installed in workshops in high schools, telling the shop teachers that students are now 100% safe from injuries while using the table saw, so they can just throw out all safety equipment. When, inevitably, someone gets a serious wound while using a “Finger Saver” the media goes on a rant about whether you can really trust “finger saving devices” at all.

Anyhow, this is a rant about Waymo vs. Tesla.

ParadoxSeahorse@lemmy.world on 24 Jun 18:18 next collapse

Excellent work

frenchfryenjoyer@lemmings.world on 24 Jun 18:38 next collapse

Really good analogy. loved this

localhost443@discuss.tchncs.de on 24 Jun 20:47 next collapse

That was great, the first comparison that came to mind after reading it was they are both a game of russian roulette…

Waymo - you get one chamber loaded with a blank, might kill you if you get it.

Tesla - you get one empty chamber… And the gun is loaded by your worst enemy

Woht24@lemmy.world on 24 Jun 21:12 next collapse

Wow…

KayLeadfoot@fedia.io on 24 Jun 21:50 next collapse

This put a smile on my face.

lowspeedchase@lemmy.dbzer0.com on 24 Jun 22:49 next collapse

Awesome read, thanks!

kerrigan778@lemmy.blahaj.zone on 24 Jun 23:04 next collapse

Waymo is also a silicon valley AI project to put transit workers out of work. It’s another project to get AI money and destroy labor rights. At least it kind of works isn’t exactly helping my opinion of it. Transit is incredibly underfunded and misregulated in California/the USA and robotaxis are a criminal misinvestment in resources.

outhouseperilous@lemmy.dbzer0.com on 24 Jun 23:52 collapse

Waymo is so much better, yeah. No problems with waymo. except all the times they almost hit me.

werewolfconspiracy@lemmy.world on 24 Jun 18:50 next collapse

Welcome to johnnycab

Ironfist79@lemmy.world on 24 Jun 20:23 next collapse

Cruise cars were already doing this and performed far better. GM is fucking braindead and pulled the plug like usual though.

BlueMagma@sh.itjust.works on 24 Jun 20:41 collapse

Haaa, finally !! An AI taxi that behaves like a normal taxi driver. It must feel so refreshing.