this post was submitted on 27 Jul 2023
392 points (95.2% liked)

Technology

34438 readers
174 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

cross-posted from: https://derp.foo/post/81940

There is a discussion on Hacker News, but feel free to comment here as well.

top 50 comments
sorted by: hot top controversial new old
[–] abraham_linksys@sh.itjust.works 98 points 1 year ago (4 children)

We need to build special roads so self driving cars can navigate properly.

You could even connect self driving cars together, by letting the front car pull them the others could save their batteries.

And with these "trains" of self driving cars pulling each other, you wouldn't have to build the self driving car roads very wide, they could just run on narrow "tracks" for the wheels.

Then we'd have more space for human stuff instead of car stuff like roads and parking lots everywhere.

He's done it again. Elon Musk is a god damn genius.

[–] whataboutshutup@discuss.online 33 points 1 year ago (3 children)

Would you consider to also make an underground version?

[–] pineapplelover@lemm.ee 18 points 1 year ago

We could call it a subway. Since it's underground.

[–] CCatMan@lemmy.one 12 points 1 year ago (1 children)
[–] AnAngryAlpaca@feddit.de 8 points 1 year ago

Can we also put it on one rail instead of two?

load more comments (1 replies)
[–] gary_host_laptop@lemmy.ml 16 points 1 year ago

It is amazing how copos try to reinvent shittier versions of trains.

[–] iturnedintoanewt@lemmy.world 10 points 1 year ago (2 children)

This reminded me so much of this !

[–] PipedLinkBot@feddit.rocks 8 points 1 year ago

Here is an alternative Piped link(s): https://piped.video/watch?v=5eHWVjUAukU

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

load more comments (1 replies)
load more comments (1 replies)
[–] stanleytweedle@lemmy.world 74 points 1 year ago (2 children)

Double the fine for 'self-driving' traffic violations and bill the manufacturer for half.

[–] amanneedsamaid@sopuli.xyz 76 points 1 year ago (6 children)

Bill the manufacturer 100%, IMO. Thats why I think self driving cars beg an unanswerable legal question, as when the car drives for you, why would you be at fault? How will businesses survive if they have to take full accountability for accidents caused by self-driving cars?

I think its almost always pointless to hold back innovation, but in this case I think a full ban on self driving cars would be a great move.

[–] DauntingFlamingo@lemmy.ml 25 points 1 year ago* (last edited 1 year ago) (5 children)

The most basic driving like long stretches of highway shouldn't be banned from using AI/automated driving. The fast paced inner city driving should be augmented but not fully automatic. Same goes for driving in inclement weather: augmented with hard limits on speed and automated braking for anything that could result in a crash

Edit: I meant this statement as referring to the technology in it's current consumer form (what is available to the public right at this moment). I fully expect that as the technology matures so will the percentage of incidents decline. We are likely to attain a largely driverless society one day in my lifetime

[–] snooggums@kbin.social 19 points 1 year ago (2 children)

"Self driving with driver assist" or whatever they call it when it isn't 100% automated is basically super fancy cruise control and should be treated as such. The main problem with the term autopilot is that for airplanes it means 100% control and very misleading when used for fancy cruise control in cars.

I agree that it should be limited in use to highways and other open roads, like when cruise control should be used. People using cruise control in the city without being in control to brake is the same basic issue.

Not 100% fully automated with no expectation of driver involvement should be allowed when it has surpassed regular drivers. To be honest, we might even be there with how terrible human drivers are...

[–] GonzoVeritas@lemmy.world 21 points 1 year ago

Autopilot systems on airplanes make fewer claims about autonomous operation than Tesla. No pilot relies completely on autopilot functionality.

[–] amju_wolf@pawb.social 4 points 1 year ago (3 children)

Autopilot in aircraft is actually kinda comparable, it still needs a skilled human operator to set it up and monitor it (and other flight controls) all of the time. And in most modes it's not even really all that autonomous - at most it follows a pre-programmed route.

load more comments (3 replies)
[–] dudewitbow@lemmy.ml 2 points 1 year ago

Its why im all for automated trucking. Truck drivers is a dwindling source and living the lifestyle of a cross country truck driver isnt highly sought after job. The self driving should do the large trip from hub to hub, and each hub ahould do the last few miles. Keeps drivers local and fixes a problem that is only going to get worse.

[–] dan1101@lemmy.world 2 points 1 year ago (2 children)

Long stretches of highway are good unless there is a stopped emergency vehicle.

[–] amju_wolf@pawb.social 5 points 1 year ago

I mean that's a huge issue for human drivers too.

We need assistive technologies that protect us, but if at any point the driver is no longer driving the car manufacturer needs to take full responsibility.

load more comments (1 replies)
[–] amanneedsamaid@sopuli.xyz 2 points 1 year ago

I disagree, I feel no matter how good the technology becomes, the odd one-in-a-million glitch that kills someone is not preferable to me over the accidents caused by humans. (Even if we assume the self driving cars crash at a lesser rate than human drivers).

The less augmentation past lane assist and automated braking the better IMO. I definitely disagree with a capped speed limit built into the vehicle, that should never be limited less than what could melt engine components or something (and even that would be take time to turn on). The detriments that system would cause when it malfunctions far outweigh the benefits it would bring to safety.

[–] stanleytweedle@lemmy.world 5 points 1 year ago* (last edited 1 year ago) (1 children)

I think its almost always pointless to hold back innovation, but in this case I think a full ban on self driving cars would be a great move.

I agree on both points. Also I think it's important to characterize the 'innovation' of self driving as more social-economic than technological.

The component systems- sensing, processing, communications, power, etc- have a wide range of engineering applications and research and development will inevitably continue no matter the future of self-driving. Self driving only solves a very particular social-economic-technological issue that only exists because of how humans historically chose to address the same issue with older technology. Self driving is more of a product than a 'technology' in my book.

So my point there is that I don't think a ban on full self driving really qualifies as 'holding back innovation' at all. It's just telling companies not to develop a specific product. Hyperbolic example but nobody would say banning companies from creating a nuclear powered oven was 'holding back innovation'. If anything forcing us to re-envision human transportation without integrating into legacy requirements advances innovation more than just trying to use AI to solve the problems created by using humans to solve the original problem of how to move humans around in cars.

[–] amanneedsamaid@sopuli.xyz 4 points 1 year ago

I see it the same way, but an incredible amount of people I've discussed this with say that its stupid to hold back technological innovation "like self-driving cars". Its an unnecessary piece of technology.

I also just think the whole ethical complication is fucked. The way we have it now, every driver is responsible for their actions and no driver ever glitches out on the freeway (and if they do, they bear the consequences). Imagine a man's wife and kids getting killed by a drunk driver vs a self-driving car. In one scenario you can clearly place blame, and take action in a much more meaningful way than just suing a car manufacturer.

[–] skullgiver@popplesburger.hilciferous.nl 5 points 1 year ago* (last edited 9 months ago) (1 children)

[This comment has been deleted by an automated system]

load more comments (1 replies)
load more comments (3 replies)
[–] schroedingershat@lemmy.world 16 points 1 year ago (1 children)

Nah. Give tesla the same number of points everyone else gets on their license. If the company runs out, no more cars controlled by tesla on the roads..

load more comments (1 replies)
[–] Arotrios@kbin.social 11 points 1 year ago (1 children)

JFC that's frightening. It blew that red at about 30mph, didn't even really slow down except for the curve.

[–] killall-q@kbin.social 14 points 1 year ago (5 children)

Because the car didn't recognize it as a red light, probably due to all the green lights that were facing a similar direction.

The issue is not the speed at which it took the turn, but that it cannot distinguish which traffic lights are for the lane the car is in.

[–] Anticorp@lemmy.ml 8 points 1 year ago (1 children)

Then why have I been forced to do all those ReCaptchas?

load more comments (1 replies)
[–] NotMyOldRedditName@kbin.social 5 points 1 year ago* (last edited 1 year ago)

If you've watched any of their recent AI talks, they talk a lot about these unusual and complex intersections. Lane mappings in complexe intersections being one of the hardest problems. Currently they're taking data from numerous cars to reconstruct intersections like this to then turn into a simulation and train it so it learns more and more complex things.

There really are only 2 options.

Solve this with vision and AI, or solve this with HD maps.

But it has to be solved.

[–] SheeEttin@lemmy.world 3 points 1 year ago (1 children)

If it sees red and green, it should take the safe option and stop until it is sure or the driver takes over.

[–] NotMyOldRedditName@kbin.social 4 points 1 year ago* (last edited 1 year ago)

If it's unsure, but for whatever reason this failed, it seemed sure.

I've had the car slow in unsure situations before so it can and does.

It just got this one very wrong for some reason

load more comments (1 replies)
[–] hypnocoder@lemm.ee 5 points 1 year ago (1 children)

Is is a common thing on Fsd beta - gets posted all the time to https://teslamotorsclub.com/tmc/forums/ai-autopilot-autonomous-fsd.249/

The overall trend on the forums is that latest versions are getting worse.

[–] drekly@lemmy.world 9 points 1 year ago

It blows my mind they decided not to use LIDAR anymore. Of course it's getting worse.

[–] Chariotwheel@kbin.social 4 points 1 year ago

Eech. The comments under the original tweet are rancid. Twitter is really Musk town now.

[–] adhdplantdev@lemm.ee 4 points 1 year ago

Man hackernews is full of people criticizing the poster saying that he should have disengaged the system so it learns completely missing the point that FSD should not be considered safe.

load more comments
view more: next ›