There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi...

Elon Musk’s FSD v12 demo includes a near miss at a red light and doxxing Mark Zuckerberg — 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of thi…::Elon Musk posted a 45-minute live demonstration of v12 of Tesla’s Full Self-Driving feature. During the video, Musk has to take control of the vehicle after it nearly runs a red light. He also doxxes Mark Zuckerberg.

Vub ,

I don’t understand much of this stuff but does this mean they (he…) threw a decade of research out the window and instead fed an AI loads of video data to start over from scratch?

ours ,

I’m guessing it’s just more complicated than that. Training an AI model with loads of video data is how they got this far but they seem to be hitting the limits of this current process/sensors.

skymtf ,

near miss red light lmao

atfergs ,
@atfergs@lemmy.world avatar

I wonder who got fired after that.

DampSquid , (edited )

I assume this all just bullshit and lies like last time?

Honytawk ,

When was it ever different from old musky?

How long before the cybertruck is released again?

EpsilonVonVehron ,

Cybertruck, hyperloop, etc, etc. Mush said in 2016, “I really would consider autonomous driving to be basically a solved problem,” Musk said. “I think we’re basically less than two years away from complete autonomy.”

Thorny_Thicket , (edited )

AI DRIVR made an interesting analysis about the v12 on YouTube. Apparently it’s completely different from the previous versions and instead of understanding traffic rules it learns from a videos of people driving which means it does things like doesn’t fully stop at stop signs and drives over the speedlimit - like people do too.

It’s interesting because by strictly following traffic rules you might infact be a danger to others but by driving like humans you’re also breaking the law. Good example of a situation where the “right” thing to do might not be the most intuitive one though in this case it’s still up for a debate.

PipedLinkBot ,

Here is an alternative Piped link(s): piped.video/watch?v=ZI7-Swmuo4A

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source, check me out at GitHub.

ultratiem ,
@ultratiem@lemmy.ca avatar

That’s what we were all clambering for: a self driving machine that operates like a mouth breather late for work.

Elon is a masterclass of stupid.

Thorny_Thicket ,

Perhaps you should put your hatred towards Elon aside for a while and objectively consider what actually is the better solution here.

One could argue that strictly following the rules is the right approach, and perhaps it would be if everyone actually drove that way. However, in reality, that’s not usually the case. What truly increases traffic safety is predictability. If most drivers are rolling through stop signs and you’re the only one stopping completely, while you might technically be in the right, your behaviour could lead to accidents due to the unpredictability. The same applies to speeding. Driving significantly slower than the flow of traffic might slow down the traffic flow, leading to unsafe overtakings and such. While you might be legally correct here too, in practice, a slight increase in speed could lead to increased road safety.

These are complex issues. A dose of humility might go a long way instead of acting like the answer is obvious.

trashgirlfriend ,

The answer is clear and easy.

Don’t let computers have full control over freely moving several ton death machines.

i_r_n00b ,
@i_r_n00b@lemmy.world avatar

This is such a cop-out. “No computers!”, but it’s okay to let someone drive who isn’t paying attention because they’re deep in their phone? I drive a motorcycle and I’ve had people stare me straight in the eye, only to pull out in front of me and nearly kill me.

People are notoriously bad at driving. The computer doesn’t have to be perfect, just better than the soccer moms or distracted dummies.

Honytawk ,

After a while the human will be the bottle neck of preventing accidents.

Computers are a lot better at following the law.

OnionQuest ,

It’s simply solved by the fact that I, as a human driver, can recognize now when a robo-taxi is driving and change my expectations of the car’s behavior. Right now it’s clearly evident what an autonomous car looks like and a reasonable person will have the expectation that they follow the letter of the law.

I interact with these vehicles on a daily basis in San Francisco and it would be weird if they weren’t driving perfectly.

Honytawk ,

What better predictability is there than actually following the law?

Self driving cars should be better than us, not be just like us.

Thorny_Thicket ,

Even if self driving car behaves like a human driver it still exceeds humans thousandfold in processing and reaction speed. For a truly advanced self driving system plowing thru stop signs and speeding should be non-issue because unlike humans it can pay 100% attention to its surroundings 100% of the time and react instantly when needed.

SatanicNotMessianic ,

I do this kind of thing for a living, and have done so for going on 30 years. I study complex systems and how they use learning and adaptation.

Musk’s approach to these systems is idiotic and shows no understanding of or appreciation for how complex systems - animals, in particular - actually work. He wanted to avoid giving his vehicles lidar, for instance, because animals can navigate the world without it. Yet he didn’t give them either the perceptual or cognitive capabilities that animals have, nor did he take into account the problems of animal locomotion being solved by evolution are very different from the problems solved by people driving vehicles. It, of course, didn’t work, and now Tesla is trailing the pack on self-driving capabilities with the big three German car makers and others prepping class 3 vehicles for shipping.

If he is trying to chatgpt his way out of the corner he’s painted himself into, he’s just going to make it worse - and, amusingly, for the same reasons. Vision is just one dimension of sensation, and cars are not people, or antelopes, or fish, or whatever his current analogy is.

This is just Elon Eloning again. No one predicts a car coming towards them is going to do a California stop at a stop sign. If Om pulling into an intersection and I see someone rolling through a stop sign, I’m hitting the brakes because obviously a) they didn’t see me and b) they don’t know the rules of the road. Elon’s cars have a problem with cross traffic and emergency vehicles anyway, making the logic fuzzier is not going to improve the situation. If he thinks throwing video and telemetry data at a large model is going to overcome his under-engineered autonomous system, I suspect he’s going to be in for a rude discovery.

If there’s anything kids today can learn from Elon (or from Trump for that matter), it’s how to be so confidently wrong that people throw money at you. The problem is that if you’re not already born into wealth and privilege, you’re likely to merely become the owner of the most successful line of car dealerships in a suburban county in Pennsylvania, or else in prison for fraud.

Thorny_Thicket ,

If FSD is trained from billions of hours of video data then it by definition drives like an average driver and thus is highly predictable.

SatanicNotMessianic ,

That’s not how it works, unfortunately. That’s how people want it to work, but it’s not how it works.

This is just more of Elon’s pie in the sky.

Thorny_Thicket ,

If you’ve done this kind of stuff for living for the past 30 years then I’m sure you can give me a better explanation than “that’s not how it works”

ultratiem ,
@ultratiem@lemmy.ca avatar

The better solution is to not program your machine to act like a clown behind the wheel, doing all manner of illegal offences because ThAt’s HoW ReGulAr PeoPlE DrIve!

We aren’t trying to make auto pilot act like a real bonafide driver, we are just removing the inconvenience of needing to do the driving.

Thorny_Thicket ,

That depends on what you value.

If you want self driving cars that follow traffic rules to the letter even if that means more people are going to die then that’s fine. I don’t agree but I can see why someone would think that. Personally I would prioritize human life so if it turns out this is one of the cases when bending the rules does in fact lead to less accidents then that’s what I’m voting for.

I’m not claiming either is true. Just asking to consider the fact that the right thing to do is not always intuitive.

ultratiem ,
@ultratiem@lemmy.ca avatar

Oh we all know what Elon values 🤪

Let’s pluck out this forced choice fallacy first off. I’m going to opt for c) I want self driving cars to obey the rules of the road “to the letter” and keep people safe. If not, why do they even make traffic rules?

You and Elon want the cool self driving car that cruises 60 in a 50 with traffic and occasionally doesn’t check its blind spot but quickly recovers and gives a quick wave like sorry bro my bad.

I mean, okay Jerry.

Thorny_Thicket ,

That third option is the first option in my view.

For the sake of an argument let’s imagine that most people drive 10kph over the speedlimit on highways and statistically a significant number of accidents happens when people are overtaking someone driving slower.

Now by driving faster these dangerous overtakes happen way less often and it results in overall increase in safety but it’s also against the rules so how does your “third option” solve this issue?

batmaniam ,

When someone is driving, if they misjudge and bend the rules at wrong time, and kill someone they go to court. They can potentially be convicted of all sorts of things.

Who’s going to court when a car does it? Who serves the jail time?

Thorny_Thicket ,

With the current systems the driver obviously. These systems are not yet advanced enough to be blindly relied on

batmaniam ,

I should have been more clear: I meant an AI trained to break the rules the way we’re talking about. Having the ability to make a judgement also means responsibility for that judgement. If I cross a double yellow to get around farm equipment on a back country road, and I misjudge and kill someone, it’s on me. It doesn’t matter if 999/1000 I could have broke the rules responsibly.

So who goes to jail when a car does it?

Thorny_Thicket ,

Well its an ongoing discussion with no definite answer but here’s how I see it:

Let’s say a car manufacturer comes up with a self-driving vehicle that is proven to be, let’s say, 3 times better than a skilled human driver. It is then objectively true to say that everyone would be safer in one of these cars. You could even argue it’s the responsible thing to do, especially compared to driving by yourself, right?

Well, maybe as a society, we don’t prohibit people from driving, but you must then acknowledge that if you cause an accident, you would also suffer the consequences. However, even these self-driving vehicles aren’t foolproof. Despite being 3 times safer, they will still end up in accidents. Who do we blame for this, then? That’s what I take you’re asking?

No one, really, I guess. Assigning blame might not be the most productive thing to do, and it could be more reasonable to think of these accidents as a collective risk that users willingly accept when using these products. You’re already accepting that risk now, so taking a risk three times smaller shouldn’t be an issue. Perhaps it’s conceivable that the vehicle manufacturer pays some compensation to the victim/family too but not because it’s their fault per se, but because they can afford it and it seems like the fair thing to do.

batmaniam ,

Fun conversation.

I don’t think the statistics resolve the issue though. At the end of the day, you can’t give something agency without accountability. I guess it’s similar to a well behaved dog at a park that loses it and eats an old man or something. The statistics only matter so much: the owner introduced an unpredictable element with it’s own agency, you can’t hold a dog accountable so the owner inherits that responsibility.

When I drive, I do accept a risk, but I do so knowing there are a set of rules everyone is following to minimize that risk, and that there’s accountability should someone choose not to follow them. I guess what I’m saying is that an autonomous vehicle reducing my risk by 3x, 100x, 1000x, doesn’t change the accountability for a single instance in which it got it wrong. Not when we’re talking about it knowingly and intentionally violating established traffic laws. That’s like saying a highly trained race car driver get’s off the hook for hitting someone while driving way to fast in public because, statistically, they’re actually much less of a risk to the public than most drivers.

This is all assuming, by the way, that we’re talking about a well tested, well understood system. I think having vehicles on the road right now which are advertised as “full self driving”, when there are known issues, make a whole group of people of people directly responsible for any deaths that occur.

Thorny_Thicket ,

Moral questions about autonomous vehicles is an interesting subject. There’s a lot of difficult questions like this that we have to come up with answers to. For example there’s also the issue wether in case of an unavoidable accident should the car prioritize the life of the passengers over everyone else meaning that given the choice it’s going to rather drive over a pedestrian than hit a brick wall. Human doesn’t have time to think about this and react on time but AI does.

NotYourSocialWorker ,

If most drivers are rolling through stop signs and you’re the only one stopping completely, while you might technically be in the right, your behaviour could lead to accidents due to the unpredictability.

Simply no. If you as a driver aren’t prepared that the car in front of you might actually stop when there’s a sign that says stop, and if you aren’t keeping enough of a distance to be able to break, then it isn’t the car in front that is the problem, or who is the one causing the accident, it’s you and only you.

The same applies to speeding. Driving significantly slower than the flow of traffic might slow down the traffic flow, leading to unsafe overtakings and such.

Again no. If they are driving at the speed of the signage, keeping the speed and driving predictable, then the ones driving “significantly” faster are the ones decreasing road safety. No-one is forcing them to perform “unsafe overtakings and such”. Also, just because you, from your vantage point, can’t see a reason for the car in front of you driving slowly doesn’t mean that there isn’t one.

While a dose of humility is good, a dose of personal responsibility is also great

Thorny_Thicket ,

then the ones driving “significantly” faster are the ones decreasing road safety. No-one is forcing them to perform “unsafe overtakings and such”.

I’m not claiming it is so, but I’m saying it’s conceivable that if the autonomous vehicle drives slightly over the speed limit, with the flow of traffic, it may actually lead to a statistically significant drop in accidents compared to the scenario where it follows the speed limit. Yes, no one is forcing other drivers to behave in such a way, but they do, and because of that, people die. In this case, forcing self-driving cars to follow traffic rules to the letter would paradoxically mean you’re choosing to kill and injure more people.

I don’t think the answer to this kind of moral question is obvious. Traffic is such a complex system, and there are probably many other examples where the actually safer thing to do is not what you’d intuitively think.

EpsilonVonVehron ,

Mush doesn’t care about laws. As mentioned on another article, he appears to be operating the phone by hand in the driver’s seat, which is both a driving violation and against Tesla’s own driver manual.

ultratiem ,
@ultratiem@lemmy.ca avatar

Same guy who parades around in his private jet calling everyone who doesn’t return to the office amoral and selfish.

So yeah. All that tracks. The entire “it’s different because it’s me” stench wafting in.

morriscox ,

WTH is wrong with mouth breathers? What ass grasped for some new insult and came up with that? It’s a lame stupid insult.

AnUnusualRelic ,
@AnUnusualRelic@lemmy.world avatar

Autonomous cars will only work properly in areas where humans aren’t allowed to drive.

variaatio , (edited )

It’s interesting because by strictly following traffic rules you might infact be a danger to others but by driving like humans you’re also breaking the law.

Well the others should also stop breaking the law, then things are safe again. One doesn’t solve the illegal murder problem by making murder legal. If someone is danger to someone else by driving legally, then source of problem is other persons behaviour. Since legal rules don’t include stuff like “be obnoxious and hindering to others”.

The other drivers must drive like expecting possibly the others involved driving by the rules. Leaving enough room, incase the car in front in fact does stop at the stop sign. Since they might have to emergency stop anyway. If one isn’t distant enough to leave room for stop sign stopping, one certainly doesn’t have the safe distance to anticipate as they should the car in front at any moment having to do emergency stop due to developing sudden situation. One must always leave avoidance distance.

Drive by the speed limit and not little over? It is the speeding over takers fault they are speeding over taker, took a dangerous over take when they shouldn’t due to being “annoyed” by someone driving by the speed limit and thus causing a crash.

There is very very few cases where driving by the rules is the cause of danger. Other drivers being fool hardy, emotional idiots is the source of danger. Fault will and should land with the fool hardy idiot.

As NTHSA said with making Tesla remove the “california stop” aka rolling the stop singing without stopping, others breaking the law don’t make it legal for you. In fact said arbitrary cultural behavior, which some follow and some don’t is a source of danger due to uncertainty it causes.

edit: So in long term the car is safer by following rules, since it induces others to drive legally and predictably. Specially since machines don’t use human non verbal hints and so on. Thus the only sensible route for a driving machine, instead of driving human is to strictly follow traffic rules. Since it makes it a predictable player. Unlike with humans other humans have no way to culturally gauge how a “driving machine would behave”, if it doesn’t behave by the one publicly known precedent it could be expected to behave… Driving by the rules to the letter. Which does include the simple rule of “if you can you must try to avoid collision, even on having right of way”. No amount of “but the rules say”, overrules that basic rule in the rules “every driver has obligation to try to avoid collision or minimize collision upon not being able to avoid collision.” So there well be no “cyborg car bowling down a pedestrian or other car, because technically the other person was breaking the law. The car had right of way”.

Thorny_Thicket ,

I obviously don’t know for sure, but at least it’s conceivable that, in fact, it may be the case that erratic behavior of other drivers, caused by someone else driving slower than them, leads to a significant number of accidents every year that would not have happened had they been driving at the same speed as everyone else.

In this case, forcing the self-driving vehicle to never go over the speed limit literally means you’re knowingly choosing an option that leads to more people dying instead of less.

I think there’s a pretty clear moral dilemma here. I’m not claiming to know the right way forward, but I just want to point out that strictly following the rules without an exception is not always what leads to the best results. Of course, allowing self-driving cars to break the rules comes with its own issues, but this just further points to the complexity of this issue.

variaatio ,

Tehnyt again if that follow others behavior is drive faster, that also leads to accidents. Not many with the other frustrated drivers, but with say wildlife. People not being aboe to stop in time more often dye to the increased speed and thus increased braking distance.

That is why bendy narrow roads have slower speed limit. It is function of what is the predicted reaction time, the amount of sight distance one had.

Can’t cheat physics, the more speeding there is, the longer the braking distances, the more often it isn’t anymore a near miss due to braking in time and instead a full on collision.

So sure one is more synch, but every is in synch with less reaction time available, when the unavoidable chaos factor raises its head. Chaos factor like wild live (who are not obligated nor obliged to follow traffic rules) or say someone bursting a tire leading to sudden change in speed and control.

Thorny_Thicket ,

When a self-driving car drives at or below the speed limit on a fast-moving highway, it can disrupt the natural flow of traffic. This can lead to a higher chance of accidents when other human drivers resort to aggressive maneuvers like tailgating, risky overtaking, or sudden lane changes. I’m not claiming that it does so for a fact, but it is conceivable, and that’s the point of my argument.

Now, contrast this with a self-driving car that adjusts its speed to match the prevailing traffic conditions, even if it means slightly exceeding the speed limit. By doing so, it can blend with the surrounding traffic and reduce the chances of accidents. It’s not about encouraging speeding but rather adapting to the behavior of other human drivers.

Of course, we should prioritize safety and adhere to traffic rules whenever possible. However, sometimes the safest thing to do might be temporarily going with the flow, even if it means bending the speed limit rules slightly. The paradox lies in the fact that by mimicking human behavior to a certain extent, self-driving cars can contribute to overall road safety. It’s a nuanced issue, but it underscores the complexity of integrating autonomous vehicles into a world where human drivers are far from perfect. This would not be an issue if every car was driven by an competent AI and there was no human drivers.

EdibleFriend ,
@EdibleFriend@lemmy.world avatar

Eh I hate the dipshit but he has a point. its not really doxxing when he literally just googled it live.

just_another_person ,

Intent.

EyesEyesBaby ,

If we apply that same theory to @ElonJet, then that wouldn’t be doxxing either. Obviously Elon thinks otherwise. If @ElonJet is doxxing, then so is this.

EdibleFriend ,
@EdibleFriend@lemmy.world avatar

never said that wasn’t. basically the same thing.

lazyvar ,
@lazyvar@programming.dev avatar

Isn’t that a little bit of circular reasoning?

If I doxx someone online then it gets indexed by Google, if someone then Google’s the information it stops being doxxing?

I’d assume most doxxing isn’t done by someone who has unique firsthand knowledge (e.g. “Oh I know John, he lives on so and so road”) and instead is done by finding the information online whether via Google or a different public source.

At least in the US, where a ridiculous amount of private information is deemed “public”.

EdibleFriend ,
@EdibleFriend@lemmy.world avatar

Not really? Because, in your scenario, Musk would have to be the person to originally post his info. He didn’t even have to go drop a few bucks on spokeo or something

lazyvar ,
@lazyvar@programming.dev avatar

That’s what I’m saying. In most cases the doxxer isn’t the one who originally provided the info, but rather someone who has found the information online via a Google search or something similar.

EdibleFriend ,
@EdibleFriend@lemmy.world avatar

and in most situations the person releasing the information isn’t doxxing someone whos world famous and takes all of 4 seconds to find information on. usually when you hear about doxxing its…well…someone like you. hidden behind a anon nickname and some weird ass neckbeard digs and digs to find enough information to get your name and go from there.

Its quite a bit different when the person is one of the wealthiest and most powerful men on the planet and so much about him is just very basic known things.

timkenhan ,

releasing the information versus acquiring the released information are two different thing.

lazyvar ,
@lazyvar@programming.dev avatar

Most doxxers don’t technically release the information, rather they’ve acquired it and point others to where they’ve acquired it or simply disseminate it further.

indepndnt ,

disseminate it further.

AKA release it.

silvercove ,

Is that why he has been trying to ban ElonJet?

EdibleFriend ,
@EdibleFriend@lemmy.world avatar

lol already talked about that elsewhere. yep its exactly the same thing. im not saying he isn’t a hypocrite

Aurenkin ,

Excuse me, this is the internet. You have to form a view on someone and then either agree or disagree with everything they do consistently otherwise it’s illegal.

EdibleFriend ,
@EdibleFriend@lemmy.world avatar

Lop yep. People are talking to me like I’m some kind of Fanboy when all I really said was maybe, this time, he didn’t actually strangle a puppy.

autotldr Bot ,

This is the best summary I could come up with:


“That’s why we’ve not released this to the public yet.” (FSD is technically a beta software, though Musk has said that v12 will be the first time Tesla removes that label.)

But the moment when Musk was forced to intervene at the traffic light has already been seized upon by critics who say Tesla’s approach to autonomous driving is insufficient and reckless.

Musk has said that FSD is being tested as beta software to emphasize the need for drivers to pay attention to the road while using the driver-assist feature.

(Remember, Musk has banned the @ElonJet account that tracks his private jet from X/Twitter, claiming it was a “direct personal safety risk” to him.)

The broader context here is that the federal government’s two-year investigation into Tesla’s highway driver-assist feature, Autopilot, is nearing its end, which may have prompted Musk to post the video as provocation.

The government could force a recall of Autopilot and, by extension, FSD, which could affect Tesla’s valuation, much of which hinges on the company’s promise that it will offer full autonomy to its customers in the near future.


The original article contains 656 words, the summary contains 184 words. Saved 72%. I’m a bot and I’m open source!

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines