Tesla Driver Charged With Killing Motorcyclist After Turning on Autopilot and Browsing His Phone

illdog

Rising Star
BGOL Investor
Simply...Just another test driver providing data for development of the next update..

giphy.gif
 

CurtDawg

Rising Star
Platinum Member
So..... Elon..... you might want to take your time with those driverless robo taxi's :rolleyes:
 

Ballatician

Rising Star
BGOL Investor
Tesla's Full Self-Driving doesn’t feel safe or intelligent in my experience. I tried it once on my commute to work, and it fucked up big time. Upon exiting, I needed to immediately turn right, but the system veered to the far-left lane and then attempted to switch over three lanes just to make that right turn, even though it could have done immediately when it exited. Luckily, this was 5 am in the morning and no cars were on the road.
 

CurtDawg

Rising Star
Platinum Member
Tesla's Full Self-Driving doesn’t feel safe or intelligent in my experience. I tried it once on my commute to work, and it fucked up big time. Upon exiting, I needed to immediately turn right, but the system veered to the far-left lane and then attempted to switch over three lanes just to make that right turn, even though it could have done immediately when it exited. Luckily, this was 5 am in the morning and no cars were on the road.

Which version were you using?
Tesla has updated FSD recently
The latest version #12, is actually pretty decent
It makes a few mistakes here & there
But otherwise it does a decent job
The previous version (version 11) was very robotic
I think they are on version 12.3.4 right now
 

DJCandle

Well-Known Member
BGOL Investor
Which version were you using?
Tesla has updated FSD recently
The latest version #12, is actually pretty decent
It makes a few mistakes here & there
But otherwise it does a decent job
The previous version (version 11) was very robotic
I think they are on version 12.3.4 right now
The latest version has been smooth for me. I reiterate this everytime these articles come out..

Teslas autopilot errors are 99.99% user error.

The tech itself is fine when used correctly.
 

COINTELPRO

Transnational Member
Registered

Safety is at the core of our design and engineering decisions. In 2021, we began our transition to Tesla Vision by removing radar from Model 3 and Model Y, followed by Model S and Model X in 2022. Today, in most regions around the globe, these vehicles now rely on Tesla Vision, our camera-based Autopilot system.

Since launch, we have continued to make incremental improvements in both feature parity and safety. Compared to radar-equipped vehicles, Model 3 and Model Y with Tesla Vision have either maintained or improved their active safety ratings in the U.S. and Europe, and perform better in pedestrian automatic emergency braking (AEB) intervention.

This has nothing to do with autopilot, it is this vision crap he is trying to do that is not working. Most cars today will pick up a pedestrian and engage in automatic emergency braking.

I almost had a killer Tesla try to assassinate me on the road by some white jihadist. I'm pretty sure his vision system would not have engaged in emergency braking.
 

Deezz

Rising Star
BGOL Investor
This autonomous driving feature is the worst thing invented for cars.

This kind of feature should only be used on a closed loop.

People have found so many different ways to defeat all the nanny's put in place to make sure you are keeping your eyes on the road.
 

APOPHIS

Autodidact / Polymath
Platinum Member
The latest version has been smooth for me. I reiterate this everytime these articles come out..

Teslas autopilot errors are 99.99% user error.

The tech itself is fine when used correctly.

The problem is that it may/can cause drivers not to pay attention to their surrounding environment due to the false sense of safety and security it creates.
Sometimes, people must be protected from themselves despite how accurate and reliable a technology may be.
 

Coldchi

Rising Star
BGOL Investor
Plenty of motorcyclist get hit by non-self driving cars everyday........................it never makes the news.
 

DJCandle

Well-Known Member
BGOL Investor
The problem is that it may/can cause drivers not to pay attention to their surrounding environment due to the false sense of safety and security it creates.
Sometimes, people must be protected from themselves despite how accurate and reliable a technology may be.
Oh absolutely. I agree with the first part entirely.

I just hate the hit pieces blaming the tech when it’s clearly on the idiots that purchase the vehicles and don’t know how to use em.

That rarely gets acknowledged and in 2024, we fighting back against the lunatics.
 

CurtDawg

Rising Star
Platinum Member

I was thinking about this robo taxi shit
That Elon keeps saying, its coming soon

I'm trying to figure this shit out.....
So for example, if I had a Tesla with FSD, could I just let it drive itself for Uber?
While I'm chillin at home getting paid
Multitasking like a mofo
:money::money::money::money:
 

Ballatician

Rising Star
BGOL Investor
Which version were you using?
Tesla has updated FSD recently
The latest version #12, is actually pretty decent
It makes a few mistakes here & there
But otherwise it does a decent job
The previous version (version 11) was very robotic
I think they are on version 12.3.4 right now
I'm not sure which version I have but I'll double-check.

I've adjusted the settings to a more relaxed mode to minimize lane changes, but it still switches lanes which isn’t my thing. My driving preference is to transition to the far right lane when my exit is within 1.5 miles. However, FSD system insists on moving to the far left lane, to move faster. Then, it attempts to navigate through three lanes within half a mile, which is impractical in heavy LA traffic at 5 pm. I feel uneasy when it initiates lane changes in traffic, especially with aggressive LA drivers.
 
Top