Support road.cc

Like this site? Help us to make it better.

Tesla safety?

If Tesla's technology can't detect a motorcyle, it certanly won't spot a bicycle and rider so this is of concern....

https://edition.cnn.com/2022/10/17/business/tesla-motorcycle-crashes-aut...

 

If you're new please join in and if you have questions pop them below and the forum regulars will answer as best we can.

Add new comment

30 comments

Avatar
hawkinspeter | 11 months ago
4 likes
Avatar
ktache replied to hawkinspeter | 11 months ago
3 likes

Looks like Tesla has gone to a full on arsehole driver mode.

Perhaps learned from its drivers?

Avatar
Owd Big 'Ead | 1 year ago
2 likes

Surprised no one has picked up on this?

https://www.theguardian.com/technology/2023/feb/16/tesla-recall-full-sel...

Only a mere 363,000 vehicles across all model platforms, so probably far easier to blame it on individual driver error than complete system failure 

Avatar
IanMSpencer replied to Owd Big 'Ead | 1 year ago
3 likes

Chuck might not be so confident pulling in front of them now.

Avatar
wtjs replied to IanMSpencer | 1 year ago
5 likes

Chuck might not be so confident pulling in front of them now

Hardly relevant, as virtually nothing he writes is either true or intended to be true.

Avatar
IanMSpencer replied to wtjs | 1 year ago
4 likes

I know that and you know that, but does he know that?

Avatar
hawkinspeter replied to Owd Big 'Ead | 1 year ago
1 like
Owd Big 'Ead wrote:

Surprised no one has picked up on this?

https://www.theguardian.com/technology/2023/feb/16/tesla-recall-full-sel...

Only a mere 363,000 vehicles across all model platforms, so probably far easier to blame it on individual driver error than complete system failure 

Some analysis here: https://slate.com/technology/2023/02/tesla-recall-full-self-driving-nhts...

Avatar
Awavey replied to hawkinspeter | 1 year ago
0 likes

the problem is the analysis is coming from the this "automated driving" software is bad because its not fully automated, and Ill ignore that non-automated driving is invariably much worse and there arent long winded analysis articles about that.

but because it allows the car to drive itself badly in some situations, again Ill ignore that speeding, jumping yellow lights, rolling through stop signs and using the wrong lane at an intersection is pretty much standard behaviour among human drivers anyway, and wouldnt it be hilarious if the AI had just learnt these new behaviours because it thought the software logic was showing it this was how you drove in those situations, and that means Elon is a bad person and Tesla are doomed.

But Teslas pov is FSD is an add on that drivers must still take full responsibility of monitoring, to control properly, as its a beta test programme, uniquely allowed in the US due to their laws.

it doesnt feel quite like the smoking gun when manufacturers like Ford for instance issued 65 recalls last year, affecting 8.6million vehicles does it ?

Avatar
hawkinspeter replied to Awavey | 1 year ago
4 likes

The big issue is that the FSD is not really ready for use on the roads yet (hence why the EU etc don't allow it) and other road users are unwitting participants in Tesla's development cycle. After sitting in a colleague's Tesla and seeing the poor performance of the AI's view of the surroundings, I don't think it's even approaching the ability of the worst drivers.

Avatar
ChuckSneed | 1 year ago
0 likes

I don't know about when cycling but when I'm driving and want to overtake but can see a Tesla coming up behind in the next lane, I just pull out anyway because they have automatic braking so I know there's always time for me to go in front of them without any risk to me.

Avatar
Backladder replied to ChuckSneed | 1 year ago
4 likes

Elon thanks you for being his crash test dummy  3

Avatar
charliepalooza replied to ChuckSneed | 1 year ago
3 likes

You don't want to do that. The automatic braking is only when cruise or FSD is on. 

Avatar
Secret_squirrel replied to ChuckSneed | 1 year ago
2 likes

ChuckSneed wrote:

I don't know about when cycling but when I'm driving and want to overtake but can see a Tesla coming up behind in the next lane, I just pull out anyway because they have automatic braking so I know there's always time for me to go in front of them without any risk to me.

Why would you do that to any driver?  That's careless driving. 

Avatar
hawkinspeter | 1 year ago
2 likes

There's some details on the Bay Bridge pile-up investigation here:

https://www.notebookcheck.net/Government-investigation-finds-Tesla-FSD-was-indeed-engaged-during-the-Bay-Bridge-pileup.682641.0.html

Quote:

The Model S decided to merge into the fast lane, turning a left blinker on, then moved there and started sudden deceleration to a speed of 7 mph as if being pulled over, causing vehicles behind it to bump or crash into each other. The Tesla itself escaped relatively unscathed, as well as the driver, but some of the vehicles coming behind it couldn't react in the fast lane on time and slammed into the last cars in the pileup at significant speed. Some observers argued that a recently installed warning light at the beginning of the tunnel may have been mistaken for a traffic light, while subsequent media efforts to recreate the FSD confusion at the exact same place didn't pan out.

Avatar
DylanmMackayv replied to hawkinspeter | 1 year ago
0 likes

I totally hear you about the safety concerns with Tesla's technology not detecting motorcycles. That's definitely a big worry. However, on a brighter note, have you checked out the best model 3 accessories? They can add some extra features to your ride and make it even more enjoyable.Personally, I love my Tesla's Autopilot feature. It makes driving a breeze, but I always keep my eyes on the road and my hands on the wheel. Safety is always a top priority. And, with the Model 3's built-in cameras and sensors, you can feel confident knowing that your vehicle is always keeping an eye out for you.

Avatar
ktache replied to DylanmMackayv | 1 year ago
2 likes

WTF is this?

Avatar
jh2727 replied to hawkinspeter | 1 year ago
2 likes
hawkinspeter wrote:

There's some details on the Bay Bridge pile-up investigation here:

https://www.notebookcheck.net/Government-investigation-finds-Tesla-FSD-was-indeed-engaged-during-the-Bay-Bridge-pileup.682641.0.html

Quote:

The Model S decided to merge into the fast lane, turning a left blinker on, then moved there and started sudden deceleration to a speed of 7 mph as if being pulled over, causing vehicles behind it to bump or crash into each other. The Tesla itself escaped relatively unscathed, as well as the driver, but some of the vehicles coming behind it couldn't react in the fast lane on time and slammed into the last cars in the pileup at significant speed. Some observers argued that a recently installed warning light at the beginning of the tunnel may have been mistaken for a traffic light, while subsequent media efforts to recreate the FSD confusion at the exact same place didn't pan out.

The first car that hit the Tesla (or it's driver - it isn't clear if it was self driving) may have an excuse for not avoiding the Tesla (if it cut in without leaving sufficient stopping distance for the vehicle behind), but every other vehicle and it's driver must have been following too closely, if they were unable to avoid a collision.

You don't get pile-ups when only a single (self-) driver makes an error.

Avatar
Critman replied to jh2727 | 1 year ago
0 likes

Tesla has sort of a black box that records what is going with the car. If it was in FSD mode investigators can easily find out.

Avatar
hawkinspeter replied to jh2727 | 1 year ago
0 likes

I agree, and drivers often don't leave enough space when following other vehicles.

Avatar
Critman | 1 year ago
2 likes

There isn't a car made with full autopilot. Tesla states in their manuals and at the dealership autopilot mode requires driver supervison; your hand/s must be on the steering wheel.

Avatar
hawkinspeter replied to Critman | 1 year ago
8 likes

Critman wrote:

There isn't a car made with full autopilot. Tesla states in their manuals and at the dealership autopilot mode requires driver supervison; your hand/s must be on the steering wheel.

Although it's promoted specifically as "Full Self Driving". There's also some suggestions that the FSD mode disengages before crashing with only a second or two for the driver to take over so that Tesla can claim that it wasn't the FSD that crashed, but the driver not having sufficient supervision.

Avatar
Critman replied to hawkinspeter | 1 year ago
1 like

As stated above in reposne to jh2727, Tesla has a black box which records what the car is doing at all times. If it was in FSD the box will show investigators the exact time it when it was engaged and deactivated.

Avatar
hawkinspeter replied to Critman | 1 year ago
0 likes
Critman wrote:

As stated above in reposne to jh2727, Tesla has a black box which records what the car is doing at all times. If it was in FSD the box will show investigators the exact time it when it was engaged and deactivated.

I don't trust Tesla to be entirely open and truthful with the data from their black box systems. They have an incentive to not assign blame to the FSD system.

Avatar
ktache replied to Critman | 1 year ago
3 likes

But Musk has told us it works.

And drivers are lazy.

Then there is this.

https://road.cc/content/news/cyclist-spots-driverless-car-cycle-lane-298507

 

Avatar
AlsoSomniloquism replied to Critman | 1 year ago
5 likes

Although for about 10 years, Elon has been arguing the car can drive itself NOW. So why wouldn't people believe him rather then some manual.

Avatar
VeloUSA replied to AlsoSomniloquism | 1 year ago
6 likes

AlsoSomniloquism wrote:

Although for about 10 years, Elon has been arguing the car can drive itself NOW. So why wouldn't people believe him rather then some manual.

Tesla does drive itself in FSD mode. What it fails to do is identify clearly every obect it approaches or is near the car. Calif has banned Tesla from promoting FSD as a feature.

Avatar
hawkinspeter | 1 year ago
6 likes

It's astonishing that their "Full Self Driving" mode is allowed on public streets as it seems very much like alpha quality software with multiple flaws. I don't know if FSD mode is allowed on UK roads, but I hope it isn't.

I think Tesla is basically pulling a fast one on their customers as they advertise it as "Full Self Driving", but with the proviso that the driver has to stay alert and focussed to take over at any point that the software may give up (typically just before crashing).

I had the opportunity to travel in a colleague's Tesla the other week and it's concerning just how flaky the detection algorithms seem to be. You can see a representation of the software's interpretation of the surroundings on the console tablet and it's always shifting around and changing which doesn't inspire confidence. It seems to get very confused by white lines, but had a fixation with black bins for some reason - it picked out all the black bins on the pavements.

Avatar
chrisonabike replied to hawkinspeter | 1 year ago
3 likes

Maybe the algorithms are indeed excellent but - like the case of the Russian tank attack dogs - there was an oversight in the models used in training?

Hopefully they're going to quickly get it up to speed on Mamils and learner drivers then it'll be just as good as many UK drivers.

Avatar
fukawitribe replied to chrisonabike | 1 year ago
2 likes

I don't think any of this is helped by Tesla moving completely away from using Lidar sensors, and relying on vision based systems. Can't help feeling that's a huge retrograde step, and hope the hints about them going back to it, or high resolution radar, come to something.

Avatar
IanMSpencer replied to fukawitribe | 1 year ago
1 like

We combine multiple things to make sense of the visual information we get, which remember is only detailed in a pathetically small area of vision. We also have different ways of processing vision. However, we get fooled and constantly have to re-assess. We use sound to alert ourselves, not just vision - again an unreliable source of data on its own but combined with vision, it is a step forward. We also use our "AI" to predict the movement of objects and the difference between where we think they should be and where they are alerts us too. Even more sophisiticatedly, we will use movements, or lack of movements, of other objects to predict the presence of objects that are not visible - for example, cars ahead pulling out unexpectedly might indicate a large pothole in the road that we cannot yet see, but that unusual movement alerts us to concentrate on what the treat is. We also use context, we expect traffic lights to be around junctions but not randomly in the middle of the road, hence where lights are in unexpected places we are given warning signs in rural areas, though in urban areas we are alert to the possibility of crossings so no warning signs are needed.

Trying to do it all with cameras means that a consistent visual confusion such as shadows or unusual objects or lights can always beat the system. Add to it that the eye has amazing adaptive capabilities that cameras don't. For example, although the pupil accomodates for a certain range of light, the retina itself adapts to brightness on a pixel by pixel basis. It is a problem with photograhic cameras that they cannot contain the dynamic range that an eye can, we think nothing of looking at a scene where the sky is bright yet we can see into shadow, the retina exposed to the shadowed area is able to adapt to some extent while the shy is not washed out. Of course, if we are being dazzled by the sun, we can also screen the sun - walking we to this with our hands, driving we do this with sun shades, we use our knowledge of the sun to know instinctively how to do this - are Teslas fitted with devices to stop direct sunlight interferring with its ability to process vision?

Latest Comments