- News
- Reviews
- Bikes
- Accessories
- Accessories - misc
- Computer mounts
- Bags
- Bar ends
- Bike bags & cases
- Bottle cages
- Bottles
- Cameras
- Car racks
- Child seats
- Computers
- Glasses
- GPS units
- Helmets
- Lights - front
- Lights - rear
- Lights - sets
- Locks
- Mirrors
- Mudguards
- Racks
- Pumps & CO2 inflators
- Puncture kits
- Reflectives
- Smart watches
- Stands and racks
- Trailers
- Clothing
- Components
- Bar tape & grips
- Bottom brackets
- Brake & gear cables
- Brake & STI levers
- Brake pads & spares
- Brakes
- Cassettes & freewheels
- Chains
- Chainsets & chainrings
- Derailleurs - front
- Derailleurs - rear
- Forks
- Gear levers & shifters
- Groupsets
- Handlebars & extensions
- Headsets
- Hubs
- Inner tubes
- Pedals
- Quick releases & skewers
- Saddles
- Seatposts
- Stems
- Wheels
- Tyres
- Health, fitness and nutrition
- Tools and workshop
- Miscellaneous
- Buyers Guides
- Features
- Forum
- Recommends
- Podcast
Add new comment
30 comments
How about when it spots a pedestrian using a crossing but just decides to continue anyway?
https://arstechnica.com/cars/2023/05/teslas-full-self-driving-sees-pedestrian-chooses-not-to-slow-down/
https://cdn.arstechnica.net/wp-content/uploads/2023/05/WholeMarsBlog_1080p1657807019703943169.mp4
Looks like Tesla has gone to a full on arsehole driver mode.
Perhaps learned from its drivers?
Surprised no one has picked up on this?
https://www.theguardian.com/technology/2023/feb/16/tesla-recall-full-sel...
Only a mere 363,000 vehicles across all model platforms, so probably far easier to blame it on individual driver error than complete system failure
Chuck might not be so confident pulling in front of them now.
Chuck might not be so confident pulling in front of them now
Hardly relevant, as virtually nothing he writes is either true or intended to be true.
I know that and you know that, but does he know that?
Some analysis here: https://slate.com/technology/2023/02/tesla-recall-full-self-driving-nhts...
the problem is the analysis is coming from the this "automated driving" software is bad because its not fully automated, and Ill ignore that non-automated driving is invariably much worse and there arent long winded analysis articles about that.
but because it allows the car to drive itself badly in some situations, again Ill ignore that speeding, jumping yellow lights, rolling through stop signs and using the wrong lane at an intersection is pretty much standard behaviour among human drivers anyway, and wouldnt it be hilarious if the AI had just learnt these new behaviours because it thought the software logic was showing it this was how you drove in those situations, and that means Elon is a bad person and Tesla are doomed.
But Teslas pov is FSD is an add on that drivers must still take full responsibility of monitoring, to control properly, as its a beta test programme, uniquely allowed in the US due to their laws.
it doesnt feel quite like the smoking gun when manufacturers like Ford for instance issued 65 recalls last year, affecting 8.6million vehicles does it ?
The big issue is that the FSD is not really ready for use on the roads yet (hence why the EU etc don't allow it) and other road users are unwitting participants in Tesla's development cycle. After sitting in a colleague's Tesla and seeing the poor performance of the AI's view of the surroundings, I don't think it's even approaching the ability of the worst drivers.
I don't know about when cycling but when I'm driving and want to overtake but can see a Tesla coming up behind in the next lane, I just pull out anyway because they have automatic braking so I know there's always time for me to go in front of them without any risk to me.
Elon thanks you for being his crash test dummy
You don't want to do that. The automatic braking is only when cruise or FSD is on.
Why would you do that to any driver? That's careless driving.
There's some details on the Bay Bridge pile-up investigation here:
https://www.notebookcheck.net/Government-investigation-finds-Tesla-FSD-was-indeed-engaged-during-the-Bay-Bridge-pileup.682641.0.html
I totally hear you about the safety concerns with Tesla's technology not detecting motorcycles. That's definitely a big worry. However, on a brighter note, have you checked out the best model 3 accessories? They can add some extra features to your ride and make it even more enjoyable.Personally, I love my Tesla's Autopilot feature. It makes driving a breeze, but I always keep my eyes on the road and my hands on the wheel. Safety is always a top priority. And, with the Model 3's built-in cameras and sensors, you can feel confident knowing that your vehicle is always keeping an eye out for you.
WTF is this?
The first car that hit the Tesla (or it's driver - it isn't clear if it was self driving) may have an excuse for not avoiding the Tesla (if it cut in without leaving sufficient stopping distance for the vehicle behind), but every other vehicle and it's driver must have been following too closely, if they were unable to avoid a collision.
You don't get pile-ups when only a single (self-) driver makes an error.
Tesla has sort of a black box that records what is going with the car. If it was in FSD mode investigators can easily find out.
I agree, and drivers often don't leave enough space when following other vehicles.
There isn't a car made with full autopilot. Tesla states in their manuals and at the dealership autopilot mode requires driver supervison; your hand/s must be on the steering wheel.
Although it's promoted specifically as "Full Self Driving". There's also some suggestions that the FSD mode disengages before crashing with only a second or two for the driver to take over so that Tesla can claim that it wasn't the FSD that crashed, but the driver not having sufficient supervision.
As stated above in reposne to jh2727, Tesla has a black box which records what the car is doing at all times. If it was in FSD the box will show investigators the exact time it when it was engaged and deactivated.
I don't trust Tesla to be entirely open and truthful with the data from their black box systems. They have an incentive to not assign blame to the FSD system.
But Musk has told us it works.
And drivers are lazy.
Then there is this.
https://road.cc/content/news/cyclist-spots-driverless-car-cycle-lane-298507
Although for about 10 years, Elon has been arguing the car can drive itself NOW. So why wouldn't people believe him rather then some manual.
Tesla does drive itself in FSD mode. What it fails to do is identify clearly every obect it approaches or is near the car. Calif has banned Tesla from promoting FSD as a feature.
It's astonishing that their "Full Self Driving" mode is allowed on public streets as it seems very much like alpha quality software with multiple flaws. I don't know if FSD mode is allowed on UK roads, but I hope it isn't.
I think Tesla is basically pulling a fast one on their customers as they advertise it as "Full Self Driving", but with the proviso that the driver has to stay alert and focussed to take over at any point that the software may give up (typically just before crashing).
I had the opportunity to travel in a colleague's Tesla the other week and it's concerning just how flaky the detection algorithms seem to be. You can see a representation of the software's interpretation of the surroundings on the console tablet and it's always shifting around and changing which doesn't inspire confidence. It seems to get very confused by white lines, but had a fixation with black bins for some reason - it picked out all the black bins on the pavements.
Maybe the algorithms are indeed excellent but - like the case of the Russian tank attack dogs - there was an oversight in the models used in training?
Hopefully they're going to quickly get it up to speed on Mamils and learner drivers then it'll be just as good as many UK drivers.
I don't think any of this is helped by Tesla moving completely away from using Lidar sensors, and relying on vision based systems. Can't help feeling that's a huge retrograde step, and hope the hints about them going back to it, or high resolution radar, come to something.
We combine multiple things to make sense of the visual information we get, which remember is only detailed in a pathetically small area of vision. We also have different ways of processing vision. However, we get fooled and constantly have to re-assess. We use sound to alert ourselves, not just vision - again an unreliable source of data on its own but combined with vision, it is a step forward. We also use our "AI" to predict the movement of objects and the difference between where we think they should be and where they are alerts us too. Even more sophisiticatedly, we will use movements, or lack of movements, of other objects to predict the presence of objects that are not visible - for example, cars ahead pulling out unexpectedly might indicate a large pothole in the road that we cannot yet see, but that unusual movement alerts us to concentrate on what the treat is. We also use context, we expect traffic lights to be around junctions but not randomly in the middle of the road, hence where lights are in unexpected places we are given warning signs in rural areas, though in urban areas we are alert to the possibility of crossings so no warning signs are needed.
Trying to do it all with cameras means that a consistent visual confusion such as shadows or unusual objects or lights can always beat the system. Add to it that the eye has amazing adaptive capabilities that cameras don't. For example, although the pupil accomodates for a certain range of light, the retina itself adapts to brightness on a pixel by pixel basis. It is a problem with photograhic cameras that they cannot contain the dynamic range that an eye can, we think nothing of looking at a scene where the sky is bright yet we can see into shadow, the retina exposed to the shadowed area is able to adapt to some extent while the shy is not washed out. Of course, if we are being dazzled by the sun, we can also screen the sun - walking we to this with our hands, driving we do this with sun shades, we use our knowledge of the sun to know instinctively how to do this - are Teslas fitted with devices to stop direct sunlight interferring with its ability to process vision?