Support road.cc

Like this site? Help us to make it better.

news

Tesla car in Beta full self-driving mode almost rams cyclist (+ video)

YouTube video shows shocking incident in San Francisco

A video posted to YouTube this week shows the shocking moment a Tesla car operating in Full Self Driving (FSD) Beta mode in San Francisco suddenly veered towards a cyclist, who was oblivious to the danger the vehicle placed him in.

The footage was posted to the video-sharing website by vlogger and Tesla enthusiast Omar Qazi, who immediately before the near-collision had said, “You can actually make thousands of people drive safer — just with a software update,” and who had to grab the steering wheel to get the car back on course to avoid hitting the bike rider.

Following the close call, the driver, who goes by the name HyperChange on social media, asks “are we gonna have to cut that?” and also insists that “it wouldn’t have hit him” – although we’re not sure any cyclist would voluntarily take their chances at sharing roadspace with an autonomous vehicle capable of suddenly changing direction in this way.

As journalist Jason Torchinsky pointed out in his report on the incident on the motoring website Jalopnik,

Omar goes on to suggest that this sort of thing is “shocking to people because it’s new,” though I may take the bold position that it’s shocking to people because it fucking turned right toward a cyclist who was clearly visible for no good discernible reason. I think maybe that’s a bigger shock than, you know, “newness.”

Qazi claims that the FSD system “functioned exactly as designed” since “it detected that there’s a potentially dangerous situation” – although as Torchinsky highlights, it was a situation entirely of the car’s making.

“Well, the whole time you’re driving on human pilot, you’re making your car avoid hitting a biker.” Qazi continues.

“You’re constantly making your car avoid hitting a biker ... but then you’re surprised that you’re doing it for one second while on FSD Beta” – leading Torchinsky to suggest that “if you’re characterising normal driving as ‘constantly making your car avoid hitting’ anything, let alone a person on a bike, then I’d have to say your fundamental view of driving is deeply, dangerously wrong.”

“This time, nobody got hurt, and it was all pretty funny,” the journalist added. “This time.”

It’s the second incident involving a Tesla in FSD Beta mode that we have reported on this week, despite the company’s CEO, Elon Musk, claiming last month that the technology had not been responsible for a single crash since its launch in October 2020.

> Tesla using Full Self-Driving Beta crashes into cycle lane bollard ... weeks after Elon Musk's zero collisions claim

But a video shot by San Jose-based YouTuber AI Addict showed his Tesla car crashing into a segregated cycle lane bollard as it made a right turn while the self-driving mode was engaged.

In a voiceover on the video, he said: “Changing lanes ... Oh ... S***. We hit it. We actually hit that. Wow. We were so close on the corner ... I can't believe the car didn’t stop.”

“Alright, YouTube, it's confirmed I have hit that pylon. It's a first for me to actually hit an object in FSD,” he added.

According to Tesla, FSD is “capable of delivering intelligent performance and control to enable a new level of safety and autonomy.”

The technology supposedly enables the vehicle to drive itself to a destination that has been input on the car’s navigation system, although the motorist has to be prepared to assume control should something go wrong.

In December, we reported how Musk had been accused of encouraging driver distraction and putting road users in danger when it emerged that Tesla owners are now able to play video games through the car’s infotainment system while the vehicle is in operation.

> Tesla owners can now play video games
 while their car is moving

And commenting on an earlier version of the motor manufacturer’s software, a researcher at Stanford University in California said in 2017 that Tesla’s autonomous vehicle technology had no place being used around cyclists.

> Never use Tesla Autopilot feature around cyclists, warns robotics expert

Post-doctoral robotics researcher Heather Knight wrote that she “found the Autopilot’s agnostic behaviour around bicyclists to be frightening.”

In a review posted to Medium, she said: “I’d estimate that Autopilot classified ~30 per cent of other cars, and 1 per cent of bicyclists.

“Not being able to classify objects doesn’t mean the Tesla doesn’t see that something is there, but given the lives at stake, we recommend that people NEVER USE TESLA AUTOPILOT AROUND BICYCLISTS!”

She concluded her review by saying: “Do not treat this system as a prime time autonomous car. If you forget that 
 bikers will die.”

Simon joined road.cc as news editor in 2009 and is now the site’s community editor, acting as a link between the team producing the content and our readers. A law and languages graduate, published translator and former retail analyst, he has reported on issues as diverse as cycling-related court cases, anti-doping investigations, the latest developments in the bike industry and the sport’s biggest races. Now back in London full-time after 15 years living in Oxford and Cambridge, he loves cycling along the Thames but misses having his former riding buddy, Elodie the miniature schnauzer, in the basket in front of him.

Latest Comments