Support road.cc

Like this site? Help us to make it better.

Video: Nissan driverless car in cyclist close pass

'I was a little scared for him', says nervous passenger...

A journalist from France has caught on camera the moment a Nissan driverless car passes a cyclist without leaving enough space.

 

The video, shot in London as Nissan showcased its driverless progress, shows how the car’s console registers the cyclist, but then fails to move over to give him space.

Tetsuya Iijima, global head of autonomous drive development at Nissan, is behind the wheel, but fails to over-ride the car and move out either, the video, spotted by BikeBiz, shows.

One of the French journalists in the car can be heard saying: ”I was a little scared for him"  in French.

Last year we reported how Adrian Lord, of the transport consultancy Phil Jones Associates, fears that once technology that prevents pedestrians and cyclists from being hit by vehicles makes it to our roads, it opens the door for vulnerable road users to take advantage of the impossibility of being injured.

He said: "Once people realise that an autonomous vehicle will stop [automatically], will pedestrians and cyclists deliberately take advantage and step out or cycle in front of them?

“If that’s the case, how long would such a vehicle take to drive down Oxford Street or any other busy urban high street?”

Meanwhile professor of transport engineering at the University of the West of England, John Parkin, told the Financial Times (link is external) that much of the infrastructure that's being implemented to keep bikes and cars apart in inner-city environments, will be made redundant by autonomous technology reaching maturity.

"When fewer cars are driven by humans, in cities at least," the professor said. "There would be less need to segregate cyclists from traffic. This would allow roads to be designed as more open, shared spaces."

Add new comment

47 comments

Avatar
thesaladdays replied to Mungecrundle | 7 years ago
1 like
Mungecrundle wrote:

The difference is that the autonomous driving algorithm has the potential to gather data and effectively learn from every single scenario.

By the sounds of it they're learning to be more like real human drivers!  Next thing they'll be bleating on about road tax in electronic voices.

Seriously though, do these algorithms contain strict red lines on adhering to safe distance minimums, or is there some allowance for taking reasonable (clearly not in this case) risks if it decides it's safe to do so?  Hope it's not the latter...

Avatar
velo-nh replied to Mungecrundle | 7 years ago
1 like
Mungecrundle wrote:

Having experienced no fewer than 3 dangerous passes on the club ride this morning, I'd far sooner take my chances with software being in control.

How many crashes has Airbus had due to automation?  And that's without the complexities of being on a road in heavy traffic.

As a software engineer, I don't see self-driving cars being a thing anytime soon.  At least not safe ones, or ones that don't require a lot of human intervention.  Worse, this is just going to lead to drivers that will be incapable on the occasion where the self-driving doesn't work.  

Avatar
CygnusX1 replied to velo-nh | 7 years ago
1 like
velo-nh wrote:
Mungecrundle wrote:

Having experienced no fewer than 3 dangerous passes on the club ride this morning, I'd far sooner take my chances with software being in control.

How many crashes has Airbus had due to automation?  And that's without the complexities of being on a road in heavy traffic.

As a software engineer, I don't see self-driving cars being a thing anytime soon.  At least not safe ones, or ones that don't require a lot of human intervention.  Worse, this is just going to lead to drivers that will be incapable on the occasion where the self-driving doesn't work.  

Please tell us, Velo, just how many crashes have Airbus had due to automation? And for the same period how many more crashes due to human error?

I've just gone through this list of A320 incidents and found only one incident where the autopilot was mentioned, and several where the cause was attributed to human actions:

https://en.wikipedia.org/wiki/Accidents_and_incidents_involving_the_Airbus_A320_family

Automation

  • 5 November 2014, Lufthansa Flight 1829. The aircraft, while on autopilot, lowered the nose into a descent reaching 4000 fpm. The uncommanded pitch-down was caused by two angle of attack sensors that were jammed in their positions. The crew disconnected the related Air Data Units and were able to recover the aircraft.

vs. Human

  • 28 July 2010, Airblue Flight 202.  During a non-standard self-created approach below the minimum descent altitude the aircraft crashed into the ground after the captain ignored 21 cockpit warnings to pull-up. 146 passengers and six crew were on board the aircraft. There were no survivors.
  • 28 December 2014, Indonesia AirAsia Flight 8501,. The cause was initially a malfunction in two of the plane’s rudder travel limiter units. The crew ignored the recommended procedure to deal with the problem and disengaged the autopilot which contributed to the subsequent loss of control. All 162 on board killed.
  • 24 March 2015, Germanwings Flight 9525. The crash was deliberately caused by the co-pilot Andreas Lubitz, who had previously been treated for suicidal tendencies and been declared "unfit to work" by a doctor. 150 killed.

Maybe this last one is unfair... or maybe not.  Outside of science fiction (at least as yet), computers aren't prone to emotional instability and even then I'm willing to take my chances with Marvin the paranoid android from H2G2 instead of some of the psychos out there behind a wheel currently  - HAL9000 on the other hand ...

Avatar
barbarus replied to CygnusX1 | 7 years ago
1 like
CygnusX1 wrote:

I've just gone through this list of A320 incidents and found only one incident where the autopilot was mentioned

Don't come at me with facts, this is the internet! 3.0 (post fact version)

Avatar
FluffyKittenofT... replied to barbarus | 7 years ago
3 likes
barbarus wrote:
CygnusX1 wrote:

I've just gone through this list of A320 incidents and found only one incident where the autopilot was mentioned

Don't come at me with facts, this is the internet! 3.0 (post fact version)

I thought the argument that's been made is that excessive automation 'de-skills' the pilots so when they _do_ need to take over, they can't cope and screw things up? So it's not that the autopilot makes the error, it's that it leaves human pilots more prone to do so than they used to be.

It seems to be one of those plausible-but-debatable thesis that some expert argues for, and which journalists who aren't-as-clever-as-they-think-they-are, then keep excitedly telling us all about as if we hadn't heard it already.

Might be true, might not, dunno, but seems relevant to self-driving-cars, given that the people using them will probably be drunk or asleep or watching movies when they are suddenly called on to intervene.

Point being I don't think one can put much trust in self-driving cars that require the human being able to 'take over' when things get tricky. They are going to have to be able to cope on their own.

Avatar
ollieclark replied to FluffyKittenofTindalos | 7 years ago
0 likes
FluffyKittenofTindalos wrote:

I thought the argument that's been made is that excessive automation 'de-skills' the pilots so when they _do_ need to take over, they can't cope and screw things up? So it's not that the autopilot makes the error, it's that it leaves human pilots more prone to do so than they used to be.

It seems to be one of those plausible-but-debatable thesis that some expert argues for, and which journalists who aren't-as-clever-as-they-think-they-are, then keep excitedly telling us all about as if we hadn't heard it already.

Might be true, might not, dunno, but seems relevant to self-driving-cars, given that the people using them will probably be drunk or asleep or watching movies when they are suddenly called on to intervene.

Point being I don't think one can put much trust in self-driving cars that require the human being able to 'take over' when things get tricky. They are going to have to be able to cope on their own.

Absolutely. Who wants a driverless car that you still have to be ready to drive at a moment's notice? I want it to drive me home from the pub and then go and pick someone else up, like a taxi without the bad tempered driver.

The big advantage that cars have over aeroplanes though is that when they stop, they just sit there rather than plummeting out of the sky. So car autopilots can just slow and stop (hopefully safely*) if they can't work out what to do, no driver intervention required. In a plane, the pilot HAS to take over or everyone dies.

* There are some situations where there's going to be a crash no matter whether the driver's human or cyborg.

Avatar
velo-nh replied to CygnusX1 | 7 years ago
3 likes
CygnusX1 wrote:

Please tell us, Velo, just how many crashes have Airbus had due to automation? And for the same period how many more crashes due to human error?

Air France 296, June 1988.  Controversial, but the plane delayed the pilot's command to throttle up before hitting the trees.

Air France 447, May 2009.  Mixed blame with the pilots not knowing how to react to the automated systems disengaging.

Air Asia 8501 (Qz8501), Dec 2014.  Blamed on over-reliance on automation leading to an inability to control the aircraft without it.

Indian Airlines 605, Feb 1990.  Controversial, some parties claim the crash was caused by throttle behavior that downed Air France 296.

 

I could go on, but I think you get the idea.  Yes, the automation didn't intentionally down the plane, but the automation mixed with human pilot interaction has led to disasters.  "Self driving" cars will still have humans and I presume the cars will still have manual control for some time, probably past our lifetimes.  The video in this article shows the degredation of skills, the idiot "driver" didn't take control when the vehicle came too close to the cyclist.  That type of over-dependence is what made me think of the airlines.  

Avatar
CygnusX1 replied to velo-nh | 7 years ago
0 likes
velo-nh wrote:

The video in this article shows the degredation of skills, the idiot "driver" didn't take control when the vehicle came too close to the cyclist.  That type of over-dependence is what made me think of the airlines.  

That's why most manufacturers are now looking at going straight to SAE Level 4 - full autonomy, with no driver intervention.

Quote:

Jim McBride, autonomous vehicles expert at Ford, [is] focused on getting Ford straight to Level 4, since Level 3, which involves transferring control from car to human, can often pose difficulties. "We're not going to ask the driver to instantaneously intervene—that's not a fair proposition," McBride said.

Good article on what the levels are and current state of development (and its where I've lifted the quote above from):

http://www.techrepublic.com/article/autonomous-driving-levels-0-to-5-understanding-the-differences/

 

Avatar
J90 replied to velo-nh | 7 years ago
0 likes

Please stop highlighting plane crashes.

Avatar
CygnusX1 replied to J90 | 7 years ago
0 likes
J90 wrote:

Please stop highlighting plane crashes.

Nervous flyer?

Avatar
pasley69 replied to velo-nh | 7 years ago
0 likes
velo-nh wrote:
Mungecrundle wrote:

Having experienced no fewer than 3 dangerous passes on the club ride this morning, I'd far sooner take my chances with software being in control.

How many crashes has Airbus had due to automation?  And that's without the complexities of being on a road in heavy traffic.

As a software engineer, I don't see self-driving cars being a thing anytime soon.  At least not safe ones, or ones that don't require a lot of human intervention.  Worse, this is just going to lead to drivers that will be incapable on the occasion where the self-driving doesn't work.  

Sounds great - are all cars on the road going to be separated by 90 second gaps? with assistant drivers? and traffic controllers in constant contact with drivers?

If the cyclist in question had swerved and been hit, who would be held responsible and taken to court? the driver or the computer programming team?

And if engineers are screening the results - I hope the team includes a few pedestrians, cyclists, mothers with young children, disabled pensioners. I rather suspect most highly paid engineers and computer programmers may not have the best interests of other road users in mind.

Avatar
Leviathan | 7 years ago
3 likes

Le singe est dans l'arbre.
 

Avatar
ConcordeCX replied to Leviathan | 7 years ago
2 likes
Leviathan wrote:

Le singe est dans l'arbre.
 

le singe est au volant

Avatar
drosco | 7 years ago
2 likes

My mate was victim of a close pass by a Tesla the other day. When confronting him, the driver's response was 'sorry mate, it was on autopilot'.

Avatar
bstock | 7 years ago
9 likes
Quote:

He said: "Once people realise that an autonomous vehicle will stop [automatically], will pedestrians and cyclists deliberately take advantage and step out or cycle in front of them?

“If that’s the case, how long would such a vehicle take to drive down Oxford Street or any other busy urban high street?”

 

Sounds like a plan for far more civilised cities to me.

Avatar
Strathbean | 7 years ago
3 likes

the thought of these simpering geeks faffing about with this nonsense anywhere on public roads beggars belief

Avatar
Housecathst | 7 years ago
12 likes

People are obsessed by the idea that cyclists and pedestrians will 'take advantage of driverless cars'. it sounds like there judging cyclists and pedestrians the same the same way motorist currently act towards vulnerable road users 'it's ok their get out of my way or their be dead'. The difference is that people in driverless cars aren't going to have there lives put in danger by the actions of a cyclists or a pedestrian on a daily basis, there just going to have a slower journey, oh the humanity. 

 Aren't they going to predestriase oxford street anyway. So yeah it going to take a very long time to driver down there. 

Pages

Latest Comments