Uber admits to self-driving car 'problem' for cyclists

Just saw this -

"Engineers were working to fix programming flaw that could have deadly results for cyclists days after Uber announced it would openly defy California regulators"

More -

https://www.theguardian.com/technology/2016/dec/19/uber-self-drivin...

Views: 1184

Reply to This

Replies to This Discussion

If you cannot progam a machine to make its road behaviour safe, you probably can't claim that the human driver can be expected to drive safely in these conditions either.

Logically then, there possibly isn't an acceptable safety case for cars driven by human or by robot.

I'll be interested to see if the approval of these things is based upon 'safe', or merely 'more safe than a human" which is a very low bar indeed.

To prevent right hook (left hook here) type crashes the car has to I suppose, detect a cyclist to the side well before an intersection, slow down, let the cyclist get ahead, move to the side of the road (or into the bike lane, which is the law in US apparently), sit behind the cyclist, and then turn, also looking out for pedestrians and other cars. A good human can do this, and I suppose this would all be programmable, but you would hope they test it many times before release. Come to think about it, all drivers should be tested on this too, before licensing.

Human drivers can make mistakes, be lazy, impatient or callous. Autonomous cars will not be these things - they are machines, doing what they are told. But they are programmed by humans, who make mistakes, are lazy, impatient, callous, and under immense pressures.

As Uncle Bob said in a talk recently (YOW! 2016): every 5 years the number of programmers in the world doubles. So 50% of the programmers you meet have less than 5 years experience. You know that somewhere, a 22 year is rushing a code commit at 3am in the morning, and that code controls your car.

As a person who programs automatic things that do risky stuff, I hasten to point out that in my domain at least that's not true.

There's a robust process in aerospace at least. The equivalent should be applied to motor vehicles, for the same basic reasons. If it isn't, the community should be pressuring government to do its job.

Yes, Uncle Bob or another presenter, Martin Thompson, did make they point that eventually Software Engineering will have some kind of regulation. They used the example of the financial services industry. Then noted it was not a good example, and use accountancy instead.

"Driver behaviours such as speeding and texting continue to challenge efforts to reduce accidents,"

Despite test snafus in California, where driverless cars ran red lights and failed to respect bike lanes, ride-sharing upstart Uber says it's just a process of product improvement.

For all the talk of safety, the economics are clear. Trucks carrying themselves around the highways will save an estimated $US168 billion annually.

With "USD168 Billion annually" (just trucking)  up for grabs this is going to happen, no stopping it.

http://www.abc.net.au/news/2016-12-21/automation-threat-trucking-in...

We'll have autonomous trucks before cars cos freeways and highways are much easier to program for than city streets, and there's a lot of money to be saved by not having to pay truck drivers.

But you still have to pay someone to sit in the cab and not drive, right?

Maybe nobody will be required to be in the cab. Or maybe a person will be required but they might be able to sleep, thus allowing truckies to work longer shifts and cut costs. And perhaps a truckie will be required for the city streets, but once they get to the freeway they can lie down and have a sleep.

Pay rates for sleep hours might be less than for driving hours.

This doesn't seem too alarming. The programmers were ignorant of the law regarding bike lanes, and now that they've been told they're gonna change it. 

It confirms the need for vigilance and political pressure to make sure autonomous cars are safe around people on bikes.

The alarming piece for me is that Uber have thumbed their nose at SF city govt. and decided their cars are ready to test on public roads. From the incidents documented (running red lights, not yielding to pedestrians, hook turns over bike lanes) their cars are not ready. While this maverick, "disruptive" behaviour might have helped Uber break the taxi monopoly, it may end up causing a serious injury/death. This will cause a huge backlash against driverless innovation that will damage those doing the right thing and testing within their limits (Google, Apple, Tesla, Volvo, Nissan/NASA etc).

Going directly to autonomous vehicles seems wrong anyway.

Letting the robot help the driver, and deal with driver mistakes, seems to be the next step. Not on-road full autonomy.

I guess uber's programmers don't know how to drive. That's a cheery thought eh.

RSS

© 2018   Created by DamianM.   Powered by

Badges  |  Report an Issue  |  Terms of Service