Is the Uber driverless car colour blind?
Is the Uber driverless car colour blind?
Uber only yesterday, to much fanfare, announced its expansion of its driverless car trial into its home city of San Francisco.
However, its autonomous vehicle has already been caught on video posted to YouTube blatantly running a red light as it zooms across a pedestrian crossing outside the city's Museum of Modern Art.
![](http://web.archive.org./web/20161216164201im_/http://dynimages.themotorreport.com.au:80/tJ5EFsj6rlKUmJI-cnPh0yNoLc0=/fit-in/800x600/filters:stretch(FFFFFF)/editorial/articleLeadwide-uber-has-commenced-a-trial-of-its-self-driving-cargtbi3e.jpg)
The trial, which is a collaboration with Swedish car maker Volvo, uses the car maker's existing XC90 SUV but adds the ride-sharing company's own hardware and software, which is clearly visible on the roof.
Uber also stated when announcing the trial that every vehicle would still have an Uber technician behind the wheel to supervise the car's operation. But who will supervise the supervisor?
Driverless car technology is still in its infant stage and these trials are there to help iron out all the difficulties that happen with these vehicles.
Though, incidents like this raise interesting questions about who is at fault when driverless technology goes wrong.
Michigan this week enacted legislation that made it the first in the US to establish comprehensive regulations for testing, use and future sales of driverless cars.
The law allows public road testing of vehicles without steering wheels, gas or brake pedals or any need for human control. It lets auto and tech companies operate driverless ride-sharing services and also lays out rules for how self-driving cars can be sold to the public once the technology has been tested and certified.
In Australia no such laws exist and the legal framework still insists that a driver be in charge of the vehicle.
4 Comments
Who would thought robots would disobey rules?
Ha! Woop de doo big deal!!If we had a headline every time a human driver ran a red light, tailgated, drove drunk/drugged,turned without indicating, etc etc, we'd have no room for any other news.
@Zen, so how many driverless cars are there on the road? 1? that would make 100% bad driver, 2? that would make 50% etc... so add to that just how little they drive. I think you miss the point, if there is an accident, who is to blame? the car, the technology behind it or the chump techie plonked behind the wheel?
@ Fungus, actually you'll find that over the past few years Google and other companies have extensively tested and are still testing a considerable number of autonomous cars (dunno how many) over millions of kilometres in various scenarios,on public and private roads, with so far a 100% fatality free record, (apart from the 2 Tesla cases which were due to drivers misusing Autopilot). Early days and regulation plus public acceptance still lag, but that's a very promising start. During the same period, cars driven by humans killed and injured several million people. Sure. blame the human, blame the mobile phone they were using instead of driving, or the drugs they took before they drove, blame the brewery that brewed the beer they drank, blame the jolly green giant if you want, but the numbers of dead and maimed clearly indicate that humans are the glaring weakest link and the way to stop or drastically reduce over a million deaths on the road annually ( according to WHO) is to remove the human. Blaming isn't going to solve a single thing