You don't need to be an 'investor' to invest in Singletrack: 6 days left: 95% of target - Find out more
Wonder if they will hold the software programmer liable for the death.
In reality it will be the first piece of much evidence used to ensure that in the not too far off future no pedestrians or cyclists will be allowed on roads with autonomous vehicles 'for their own safety' and that people will only be able to cross on crossings by law even in the uk.
surely it should be the driver in the car who should be charged? He was there as a safeguard.
Wonder if they will hold the software programmer liable for the death.
This is why the Trolley Problem is a factor in autonomous car design and legislation.
surely it should be the driver in the car who should be charged? He was there as a safeguard.
In theory yes but in practice no. The "driver" is only really of use if they have some warning. For any oh shit we need a response now they arent going to be much use.
Will be watching this with great interest. Having read about Arizona relaxing the rules on self-driving cars much further than other states, I wonder if the victim's family might sue the state government.
I do get the impression that "jaywalking" carries much more stigma over there though.
Well jaywalking is an offence in some jurisdictions. Don’t know about that one though. And of course it shouldn’t affect the software design (there are many reasons why a pedestrian or other obstacle might be in the road).
Has there been any context around the nature of the incident though?
All I've seen is 'there's been a death and an automated car is involved'.
Nope, no context that I’ve seen. For all we know she tripped and landed just in front of it
I'm somehow wholly unsurprised it's an uber.
The liability stuff is way more complicated than sue the programmer as there are a lot of options depending on where they are and what might be considered reasonable for the vehicle to have anticipated
e.g. https://en.wikipedia.org/wiki/Autonomous_car_liability
The references section there gives you an idea of just how much work is going in to it.
On that note it did say there was a driver in the car to take control, so assuming the human in charge of the car was paying attention, it may have been an unavoidable accident.. But without more details it's all just conjecture.
16 pedestrians are killed in the states every day by drivered cars, I think we should ban all cars until we can reduce that to zero.
Robot car was busy twitter-botting the Putin election campaign. Unlike its undistractable meatsack counterparts, robot car should've been paying more attention.
Is it just me who thinks these things shouldn't be on the road yet?
Apparently a cyclist..
https://news.sky.com/story/uber-suspends-self-driving-car-testing-after-cyclist-is-killed-11297320
I work in the industry and IMO people have got wwaayy more confidence in this technology than they should have at the moment. Governments all over the world seem to be rushing to hand out licenses to test autonomous cars on the roads to attract the R&D centres, way before the technology is ready. The civil service should be doing work to make sure the legislation is up to scratch and everything is tested properly, but they are hopelessly outmatched in terms of salaries, skills and the sheer pace of change of the companies and technology they are trying to regulate. It seems like they are unable to tell how mature a particular companies technology is. Personally, I wouldn't trust anyone other than Google.
Spot on northwind if you want to keep killing people until you hit perfection then you are missing the point.
in the not too far off future no pedestrians or cyclists will be allowed on roads with autonomous vehicles ‘for their own safety’ and that people will only be able to cross on crossings by law even in the uk.
The UK has these already, we call them motorways and in places dual carriageway, see also tunnels and some bridges. It is a logical step to create higher speed close uninterrupted transit lanes so that cars eta can move quickly without having to deal with merging and pedestrians etc
Is it just me who thinks these things shouldn’t be on the road yet?
In the 2 months since I've been back in the UK I reckon I'm, have seen a couple of thousand drivers that should be off the road before any driverless car.
Is it just me who thinks these things shouldn’t be on the road yet?
With a human driver overseeing things with complete manual override under strict conditions? I don't see a problem with it. To take issue would be the same as blaming ABS or cruise control for having an accident.
It wouldn't surprise me if this was driver error, or pedestrian error, or even a mechanical failure, but I'm reserving judgment.
It's a bit sensationalist just to assume it's the automation aspect of the cars fault when there are so many variables and no specific information has been released.
Google cars could have killed dozens, you don't know about it because they'd remove it from their search results. (said in jest but when you hold the keys to the eyeball castle it's possible to guide opinions)
It's a statistical certainty autonomous cars will kill given enough road time, though it might be a lot less than meatbags do. What level is acceptable to society? If I was investing a billion or two in this tech it's a question I would have already asked.
TBH in autonomous mode you can't really expect the "driver" to take over in an emergency- in some slower emerging thing like a breakdown or contention or something, then sure but in a collision it's just not going to happen, the driver will be switched off and unready.
The bottom line is that real world testing is required to make these things better, because it's only connecting with the countless idiot things that happen in the real world that you'll get a result that works there. In exactly the same way as we instruct learners on the street then let them out when they're only barely capable. Self-driving clearly isn't there yet- but if we don't let it out in the wild then it probably never will be.
TBH in autonomous mode you can’t really expect the “driver” to take over in an emergency
A good point, but in a controlled test, or proof of concept test, which this hopefully was as it was on a real road with real hazards, you'd expect the human in the car to be ready to brake or take evasive action at any second if things didn't look right.
We really need more information on what exactly happened.
Picture in the link further up is pretty heartbreaking.
That picture says nothing, we already know that there was a collision between the car and a pedestrian pushing a bike across the road.
Presumably the car was fully camera'd up and that footage will reveal more of what actually happened.
Interesting angle from the independent article
Until recently, they have required a real person to be sat in the front of the car and ready to take over – but recently California officials approved the testing of such vehicles without humans in the front seats.
I wonder if uber were trying to run before they could walk, considering the first company to market with a viable autonomous vehicle stands to make a (forgive the pun) killing.
It will be interesting to see if the human in the car was actually ready to intervene, or if there even was a human in the car.
Either way it looks very bad for uber, and california I guess.. there's some criminal negligence there for sure.
I wonder whether any of those calling for such cars to be banned as a result of this will manage the logical connection that exactly the same arguments could be used for banning cars with human drivers.
Sad news.
Unfortunately I think the pedestrian will be found to be at fault here, seeing as the police are quoted as saying she wasn't using a pedestrian crossing. This is the US after all where the car is king.
I'm not 100% happy with autonomous cars driving around urban areas until they have done many miles on easier roads like Google has done. Uber have been doing this a lot less so should be restricted to main routes. Not saying it would have prevented this crash but it seems like some tech companies are running way before they can walk.
Listening to someone on Radio 4 the other days and it is clear that laws, governance, insurance policies etc,. have not kept up and all need to change to cater for the questions raised here.
The key one is that the insurance companies need to treat the autonomous car in same was as a person for liability.
Policy could also be void if you have applied latest patch which could catch a lot of people out in future...
in the not too far off future no pedestrians or cyclists will be allowed on roads with autonomous vehicles ‘for their own safety’ and that people will only be able to cross on crossings by law even in the uk.
I recon you will only be allowed to use the road system with some kind of "transponder" which can be pinged for a response of who where what and how fast and what direction. The transponder box will also control the speed of driven powered vehicles so there a no surprises for self driven ones. Having this would be useful helping with things like "blind" overtakes... who gets to go at box junctions, congestion reduction, forming conveys on motorways etc.
Bez's piece seems highly relevant here:
https://singletrackmag.com/wp-content/uploads/2017/09/the-law-will-be-fixed/
forget the law and technology..
who is actually mad enough to get into a coffin on wheels with a bit of software in charge?
Bez’s piece seems highly relevant here
Not really
As per comments under the article. No one is using V2X in any real sense for actual driving, as it's a logistical and technical nightmare. Plus the biggest markets in the world have patchy network coverage and large numbers of people with no access to that level of tech. Not to mention things like wild animals that you don't tag. (not a problem in the UK where the biggest thing most people hit is usually a malnourished badger, but hitting a kangaroo or Elk/Moose is a serious issue.)
Its more in use for things like improving traffic flow (Vehicle to traffic lights is something that's been talked about.)
who is actually mad enough to get into a coffin on wheels with a bit of software in charge?
About 3.5 billion a year as far as i can tell. ;o) And most of them are several thousand metres up.
The thing that interested me about the news coverage was that the victim's prior criminal misdemeanours were being highlighted within hours of her death. Marijuana convictions, complete with mugshots of her looking a bit ropey.
Must be a lot of people with significant sums invested in the 'transport of the future' to need the poor woman to be trashed so comprehensively, so quickly. In truth we have no idea who was to blame for her death.
People will have to get used to the fact, that in some circumstances, even the best autonomous vehicle will run people over and kill them. You would hope that they return better figures than the present arrangement though.
Money is driving the development, not some need for them. Regulators wont be able to regulate fast enough and we will end up with them being pushed onto the roads, ready or not. Money is the driving force here, not a need for them.
We don't live in a perfect world, no software will be perfect and no human will be. In both situations we still don't have a need for software driven cars.
There's some more context here: Tempe police chief says early probe shows no fault by Uber:. Seems like she stepped out in front of the car.
Edit: to remove random garbage put in by the forum software.
16 pedestrians are killed in the states every day by drivered cars, I think we should ban all cars until we can reduce that to zero.
And how many "drivered cars" are there in comparison to driverless ones?
People are far too trusting in this technology IMO.
In reality it will be the first piece of much evidence used to ensure that in the not too far off future no pedestrians or cyclists will be allowed on roads with autonomous vehicles ‘for their own safety’ and that people will only be able to cross on crossings by law even in the uk.
Unlikely, unless you're somehow also going to legislate against kids, animals, fallen trees, and any other type of obstacle that can turn up unannounced on a road. Which is the same reason Bez's article doesn't make much sense.
not a problem in the UK where the biggest thing most people hit is usually a malnourished badger
Deer, horses, cows and sheep cause a lot of damage even potential fatal for those in the car.
This is like the 1986 film Maximum Overdrive all over again.
From the article linked above:
Traveling at 38 mph in a 35 mph zone on Sunday night,
Regardless of whether the death was preventable, if confirmed this seems like a massive and very basic fail by autonomous software.
16 pedestrians are killed in the states every day by drivered cars, I think we should ban all cars until we can reduce that to zero.
Fair point, except autonomous vehicles aren't just being touted as a replacement for cars. https://www.wired.com/story/las-vegas-shuttle-crash-self-driving-autonomous/
you’d expect the human in the car to be ready to brake or take evasive action at any second if things didn’t look right.
I dont think you realise just how many miles are being racked up by autonomous cars in the USA. In May last year Google had done 3 million miles and bearing in mind it took them from 2009 to 2016 to reach 2 million the current total will be far higher and thats just Google. These "tests" are becoming really routine operations now.
In theory the person should remain alert but what do you reckon the chances are of that in reality.
But how many of those miles are trundling up and down the freeway between Silicon Valley and Vegas, easy as pie, and how many are nighttime passes along Mill St, Tempe past the college bars?
In theory the person should remain alert but what do you reckon the chances are of that in reality.
Google were letting staff test the cars - they had to stop the tests because people were falling asleep. Remaining alert just wasn't possible.
We don’t live in a perfect world, no software will be perfect and no human will be. In both situations we still don’t have a need for software driven cars.
except the approach you're advocating, maintaining the status quo, leaves the USA killing 36000 people a year on their roads. that's even more than get killed with guns. shouldn't we try to fix this problem if we can? if the introduction of driver-less cars means you kill one less human being prematurely isn't it worth it?
The police said that the vehicle was traveling 38 miles per hour in a 35 mile-per-hour zone, according to the Chronicle—though a @33.4350531,-111.941492,3a,49.5y,347.01h,83.57t/data=!3m6!1e1!3m4!1sx-K4_17J8MVthFRapvIa2A!2e0!7i13312!8i6656">Google Street View shot of the roadway taken last July shows a speed limit of 45 miles per hour along that stretch of road.
We don’t live in a perfect world, no software will be perfect and no human will be. In both situations we still don’t have a need for software driven cars.
Whilst there are all kinds of arguments for and against, it cannot be denied that our roads are a dangerous place, and always have been, with thousands of people killed and seriously injured in the UK alone, every year. And it might come as a surprise to some, but we have a pretty good standard of driving compared to many countries.
Most of those accidents are caused by human factors, so there is a lot of sense in removing the human.
Clearly software, and the engineers creating it, are not infallible. It comes with its own set of inadequacies, which are still somewhat unknown. However, with the danger already present on our roads, we should always strive to improve conditions.
In a capitalist society money will always be the driving force. But to make money, a product must have value. There is very much a need to make our roads safer.
Butcher, that right. But software wont be the answer when used in the real world.
In the real world software will fail.
Better training of drivers would be a good idea. Or we just accept that driving can be dangerous. Bit like riding a mountain bike, you can kill yourself.
If technology can make cars safer then we should be focusing on how to integrate it into cars with drivers. Eg. making speed limits compulsory.
Who the **** let Uber put driverless cars on the roads?
Every single thing about that company's MO is dodgy as hell and will use every legal and illegal trick in the book to further their own ends - acting fast to stay ahead of regulation. The most irresponsible form of capitalism.
As a software engineer - I think its a bonkers idea for so many reasons.
Top reason - machine vision just isn't sophisticated enough. I don't mean the cameras...no processer and software algorithm will ever get anywhere close to beating the human brain in its ability to make sense of the world around it. The road network and the world in which it exists is just too complicated - there is too much going on - too many objects look too similar - an infinite number of possible scenes to try an interpret. The human brain can make sense of all this clutter in an instant, put it all in context, filter the noise and immediately derive meaning in order to react. Its hard enough to get a machine to recognise "this is a lamp post", "this is a person". Its always going to be screwing up, misinterpreting and making mistakes - fatal ones - and it will do for at least the next 100 years I reckon.
Second Reason...this is primarily because machine vision doesn't work - The framework that the car exists in (the roads) needs to be vastly simplified and standardised before you even try this...as in the roads need to be designed for autonomous cars (not the other way around) . there needs to be sensors that the car can talk to every 100 yards to work out where it is. Realistically pedestrians cant be allowed anywhere near. you need to limit to an absolute minimum the variables that the car has to deal with. Think about it - we still have people driving trains! All they have to control is a throttle and a brake, the only time they are near people is in a few well defined locations (stations)
We do have a few roads currently that driverless cars have a chance of working on...motorways. I think it might be possible for motorways. But why! what is so terrible about driving a car - I get really bored sitting in the passenger seat - I don't get the need at all.
Nothing to add, but if you think about it, this car has now killed more people than the Terminator...cos the Terminator is a science fiction characer...but maybe skynet is coming...
Butcher, that right. But software wont be the answer when used in the real world.
In the real world software will fail.
Better training of drivers would be a good idea. Or we just accept that driving can be dangerous. Bit like riding a mountain bike, you can kill yourself.
In the real world people fail all the time. Like it or not, this is happening.
As a software engineer – I think its a bonkers idea for so many reasons.
Top reason – machine vision just isn’t sophisticated enough. I don’t mean the cameras…no processer and software algorithm will ever get anywhere close to beating the human brain in its ability to make sense of the world around it.
A number of massive tech corporations (not least of which Google) disagree with you. As above, this is going to happen. It won't happen for every household, everywhere in the world at once, but it is happening.
Second Reason…this is primarily because machine vision doesn’t work – The framework that the car exists in (the roads) needs to be vastly simplified and standardised before you even try this…
They are already trying it. There are fully autonomous cars on our roads and more and more US states and countries are opening up their legislation in order to facilitate the testing and ultimately the switch to autonomous cars. The ipad generation aren't interested in driving, they'd rather play with their phone, which is what they do now, while driving.
If self-driving cars take over as our primary form of transport, they'll kill an awful lot of people. Just not in a spectacular way that generates news headlines.
We need to stop building towns and cities on the self-fulfilling assumption people will travel by car. There is no future in which humans can sit down all day without paying an enormous health price. If driverless cars appear in streets anything like today’s, we risk falling into the most pathetic of robot uprisings, where they transport us helpfully from place to place while we remain inactive, growing fat and increasing our risk of cancer and diabetes.
"As a software engineer – I think its a bonkers idea for so many reasons."
Googles car uses LIDAR as well. This is what it sees:
/2013%2F05%2F03%2F51%2Fgoogle_car.cbddd.jpg)
Top reason – machine vision just isn’t sophisticated enough. I don’t mean the cameras…no processer and software algorithm will ever get anywhere close to beating the human brain in its ability to make sense of the world around it.
Is it not a case of they don't necessarily need to know what they're seeing but they do need to know its there and not hit it?
As an aside an algorithm has now been developed which means that using AI using LIDAR can see round corners (the processing part of the algorithm is fast but the collecting data bit is very slow so it's a ways off being useful yet) which I think is pretty cool!
They are already trying it
I know - that is why it is bonkers!
expect more carnage - or more likely - expect a lot of money to be wasted producing something that isn't safe and is eventually scrapped
As a human I think putting human-driven cars on the road is a terrible idea and the sooner they are replaced by autonomous vehicles the better for everyone.
[i]Trimix wrote:[/i]
Or we just accept that driving can be dangerous.
So we shouldn't bother doing anything to make it less dangerous?
[i]ndthornton wrote:[/i]
As a software engineer – I think its a bonkers idea for so many reasons.
All of which appear to be based on the premise that driverless cars have to be perfectly safe. I've got news for you - the current system screws up misinterprets and makes mistakes, fatal ones. It's kind of like the scenario where you're out walking and come across a bear - you only have to be able to run faster than the other person, not the bear. As mentioned numerous times on this thread we already accept a ridiculously high death rate on the roads and any argument against autonomous cars on safety grounds applies even more to human drivers. I'm fairly confident that current autonomous cars are already safer on average than human drivers.
Think about it – we still have people driving trains! All they have to control is a throttle and a brake, the only time they are near people is in a few well defined locations (stations)
DLR. Though a lot of the trains on the mainline network could run completely autonomously (and largely do) - the tech is all in place.
But why! what is so terrible about driving a car – I get really bored sitting in the passenger seat – I don’t get the need at all.
Well apart from the thousands of people killed on the roads every year, no, no need at all 🙄
expect a lot of money to be wasted producing something that isn’t safe and is eventually scrapped
That's pretty much the opposite of what I'm expecting. And as pointed out (repeatedly) it doesn't need to be perfectly safe, just safer than the current mess we have.
[i]ndthornton wrote:[/i]
expect more carnage – or more likely – expect a lot of money to be wasted producing something that isn’t safe and is eventually scrapped
What, more than we have already? I'm still not sure you appreciate just how flawed the current system is.
no processer and software algorithm will ever get anywhere close to beating the human brain in its ability to make sense of the world around it
as a software engineer i'm surprised you take that view. you'd be denying the idea that software and processors develop, which clearly they do, very quickly.
Googles car uses LIDAR as well. This is what it sees:
But what does it interpret form all that - that's the important bit. I can interpret that an intersection in a busy town. not quite so easy for a machine.
Even if it interprets perfectly, I think you can agree its lacking in detail. Would you be comfortable driving pas a school if that was the view through your eyes?
Is it not a case of they don’t necessarily need to know what they’re seeing but they do need to know its there and not hit it?
Then expect to spend a lot of time sat in the road not moving drumming your fingers on the dashboard 🙂
I’m fairly confident that current autonomous cars are already safer on average than human drivers.
And what are you basing that on. Not exactly huge pool of data out there in terms of autonomous car safety in the real world is there? In comparison there are far too many cars on the road for you to make a judgement on how dangerous they are just be reading the news.
Last year in the US there were 1.18 fatalities per 100 million vehicle miles.
Have we had a 100 million autonomous vehicle miles yet for this 1 fatality?
I don't know the answer to that and its statistically insignificant anyway with just 1 fatality- but Id be interested to know?
I wonder what rules the people in the 'driverless' cars have, in terms of when they should take control.
For example if you see a pedestrian looking like they might cross without looking properly (on the phone, drifting to kerb edge), do you:
a). Take over and slow down just in case (as you probably would if driving a normal car)
b). Wait until the car is a certain distance away to see if the car spots the issue (probably not if the pedestrian is still on the pavement but maybe it's a factor in some systems)
c). Wait and see what happens hoping you can brake quickly enough if the pedestrian does suddenly step out onto the road and the car doesn't auto brake?
Maybe it depends how experienced the guardian 'driver' is, I can see if you keep stepping in early (before a situation occurs) the car is never going to learn and you're not going to gather useful data. Over time you probably start to trust the car decision making more and give it more leeway, but obviously leave it too late and an accident is going to happen.
But what does it interpret form all that
Probably more than you do.
It'll have seen everything on and around the junction (and for 100m down the road as well) and also be tracking it in real time.
I’m fairly confident that current autonomous cars are already safer on average than human drivers.
This might have been true last week. Doubtful now.
We need to stop building towns and cities on the self-fulfilling assumption people will travel by car.
This I do agree with. Ultimately we will accept a number of accidents because of the benefits the road network brings to our society. It goes without saying that it's a fundamental piece of infrastructure, so much so, that any alternative is to most people unthinkable.
We probably really should be thinking about the alternatives though. Our roads are a hostile and unpleasant environment, and in an age where we can be in various places without actually physically travelling, there is less need for the cars to be on them. We have the potential to work local again and reduce our reliance on cars. We can encourage healthier, less damaging forms of transport. And something where an autonomous model might excel is in public transport. There's a lot of scope for improvement in the wider picture and I think we have to look well beyond the way we travel now.
There’s some more context here: Tempe police chief says early probe shows no fault by Uber:. Seems like she stepped out in front of the car.
Edit: to remove random garbage put in by the forum software.
Please, don't let whoever coded this forum anywhere near the team developing the driverless car software
I wonder what rules the people in the ‘driverless’ cars have, in terms of when they should take control.
For example if you see a pedestrian looking like they might cross without looking properly (on the phone, drifting to kerb edge), do you:
People often assume that the decision is the difficult bit. In realty just identifying the fact that there IS a human and it IS looking at its phone...This level of detail is impossible to start with.
I don't understand why we think making road transport autonomous is just round the corner when we haven't even been able to make railways autonomous which should be several orders of magnitude easier than making cars and tracks autonomous.
My view is that there is no way an autonomous vehicle can interpret the road and its surroundings as well as a good, experienced, driver. Without knowing it, good drivers often spot hazard cues long before they are identifiable as actual hazards and the brain is a very good filter of useful/irrelevant information.
However, it's likely that an autonomous vehicle will be significantly better than a large number of the other drivers who grace our roads.
If we ever get to a future where autonomous vehicles are far more prevalent, one side effect is that there will gradually be fewer of these experienced drivers, as people generally spend less of their time behind the wheel. You'll end up with an average standard which may be slightly higher than at present, and hopefully with a slightly lower mortality.
Expecting these vehicles to cut massively the number of casualties is unrealistic.
[i]ndthornton wrote:[/i]
People often assume that the decision is the difficult bit. In realty just identifying the fact that there IS a human and it IS looking at its phone…This level of detail is impossible to start with.
No, it really isn't. What sort of software do you do, because you seem to have a very poor understanding of just how good sensing and AI systems are nowadays. It's an area a lot of work has been done as part of this development (one I touched upon with work I did many years ago, but it's moved on a lot since then) and it's pretty amazing how well they can interpret the environment.
I'm basing my judgement of the relative safety on my knowledge of how good those systems are and how poor the average human is (if you search you'll find plenty of reports of how they did better than humans would have in incidents they encountered, and it's a piece of piss to find reports of where people were killed by human drivers in incidents autonomous cars would easily have avoided).
[i]chakaping wrote:[/i]
This might have been true last week. Doubtful now.
😆 - you're basing that on a single incident which from all available information it's very unlikely would have resulted in a different outcome had a human been driving?
We make judgements on pedestrians as we drive and drive accordingly. We even change according to the time of day and who is likely to be about.
A 40 year old walking along in work clothes at lunch time is pretty low risk unless Greggs is on the other side of the road.
Kids could do anything anytime.
Old people are slow but sometime deaf, can't see well and don't judge speed so well.
Drunks and bag ladies are likely to step out and wave a digit at you
Pub and nightclub turnout is best crawled past.
So we make a calculation, adapt speed, and maybe cover the brake, and maybe give a wide berth, and maybe even put the hazards on to warn people behind us.
We're so good at it that people who really want to kill themselves generally choose a train rather than a car or a truck. We might stop or slow down enough to only injure them. In future suicidal people will be able to choose any vehicle which will at least give train drivers a less traumatic time.
[i]uponthedowns wrote:[/i]
I don’t understand why we think making road transport autonomous is just round the corner when we haven’t even been able to make railways autonomous which should be several orders of magnitude easier than making cars and tracks autonomous.
Already done that one. DLR. I also know some people who work on systems for mainline trains, and those could be fully autonomous now on some lines, the driver is pretty much redundant in reality.
So what's stopping it happening? ASLEF?
Martinhutch
My view is that there is no way an autonomous vehicle can interpret the road and its surroundings as well as a good, experienced, driver.
What about mathematics? Can a machine be better at calculations than a human? What about chess? Can a machine be better than the best human at chess? What about Go?
There are superhuman AI's which are better than humans in many ways. Driving will just be the next thing on the list machines will be better at.
In almost any critical life or death decision, I'll take an algorithm over human "judgement". There will come a time when human will look back and think it reckless that we let them behind the wheel of non-autonomous vehicles.
aracer
Are you saying that its possible both in terms of hardware and software to create a system that can resolve....
a = "Child distracted by social media"
b = "Child looking carefully at traffic"
c = "Any one of an infinite number of similar looking scenarios"
...and be able to prove compliance 100% of the time in 100% of these scenarios with 100% reliability (because that is the level of verification required to get new technology on to a production vehicle)
Ill go back to the trains...automation would be easy - almost trivial. The fact that it hasn't happened should be a big alarm bell considering the problem and the risk is many orders of magnitude bigger with cars.
As for what I do - Cant say as a lot of it is classified - but Iv worked on autonomous vehicles that use LIDAR, machine vision and many other sensors. Would I still cycle to work if these vehicles were driving on the road - hahahahahano
Iv also spent a lot of time in the automotive sector which is safety critical - so I know how much verification and testing is required to change one line of code to modify the colour of the handbrake warning light !
Driverless cars...
Your living in a dream world Neo 🙂