Web
Analytics Made Easy - Statcounter
Pedestrian killed by self-driving car - Lotus / Motoring / Cars Chat - The Lotus Forums - Official Lotus Community Partner Jump to content


IGNORED

Pedestrian killed by self-driving car


Recommended Posts

FT Breaking News

"Uber has pulled all its self-driving cars off the road as police in Arizona investigate a fatal collision involving one of its autonomous vehicles.  The incident is believed to be the first time a pedestrian has been killed by a self-driving car, after a Tesla driver died in a crash while using its Autopilot system in 2016."

Sad but almost inevitable given the rush to implement the new technology when, at most, it has been developed to the point it can "just about work".   Ahead of the huge further development to make it robust, degrade gracefully, have fallback modes, to "care about itself and its own car", to anticipate "might happen though nothing apparent yet", to judge possibilities from driving styles and types of other vehicles, etc, etc. In short "human equivalent abilities", despite the errors we all make.  Masses more work needed (and many further constraints on human drivers).

My greatest concern is that the "intelligent system" is not afraid of death or serious injury itself.  We are 101% careful and cautious when pulling out of a minor road to fast traffic on a main one. Get it wrong and we will suffer far more from a side impact than the main road vehicle will from a frontal one - will they?

(I write having held a leading role in AI for some years.)

(Mods, sure you will move to an appropriate topic if one already exists.)

 

Link to comment
Share on other sites


Upgrade today to remove Google ads and support TLF.

Buddsy,  I should have been clear my AI work ended a few years ago (retired). But at the practical level I still keep an interest in the "practical" field.

I am aware of Geordie Rose, but his quantum world is way, way ahead of anything practical at present! Yes, the potential prizes are such that major organisations are trying to make practical quantum computers - but "trying" is the word.  Collapse of quantum states, and errors are such that many qubits are needed to try and get a correct output from just one or two.

If self-driving cars had any operation dependent on quantum calcs, I'd be even more scared!  (But Geordie Rose speaks of interesting things - Google will tell your far more than I can!)

 

Edited by mdavies
Link to comment
Share on other sites

Not the place for this (and I am not the person!) but: Yes Buddsy, the D-wave 2000Q is a computer, but not as we know them.

Highly, highly specialised and (this model) for a particular type of mathematical problem.  The "annealing" technique, essentially of trying many solutions of a problem with several variable factors to find "the best" by some measure, is well known as an approach. Not generally very efficient though and with current machinery the sort of thing used where a calculated precise analysis just isn't practicable.  (Just too complex, or some relationships between variables ill-defined, or would take too long.)  Note the glossy leaflet (below), in terms, speaks of "problems in areas such as", not "solving the area", and "multiple solutions".  Indeed, with a qubit approach and massive surrounding infrastructure to link the qubits to the real world - the one we ordinarily are aware of - I expect some central element of many problem areas can be presented as and addressed by annealing approaches.  With, in effect, parallel qubit analysis of many solutions.  (Other more direct applications in things like cryptography too, and multi-multi-multi interactions between sub-atomic particles, produced by the LHC for instance.) But not "computing" in today's terms.

It's a exciting area and potentially heralds a revolution in computing, but not one I would suggest for self driving cars!

https://www.dwavesys.com/sites/default/files/D-Wave 2000Q Tech Collateral_0117F_0.pdf

PS: Apart from superficial quantum principles, my D-wave knowledge doesn't go beyond that leaflet! No more here, please!

Link to comment
Share on other sites

The headlines really do have to be put in context. Google, Uber, Tesla and the rest have driven millions of kilometres in self driving and fully autonomous mode, either in the real world or under test. All accidents thus far have been down to human error.

This is the first pedestrian fatality. In the US alone over 100 pedestrians are killed by human drivers every day. Globally there are over 1.2 million fatalities on the roads. 

Whilst Uber have done the right thing, it's doubtful that autonomous mode was at fault. Given there was a driver on board who is supposed to intervene, it's more than likely to be human error, perhaps even the pedestrian, sad though that is.

Autonomy is going to happen and if that means human error can be virtually eliminated, I'm all for it.

As long as I can still drive Esprit 😊

Link to comment
Share on other sites

hedgerley,  yes, I think the general situation re self-driving cars is broadly understood. But generalities are not what we experience from behind a steering wheel - matters are highly personal!  I am primarily concerned at the premature introduction of SDCs. I entirely share the principle of your last sentence, but how will you feel when your precious Esprit is hit by a SDC and its human occupants shrug and refer you to some US techy outfit - particularly in circumstances that no human "other driver" would have entered?  Who will you exchange eye contact/nods/waves with at tricky junctions, or in congested circumstances? (Absolutely essential for safely managing a cross roads near me.)  How will you tell that an SDC joining your high speed road from a minor one has registered your 60 mph approach when you can't see them look at you? How confident will you be in risking overtakes of one or two of a string of SDCs (and they will string, on their identical systems) as they accelerate at a quarter of the rate that you can safely manage on a safe overtaking road? How will you feel about heavy restrictions introduced on roads for all traffic to accommodate SDC's limited capabilities?  And it's just not so that all SDC accidents have been due to human error. They have shot red lights, driven into the back of vehicles, just not seen things, and of course control has had to be taken over by human monitors.

Yes, I expect SDCs to happen - but I'm concerned about sharing the road with them. In any car,  not only my Lotus. They have been developed to operate in a formalised, specified, routine, definable, and substantially predictable world. I don't find the human driving world to be quite like that, and certainly not without any exceptional happenings. (Though I expect it to become rigorously more so, to accommodate SDCs.)  My first post above mentioned some of the "soft characteristics" work that I was involved in found to be essential in a hugely simpler domain - air traffic.

I hope I've made my position clear! No more about this!

Link to comment
Share on other sites

  • Gold FFM

 

utterly tragic,  and whilst not privy to full details its obvious to any person with even a little  intelligence this is going to happen

 

...  autonomous vehicles are not needed  full stop... 

trams, rail cars, airport parking pods.. yes, but not on public roads where the variables are to huge..

an example where an autonomous vehcle would fail every time..

i took my son to work on friday in a heavy snow flurry,  some bits of road were wet some were tracked snow, when i approached a roundabout i could see tracks in the snow where cars had locked up approaching the junction and had gone straight on a  bit...  i was going quite slowly but slowed even more..  it was basically packed hard snow, snow tyres were in effective you just and no friction...  ..

i made adjustments based on my experience, seeing the road surface ahead, the pack snow, and how zero traction applies, and made it round the roundabout..

5 mins later i returned after dropping lad off... there was a car into railings, a brushing glance he was lucky...

two hours later on return journey, the railings were obliterated with two cars stacked well and truly into them.... 

i picked up lad and took him to show him the crash site, (hes having lessons at the mo so it was a good experience) i then took him to a empty industrial car park, made him driver round top experience  how an small amount of snow changes the grip level...  he could believe how the car just did not want to stop, it  just slis and slid.. i turn of traction control and he tried to accelerate....  it was a great lesson.. one ive taken my wife on and will my  daughter..

upon leaving and driving home, lad says to me out of teh blue..

id like to see an autonomous vehicle do what you did dad at that roundabout..

despite being quite proud that lad recognised that experience had prevented us from hitting the railings by recognising a number of factors....  he has a point..

 

there is no way that change in surface grip could be programmed for an autonomous car..

i really have  a dislike for autonomous cars, they have no purpose..  if your in it you may as well drive it!....  its like someone re inventing the wheel and making it square..

 

 

  • Thanks 1
Link to comment
Share on other sites

Your points are well made Mel and I wholeheartedly agree with them. I have direct experience of autonomous driving and convoys of linked cars travelling at speed a meter or two apart, as well as being a passenger driving at 60mph towards a large concrete block and relying on radar, lasers and cameras to pull us up in time. Its scary stuff, even in the controlled environment of a test track without other traffic and drivers to worry about! There is much to overcome before autonomous vehicles become ubiquitous, as you rightly point out.

But let me be clear, I was not responding to your post; my comments were aimed squarely at the sensationalist reporting of the pedestrian death and the need for context, as I clearly mention, that's all.

Link to comment
Share on other sites

Mark, thanks for putting me right on your angle. Sorry, a very busy day and I didn't pay enough attention (good no driving involved....)

Yes, context essential, and perhaps this sad accident was unavoidable by any driver at all. Expect/hope it will be well reported in due course. I bow to your related practical experience - fascinating!  Can't quite say envious! You are better qualified to comment than most - any more details you can give?

Buddsy, coincidentally with our earlier posts, from another area entirely I've just been looking at something near the leading edge of work on practical management of quantum bits. The sort of thing that could lead to more computing "but not as we know it". (Was it Einstein that said the quantum world was stranger than we could know?  Except for Geordie Rose, I guess.)  Nice pictures anyway.

https://phys.org/news/2018-03-quantum-bits-dimensions.html

 

 

 

Link to comment
Share on other sites

Even with the current state of technology it seems fair to say autonomous vehicles will reduce the chance of accidents and will, all in all, be safer than todays eating, drinking, phoning and texting drivers. However, there will remain situatons where autonomous cars are equally incapable of avoiding accidents, including deadly accidents. 

How society reacts to these accidents is a whole other matter. We are fully accostomed to fatal car accidents where one human's fault leads to the ptential death of another, as we are also fully trained to accept that one human's faults can lead to his own death because another human could not avoid the accident. But, how does society react if a computer's fault leads to a human's death or a human's fault leads to a fatal accident that even the computer couldn't avoid (which seems to be the case here)?

If you have the choice between a Stairway to Heaven and a Highway to Hell don't forget the Nomex®!

Captain,  Lotus Airways. We fly lower! 

Link to comment
Share on other sites

There's a name for the paradox, but what choice does an autonomous car make when it's got the option of crashing into a line of people stood in a road or driving off the road into a river. Does it harm the people on the road or it's occupant(s)?

For forum issues, please contact the Moderators.

Link to comment
Share on other sites

  • Gold FFM

The only autonomous car that I have seen was the 'Johnny Cab' and I think that went pretty badly for the car in the end.

All we know is that when they stop making this, we will be properly, properly sad.Jeremy Clarkson on the Esprit.

Opinions are like armpits. Everyone has them, some just stink more than others.

For forum issues, please contact one of the Moderators. (I'm not one of the elves anymore, but I'll leave the link here)

Link to comment
Share on other sites

14 hours ago, mdavies said:

Yes, context essential, and perhaps this sad accident was unavoidable by any driver at all. Expect/hope it will be well reported in due course. I bow to your related practical experience - fascinating!  Can't quite say envious! You are better qualified to comment than most - any more details you can give?

In a previous life Mel I was fortunate, amongst other things, to be responsible for strategic relationships with vehicle manufacturers, transport authorities, mobile device vendors, traffic information suppliers and other motoring organisations. I spent time consulting with Ford, GM, Rover, Toyota etc. even Rolls Royce/Bentley about the integration of mobile devices and services into their products, several years before they actually appeared. I was trialling GMs OnStar in a wintry Detroit a couple of years before it made production. I was into traffic telematics, integration of in-vehicle comms with positioning systems (GPS and cellular), traffic information, road tolling trials etc. Interesting times given the immaturity of the technology back then. Now with in vehicle Wi-Fi, Bluetooth, 5G etc we don't give it a second thought, notwithstanding the comments above!

I happened to be delivering a paper at a telematics conference in Yokohama when Toyota hosted a session at their test track where the aforementioned buttock clenching demos took place. I was in the front passenger seat surrounded by laptops, loose wiring and assorted devices as we headed towards a very solid object when the driver took his feet off the pedals and hands off the wheel. I survived, obviously. Then we did the convoy thing round the high speed bowl a few meters apart. Spooky. More scary though was the small earthquake we had in the middle of the night - I was on the 20th floor of the hotel by the harbour......:blush:

Earlier in my career I struck gold when McLaren happened to call the office seeking advice on mobile performance, external antenna (or not - Murray wouldn't allow one.....) and data comms - for the F1 road car. I visited the Woking production facility several times in 93/94. For a petrolhead it was nirvana ;).

Link to comment
Share on other sites

Err..................................................................I've a bit of a problem with my interior light switch, Mark.  What do you recommend?!

Great stuff - and I suppose you were paid as well!   My being able to choose a few interesting company cars doesn't really compare does it!

 One perhaps relevant thought (and re TBD's mention above of phoning and texting drivers), in your professional discussions was there any consideration of avoiding driver distraction in serious ways?  Such as greatly limiting screen displays and operations when the engine is running?  Perhaps only the simplest physical touch operations on very simple screen displays - any more options by voice only?  Jamming phone handsets inside the car?  Very limited satnav operation by steering wheel buttons only? Those are top of the head illustrations, not proposals.  I'm only asking whether such was ever considered - on the basis that driving itself is quite enough of a challenge?

 

 

 

Edited by mdavies
Link to comment
Share on other sites

Having just watched the footage of the accident several times, firstly I wouldn't have been pushing a bike across what looks to be a dual-carriageway, without any lights, and in the dark, and secondly I doubt whether I'd have been able to stop in time from that speed anyway. In my opinion, it's highly likely the accident would still have happened with a human driver behind the wheel.

Margate Exotics.

Link to comment
Share on other sites

When I was involved Mel in-vehicle screens simply weren't around and it was before the era of the smartphone, apps, detailed on-board navigation systems etc. We were playing with the basic technology, seeing if it could work and it was mostly done in a controlled environment. It would be a few years before issues around the HCI, usability etc would need to be factored in. By then I'd moved on. But I agree, I think when on the move screen access should be limited, probably to basic voice commands truth be told. 

By the way, I've now seen the video of the Arizona fatality. The car is travelling at 38mph. From appearing in the headlights to impact is about one second. Apparently the sensors did not detect the woman so yes, there are questions to be asked. Having said that, there is no way an autonomous vehicle could have seen her, reacted, applied the brakes and stopped in time. The laws of physics still apply. Equally, there is no way the driver could have pulled up and she has in fact been released by the police. Tragic though this is, the fault lies with the pedestrian. The fact the car was in autonomous mode is irrelevant in my view although there does appear to be an issue about the technology in this case. It will be interesting to see what the investigation throws up.

Link to comment
Share on other sites

Tragic as this is, there is NO WAY that this kind of accident will ever be 'programmed' out of this technology.

The only way to reduce the chances of failure is to make EVERY car on the road have this technology AND to permanently remove pedestrians/deer/dogs etc from the driving environment.

That might (?) just be possible on motorways but certainly not on the roads around where I live. Do I drive to a motorway 'transit hub' in my own car and then get in one of these vehicles? What happens at the other end when I want to end up in another rural area? (isn't this what park and ride already achieves?)

This is technological know-how determining the future and I just don't see it making sense. I agree that tech can and probably will help save the planet but why, in all honesty, do we need this?

Does that make me a dinosaur or simply very, very intelligent? B-)

Is the price for that bit in Yen or £?

Link to comment
Share on other sites

Actually the video just shows the limits of the current technology, nothing else. The pedestrian was obviously already more than halfway across the street when the Uber car hit  her, she just wasn't visible in the darkness. She there quite some time, the car just didn't see her. 

As such the accident had little to do with the fundamental issues concerning autonomous cars, just an inacceptable limitation on behalf of the implementation by Uber. Seeing through darkness, may be a technological challenge, but surely solvable.

 

If you have the choice between a Stairway to Heaven and a Highway to Hell don't forget the Nomex®!

Captain,  Lotus Airways. We fly lower! 

Link to comment
Share on other sites

Just seen the  video. As I kicked off this topic I must comment, but key point is that IMO that video is no basis for making a judgement.  Just three observations re what can be seen in the video as presented.  

1. The forward lighting projected from the car is very poor.  Very, very poor.  (I have never driven with that apparent lighting.) Wonder if that is an effect of the video, or the driver not going to main beam - as he might imagine there is no need, given the auto sensing/control?

2. The unlit woman (crazy behaviour indeed, but not the point) is indeed apparent only a very short time before impact. With decent lighting , would that have been longer?

3. The question is not whether the car could have stopped - unlikely verging on the impossible - but whether a human driver would have instinctively (no thought process) swerved away. (And perhaps had put the lights up.)  Hands on wheel of course, as opposed to that "monitor". (What a job!)  No judgement, but I see that as a valid and open question.  And that is the question. Not who was at fault, but would a human have swerved. (Whether effective or not.)  And why was there no trace of a swerve from the auto system?  I think the answer to the second of course is that the system had no such instinctive reaction or ability.

 

 

 

Link to comment
Share on other sites

I dont know the specs of the camera used for the footage, but there are several images od the same area from other dashcams, which suggest it is far better lit. Which might explain why the headlights in the official vid seem to illuminate only a couple of meters ahead of the car.?

If these images more accurately reflect what you or I could see, then i'd expect the human should have been able to see the pedestrian well in advance. Had they not been playing with their phone or whatever. Being Amercuh I am sure she will be sued and deemed liable. Horrendous job - sit here and do nothing for a full shift, but be ready to intervene should anything go wrong....

6wAbdFe.jpg
 

Link to comment
Share on other sites

Informative - the whole article is very relevant.

"Tempe Police Chief Sylvia Moir has said the SUV would probably not be found at fault, but two experts who viewed the video said the SUV’s sensors should have spotted the 49-year-old woman and her bicycle in time to brake.

Reviewing the video footage, the experts concluded that it appears there was enough time and distance to avoid the collision."

https://eandt.theiet.org/content/articles/2018/03/us-grants-100m-for-driverless-car-research-days-after-uber-death-in-arizona/?utm_source=Adestra&utm_campaign=New EandT News - Automation FINAL - MEMBER&utm_medium=Newsletters - E%26T News&utm_content=E%26T News - Members&utm_term=https%3A%2F%2Feandt.theiet.org%2Fcontent%2Farticles%2F2018%2F03%2Fus-grants-100m-for-driverless-car-research-days-after-uber-death-in-arizona%2F

Link to comment
Share on other sites

  • Gold FFM

What about the elephant in the room? The human observer was quite obviously distracted by a mobile device. You don't look away from the road and grin at times for no apparent reason.

If the human observer is in the car to be paying attention, that was clearly not what was occurring. Others have said above that the sooner cars are automated, the better. I beg to disagree. Here is an automated car with a human observer who thought it was ok to take their attention away from the road because the car is 'automated'.

I would be reticent to hop in one.

All we know is that when they stop making this, we will be properly, properly sad.Jeremy Clarkson on the Esprit.

Opinions are like armpits. Everyone has them, some just stink more than others.

For forum issues, please contact one of the Moderators. (I'm not one of the elves anymore, but I'll leave the link here)

Link to comment
Share on other sites

  • 2 weeks later...

This post is not "because there has been another fatal auto-driving crash".  (Tesla this time by the way.) Issues surrounding them have been fairly well covered already.

No, it is because I am amazed by the apparently trivially simplistic mechanism that the Tesla lane-following process was using, indicated by the amateur "reconstruction" covered by the link.  That it seems the system was merely trying to find a lane edge line, and when it failed (line obscured, missing for some distance, whatever, such is "normal) it simply kept veering further - into a wall - is incredible.

As a basic minimum I would expect the system at any and every point to know how many lanes there were, their width, to know that the road does not extend indefinitely etc.  And hence be able to adjust its guidance and its seek process and criteria dynamically - as a human does. 

That again some human "tester" was not alert is a) irrelevant for such a non-externally imposed fault, and b) only to be expected as typical of anyone in - and having faith in - such a vehicle.

https://electrek.co/2018/04/02/tesla-fatal-autopilot-crash-recreation/

 

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We use cookies to enhance your browsing experience, serve personalized ads or content, and analyze our traffic. By clicking " I Accept ", you consent to our use of cookies. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.