Web
Analytics Made Easy - Statcounter
Pedestrian killed by self-driving car - Page 2 - Lotus / Motoring / Cars Chat - The Lotus Forums - Official Lotus Community Partner Jump to content


IGNORED

Pedestrian killed by self-driving car


Recommended Posts

  • 1 month later...

Upgrade today to remove Google ads and support TLF.

I need add little further regarding my belief that current so-called self driving vehicles are far from safe enough to be allowed on public roads. Whatever the sensitivity of their sensors, their interpretive and reactive capabilities are primitive compared with those of human drivers. The term "false positive" summarises the need in doubtful cases to bring to bear a very large amount of background knowledge, experience, balance of probabilities and consequences of error that the automatic systems just do not have.  The same applies in many safety critical systems. 

From Telegraph 8th May. (A full report expected soon.)

________________

The self-driving car which killed 49-year-old Elaine Herzberg in Phoenix, Arizona in March saw the pedestrian as a "false positive", causing its on-board system to decide to ignore her rather than swerve to avoid the crash, according to The Information.

The car's sensors detected the pedestrian, but according to Uber's internal investigation into the crash, the self-driving car had been tuned to ignore obstacles it didn't deem a risk.

Self-driving cars have been having problems with so-called "false positives" - avoiding small objects that human drivers would normally ignore or drive over.

The car had been programmed to ignore more of these warnings, so that the car can drive over certain obstacles, such as a plastic bag floating in front of it, rather than seeing them as a risk and swerving to avoid them.

However, the car's software was programmed in such a way that it ignored Herzberg crossing the road despite seeing her in with its sensors, causing the deadly crash.

 

 

Link to comment
Share on other sites

Never mind what I think - it has been some years since I had something of a leading role in earlier stages of AI - but see what a current leading expert has said at the Royal Society. (The Royal Society!  Not an honour I ever came near enjoying!)

From the Telegraph article:

“The 41-year-old has made several major breakthroughs in AI, most famously leading the development of AlphaGo, software that defeated the best human players of the Chinese board game Go.”

“Speaking at the Royal Society in London, Dr Hassabis said: “If we are going to have self-driving cars, well maybe we should test them before putting them on the road rather than beta testing them live on the road like what we have now. Is that responsible, really?”

https://www.telegraph.co.uk/technology/2018/05/13/ai-expert-warns-self-driving-car-dangers/?WT.mc_id=e_DM755070&WT.tsrc=email&etype=Edi_Cit_New_AEM_Daily&utm_source=email&utm_medium=Edi_Cit_New_AEM_Daily_2018_05_14&utm_campaign=DM755070

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

Today's FT carries a piece reporting some details of the operation of the UBER "self driving" car that resulted in the pedestrian death. I only spotted it as a library was closing and have no record or access to it now.  Hence I can give only the key basic facts I recall. All in my own words.

Both the car's radar and lidar systems "saw something" ahead 6 seconds before impact, when travelling at 43mph. It braked 1.3 seconds before impact having taken the interval to interpret, analyse, recognise, call it what you will, that it was not something to ignore. No alert at any time was given to the "supervising" human driver.   It had been programmed not to brake - at all, as I read it - on initial detection, to avoid irregular/unpredictable behaviour. 

How that car's operation differed from what any attentive human driver would have done in the, admittedly difficult, circumstances needs no comment. Beyond that, it is difficult to fathom how a system could have been approved that gave no prompts to the "supervisor" whilst there was ample time to slow, and stop or steer away from impact. 

In my opinion the responsible design approving engineer should appear in court on a charge amounting to culpable homicide. Whatever the verdict, he should be sacked and stripped of whatever professional status he holds. If his status is not at least that of Chartered Engineer or US equivalent, he should not have been appointed. If the UBER technical team was not organised within a professionally qualified design approving hierarchy, they should be closed down.

As well as the obviously necessary changes to the general operating logic, as a basis for technical redesign the computing hardware needs to be 10 times more powerful, and correspondingly faster, as a minimum. That car was disgracefully unfit to be on the road.

 

Link to comment
Share on other sites

It's behind the FT paywall - however, if you search for "Self-driving car crash report highlights Uber safety shortcomings", you will get a link to the FT page which you can then read.

34 minutes ago, mdavies said:

It had been programmed not to brake

Sounds a bit worse than that: "The top US accident investigators also found that Uber had disabled the car’s emergency braking system but did not install a warning system to alert the human safety driver who would take control in an emergency."
 

 

Link to comment
Share on other sites

Thank you for digging out that way of accessing the FT report, Chris, and for quoting that important sentence.  I had about 30 seconds to scan the piece and concentrated on the key data.

In fact though, the internet version does not correspond very closely to the one printed in the paper.  It gives all the same messages, but the printed one is more detailed on specifics (43 mph, not "about 40") and gives the timings of 6 and 1.3 seconds. On the other hand it does not contain as many quotations and comments. I want to mention that in case anyone concludes that my version was not well founded.

Adding to my comments about the qualifications and organisation of the design and approval teams - I do trust they were not one and the same - I wonder too about the State and Highway authorities that gave permission for the vehicle to be on the public road. What inspections and technical analysis happened before permission was given?

I suspect the rapid close-down of the programme and cessation  of testing was to pre-empt the likely conclusions of any enquiry. Too much that would not stand the light of day, perhaps - as a purely personal and uninformed opinion of course. 

Link to comment
Share on other sites

  • Moderator

Just read there's been another accident with a Tesla on autopilot. It totaled a parked police car in California. The article says it's the third accident with an emergency vehicle, with 2 firetrucks being hit by autonomous Tesla's earlier this year.

I have made many mistakes in my life. Buying a multiple Lotus is not one of them.

 

Link to comment
Share on other sites

Escape, re that case, I don’t want to bore on this topic so I've copied below only the extracts that made me gasp in reading a report on that Tesla, travelling on what it calls Autopilot but is an “assistance system”, that drove into the parked police car. General information has been reported.

Autopilot covers lane centring, lane changing [Really??!!], adaptive cruise control and parking things. Extracts:

------

Tesla vehicles using the partially autonomous system have been involved in several accidents caused by difficulty detecting stationary objects, with two previous instances already this year. Earlier this month, a Tesla vehicle using Autopilot accelerated into a stationary fire engine near Salt Late City, Utah.

------

Tesla has stated in its instruction manual that Autopilot is not designed to avoid collisions, and may not brake or decelerate for stationary vehicles.   [!!!!!!!!]

-------

......before a driver can use autopilot, they must accept a dialogue box which states that ‘autopilot is designed for use on highways that have a centre divider and clear lane markings’.

------

 The reason I am incredulous is that from the above official Tesla information, Autopilot, in appropriate use on clearly marked roads, may not stop behind a stationary vehicle.   So if you or I are brought to a halt by traffic, better not have a Tesla coming up behind!

And Tesla has the gall to state that that they have not addressed what is surely a primary design requirement!  How are the vehicles allowed on the road?

 

(I have left out some references to other Tesla crashes.)

 

https://eandt.theiet.org/content/articles/2018/05/tesla-car-running-on-autopilot-crashes-into-parked-police-car/?utm_source=Adestra&utm_campaign=New EandT News - Automation FINAL - MEMBER&utm_medium=Newsletters - E%26T News&utm_content=E%26T News - Members&utm_term=https%3A%2F%2Feandt.theiet.org%2Fcontent%2Farticles%2F2018%2F05%2Ftesla-car-running-on-autopilot-crashes-into-parked-police-car%2F

Link to comment
Share on other sites

As it happens, just an hour ago, I had a "our cars" chat (very amicable!) with the driver of a new and shiny looking Tesla at a garage where I was filling up the Evora. (Shiny, if not new.)  Of course, after we had paid the usual compliments about each others vehicles, compared acceleration figures etc. - he would beat me to 60, but I would leave him after that - I enquired about his Autopilot use.

He uses it a great deal, on motorways and similar, and generally it works well, maintaining distance and speed in heavy traffic. However it was "too logical and precise" in following the centre of lanes - because of course, we humans take account of the type and width of the vehicles in lanes beside us, and their lane positioning.  Fiat 500 or an giant artic?  Dead centre is often not what we choose, and so he would find himself fighting the Autopilot - when it  disengages automatically. [I did not say "fingers crossed".]

Of course I left the "stationary vehicles" topic until the end of our talk. Interesting and highly relevant to my post above is that though he was a sharp youngish chap - not an old fogey, as he may now be reporting about me - he was not aware of: "stated in its instruction manual that Autopilot is not designed to avoid collisions, and may not brake or decelerate for stationary vehicles. In his experience it did work in that respect.

If the case is that it usually does but may not "brake or decelerate for stationary vehicles", I see that as a far worse situation, not only being liable to "not" at any time, but doing so unexpectedly, having lulled the driver into excessive dependence previously.  Of course that may not be so any more - but why then is Tesla not announcing loudly that the cause of the previous smashes - I will not call them accidents - has been rectified?

Could it be that in the litigious US culture, explicitly recognising a fault could clear the way to multi, multi million dollar claims? Some people have been hurt, and in one case with cause not yet attributed, two youngsters killed.

 

 

Link to comment
Share on other sites

  • Moderator

Good info Mel! That does put the autopilot in a totally different light. I agree a system that usually works but can't be trusted a 100% is by far the worst case scenario. Especially if it is something that is used specifically to lighten the attention level needed. If the autopilot doesn't take into account the width of vehicles in adjacent lanes, I wonder how it would react if a nearby vehicle crossed the line just a bit. I saw this happen when an artic took its turn a bit wide and the trailer used a good chunk of the next lane. The human driver there quickly moved to the other edge and all was fine, luckily.

I have made many mistakes in my life. Buying a multiple Lotus is not one of them.

 

Link to comment
Share on other sites

  • 2 weeks later...

I have previously made clear my views about the current state of development of any “self-driving” cars and the, IMO, remaining gulf to anything that should be allowed into general traffic.  This post is regarding only the Tesla (in “autopilot” mode) that crashed into a concrete barrier on 23rd March, killing the “driver”. It is extracted from a Guardian piece on 8th June. I have made the final phrase bold. I believe it requires no further comment. (Underlines are for Guardian links.)

 “......four seconds before ....... the car stopped following the path of a vehicle in front of it. Three seconds before the impact, it sped up from 62mph to 70.8mph. The car did not brake or steer away, the National Transportation Safety Board said.  ..........  The Tesla battery was breached, causing the car to be engulfed in flames” .

The Guardian goes on to say: “......the company has repeatedly sought to deflect blame on to the driver and the local highway conditions............ Musk has also aggressively attacked journalists writing about this crash and other recent autopilot collisions, complaining that the negative attention would discourage people from using his technology.

https://www.theguardian.com/technology/2018/jun/07/tesla-fatal-crash-silicon-valley-autopilot-mode-report

Link to comment
Share on other sites

Telsa branding of autopilot is poor and misleading. Plus they are beta testing with lives. The only one worse is Uber. Tesla's camera supplier (Movidius) disowned them lost the business rather than be associated with the "car crash" that was coming stating that the capabilities were being over hyped.

Link to comment
Share on other sites

Andy, and others. Something potentially of interest to anyone who drives in the UK. I do, where we have Teslas on the roads, hence I highlight such information.

This is re Tesla testing here by Thatcham, currently on the BBC News, Technology section.  Of special interest to Tesla owners such as the chap I spoke with a few days ago who regularly uses the 'Autopilot' on motorways. It seemed he was quite unaware of any such issues.  The BBC piece covers more, but here is an important extract. 

"To demonstrate the dangers of partial automation, Thatcham took a Tesla out on its test track at Upper Heyford, in Oxfordshire.

With the Autopilot system switched on, the Model S kept in lane and slowed to a halt when a car it was following encountered standing traffic.

But on a second run the car in front switched lanes at the last moment, and the Tesla was unable to brake in time, running into a stationary vehicle."

Just the sort of lane switch that happens. An unfortunate stationary driver would likely not even have been aware there was a Tesla a couple of cars back. A dangerous vehicle lurking in attack range, I will allow myself to say!   

 

https://www.bbc.co.uk/news/technology-44439523

  • Like 1
Link to comment
Share on other sites

The above Thatcham experiment was properly videoed - dramatically!

https://www.bbc.co.uk/news/av/business-44460980/this-car-is-on-autopilot-what-happens-next

At the end Tesla basically says "not a problem".  Thatcham disagrees. I know who I believe, both re the validity of the experiment and the conclusion.

  • Like 1
Link to comment
Share on other sites

  • Gold FFM

Our Honda Accord has Hondas version of this. The car can maintain its central position in a lane. It has adaptive cruise control which can maintain a set distance from the car in front even if the car in front slows down and then speeds back up again. The car can also brake automatically if need be. The car always reminds you to keep your hands on the wheel. I presume that the servo motor that drives the steering wheel also has a feedback loop that senses a human inputting slight variations to the steering.

Is it described as autopilot in the manual? No. Does it do a decent job? Yes. Would I use it in any sort of real busy traffic? Not on your life. The car can sometimes brake on freeways if you are going into a right or left hand sweeping bend. If you can imagine, a vehicle that enters into the radars range as you slowly turn, is registered by the radar as not doing 100km/h for example. The car at that speed knows the distance will be closed quite quickly and it brakes. Quite severely in some cases. Enough so, that depending the road, my wife and I don't use the adaptive cruise control as the car can brake so heavily that the following car could easily run into you since they, invariably, are driving too close anyway.

The Lane Keep Assist System (LKAS, maybe it sounds better in Japanese) can also not account for a vehicle just in front or beside you from wandering into your lane.

I don't consider these systems automated by a long shot.

 

All we know is that when they stop making this, we will be properly, properly sad.Jeremy Clarkson on the Esprit.

Opinions are like armpits. Everyone has them, some just stink more than others.

For forum issues, please contact one of the Moderators. (I'm not one of the elves anymore, but I'll leave the link here)

Link to comment
Share on other sites

  • 2 weeks later...
  • 2 months later...

Fortunately no injuries from this one (several similar mentioned in the report) but it deserves a place on this thread for illustrating one of the fundamental issues addressed by initial posts: incompatibility between automated and human drivers. I believe that issue will be a major one when, and if, automated cars are introduced in significant numbers and could lead to far more serious collisions. (I will not use the term "accident" for such circumstances.)

Part of report below: see https://www.bbc.co.uk/news/technology-45380373 for the full version.

_________________

An Apple test vehicle in autonomous mode was rear-ended while preparing to merge onto Lawrence Expressway South from Kifer Road,” the incident description reads.

“The Apple test vehicle was travelling less than 1 mph waiting for a safe gap to complete the merge when a 2016 Nissan Leaf contacted the Apple test vehicle at approximately 15mph.

"Both vehicles sustained damage and no injuries were reported by either party.”

The DMV does not attribute blame in its reports. Self-driving cars being rear-ended, however, might be considered a trend. A recent report by investigative technology news site The Information revealed teething problems at Waymo, the self-driving car company spun out of Google, where there have been headaches caused by what humans might consider over-cautious driving.

The self-driving cars would stop abruptly in scenarios where humans might zip through, such as turning across a line of traffic.

"As a result, human drivers from time to time have rear-ended the Waymo vans,” the report noted.

Link to comment
Share on other sites

  • 4 weeks later...

No comment, other than I expect they will be seeking Elise and Caterham owners to take part in the “platooning for merging and de-merging scenarios”, just to check they will be able to cut into the platoons in order to get to their exit slip road.

 

... construction of a 1.2 million square foot test track for driverless cars .....

.....Horiba Mira also rejected calls to locate the facility elsewhere, saying that if it was built further south of the field it would be shortened in length which would prevent many of the tests being carried out, including autonomous truck platooning for merging and de-merging scenarios with fully laden vehicles.

https://eandt.theiet.org/content/articles/2018/09/battle-of-bosworth-site-to-be-covered-by-driverless-test-track-despite-local-opposition

 

  • Sad 1
Link to comment
Share on other sites

 

44 minutes ago, mdavies said:

.....Horiba Mira also rejected calls to locate the facility elsewhere, saying that if it was built further south of the field it would be shortened in length which would prevent many of the tests being carried out, including autonomous truck platooning for merging and de-merging scenarios with fully laden vehicles.

https://eandt.theiet.org/content/articles/2018/09/battle-of-bosworth-site-to-be-covered-by-driverless-test-track-despite-local-opposition

Bloody hell - paving over the Battle of Bosworth, just how much did the councillors get to approve this! This is one of the very few things I remember being taught in my school history lessons - one of the real landmark events in British history.

Link to comment
Share on other sites

Indeed a shame, March, but looking to the future rather than the past, I am more worried by the prospect of trying to live with those platoons on our major roads. "Live with" in both senses of course.

Link to comment
Share on other sites

  • 5 weeks later...

That's all very well, but, but, but...........

https://www.telegraph.co.uk/technology/2018/10/25/paramedics-told-watch-driverless-cars-driving-accidents

What about the circumstances where a no-driver car (mis)behaves in a way that forces a driven vehicle to swerve to avoid impact? Where, as a consequence, the driven car hits another vehicle, perhaps scraping by a parked one? Or hits a lamp post or a tree and its occupants are injured? Or even hits a person? A myriad potential circumstances where the no-driver car suffers no, or negligible, impact and its whatever sensors are not triggered and it drives on “obliviously”?   Or perhaps is slightly damaged, stops and the occupants of the victim vehicle get out and, “untrained” do “the wrong thing”  and are injured by the no-driver car, perhaps now  with damaged sensors?

What about no-driver cars and cyclists? By (mis)behaviour or by “not seeing” causing a “brush-by”, or passing just too close, causing the cyclist, even if not entirely blameless, to fall to the ground. And the no-driver car, oblivious, simply carries on? Witnesses? What witnesses?

It is irrelevant that such things happen with driven cars. Of course they do - but driven cars stop. Their drivers investigate and do whatever. A serious offence not to.

Oh, they will be insured, that solves everything.  Indeed..................

https://www.telegraph.co.uk/technology/2018/10/25/paramedics-told-watch-driverless-cars-driving-accidents

 

Link to comment
Share on other sites

Remarkable IMO!    Two extracts from the link below. Note the second. I trust Waymo will be sued to blazes for every injury and collision they cause.

 

Published by Business Insider.

 

“Waymo received a permit from California's Department of Motor Vehicles (DMV) on Tuesday to test autonomous vehicles without human backup drivers on public roads, the DMV said in a statement.”

 

“But an August report from The Information suggested Waymo's self-driving technology struggles with more driving tasks than the company has indicated. The publication said Waymo vehicles have difficulty making unprotected left turns, distinguishing between individuals in a large group, and merging into turn lanes and highway traffic, among other trouble areas.”

 

https://www.hl.co.uk/news/2018/11/1/waymo-is-the-first-company-thats-allowed-to-test-autonomous-cars-without-a-backup-driver-in-california

Link to comment
Share on other sites

  • 11 months later...

Despite its slightly historic title I've reused this thread rather than creating a new one because, IMO, it would be a pity to lose track of the material on it.

The topic has been quiescent here for near a year, but certainly not so the attempts by manufacturers to develop realities to approach the hype.  This article from today's Telegraph Technology is worth a read in relation to the UK.  Despite pointing out several (just five - Ha! Only 5?!) big issues, it completely overlooks for instance the importance of overtaking as a topic.  A huge subject covering the nature/driving style of relevant vehicles, the likelyhood of doing/being,  actions prior/post. Each a complex subject in its own right.

https://www.telegraph.co.uk/technology/2019/10/02/roundabouts-country-lanes-will-driverless-cars-cope-british/

Then again today, I see the following. Good luck with that. It's not "vision" itself, it's the interpretation that matters. A sign of worry and of seeking help, I suggest. 

https://www.hl.co.uk/news/2019/10/2/tesla-is-reportedly-buying-a-computer-vision-company-to-help-it-make-the-technology-for-self-driving-cars-tsla?cid=halDM89070&bid=357711102&e_cti=481268&e_ct=T&utm_source=AdobeCampaign&utm_medium=email&utm_campaign=EONRM_Daily Newsroom_Opt in_02.10.19&theSource=EONRM&Override=1

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We use cookies to enhance your browsing experience, serve personalized ads or content, and analyze our traffic. By clicking " I Accept ", you consent to our use of cookies. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.