Philip Koopman’s Post

View profile for Philip Koopman, graphic

Autonomous Vehicle Safety, Embedded Software, UL 4600, Consulting, (He/him.) Personal account; likes/shares are interest and not endorsements; lack of response does not imply agreement.

Zoox had an injury crash in Menlo Park on August 23rd while in autonomous operation. (Said to be minor injuries.) Briefly, the Zoox vehicle made a right and then slowed down for an almost immediate left turn instead of maintaining traffic flow speed (looks like under 100 feet before slowing for a turn -- only one house lot of distance). Brake and signal lights on. Another vehicle hit the Zoox from behind at about 24 mph with airbag deployment. The two Zoox operators sustained minor injuries; other vehicle driver injury status unknown, but other vehicle hit a tree. In sharp contrast to the recent blame-ridden Cruise injury report, this report gives a factual description. From a crash investigation point of view it will be important to understand how much time margin there was when the Zoox vehicle pulled out in front of the other vehicle. Did the Zoox vehicle basically cut the other driver off with insufficient reaction time given other vehicle speed (whatever it might have been)? Or did the other driver have ample time and simply wasn't paying attention? Given the incredibly adversarial way other companies have been playing these reports, this is a chance for Zoox to show they are serious about safety transparency. What was the other driver reaction time? What could have been done differently (what will you do differently) next time to avoid similar crashes given what happened this time? The promise of autonomous vehicle safety will only be reached if lessons are actually learned. Here is Zoox's chance to show their safety culture and transparency. https://lnkd.in/e5n4sHAs

  • No alternative text description for this image
Philip Koopman

Autonomous Vehicle Safety, Embedded Software, UL 4600, Consulting, (He/him.) Personal account; likes/shares are interest and not endorsements; lack of response does not imply agreement.

1y

Hypothetical Zoox failure mode that would explain this crash (assuming that indeed the Zoox vehicle left an unreasonably short reaction time for the other vehicle by "cutting in front of it" -- which is not yet known). - Step 1: Zoox planner is making a right turn. It sees it has enough time to get onto Santa Cruz Ave and accelerate. By the time it has accelerated, following distance for the oncoming vehicle will be, say, 2 seconds (or similar). This plan is deemed safe, but does not account for the imminent left turn. - Step 2: Having made the turn, the Zoox planner now decides to go slow instead of speeding up because of the left turn. The following human driven vehicle is now presented with the sudden appearance of a slow moving vehicle and has insufficient reaction time to avoid a crash. If this speculative scenario is what happened, a contributing cause would be a tactical planner that is just checking the safety of the next maneuver without considering that a tightly packed series of maneuvers will invalidate the usual assumptions for the first maneuver. Do we know this is what happened -- no. But it is the type of explanation I would expect if indeed the Zoox vehicle went slow after cutting out in front of traffic.

Phil Benzel

Mixed System Architecture Definition and Design

1y

Referring back to the title of your book "How safe is safe enough?" While there will always be areas that can and most certainly will improved over time, I would say that the answer is when the accident/injury rate for autonomous (or nearly autonomous) vehicles is less than that for conventional vehicles per million miles driven. The simple statistics show that this was achieved years ago. As a Tesla owner running FSD beta during a drive on the freeway yesterday I, or more correctly stated my car, avoided an accident at 70mph that if I were driving I most likely would not have been able to avoid. By the time I even noticed a problem, my car was responding... My extra 0.5-0.7sec (or ~60ft) response time would have resulted in me being involved as well. Generally speaking rear end incidents are nearly always the fault of the person in back and I would think that this was probably the case here as well. " A human not paying 100% attention to 100% of the external situations 100% of the time.

Doug Macdonald

Lifelong innovation enthusiast, recovering advocate for PLM, seeking value at the intersection of markets and technology

1y

This doesn’t even seem to qualify as a “corner case”. This is a typical driving situation where a human driver would be aware of the risks and with a little planning ahead can easily navigate.

Joseph Munaretto, PhD

Algorithm Developer & Data Scientist

1y

I go for runs on this exact route. It is essentially like crossing over a street - need to wait for both directions to be clear before entering then “turning”. 

Puneet Jindal

Top Voice | Improve your data for your Generative AI and CNN, RNNs as well

1y

Does that mean Zoox didnt wait to clear on both sides?

Like
Reply
Behrouz G

Functional Safety Manager

1y

Getting rear ended is the leading cause of AV accidents in California. Good and observant human drivers are always mindful of getting rear ended and take actions to avoid these types of accidents. I have never been rear ended (touch wood :) ), but not by luck, but by doing the following: 1.When braking, watching for the approaching the vehicle from behind and removing my foot from the brake and moving forward a few feet to create more distance/room for the other vehicle to stop without hitting my car. 2. When slowing down/braking estimating the approaching vehicle from behind and moving to the shoulder of the road when it seemed likely that a collision would have happened if I didn't get out of the way. 3. When slowing down to make a left or right turn, abandoning the turn and hitting the accelerator when estimating that the following vehicle might hit my car. 4. giving an indication of a turn well in advance and never hitting the brake hard. To me it doesn't matter if legally the other driver is at fault; what matters to me is doing all I can to avoid an accident even if it is going to be the other driver's fault. Today's AVs are not capable of doing that; they are just sitting ducks.

I haven't checked this source but it appears the stretch has inherent challenges. Sharing as information only. https://univpark.org/content/safe-issue-3-crosswalk-sharonsanta-cruz

Rick Comiskey

ANSYS Enterprise Automotive Group

1y

Back in the old days auto companies were forced to buy expensive test facilities for proving ground validation of their products. Now you just need a pass from your local and state government to use the public roads as your test course. Think of the money saved -

Dr. Ruben Strenzke

Autonomous Vehicle Third-Party Safety Assessments by TÜV SÜD

1y

Although technically very different from the Cruise crash, the underlying issue might be the same. Who defines where how an AV must behave w.r.t. that specific taxi lane type and how much reaction type for cutting in is enough to be safe/legal? In Singapore we have the TR68, where we could define this. What is the national or international standard that will answer these (and many more) questions?

See more comments

To view or add a comment, sign in

Explore topics