Breaking: Self-Driving Cars Avoid Accident, Do Exactly What They Were Programmed To Do

from the I-can-and-will-do-that,-Dave dept

We just got done talking about how, after logging 1,011,338 autonomous miles since 2009, Google's automated cars have had just thirteen accidents -- none of which were the fault of the Google vehicles. By and large the technology appears to be working incredibly well, with most of the accidents the fault of inattentive human drivers rear-ending Google's specially-equipped Lexus SUVs at stop lights. But apparently, the fact that this technology is working well isn't quite interesting enough for the nation's technology press.

A Reuters report making the rounds earlier today proclaimed that two self-driving cars from Google and Delphi Automotive almost got into an accident this week in California. According to the Reuters report, Google's self-driving Lexus "cut off" Delphi's self-driving Audi, forcing the Audi to take "appropriate action." This apparently got the nation's technology media in a bit of a heated lather, with countless headlines detailing the "almost crash." The Washington Post was even quick to inform readers that the almost-crash "is now raising concerns over the technology."

Except it's not. Because not only did the cars not crash, it apparently wasn't even a close call. Both Delphi and Google spokespeople told Ars Technica that both cars did exactly what they were programmed to do and Reuters apparently made an automated mountain out of a molehill:
"I was there for the discussion with Reuters about automated vehicles," she told Ars by e-mail. "The story was taken completely out of context when describing a type of complex driving scenario that can occur in the real world. Our expert provided an example of a lane change scenario that our car recently experienced which, coincidentally, was with one of the Google cars also on the road at that time. It wasn’t a 'near miss' as described in the Reuters story."

Instead, she explained how this was a normal scenario, and the Delphi car performed admirably.

"Our car did exactly what it was supposed to," she wrote. "Our car saw the Google car move into the same lane as our car was planning to move into, but upon detecting that the lane was no longer open it decided to terminate the move and wait until it was clear again."
In other words, As Twitter's Nu Wexler observed, the two cars did exactly what they were programmed to do, though that's obviously a notably less sexy story than Reuters' apparently hallucinated tale of automated automotive incompetence.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: accidents, autonomous vehicles, cars, driving, near miss, self-driving
Companies: delphi, google


Reader Comments

Subscribe: RSS

View by: Thread


  • icon
    TasMot (profile), 26 Jun 2015 @ 11:50am

    You got that new title wrong.

    Breaking: NOBODY GOT HURT IN SELF-DRIVING CAR NON-ACCIDENT. Self-driving cars avoid accident, doing exactly what they are programmed to do.

    reply to this | link to this | view in chronology ]

    • icon
      David (profile), 26 Jun 2015 @ 12:19pm

      Re:

      Actually O would make it:

      Braking: NOBODY GOT HURT IN SELF-DRIVING CAR NON-ACCIDENT. Self-driving cars avoid accident, doing exactly what they are programmed to do.

      reply to this | link to this | view in chronology ]

  • icon
    Ninja (profile), 26 Jun 2015 @ 11:53am

    Breaking: Self-driving cars avoid accident, doing exactly what they are programmed to do

    This doesn't generate as much hype as

    BREAKING: Skynet over us, self-driving cars nearly cause apocalypse. John Connor called for help.

    As much as I like driving I look forward a future where humans won't touch the wheel. Or a text processor it seems from the idiocy from the media in this case :)

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Jun 2015 @ 11:37am

      Re:

      Agree with Ninja. I guaran-dang-tee you that self-driving cars will use their turn signals, will stop at stop signs and red lights, and will run with headlights operating.

      If people cannot be trusted to operate a complex piece of machinery, perhaps the machine can run itself better. No question in the case of cars. I doubt more than one person out of four driving today actually knows how to drive in compliance with traffic laws.

      reply to this | link to this | view in chronology ]

  • icon
    Roger Strong (profile), 26 Jun 2015 @ 11:56am

    Keep in mind that 1,011,338 autonomous miles was almost always in the same small area, over and over and over.

    That area is mapped out electronically in far more detail than you find elsewhere. That includes mapping the location of traffic lights so that the car will notice them. Temporary traffic lights are a problem, and the car will not notice a police officer signaling the car to stop.

    And with much of the processing done in the cloud, be sure to limit your driving to areas with decent cellular coverage.

    The average person won't be able to operate a self-driving car in the next few years, but it IS wonderful progress.

    Taxi services will soon be able to use them. The dispatcher can check to see if the pick-up location, destination and points in between are suitable for a self-driving car. If not, they can send one of their remaining human drivers.

    I assume that this is Uber's business model. Use humans to build up their business using their own cars. They'll be established in many cities just in time for a self-driving fleet to become practical.

    reply to this | link to this | view in chronology ]

    • icon
      jupiterkansas (profile), 26 Jun 2015 @ 12:16pm

      Re:

      The processing is not handled in the cloud. Each car is capable of driving using only its sensors and onboard computers. The cloud helps with maps and directions but isn't required.

      They can also recognize stoplights, but maybe not a police officer's hand signal - but that's a problem that's easily solved.

      And the areas may be mapped out - but they're mapped out by the cars themselves. Any area where self-driving cars spend any considerable time would also get mapped out in great detail. It's one of those situations where the more it's implemented, the better it works.

      reply to this | link to this | view in chronology ]

      • icon
        Roger Strong (profile), 26 Jun 2015 @ 12:37pm

        Re: Re:

        Cloud computing is a big part of the system.

        Technology Review: Impressive progress hides major limitations of Google’s quest for automated driving.
        Maps have so far been prepared for only a few thousand miles of roadway, but achieving Google’s vision will require maintaining a constantly updating map of the nation’s millions of miles of roads and driveways.
        [...]
        If a new stop light appeared overnight, for example, the car wouldn’t know to obey it.
        [...]
        Among other unsolved problems, Google has yet to drive in snow, and Urmson says safety concerns preclude testing during heavy rains. Nor has it tackled big, open parking lots or multilevel garages.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 26 Jun 2015 @ 1:01pm

          Re: Re: Re:

          Yeah, snow would be my major concern. A little loss of traction would throw off any navigation aides based on how far the car thinks it's traveled, and heavy cloud cover could ruin GPS reception, and the snow cover hides the road lines... Is the car capable of picking up the more subtle cues of where the road is?

          If they're scared to even test the thing in heavy rains, then it's nowhere near ready for actual deployment.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 26 Jun 2015 @ 1:07pm

            Re: Re: Re: Re:

            Baby steps, my dear sir. These are tests and proof of concept vehicles.

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 26 Jun 2015 @ 2:04pm

              Re: Re: Re: Re: Re:

              ...These are tests and proof of concept vehicles...

              Which NOW need testing on ACTUAL roads, not controlled test tracks.

              Supposedly Nevada has approved some of their highways for such testing, albeit with caveats. Such as a licensed driver must be in the vehicle, even if they are not controlling it. And the 'tests' have their own caveats: IIRC the vehicles to be tested currently do NOT have the ability to enter or exit the highway, not can they do a passing maneuver.

              reply to this | link to this | view in chronology ]

        • icon
          jupiterkansas (profile), 26 Jun 2015 @ 2:02pm

          Re: Re: Re:

          When a Google car sees a new permanent structure such as a light pole or sign that it wasn’t expecting it sends an alert and some data to a team at Google in charge of maintaining the map.
          [...]
          The car’s video cameras detect the color of a traffic light.


          As I said, the cars themselves help map the terrain, so it's not like they can't see what's around them and operate independently or rely on maps in the cloud. And luckily, stoplights don't appear overnight. It would be easy for the city to work with the Google to incorporate road changes and construction zones (or for the city to maintain the maps themselves).

          Snow and ice are definitely serious issues though. Right now that means in certain types of weather, you'll just have to drive yourself.

          And these are today's cars - not cars 10 years from now. Go back and look at where they were in 2004, and I'm not too worried that these issues won't be overcome.

          reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 26 Jun 2015 @ 2:42pm

          Re: Re: Re:

          The car has to be able to deal with local conditions and changes all on its own, as it is never guaranteed to have an up to date map, especially when dealing with temporary lights and lane closure due to road works etc. Also there is nothing to stop a passenger giving a car directions, like take the next road or dive on the right, or stop here.
          Spotting a stop light is no harder than spotting a cyclist or pedestrian or other unmapped obstruction. All such processing has to be done on-board, as otherwise an Internet dropout could cause an accident.

          reply to this | link to this | view in chronology ]

        • icon
          John Fenderson (profile), 29 Jun 2015 @ 3:07pm

          Re: Re: Re:

          "Cloud computing is a big part of the system."

          Yes, just as the commenter said, for maps and such. Not for processing. If processing is done in the cloud, then the whole system would be impractical for the foreseeable future.

          reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Jun 2015 @ 12:25pm

      Re:

      An Uber Johnny Cab would be great!
      https://youtu.be/IjRXyWFLkEY?t=109

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 27 Jun 2015 @ 12:06am

      Re:

      the car will not notice a police officer signaling the car to stop.


      I've seen a Google car recognize hand signals from bicyclists. Unless you mean flashing lights and whatnot, then I'm sure they'll have a kill switch to tell the car to pull over.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Jun 2015 @ 11:57am

    Or "Automated Audi and Automated Lexus BETTER at driving than human Audi and Lexus drivers!!!" This comes with clickbait-style all-caps word and exclamation points in a ready to use package. You're welcome, Weather.com and others. I hereby release this creation into CC with no restrictions.

    reply to this | link to this | view in chronology ]

  • identicon
    Hideous, 26 Jun 2015 @ 12:10pm

    robocars to use time-tested CSMA/CD "collision detection" protocol...

    Autonomous vehicles will adopt the simple, well-understood CSMA/CD with BEB protocol from computer networking. When two autonomous vehicles see an open lane or intersection at the same time, they will both enter it. If they collide they will sense the collision and back off. Each will choose a pseudorandom delay time and proceed after it expires. Each car that collides again will backoff and delay for twice the previous delay time before attempting to go again.

    reply to this | link to this | view in chronology ]

    • icon
      Roger Strong (profile), 26 Jun 2015 @ 12:25pm

      Re: robocars to use time-tested CSMA/CD "collision detection" protocol...

      Another analogy is lossy compression. They can greatly increase efficiency during rush hour by simply accepting that they're going to lose a few vehicles.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Jun 2015 @ 12:17pm

    This is IT though

    No body gives a fuck until something either actually goes wrong or when someone just thinks something was wrong despite it going right.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Jun 2015 @ 12:30pm

    It's called "click bait", people!

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 26 Jun 2015 @ 12:32pm

    I can't decide whether or not "breaking" in the title is an intentional pun.

    reply to this | link to this | view in chronology ]

  • identicon
    TDR, 26 Jun 2015 @ 12:47pm

    So where's my flying DeLorean?

    reply to this | link to this | view in chronology ]

  • identicon
    TDR, 26 Jun 2015 @ 12:56pm

    "Drivers? Where we're going we don't need... drivers."

    reply to this | link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 26 Jun 2015 @ 2:04pm

    Somone is going to make a Johnny Cab. I can feel it.

    Of course, there are already (manned) taxi services named Johnny Cab, so it's only a couple of steps for them to automate their cars and install an animatronic half-mannequin chatbot into the driver position.

    reply to this | link to this | view in chronology ]

  • identicon
    Digger, 26 Jun 2015 @ 2:05pm

    Wait til the autonomous vehicles get into the mix...

    S.C.A.V. will end Taxi and Limo companies...
    Self
    Contained
    Autonomous
    Vehicle

    reply to this | link to this | view in chronology ]

  • identicon
    Monsur, 26 Jun 2015 @ 2:13pm

    A road Test?

    Was it supposed to be a road test or something? And have they been that perfected?

    reply to this | link to this | view in chronology ]

  • identicon
    StrongStyleFiction, 26 Jun 2015 @ 2:35pm

    Automatic autos actively avoid actual accident.

    reply to this | link to this | view in chronology ]

  • icon
    A. Nnoyed (profile), 27 Jun 2015 @ 8:54am

    Programming makes bad decision for driver!

    1. Push button ignition switch killed some people when the throttle stuck and the driver could not shut off engine which resulted in a fatal crash. There was a way to stop engine in an emergency but drivers are not advised how to do so.

    2. A driver was killed because they could not exit the vehicle in time when railroad crossing gates trapped them. A news story showed how a driver could easily drive through and break off a railroad crossing arm because they are designed to be easily broken off. If a driver is caught between railroad gate crossing arms and vehicle program will not allow the driver to drive into and break off an arm to escape a collision with a train, without a series of steps, to override the collision avoidance system, it is dangerous.

    reply to this | link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 27 Jun 2015 @ 9:12am

      The bear rule.

      Are these actual incidents or hypothetical ones?

      The push-button ignition only indicated the system needs to check that the transmission is out of gear before engaging the engine. That's a fixable bug.

      The second one sounds avoidable by the guidance system detecting the position of the train and not proceeding onto the tracks if it is too close.

      I expect that, because people are ambitious and creative in their idiocy that self-driven cars will sometimes engage in accidents, some fatal. But they don't have to be casualty free, just safer than human-driven cars.

      Granted this may create a liability problem, but that's a different issue. There are similar liability problems with public transit.

      reply to this | link to this | view in chronology ]

      • icon
        A. Nnoyed (profile), 27 Jun 2015 @ 2:21pm

        Re: The bear rule.

        1) Read this driver could not shut engine off:
        http://abcnews.go.com/Blotter/toyota-pay-12b-hiding-deadly-unintended-acceleration/story?id=2297 2214
        ABC News first reported the potential dangers of unintended acceleration in an investigation broadcast in November 2009. The report said hundreds of Toyota customers were in “rebellion” after a series of accidents were apparently caused by the unintended acceleration. Two months before, Highway Patrolman Mark Saylor and three members of his family had been killed after the accelerator in his Lexus had become stuck on an incompatible floor mat. Saylor was able to call 911 while his car was speeding over 100 miles per hour and explain his harrowing ordeal right up until the crash that ended his life.
        EXCLUSIVE INVESTIGATION: Runaway Toyotas

        2) Driver killed when vehicle became trapped between crossing arms Page 1 to 4 here:
        http://www.cbsnews.com/pictures/metronorth-train-accidents/

        reply to this | link to this | view in chronology ]

        • icon
          Uriel-238 (profile), 27 Jun 2015 @ 4:12pm

          Re: Re: The bear rule.

          Yeah, I can't find an analysis of the Jeep SUV failure due to collision detection regarding the second one.

          The first one is a problem with controls. It sounds like it's not a problem with power acceleration (fly-by-wire) but with a foot pedal getting stuck. That's actually a situation in which a smart automatic driving system could help, especially if the acceleration was control was fly-by-wire as it could then override the stuck control and slow the car down regardless.

          reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 14 Jul 2015 @ 6:18pm

          Re: Re: The bear rule.

          If only he hadn't have been talking on the phone...

          reply to this | link to this | view in chronology ]

    • icon
      MrTroy (profile), 28 Jun 2015 @ 8:16pm

      Re: Programming makes bad decision for driver!

      Re 2, why have a gate crossing arm on the exit side of the crossing?

      Question asked and answered, sigh: https://www.azatrax.com/controller.html

      "To reduce collisions at grade crossings, railroads are installing four quadrant gate systems on high speed rail corridors, commuter lines, light rail systems and in areas with high concentrations of foolish drivers."

      At least once people stop driving their own cars, we can return to more sane two quadrant crossing gates.

      reply to this | link to this | view in chronology ]

  • icon
    Derek Kerton (profile), 29 Jun 2015 @ 11:52am

    Warning! You Could Be In Danger...stay tuned

    Later, in the news: drone-like, repetitive, clickbaity and formulaic news articles basically write themselves. How the forumla for creating a moral panic may threaten the human element and editorial touch we used to value in news reporting.

    Click to read more...but first, Is something in your garage planning to kill you?

    reply to this | link to this | view in chronology ]

  • identicon
    Scott, 10 Jul 2015 @ 12:32am

    Self driving car avoid accident is a new concept in market, if this type of model avoid accident then it will be very helpful for people. For avoid accident and any type of problem, this a very useful car.
    http://www.harmonymotorworks.com/

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Recent Stories
.

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.