Self-Driving Cars Have Twice The Accidents, But Only Because Humans Aren't Used To Vehicles Following The Rules

from the I'm-sorry-you-hit-me,-Dave dept

When Google discusses its latest self-driving car statistics (provided monthly at the company’s website), the company is quick to highlight that with two million miles of autonomous and manual driving combined, the company’s self-driving cars have only been involved in 17 minor accidents, none of them technically the fault of Google. Or, more specifically, these accidents almost always involve Google’s cars being rear ended by human drivers. But what Google’s updates usually don’t discuss is the fact that quite often, self-driving cars are being rear ended because they’re being too cautious and not human enough.

And that’s proven to be one of the key obstacles in programming self-driving cars: getting them to drive more like flawed humans. That is, occasionally aggressive when necessary, and sometimes flexible when it comes to the rules. That’s at least been the finding of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab, which says getting self-driving cars onto the highway can still be a challenge:

“Last year, Rajkumar offered test drives to members of Congress in his lab?s self-driving Cadillac SRX sport utility vehicle. The Caddy performed perfectly, except when it had to merge onto I-395 South and swing across three lanes of traffic in 150 yards (137 meters) to head toward the Pentagon. The car?s cameras and laser sensors detected traffic in a 360-degree view but didn?t know how to trust that drivers would make room in the ceaseless flow, so the human minder had to take control to complete the maneuver.”

And while Google may crow that none of the accidents their cars get into are technically Google’s fault, accident rates for self-driving cars are still twice that of traditional vehicles, thanks in part to humans not being used to a vehicle that fully adheres to the rules:

“Turns out, though, their accident rates are twice as high as for regular cars, according to a study by the University of Michigan?s Transportation Research Institute in Ann Arbor, Michigan. Driverless vehicles have never been at fault, the study found: They?re usually hit from behind in slow-speed crashes by inattentive or aggressive humans unaccustomed to machine motorists that always follow the rules and proceed with caution.”

But with a sometimes-technophpobic public quick to cry foul over the slightest self-driving car mishap, car programmers are proceeding cautiously when it comes to programming in an extra dose of rush-hour aggression. And regulators are being even more cautious still. California last week proposed new regulations that would require that all self-driving cars have full working human controls and a driver in the driver’s seat at all times, ready to take control (which should ultimately do a wonderful job of — pushing the self-driving car industry to other states like Texas).

The self-driving car future is coming up quickly whether car AI or self-driving auto philosophical dilemmas (should a car be programmed to kill the driver if it will save a dozen school children?) are settled or not. Google and Ford will announce a new joint venture at CES that may accelerate self-driving vehicle construction. And with 33,000 annual fatalities caused by highway-bound humans each year, it still seems likely that, overly-cautious rear enders aside, an automated auto industry will still likely save significant lives over the long haul.

Filed Under: , , , , ,
Companies: google

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Self-Driving Cars Have Twice The Accidents, But Only Because Humans Aren't Used To Vehicles Following The Rules”

Subscribe: RSS Leave a comment
82 Comments
TechDescartes (profile) says:

Zoned Out

But what Google’s updates usually don’t discuss is the fact that quite often, self-driving cars are being rear ended because they’re being too cautious and not human enough.

Google deliberately limits them to 25 mph, which has resulted in at least one instance of getting pulled over for impeding traffic. In other words, Google’s cars are rolling school zones.

Mike Masnick (profile) says:

Re: Zoned Out

Google deliberately limits them to 25 mph, which has resulted in at least one instance of getting pulled over for impeding traffic. In other words, Google’s cars are rolling school zones.

There are two different self-driving car programs by Google (Alphabet). One of them is the one you describe, limited to 25 mph, just around Mountain View (the “bubble” cars). The other is the regular cars, which go well above 25mph. Most of the data comes from the faster cars, not the slow ones.

Anonymous Coward says:

The funny thing is that this points out the problems with the human drivers that “follow the rules” too much – the ones who literally interpret all driving rules as sacrosanct or are at least paranoid of getting tickets for any infraction. Other drivers exceed posted speed limits and stop at stop signs for fewer seconds than official rule books state. When you get these strict rule-adherents, other drivers are thrown off and it causes problems. I’ve had cops tailgate me because they wanted to drive faster above the speed limit and I was paranoid they’d pull me over for speeding. Later I got into law enforcement as a dispatcher and the cops that I went on ride-alongs with said that they didn’t pull anyone over for less than 15 mph above the speed limit unless there was erratic (i.e. possibly drunk) driving involved. Even some cops expect drivers to break the official rules. I’m not saying everyone should embrace breaking the rules all the time and I’m not saying you won’t encounter a dick cop that wants to give you a ticket for 1 mph over the speed limit. I’m just saying it’s messed up that the rules of the road are like copyright – people don’t respect them because they aren’t always practical.

Anonymous Coward says:

Re: Re:

If I’m going the speed limit and you rear-end me, the problem is not that I was going the speed limit. The problem is that you were both exceeding it and not paying attention. If I stop at a stop sign and you rear-end me because you weren’t expecting me to stop, that’s also your fault.

On the other hand, the linked article notes that a Google car was warned by police for going 24 in a busy 35 zone. Now, crashes in that sort of situation are not going to be listed as Google’s fault… but at some point, you’re increasing rather than decreasing the risk of an accident by driving so far below the posted limit for no discernible reason.

Anonymous Coward says:

Re: Re: Re:

“If I’m going the speed limit and you rear-end me, the problem is not that I was going the speed limit. The problem is that you were both exceeding it and not paying attention. If I stop at a stop sign and you rear-end me because you weren’t expecting me to stop, that’s also your fault.”

I agree, from an insurance standpoint, that the driver hitting the rule-adherent would be at fault. But you’re describing something other than what I was describing. I pay attention and don’t rear end people and I do expect them to stop at stop signs. I don’t hit them because they follow the rules. I don’t hit them at all because I don’t get into accidents. I’m an alert driver and I know when to drive cautiously.

But when you have 5 cars lined up behind you all wanting to go 10 mph more than the speed limit in a zone that is poorly rated for speed by the local government planners, the rule-adherent becomes the annoying anomaly that may in fact cause an accident due to their deviation from the average, albeit rule-bending behavior of most drivers.

Anonymous Coward says:

Re: Re: Re: Re:

Sure, going with the flows of traffic avoids many accidents and can be safer than sticking strictly to the posted rules. What happens as the roads shift from human drivers who don’t follow the rules to driverless cars that do? Then the average car will be following the set guidelines, and the erratic humans will be the problem causing agent. We cannot just look at today but must look forward too. Maybe the best way to look at this is we need to re-evaluate some of these rules as we take drivers out and replace them with autos.

Anonymous Coward says:

Re: Re:

The issue is that Google cars generally follow the rules, they have to by law, and most everyone else doesn’t. If everyone followed the rules there would be no safety issues with Google cars. But because most everyone doesn’t it’s actually safer to not follow the rules than to follow them. In many situations it’s because the rules are overly restrictive and the law makes it very clear that the rules must be very strictly followed and so Google can’t really legally make their cars not follow the rules.

If Google makes their cars follow the rules uniformly it’s more dangerous due to everyone else breaking the rules. Any accidents would then be the fault of those breaking the rules but accidents will occur more frequently. If Google makes their cars break the rules occasionally it’s safer but any accidents caused during which Google broke a rule to increase safety would be at least partly Google’s fault by law because they broke a rule.

A partial solution is for Google cars to determine how many people are breaking the rules at any given time (though the problem is that Google can’t predict whether or not someone will break a rule in the near future). If no one is breaking the rules Google should then follow the rules because that would be safer for everyone. If there are other vehicles breaking the rules Google should then determine if breaking the rules would increase safety. If it would then Google should, for the sake of safety, break the rules. The problem with that is now if there is an accident Google will now at least be partially to blame, legally, for breaking the rules. Google will be legally partially at fault if both Google and the other driver broke the rules. Google will be entirely at fault legally if Google broke the rules and the driver that Google got into an accident with didn’t even if, technically speaking, Google was just trying to avoid accidents with other vehicles that are breaking the rules.

I believe the rules are partially to blame here. If so many people are breaking them perhaps it’s because they are unreasonably restrictive. The rules should be such that those that break them aren’t just breaking an arbitrarily restrictive rule but that they are actually doing something dangerous. Just like humans aren’t perfect the humans that design the rules aren’t perfect either.

Another interesting question that often pertains to going on a freeway on-ramp (and other situations) are hand gestures. In all sorts of driving situations people often communicate with hand gestures. There could be a situation where a driver is communicating with another driver. There could be a case where a passenger of one vehicle is communicating with the driver of another. There could be a case where a driver of one vehicle is discussing something with the passenger of the same vehicle and one is gesturing to the other (or both are gesturing to each other) and the gestures are unrelated to what’s going on on the road. Especially when approaching an on-ramp a common gesture a driver of one vehicle could give to another vehicle is that the driver of the person on the freeway may be directing the driver trying to get onto the freeway to enter the freeway in front of them. Gestures are also useful for situations where a driver legally initially has the right of way but would like to pass that right of way onto another driver first for whatever reason (ie: at an intersection. Perhaps the driver with the initial right of way wants to make a U-turn at the intersection but would like the other driver to pass first). Also humans honk their horns at each other to warn them of potential danger. Usually other humans on the road are good at determining whether a hand gesture or car horn honk is being directed at them. Are Google cars good at this?

Anonymous Coward says:

Re: Re: Re:

Also what about when people change their minds. Often times hand gestures can be hesitant and the driver of a vehicle may first gesture for another driver to do one thing (or the driver may gesture that they are going to make a U turn or that they are going to do something) but then they may change their mind and gesture for that driver to do something else (or change their mind and decide they want to do something else instead) or their gestures maybe hesitant and just the act of hesitation may be indicative that they are changing their minds about what they are going to do or about what they want the other driver to do. Are google cars good at determining if a gesture is intended to be an action by the person making the gesture (ie: the person making the gesture is gesturing to the other driver that they are going to make a U-turn or park at a specific spot) or if the gesture is intended to direct the other car to do something (ie: you may park here if you want, I’ll find another parking spot or I’m not going to park. You may pass first).

Usually humans are good at reading hand gestures, gesturing back, and understanding each other. Humans understand each other much better than computers. Take, for instance, speech recognition. Talking into a phone or a computer is much more difficult than talking to another person. If you are talking into your phone you must speak more slowly, distinctly, clearly, and even then you often have to make corrections. Compared with talking to another person the difference is light and day. As humans we generally get each other but computers suck at understanding humans in all sorts of situations.

Anonymous Coward says:

Re: Re: Re: Re:

and none of this even gets into the subject of things like how lip reading (along with hand gestures), facial expressions (along with hand gestures), and being able to read each others eye’s (where are they looking, can they see me/did they see me/are they aware of my presence), plays into driving. How will Google’s cars handle these things?

Deniable Sources says:

Figures.

Add this to the myriad of reasons that self-driving cars are hopeless for a long time to come. Driving is a social behavior. The rules really are guidelines, not fixed inviolable laws. Yes, you can be ticketed for going over the speed limit on the highway. You can be ticketed for going under or exactly at the limit too, if you’re impeding traffic. Good luck figuring that one out using simple algorithms.

Let’s simply say that under ideal conditions, with weather, terrain, traffic, mechanical failures, and serious law-breaking not involved, highly paid engineers have managed to achieve an accident rate only twice the baseline set by “typical” drivers with their “typical” education, training, distractions, and vehicle maintenance. Spiffy.

PaulT (profile) says:

Re: Figures.

“Add this to the myriad of reasons that self-driving cars are hopeless for a long time to come”

I’m curious as to what you base this on.

“highly paid engineers have managed to achieve an accident rate only twice the baseline set by “typical” drivers with their “typical” education, training, distractions, and vehicle maintenance. Spiffy.”

On a brand new technology that’s still very much into its testing phase and not claimed to ready for mass market. Yet, the biggest problem you have seems to be that the cars don’t act like human beings.

Not bad. This far into the early life of the motorcar, people were still arguing about steam vs. petrol and whether or not to have men with flags warning people about them, or even ban them completely to not risk scaring the horses. It wasn’t long until those arguments looked dated and the technology improved exponentially. I look forward to future developments from all the companies working on this.

Alien Rebel (profile) says:

Re: Re: Re: Figures.

That said, there’s no guarantee that real-world adaptations will be positive. For example, what happens when human drivers discover ways to exploit the tendencies of self-driving vehicles for their own selfish advantage? “Ah, a self-driving car– I cut in, they brake; I make my exit, they don’t; SWEET!” I wonder if there could be a messy evolution of behavior, with unpredictable twists and turns.

PaulT (profile) says:

Re: Re: Re: Figures.

…which would be a much, much higher figure if safety technology was not at the state that it is currently.

But, yeah, the huge number of cars on the road aren’t the reason for the large number of deaths, nor is the fact that the poor design of many American cities make owning at least one mandatory for a great many people. It’s the technology, which means that we should abandon development of a new technology that will improve safety further. You go with that.

INOC | Network Monitoring (user link) says:

It doesn’t come as a shock that cars now are operable even without a driver on sight. Although its nowhere near perfect and problems can arise, the blame is not entirely on the car. Some incidents isn’t really because of the car but because the driver doesn’t know how to use them. Self-driving car are potent and resilient which makes it another revolutionary innovation, but of course, like everything else, the driver of it must how to operate it or it will do more harm than good.

Anonymous Coward says:

years ago a mary tyler moore episode had all the gang ready for a big xmas meal. most were already seated around the long table and the food was soon to be brought out.

a knock at the door was georgette, arriving slightly late. as she stepped in the gang all welcomed her, and she responded to the men, ‘oh, don’t get up.’ whereupon, of course, all the guys dutifully arose.

in her plaintive voice: ‘why do they always do that?’

when machines understand why that was amusing, they’ll be ready to share the road with us.

Glenn says:

“Driving is social behavior…”?

Yeah, it’s real “social” to go crashing into other drivers, pedestrians, trees and what not… to the tune of tens of thousands of deaths each year. It’s real “social” for butt-holes who drive around with their radios/stereos blasting loud enough for everyone within a mile or more in any direction to hear.

Some people really shouldn’t be allowed into “society”… at all.

Anonymous Coward says:

I agree with the premise of the article: self driving cars are making a number of bottom-feeding lawyers salivate with anticipation. This is one of the biggest impediments to the furthering of the technology, which is why I think that California’s proposed regulations are at least a step in the right direction.

As I see it right now, there are three major blocks to the widespread adoption of self driving vehicles.

1. The question of who is legally responsible for the safe operation of the vehicle is currently up in the air. This represents a huge potential liability, particularly when you throw in aftermarket parts, repairs, and modifications. This will require law (preferably legislation, less preferable regulation or litigation) to determine whom is legally responsible for a self driving vehicle.

2. Self driving vehicles must be interoperable by design. The major companies working on them are well known for interoperating inside their own domain, but do not tend to play well with others (I’m looking at you, apple and google). This will also likely require a lead integrator outside of the companies developing the vehicles, either in the form of an industry alliance or a government entity.

3. The technology will require approximately 10-20 years to penetrate to the used market after self driving vehicles become widespread in new build cars. Until there is a healthy used market, the adoption rate will lag what is necessary to pull the trigger on full autonomous driving on the nations highways.

This is why I think that the most likely technology to be fielded is some sort of semi-automatic driving aid. This would be similar to an autopilot in a aircraft. it would be up to the driver to make lane changes, turns, and merges, but the vehicle could keep in the lane and maintain distance. I also think that the more likely field for adoption is on the highway/interstate, because it is a simpler problem than trying to automate in-city driving. Human drivers can already effectively do in city driving, and the majority of the benefit in preventing fatalities would occur with high speed driving on the highways anyway.

PaulT (profile) says:

Re: Re:

I disagree. The cars would be far less useful if they weren’t able to perform lane changes, etc. and limiting in this way could encourage more dangerous driving (e.g. drunk or distracted drivers who forget they have to control the car for a particular manoeuvre, people who forget to switch the car back to auto before taking care of something in the back seat, etc). A driver should be able to take over if necessary, but people are bad enough at driving when they have constant control, let alone if they have to switch back and forth.

I agree that the legal situation will be messy to begin with, but that’s true of any new technology. Standards, liability, etc. are part of the acceptance process of anything not previously encountered by current law.

“Human drivers can already effectively do in city driving”

You must have visited different cities to the ones I’ve been to.

Anonymous Coward says:

Re: Re: Re:

I have driven in LA, San Diego, DC and the surrounding cities, and some in New York. City driving is frustrating, I agree. However, human drivers can anticipate what people are going to do (experience! reasoning!), which a computer cannot do at this time. This makes it easier for human drivers to do difficult tasks such as driving in congenstion. Driving in a straight line is much easier to implement.

I will admit, what I am thinking of is more of an 80% solution: something that we c an start building the technology base while allowing for the fully automated car to take the time it needs to fully gestate, and to allow time for the law and society to adjust to a self-driving automobile. I don’t think we should go from canvas and wire to 777 (to analogize from autopilot technology on airplanes).

As for people driving drunk and general incompetence… seriously? At the worst case, we’re no worse off than we are now. The only difference is that we would have a mechanism that would reduce a great deal of driving stress associated with traffic jams and highway driving, which is where the majority of traffic fatalities occur (according to the insurance institute for highway safety).

Were I dictator for a day, I would not choose to invest money in self driving cars. I would take the 80% solution and use the rest of teh money to improve the infrastructure (e.g. widening roads, adding more lanes). I would also look into algorithmically controlled traffic lights, which I think can be made to reduce in-city congestion (at least in my town, where all the damned lights seem to be perfectly out of phase with each other).

Anonymous Coward says:

bad road design

I see too many people driving without paying proper attention, but unless the traffic is going extremely slowly then “Swing across 3 lanes in 150 yards” is asking for trouble as a human driver with added “social skills” as it relies on levels of alertness from fellow drivers that experience shows cannot be relied upon.
In the cities I regularly drive there are a few sections of road like that – only visitors tend to attempt the move across many lanes in a few yards manouveres, the locals use different routes around the road network so they avoid those dangerous areas of road “design” and have far safer junction transits involving less lane changes or a far longer distance to change lanes
BTW all the (I’m guessing US comments) about often exceeding speed limits, think yourselves lucky you are not in the UK where there are huge numbers of vehicle speed cameras & ignoring speed limits can rapidly mean loss of driving licence via gatso.

Anonymous Coward says:

Re: bad road design

Indeed. On my way home from work, I tend to cruise in the far left lane, but my exit will eventually be 3 lanes over on the right. I usually start getting over about a mile or two before my exit so I don’t have to veer across 3 lanes in busy traffic like I’ve seen a few people do.

If traffic was actually moving at a reasonable speed and I only had 150 yards to get across 3 lanes and there was heavy traffic, I’d probably just shrug and slowly start getting over so that I can catch the exit after the one I’ve obviously already missed.

AndrewLee says:

Need better testing.

Getting my CDL taught me drivers ed taught me nothing.

Drivers Ed –
Day 1. This is a video of a human head under a tire.
Day 2. This is a video of two kids burning alive in a ditch.
Day 3. This is a video of drunk driver under a semi.
Day 4. This is a video of a torso thrown from the window.
Day 5. Congrats, you finished watching faces of death!!

Here’s your drivers license, bro! Now get the fuck out, you reek of sawdust and vomit.

Anonymous Coward says:

Control

At the current moment I know of no way one can easily or reliable go from Atlantic to Pacific without having one’s travels reported to numerous police departments.

Are we now going to go beyond reporting one’s location to giving the police the authority to determine if we can go or not by giving the police the authority to determine if the car will work and if it does the rout it will take?

Anonymous Noel Coward says:

Question

Supposedly Google are going to be trialing self-driving cars in the UK soon. That got me wondering;

What happens when lane markings haven’t been repainted for ten years and they’re almost invisible?

What happens when a stretch of road has been resurfaced but the lines haven’t been painted on yet?

How well do these cars cope with motorway contraflows?

Can they do roundabouts?

How good are they at detecting potholes and speed bumps?

What do they do when traffic lights are defective?

What happens if there’s been flooding and the road surface is under, say, 6 inches of water? It would be passable by a driver in a normal car, how about these?

Snow? Can it do snow?

How does it deal with cyclists? Especially filtering / lane splitting, running red lights, etc?

Sorry if this sounds cynical, but all of these (and more) are real-world problems that autonomous vehicles will need to be able to cope with if they are going to become standard over here on Surveillance Island.

Anonymous Coward says:

Re: Question

Supposedly Google are going to be trialing self-driving cars in the UK soon. That got me wondering;

What happens when…

As I understand it, Google cars only self-drive on specially pre-mapped roadways. Kind of like some kind of amusement ride and not really useful for general transportation.

jupiterkansas (profile) says:

Re: Re: Re: Question

There’s two types of Google cars. The ones you are talking about that are more for demonstration than actual road usage – that are limited to 25 mph because they aren’t real cars.

And then there are the real cars that are navigated by software but still require a human driver. They are not restricted to pre-mapped roadways. Most car manufacturers are currently working on these types of cars.

PaulT (profile) says:

Re: Question

I understand your concerns, but honestly? I don’t think you’ve thought of anything that someone at Google UK wouldn’t have thought of. I also don’t think that Google would be risking a high profile test on road conditions that nobody’s documented fully before they ship them over. I certainly don’t think that the existence of roundabout or rain/flooding hasn’t been considered.

Also, the road markings are probably irrelevant. If you search for documentation, what’s available certainly shows that they have multiple systems working on positioning, etc., While public roads in California, et al., may be somewhat straighter than in the UK in many areas, they’re certainly not better maintained in my experience.

In fact, the company seem to have been over-cautious with other tests (such as setting the speed to 25MPH), so they may not even be testing on motorways. I certainly don’t see them being asked to navigate the M25, Spaghetti Junction or the Magic Roundabout in a heavy rainstorm just yet, but I will guarantee they’ve considered roundabouts and driving on the left in designing their tests.

jupiterkansas (profile) says:

Re: Question

The solutions are simple.

When the Google car can’t determine what to do, it hands things over to a human driver.

Then the programmers go in and study the situation and program a way so that it doesn’t happen again – if that’s possible. This is how they’ve been developing the car for the last 10 years – trial and error. From what I’ve seen, Google cars can do fine with most of the situations you’ve described.

Anonymous Coward says:

Re: Re: Question

…When the Google car can’t determine what to do, it hands things over to a human driver…

When I first read California’s and Nevada’s regulations for self driving vehicles (Nevada is supposedly allowing testing for self driving commercial trucks) I noted for both states that a licensed driver was required to be in the vehicle. Based on some of the comments here and elsewhere I’m wondering if the ultimate goal of these self driving car experiments might be self driving cars that won’t have humans in them. If that’s the case how would the car react with no humans? And would there be a ‘safety shutdown’ that the car would stop exactly where it is, even if it’s still on a highway?

Chronno S. Trigger (profile) says:

Re: Re: Re: Question

Self driving cars are not going to be out there without a licensed driver until it can be proven that they will be able to handle themselves in all situations that are likely to happen on the road. And when the unlikely happens, the fallback will be exactly the same as it is for humans, pull off to the right (or left) side of the road, stop, and call for help.

Figuring out what the likely situations are and accounting for them is exactly what Google and others have been doing for the past several million miles.

jupiterkansas (profile) says:

Re: Re: Re: Question

That is the ultimate goal, but we’re a long long way from that point, and when we do get to that point, it’s likely that all the vehicles on the road will be automated and communicating with each other so if there is a malfunction, other vehicles can avoid it.

In the near future, you’ll be sitting at the wheel ready to take over.

Not Suprised At All says:

This tidbit is actually really interesting. It reminds me if a girl I dated in college. She was brought up to drive super carefully, in a small town. We went to college in a major metropolitan city known for speedlimits being. . . Suggestions more than limits. She was confused as to why people constantly honked at her, and had several near collisions when turning or going through lights. I had to explain to her about driver expectations of behavior and how that relates to road safety.

So she started taking turns much faster, speeding up on yellow lights, doing rolling stops, and so on, and suddenly the problem went away.

Also reminds me of the day we got a metro rail above ground. 1st day, like 3 hits, all side hits from people who didnt look where they were going.

Andrew D. Todd (user link) says:

So Install a Warning Sign

Well, the simplest thing for Google to do would be to flash a conspicuous “I’m a Robot and You’re Tailgating Me” sign when the rear-facing radar indicates someone coming up too close behind. The next sign is “I am Filming You, and Reporting You to the Police.” Once you make it clear to a tailgating driver that the machine will not accelerate to get away from him, he will either back off, or commit an overt act, such as deliberate ramming. Of course, I don’t see why a conventional car could not be fitted with a “You’re Tailgating Me” sign and radar-coupled brakes. I believe the official rule is that one should not follow closer than two seconds behind the preceding car, that is, ninety feet at thirty miles per hour.

Imagine a scenario in which a drunk becomes convinced that a telephone pole is insulting him, and he breaks his hands trying to beat up the telephone pole. That is not an argument for removing telephone poles so that the drunk can be sure of only hitting people.

Safe speed is a function of traffic density. I have sometimes seen fender-benders in traffic so jammed that, going down the sidewalk, I was out-walking the cars. A few weeks ago, I saw a five-way pile-up at speeds which cannot have been much more than ten miles per hour. I heard collisions at approximately one-second intervals, and, looking up, saw a row of wrecks, twenty or thirty feet apart. My impression was that the traffic light had just turned green, and the drivers had all accelerated with such vigor as not to be able to stop which it appeared that there was nowhere to go. The fire department carried one man away to the hospital. At a certain traffic density, it may be the case that the safe speed is two miles per hour. If one were to borrow from the science of hydraulics, one might introduce the concept of Reynolds’ Number as the boundary between orderly flow and chaotic flow.

The speed limit is not just made up out of someone’s hat. It is based on calculations of things like stopping distances, and it is premised on the observation of the two-second rule. You can find the details in a textbook of highway engineering. Braking distance increases as the square of speed, that is elementary physics. People who insist that they can cope with traffic by going faster are really banging their heads against telephone poles. The textbook solution, of course, is to build more lanes, and build lane-dividers (for different directions, and local-long distance), overpasses, cloverleafs (sometimes stacked four or six or eight layers high), etc. However… in an urban area, one tends to run out of land. I understand that Reason Magazine has put forward a proposal to dig multiple layers of tunnels under Los Angeles, like the Boston “Big Dig,” only much bigger and more elaborate, which are likely to cost something like a hundred thousand dollars per capita. Some of the tunnels would be the longest road tunnels in the world.

Dingledore the Flabberghaster says:

The problem is the human drivers

The laws are actually set because human drivers are expected to break the laws. If people actually drove within the rules, the authorities would be able to increase speed limits to reflect realistic and efficient road conditions.

Some drivers will always drive faster than the law allows, irrespective of whether the limit is appropriate or not. In order to deter those drivers from driving at dangerously fast or inefficient speeds all the time, the limits are kept lower than they need to be. Peer pressure does the rest and results in the entire traffic flow moving close to or above the prescribed limit. The more humans drive outside the laws, the more automated vehicles will be at a disadvantage.

The solution is for more humans to reliably follow the prescribed laws. At which point the prescriptions can be changed with the result of reducing the gap between humans and machines.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...