Uber's Video Shows The Arizona Crash Victim Probably Didn't Cause Crash, Human Behind The Wheel Not Paying Attention
from the everyone-error dept
In the wake of a Tempe, Arizona woman being struck and killed by an Uber autonomous vehicle, there has been a flurry of information coming out about the incident. Despite that death being one of eleven in the Phoenix area alone, and the only one involving an AV, the headlines were far closer to the “Killer Car Kills Woman” sort than they should have been. Shortly after the crash, the Tempe Police Chief went on the record suggesting that the victim had at least some culpability in the incident, having walked outside of the designated crosswalk and that the entire thing would have been difficult for either human or AI to avoid.
Strangely, now that the video from Uber’s onboard cameras have been released, the Tempe police are trying to walk that back and suggest that reports of the Police Chief’s comments were taken out of context. That likely is the result of the video footage showing that claims that the victim “darted out” in front of the car are completely incorrect.
Contrary to earlier reports from Tempe’s police chief that Herzberg “abruptly” darted out in front of the car, the video shows her positioned in the middle of the road lane before the crash.
Based on the exterior video clip, Herzberg comes into view—walking a bicycle across the two-lane road—at least two seconds before the collision.
Analysis from Bryan Walker Smith, a professor at the University of South Carolina that has studied autonomous vehicle technology indicates that this likely represents a failure of the AVs detection systems and that there may indeed have been enough time for the collision to be avoided, if everything had worked properly.
Walker Smith pointed out that Uber’s LIDAR and radar equipment “absolutely” should’ve detected Herzberg on the road “and classified her as something other than a stationary object.”
“If I pay close attention, I notice the victim about 2 seconds before the video stops,” he said. “This is similar to the average reaction time for a driver. That means an alert driver may have at least attempted to swerve or brake.”
The problem, of course, is that AVs are in part attractive because drivers far too often are not alert. They are texting, playing with their phones, fiddling with the radio, or looking around absently. We are human, after all, and we fail to remain attentive with stunning regularaty.
So predictable is this failure, in fact, that it shouldn’t surprise you all that much that the safety operator behind the wheel of this particular Uber vehicle apparently is shown in the video to have been distracted by any number of things.
A safety operator was behind the wheel, something customary in most self-driving car tests conducted on public roads, in the event the autonomous tech fails. Prior to the crash, footage shows the driver—identified as 44-year-old Rafaela Vasquez—repeatedly glancing downward, and is seen looking away from the road right before the car strikes Herzberg.
So the machine might have failed. The human behind the wheel might have failed. The pedestrian may have been outside the crosswalk. These situations are as messy and complicated as we should all expect them to be. Even if the LIDAR system did not operate as expected, the human driver that critics of AVs want behind the wheel instead was there, and that didn’t prevent the unfortunate death of this woman.
So, do we have our first pedestrian death by AV? Kinda? Maybe?
Should this one incident turn us completely off to AVs in general? Hell no.
Filed Under: ai, arizona, autonomous vehicles, driverless cars, pedestrians
Companies: uber
Comments on “Uber's Video Shows The Arizona Crash Victim Probably Didn't Cause Crash, Human Behind The Wheel Not Paying Attention”
So wait humans have LIDAR?
Maybe the human “safety” person was not paying attention.. Didn’t the sheriff already say it would have been unavoidable because she “came out of the shadows”, maybe grounds for firing for not doing her job, but SELF DRIVING CAR. That’s what they are TESTING right?..?? so they learn nothing and try to blame everyone except there software, Active sensors don’t have the limitations people with only passive visual like Oculation do. Don’t make excuses or try to shift blame, do an analysis FFS there is presumably (much much, much )more data that air crashes and they can figure them out, IF you give it to the NTSB.
Re: So wait humans have LIDAR?
Obviously the car’s programming also failed to avoid the accident, and the people who develop it are learning from that result, that is 100% besides the point since the car isn’t cleared to go live autonomously. You know, because it isn’t finished testing yet? No one is shifting blame. The whole point of the human “safety” is to be able to get those test results without people dying. Otherwise it’s not really testing is it?
Re: Re: So wait humans have LIDAR?
wrong the saftey driver is there because automated vehicles are illegal, not to provide backup after all if you need a back up you may was well require a driver in any case it does not go to my point which is that the ACTIVE sensor systems don’t have a problem seeing in the dark so unlike the human that the sheriff concluded would have been unable to avoid the collision because “she came out of the shadows” the LIDAR and IR camera computer vision systems don’t have that excuse, so it is blame shifting. if you think it’s good enough to be live off a test track your saying your taking responsibility… except they are not.
Re: Re: Re: So wait humans have LIDAR?
“automated vehicles are illegal”
Citation needed.
“after all if you need a back up you may was well require a driver”
Yeah, when I was learning to drive, the instructor had his own controls in case of any incident. There was obviously no point in me learning, the instructor should have just taken my test for me.
( /s in case anyone needs it, but hopefully the dense among us will get the point)
Re: Re: Re: So wait humans have LIDAR?
How do you get from manual driving to automatic driving without gradually introducing systems to make the process more automatic? What is your definition of “automated vehicle”? The responsibility is complex and both the technician/driver and the cars system failed here.
Legally you need a guilty party with “personhood” to blame and that is the manual driver normally. You cannot put the car in jail. The company Uber and their executives, though… I know economic penalties can befall them, but how would jailable offenses end out?
Re: Re: Re:2 So wait humans have LIDAR?
“How do you get from manual driving to automatic driving without gradually introducing systems to make the process more automatic?”
Google did it. They started their self-driving program with the philosophy that the vehicles would be completely autonomous from the start and limited them to the speeds they could safely operate. The first model could only travel 2 miles per hour (walking speed is around 3.5). I’ve been seeing them on the roads around here for over a decade and they’re now tooling around in the high 30’s, possibly over 40 mph. A friend of mine who often hits the sleep alarm too many times and is usually hurrying to work still gets annoyed by their extremely slow turns at intersections where they detect pedestrians, but he’s no longer hurling profanity at them for going slowly on the straight bits.
I think the ‘add more assistance until the driver is no longer necessary’ approach is fatally flawed (literally) because of the dangerous valley between mostly don’t need a driver and really don’t need a driver. A human who’s not actually needed 90%+ of the time will not be paying attention.
Re: So wait humans have LIDAR?
Re: Re: So wait humans have LIDAR?
it would have to be infrared to blind the sensor. Possible, but a waste of electricity. The LIDAR’s meant to operate in bright sunlight which would be much worse.
Uber cuts corners...
I’m guessing that Uber’s lidar and radar sensing systems are complete shit…
No surprise really, they’re new to this autonomous vehicle game, and they have a track record for being a shitty company with poor decision-making from the top-down.
Uber thinks they can get in front of everyone else with their fleet of autonomous cars, and in order to do so, I bet they’ve cut every corner they can. I wouldn’t be surprised if we learn next week that Uber’s Lidar system is simply for looks… and it’s non-functional.
Waymo (Google) has been at this game for nearly a decade now… and they still seem cautious about their approach compared to Uber and other companies trying to get in on the action recently.
Dear cheerleader for Uber & Google: we don't want your shit.
Take your inexorable death tech and FUCK OFF.
Re: Dear cheerleader for Uber & Google: we don't want your shit.
I’m confused as to how a Luddite is commenting on an internet website.
Re: Re: Dear cheerleader for Uber & Google: we don't want your shit.
Maybe an AV will get him, huh?
Re: Re: Dear cheerleader for Uber & Google: we don't want your shit.
Re: Re: Re: Dear cheerleader for Uber & Google: we don't want your shit.
You forgot to say (((globalists))) you fucking clown.
Re: Re: Re:2 Dear cheerleader for Uber & Google: we don't want your shit.
Um.. what do you think the British empire was? granted Luddites where in there era that was sill mostly Serfdom and aristocracy a small progressive working class was beginning to develop and allow people to have lives that where not mostly disconnected and miserable, as tradesmen making everything from cloths and fabrics, to block and tackle.. to Guns(the prototype assembly line was an American invention to make guns) that the British navy adapted to make block and tackle. If you want to pretend that we have not seen this before and that people that are upset about it are crazy, just remember those that do not learn from history are doomed to repeat it, only this time it won’t just be the skilled trades, it will be everyone, a Masters is now required for most stable jobs, that is what at least 16 years or confirming you are loyal to the state, that is really not OK and to get angry at people because they don’t like what direction their society is going is unpatriotic and anti-human
Re: Re: Re: Dear cheerleader for Uber & Google: we don't want your shit.
a person opposed to increased industrialization or new technology.
“a small-minded Luddite resisting progress”
Re: Re: Re:2 Dear cheerleader for Uber & Google: we don't want your shit.
NO YOU!
While I agree that The car likely should have noticed, i keep hearing how, after repeated viewings, people see the victim as many as 2 seconds before the crash…First time i saw it, it was a second, maybe less. I certainly hadn’t fully registered her presence before the impact. Its only after repeated viewings, knowing what size object im looking for that ive begun to see shadows shifting earlier. And despite some other report’s suggestions, I certainly wouldn’t have expected a pedestrian to be crossing in front of a car like that. The safety drive probably wouldn’t have seen her in time to prevent her death.
That said, again, the lidar/radar should have noticed the woman.
Also, it is important to note that the Ex Waymo employee who designed Uber’s systems left google because he thought google was playing it way too safe. Google didn’t think the tech was ready, he wanted it out on the streets *Now* and damn the consequences. Its also why Uber moved to Phoenix – Lax testing regulations.
Re: Re:
Maybe, but the question is at what range is the vehicle relying on cameras, and what is the maximum range of any Lidar/radar?
Re: Re: Re:
Yes, yes this is indeed a question and one that is not answered.. TESTING right learning Taking responsibility for your in testing unknown product..
Re: Re: Re: Re:
English is hard.
Re: Re: Re:2 Re:
Your right it a pity they did not teach you to with complete sentences.
Re: Re: Re:3 Re:
“English is hard” is a complete sentence.
“Your (sic) right it (sic) a pity they did not teach you to (sic) with complete sentences” is a run-on sentence with missing punctuation, a misspelling, and two missing words.
Re: Re: Re:4 Re:
^^^^
THIS
Re: Re: Re:4 Re:
Or maybe just “innovations” in “new grammar”. Isn’t “innovation” supposed to be good?
Re: Re: Re:
Autonomous vehicles should never rely solely on cameras, at any range!
Lidar range is actually quite good, and should have seen this pedestrian.
To see how the combination of lidar + camera sensing works, you can get a decent overview here:
https://www.youtube.com/watch?v=tiwVMrTLUWg
There are some crazy scenarios in there, several in which the vehicle sensing equipment and software makes a better decision than most humans would.
Re: Re: Re: Re:
Looking at that video, the laser range is somewhat limited, which makes sense, as at longer ranges it needs more angular resolution to identify a target, a slower scan rate, and the reflected power for the same target drops by a 16th as the range is doubled. Given where the woman was, a human had little chance of spotting her, (at least not without knowledge that she was there), the camera probably a poorer chance, and the lidar almost none when she came out of the shadows, with her dark clothing not helping. Indeed the first part of her visible in the film are just her feet, which cold be small animals or birds in the road.
Re: Re: Re:2 Re:
According to this breakdown, the lidar (and radar) should have picked up the pedestrian 3-5 seconds before impact, and braked immediately.
http://ideas.4brad.com/it-certainly-looks-bad-uber
Re: Re: Re:3 Re:
At 3 to 5 seconds away, was the victim masked by a car between her and the Uber car. There is a car beyond her when her feet show up. She was crossing between traffic at the time.
Re: Re: Re:
Legally, it doesn’t matter. If the car’s maximum stopping distance exceeds the sensor range, it’s going dangerously fast.
Re: Re: Re: Re:
yes, but which sensor determines that, the camera or the lidar.
Re: Re: Re: Re:
Agreed – on my advanced riding course the clear instruction was that you moderate your speed such that you can stop in the distance that you can see.
Re: Re:
Ars has a story too, stating the feet are visible 1.4 seconds before the crash. Based on that picture—dark road, no oncoming traffic—I’d say the driving conditions called for highbeams. They’re not just brighter, they literally point higher and could have illuminated the victim’s body. Or the body of a large wild animal that could have killed the safety driver.
Had you been the driver, you’d have been "overdriving your headlights". It’s dangerous and illegal. "Overdriving your headlights means not being able to stop inside the illuminated area ahead. It is difficult to judge other vehicles’ speeds and distances at night. Do not overdrive your headlights—it creates a blind "crash area" in front of your vehicle. You should be able to stop inside the illuminated area ahead."
Yes. Scotchlite is a magical substance that could have saved her life—sometimes I’ll see a 1x1cm strip on the back of a shoe, seconds before I see anything else—but this car was equipped with the actual magic of LIDAR and other sensors. Think Terminator 2 here, and notice that the safety driver was fully visible to the camera.
Re: Re: Re:
LIDAR does not care about visible light, it is an active(not sure IR don’t know what wavelength) system it’s shining a flashlight on you, Can you see me now? 🙂
Re: Re: Re: Re:
Wired says infrared, and I’d be very interested to see the LIDAR view of the crash.
Re: Re: Re:2 Re:
I would to but unfortunately I suspect we won’t.
Re: Re: Re:3 Re:
People were saying that about the video. Funny thing, the people who claim they will never see evidence never seem to return to the conversation after the evidence is released…
Re: Re: Re:
If as prior posters have noted there was another car travelling roughly 3 seconds in front of the Uber vehicle then driving with the highbeams on would have been illegal.
Your statement about “overdriving your headlights” would mean that on any road or any time where an individual cannot drive with their highbeams they should drop their speed to below 38 mph (the speed at which the Uber car was travelling). This is patently ridiculous. Speed limits on roads are chosen based on how dangerous the area is and how likely crashes will occur at those speeds — the choice takes into consideration all normal adverse effects such as (but not limited to) light rain, night, and slight fog. If the speed limit was 35 mph which changed to 45 mph slightly before the area where the accident occurred then it is safe to assume that 45 mph is a safe driving speed with or without a car’s highbeams activated.
It may be that the city planners re-visit this speed limit posting and change it down to 35 mph due to the (prohibited) use of the “crosswalk.” But given the posted speed limit it is eminently reasonable for the driver to choose to go 45 mph.
Re: Re:
TO MANY QUESTIONS..
not enough answers yet..
1. BIKE SAFETY RULES??? where are the reflectors??
2. Lidar, radar, IR, UV…what ever…and HOW WIDE A BEAM??
If its focused on the front…it DID SEE ANYTHING..
3. DID THE WALKER SEE THE CAR??? She could of STOPPED BEFORE SHE CROSSED IN FRONT..
4. She is about 100 foot away..Stopping range?? NOT going to miss it. Even with a 30mm lens, SHE IS TO CLOSE..
Re: Re: Re:
Not sure about Arizona…. bike safety rules normally apply to bikes-as-vehicles, not bikes-as-cargo. It was being pushed not ridden. Still, reflectors are easy to get; I’ve tried to buy them but multiple shops just gave them for free. Maybe they were there but not visible because the headlight was aimed too low.
Re: Re: Re: Re:
Don’t forget that the bike was 90 degrees or so to the head lights, which negates front and back reflectors.
Re: Re: Re:2 Re:
That’s what the spoke reflectors are for (and, though rarely mandatory, reflective tape on the forks/frame). Even pedal reflectors should be visible if you’re 5-10 degrees off 90.
Re: lidar/radar should have noticed
…detection & analysis of bicycles has been a BIG problem for all these automated vehicles. LIDAR has had great difficulty with the various bike shapes, reflectivity, and human bodies clinging to them.
The Uber AV was going 40mph in a 35mph speed limit zone. How does that happen with computers fully controlling the vehicle?
The brief accident video being reviewed has much less clarity & field of view than was available to a normal human driver — it is not a definitive record of the event.
Re: Re: lidar/radar should have noticed
i thought it was indicated radar is used also? i would think a bike would have a fairly sizeable cross-section. maybe i am.wrong about both.
Re: Re: Re: lidar/radar should have noticed
Not huge–there’s a lot of empty space between spokes, inside the frame’s diamond… a person probably has a larger cross-section.
Re: Re:
Isn’t there a significant difference between electronics identifying a person off the road and predicting what that person will do next?
Some scenarios I see possible are:
1) The electronics identify a person, not on the road, and the person remains off the road. Should the vehicle stop because that person might wander onto the road?
2) The electronics identifies the person off the road, takes multiple readings of their approximate location and direction of travel (how many seconds?) and then makes a determination as to slowing down or ignoring the person who is still off the road.
3) Take that second scenario up to the point where at the last second the person off the road changes direction and moves into the road rendering all the calculations previously accomplished null and void and no time left to maneuver.
Are we really gonna ask these AV to predict the behavior of every human within range, or have some expectation that those others will follow the rules/laws? Sure detectors can help, but just as humans, processors need time to process information and react, And reacting to things that happen faster than either human or processor can react to are still gonna happen sometimes.
Re: A second?
“First time i saw it, it was a second, maybe less. I certainly hadn’t fully registered her presence before the impact. Its only after repeated viewings, knowing what size object im looking for that ive begun to see shadows shifting earlier.”
Do you always drive by looking through a potato cam?
Yeah, as I UPDATED. I was RIGHT and CENSORED for it as usual.
This comment has been flagged by the community. Click here to show it
at:
https://www.techdirt.com/articles/20180320/10323539458/tempe-police-chief-indicates-uber-self-driving-car-probably-isnt-fault-pedestrian-death.shtml#c85
and:
https://www.techdirt.com/articles/20180320/10323539458/tempe-police-chief-indicates-uber-self-driving-car-probably-isnt-fault-pedestrian-death.shtml#c101
and:
https://www.techdirt.com/articles/20180320/10323539458/tempe-police-chief-indicates-uber-self-driving-car-probably-isnt-fault-pedestrian-death.shtml#c111
The full text that was CENSORED by Techdirt’s pack of know-it-all-but-WRONG kids plus Administrator follows:
You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.
First, you haven’t seen the video, just rely on likely slanted characterization by cop who wants this experimentation to proceed. We’ll need to dig into any payoffs made by Uber to council members or city to allow this menace on the roads.
"On the median" doesn’t fit with "stepped from the shadows", nor with the superior low-light ability of a video camera, nor right front of SUV dented.
This ain’t over until lawyers have argued to jury that the heartless giant corporation loosed this faulty robot to attack unsuspecting humans.
Also, my take that a human would expect person on the median to be unpredictable and slow down or turn to clear is still accurate (minion makes a point of stating that the SUV didn’t slow or swerve…): if she’s in video for more than half a second, then the car with its "superior" reaction time is faulty.
So why was that censored? — The more right you are here, the more censored.
Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.
Probably because you repeatedly and admittedly spam nonsense all over every single post.
Re: Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.
Re: Re: Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.
Spam all you want in protest. The flag will keep being clicked.
Re: Re: Re:
It’s always fun watiching your persecution complex war with you desperate need for attention.
Re: Re: Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.
I suggest lithium
Re: Re: Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.
I also reported you. You have a right to protest and I have a right to report your protest.
As for why, it is because your post are egotistical and condescending and not because of your view that I report. Also at least learn what censor is before you claim to be censored. You view isn’t suppressed, only made to be easily ignored. Kinda makes your whole censorship crusade seem kinda pointless.
Re: Re: Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.
“A) It’s tactical. I have a right to protest the unfair treatment.”
It’s a private website. Fuck off with your “rights.” TD has the right to ban you the same way a private homeowner can kick you out of their house. That you can still post and your posts can be read by anyone who is curious by clicking a single button is a testament to how generous TD is with its comment moderation. You’d be instantly banned on other sites for the same behavior.
“D) Techdirt is violating its form contract (see CRFA) and its stated principle of free speech.”
Please oh please find a greedy lawyer who will take your money and sue TD over this contract violation. I would love to read a TD article about how you outed yourself as a troll and wasted money on a lawyer with dubious legal claims. You wouldn’t even have to give me a Xmas present for the next five years. That would be so awesome.
Re: Re: Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.
Blocked and reported.
I think “Why didn’t the car recognize the pedestrian?” is probably the best question to ask right now. It’s not a matter of assigning blame, it’s a matter of fixing the problem that’s easiest to fix.
It’s always easier to make cars safer than people. That’s why we have seatbelts.
Re: Re:
Absolutely.
What we have here is an indisputable system error and a lot of data that should indicate what went wrong. Despite the tragedy, we have an opportunity to improve the system.
Re: Re: Re:
…and that is going to be exactly what the Uber staff are doing right now. Nobody on that team will want to think that a failure on their system is leading to people getting killed, and the management don’t want this kind of publicity getting in the way of them selling the tech to the end market. It’s a tragic event, but this is the entire point of public testing, learning lessons to improve the product.
Re: Re: Re: Re:
Right, accidents are bug reports. And as immoral as that sounds, it’s actually huge progress. If an accident can remove a class of fatalities from an entire system, it’s better than the human controlled “repeat the same mistakes thousands of times” standard we have now.
Re: Re: Re:2 Re:
Exactly. It sounds heartless but the fact is that this appears to be a generally safer technology than standard road vehicles. Flaws in those that cause deaths have to occur many times before action is taken, be they problems with the road system or the vehicles themselves, and corporations are noted for holding back on improvements because it costs them more money than letting people die. The fallout from this incident is impossible for them to ignore, so they have to make things safer for everybody else.
Re: Re: Re:2 Re:
Agreed, but… is "pedestrian in dark clothing crossing a dark street" not already part of their test suite? If you asked me what to check before sending your autonomous car into the night, it would be in my top 10 (more generally, person or deer).
Re: Re: Re:3 Re:
It’s the “suddenly wandering in front of the car” part that wasn’t in the test data. The lack of visible clothing just means that the cameras didn’t fail and the result would likely have been the same if a human was driving.
Now we move on to the next set of details – did the car simply fail to react to something the other sensors picked up, or was the whole incident unavoidable? Can something be improved with those sensors, or teach the AI something about how to deal with people doing things like this?
You’re fixating on the “dark clothes” part, but missing the actual argument. That’s only mentioned to reinforce the fact that a human driver would not have made a difference.
Re: Re: RED ALERT! RECORD-BREAKING ZOMBIE!
BAlbrecht or Bruce A.: SEVEN AND HALF YEAR GAP! LAST SEEN IN 2010! https://www.techdirt.com/user/balbrecht
HA, HA! — OH, NO, NO ASTRO-TURFING ON THIS SITE!
This one is unusual because makes several comments, but below those here, right on first page, back to 2010!
Only real question is whether I’m the only human here!
Re: Re: Re: Re:
Re: Re: Re: Re:
Well you’re certainly not a bot. Nobody would deliberately program something so idiotic.
I say we need more AV's
To get the jaywalkers off the streets. Consider it a side benefit.
Re: I say we need more AV's
3 points? skynet needs to know citizen, pick up that can while your at it.
Re: Re: I say we need more AV's
What is it about ACs thinking they’re making clever points that makes them forget basic English?
Anyway, been done.
https://www.youtube.com/watch?v=o2x7gHxQYYE
Re: I say we need more AV's
Maybe they can add facial recognition so that AV’s can run over terrorists too when they see them.
if everything had worked properly
it’s this kind of talk that gives me the creeps. how many human guinea pigs are we going to wave off?
Re: Re:
Misleading title
I think the title is a bit misleading. The victim ’caused’ the accident by wearing dark clothes at night and jaywalking.
The uber car ’caused’ the accident by not correctly identifying the potential obstacle using it’s non-visual sensors.
The uber driver ’caused’ the accident by being distracted rather than doing their job.
In other words, this thing was pretty much a mutual fault all the way around. People need to start taking responsibility for their actions. If you dress in dark clothes and jaywalk late at night in a dark area rather than crossing under the street lights, you are gambling that oncoming cars will see you. That’s called taking an unnecessary risk, and it caught up with this woman. It’s a shame, and we can and should question why the sensors didn’t catch it, but a human being wouldn’t have done any better than the uber car.
Yes, the uber car failed at what it should have been, which is better than a human, but if it had been a human driver, the fault would have been **entirely** the pedestrian’s fault. Only the fact that this is a self driving car makes it a mutual all around fault.
Re: Misleading title
So what do you think the headline should have been?
Re: Re: Misleading title
Video Indicates Uber Car’s instruments or software failed to detect Victim
Re: Re: Re: Misleading title
That should replace the first part, the victim did contribute to the crash. The second half of the title can remain as is.
Re: Re: Re: Misleading title
You know what? That is a better headline. Touche.
Re: Misleading title
See the invention of jaywalking. Do we know whether the person was actually jaywalking? In Arizona, the law says:
I’d call C jaywalking, and A and B failure to yield. Maybe the victim failed to yield, or maybe the car wasn’t visible when they started crossing. It wouldn’t be jaywalking unless the adjacent intersections were both signalled.
Also:
Re: Re: Misleading title
The portion of road she was on had lit crosswalks, but not where she crossed. See the street view of the area.
Re: Misleading title
If you look at this from an engineering point of view you have reason.
If you look at this from a legal point of view you are nuts.
The car has a driver.
The driver was not paying attention and was actively engaged in other activity such as playing video games.
A pedestrian was killed.
If the driver was driving drunk he would be charged with vehicle homicide.
I bet before this is over the driver is charged just like he would be if he had been drunk and not in control.
Re: Re: Misleading title
“was actively engaged in other activity such as playing video games”
…or doing something else such as checking the centre console display that some of these cars have, but which wouldn’t be visible on the video. We’ll wait for an investigation to find out what she was actually doing, but lying about it because you fantasied what she might have been doing is not going to help your case.
“I bet before this is over the driver is charged just like he would be if he”
Your grasp of the gender of the driver is as accurate as your grasp of what was happening in the car.
It’s a funny trend during these discussions, actually. A lot of the people spewing bile over what the driver or car should have been doing based on the video can’t seem to get the most basic facts about the identity of either the driver or victim correct.
If someone refers to either of them as “him” I know I can safely ignore their other observations.
Re: Re: Re: Misleading title
re PaulT
Arguably someone could refer to the driver as “he”, if they are going on chromosomes rather than what the driver currently identifies their gender as. I saw driver face in a media article and thought driver a “he” until I read the article describing the driver as trans and them now identifying as female.
Re: Re: Re:2 Misleading title
Exactly. If someone’s jumping to conclusions without knowing all the details of the incident, their assumptions about what’s happening in the video can probably be ignored. A quick reading of the case background will reveal the truth.
It’s not just the driver, I’ve seen lots of people talking about the cyclist as if she were male too. From my observation, those are usually the people with little worthwhile in their other observations.
Re: Re: Re: Misleading title
If the driver was looking at the center display that presumably showed what the car was “seeing”, then it’s really telling that the car certainly didn’t “see” the lady, as the driver clearly hasn’t noticed anything until she looks up and spots the lady with the bike based on how she reacts.
Re: Re: Re:2 Misleading title
The centre console has other information from I’m aware of, but that’s a possible scenario if the video stream is mixed with other information that the human support needed to take into account.
The point is, people are attacking this person with claim that she was not doing her job, but as far as we know she could have been doing exactly what was required. She just happened to be doing something other than looking through the windscreen during the last couple of seconds before the first ever pedestrian collision, during which time it’s unclear whether she could have made a difference if she had been.
Uncharacteristically poor slanting by techdirt.
Techdirt:
Check your own previous story. The chief didn’t say darted.
She said “based on how she came from the shadows right into the roadway.”
That is a direct quote from the previous Techdirt story.
What the chief said in that previous story is a fair characterization of what happened in the video. She does appear right out the shadows when the headlights hit her.
First: the victim certainly did cause this collision. She is casually crossing a 45 MPH road, in the dark while wearing dark clothes. It’s fatally reckless to do that. You don’t count on car to stop for you in broad daylight on a 25 MPH road, it’s suicidal to do on 45 MPH road in the dark.
Second: Ubers Vehicle shouldn’t be on the road if they can’t pick out one person crossing an empty road. All the blame belongs to the pedestrian and Uber.
Third: The Safety driver would have no chance to react in time, even if she was paying attention. She still isn’t driving. Reaction time would be longer than typical 1.5 seconds of someone actually driving, instead of watching. Also, I strongly dispute anyone would have seen the victim a full 2 seconds before impact. You don’t get to rewind the video multiple times real life until you see what you are expecting. The first time I watched that video, it was “shit- boom”, and I knew it was coming.
The safety driver is in an impossible situation and should not be a scapegoat for a fatally reckless pedestrian and poor Uber technolgy.
Re: Uncharacteristically poor slanting by techdirt.
That stretch of Mill is 35, though it used to be 45.
She also presumably didn’t know there was a car coming when she started across the street; the road curves there and it was 10 PM.
Re: Re: Uncharacteristically poor slanting by techdirt.
I saw a video shot after the accident. The posted limit is 45 MPH.
Re: Re: Re: tl;dr
I don’t know what video you saw, but northbound on Mill before Curry it’s 35.
The Arizona Republic:
Re: Re: Re:2 tl;dr
This appears to have happened because the road was recently changed from 45 to 30. The current supposition I’ve seen is that the Uber road specs in it’s computer didn’t have the change.
This brings up a different issue, which is, how do we ensure that AV’s have the correct road information in a timely fashion. This is likely going to require changes to how states handle speed changes.
Almost certainly it’s going to require that there be a standard regarding temporary speedlimits, with devices that broadcast the temporarily lowered speedlimit to the cars. This is, of course, an issue because you just know someone will hack it and put out pi devices that broadcast 120 or 5 mph speed limits to mess with the cars.
Re: Re: Re:3 tl;dr
It’s 35 and the car was going 38. I think it’s likely that it was operating in a "within 3 miles of the speed limit" range for the current, correct speed limit than going 7 miles under an outdated speed limit.
There’s all kinds of mischief people can get into by painting weird shit on roads and signs, and that’s before we even start asking about network security. We’re just getting started finding out all the different ways autonomous cars can fail.
Which is not to say that I’m opposed to them on principle. Just that we’re going to have plenty of problems as they’re deployed, and some of them haven’t even been thought of yet.
Re: Re: Re:3 tl;dr
Re: Re: Re:4 tl;dr
When I say temporary reductions in speed, I’m talking about construction zones or emergency zones, where they’ve reduced the speed within the last hour. You can’t update the databases fast enough unless every car is mandated to have always on wireless, and even then, it can still have issues with network congestion.
Re: Re: Re:5 tl;dr
Unless the construction is a flybynight operation it is scheduled months in advance.
As for emergencies traffic will have slowed and there is usually a large vehicle blocking a lane.
Re: Re: Re:3 tl;dr
They can read speed signs. The database fills the "gaps", when you’ve turned onto a street and didn’t yet see a sign.
Re: Re: Re:2 tl;dr
The paper is wrong.
Here is the streetview, with a X in green to show the accident location. You can see the accident location from 45 MPH limit sign:
https://i.imgur.com/oN57tu2.jpg
Bottom Right corner of the image, is the vidcap from a video uploaded yesterday, showing that Limit sign is still 45 MPH.
And here is the Video uploaded yesterday, Sign is visible around 26 seconds.
https://youtu.be/1XOVxSCG8u0?t=24s
I don’t see how that can be a 35 MPH zone with a clearly marked 45 MPH limit sign right in front of it.
Re: Re: Re:3 tl;dr
So it does. My mistake; thanks for the link.
Though there is a 35 MPH sign earlier (about 5 seconds in). So it’s likely that the reason the car was going 38 is that it was increasing speed from 35 to 45.
Re: Uncharacteristically poor slanting by techdirt.
You sound like someone that has had positive income.. That’s not Uber
Re: Uncharacteristically poor slanting by techdirt.
First, Uber should be on the road if the car is at least as safe as the human driver that they allow on the road.
Second, the human driver isn’t a scapegoat, he/she is there to make sure the car is at least as safe as a human driver (which is the standard the law holds us to). If the car was less safe than that standard, it’s the human driver’s fault.
To say that the human driver would have no blame, but Uber does is to say that automated vehicles should be automatically more at fault than human drivers.
Re: Re: Uncharacteristically poor slanting by techdirt.
There is no evidence that Uber AV’s are safe. You fail to grasp the "watcher" problem, that makes a human observer much less effective than a normal human driver.
As someone already posted. The human backup is NOT required under Arizona law. So you can’t claim her role is to intervene and make it safer.
The role of such backup/safety drivers is mostly about providing the illusion of safety. A real driver that is already operating the controls of a car, will usually take ~1.5 seconds to react to an emergency. Someone who isn’t actually driving the car is obviously going to take MUCH longer to react to an emergency, so there is no way this backup driver can be as safe as a normal driver.
If an AV isn’t considered safe enough to operate on it’s own without a backup/safety driver, then it should not be considered safe enough to operate with one, because the difference between those two is mostly illusory.
Re: Re: tl;dr
No it isn’t.
No "automatically" about it; he’s saying that the automated vehicle is more at fault in this specific instance, given the facts we know.
I think that’s a fair read. His argument, as I understand it, is that the pedestrian appeared too quickly for a human driver to react, but the autonomous car should have done something — it should have recognized the presence of an obstacle and swerved or braked before impact.
The automated vehicle is not automatically more at fault than the human behind the wheel; it’s more at fault because this appears to be a clear case where the LIDAR or the image recognition malfunctioned.
Let's Pretend Autonomous Vehicles Aren't Involved...
Driver A and Pedestrian 1 aren’t named and we know nothing about them. They are both fully insured with similar tiers of coverage. Which insurer pays?
There’s who we say is at fault. Not because it is true but because that choice determines how road-legal AVs can be in the future and how insurable they are.
They are legally responsible for the vehicle. That’s the whole point. They can try to blame the pedestrian, but they can’t blame Uber’s technology.
Re: Re:
That is a problem, if Uber can pawn this off, than like the military doing test flights they can go out into the uninhabited desert where only cati can be killed
Re: Re:
The whole point is to be a scapegoat when something goes wrong?
Because if you are watching a car drive, you are really in no position to take over fast enough if something like this goes wrong.
Re: Re: Re:
This may be true, companies want it both ways, the seed post argues that the woman in the car is likely the only one legally responsible as stupid as that will be but bet on it happening
Re: Re: Re:
Uh… to prevent things from going wrong. But it turns out:
Even if the driver did have some legal responsibility, it doesn’t mean Uber wouldn’t.
It’s common for Driver’s Ed cars to have a brake pedal in the passenger seat, and these do get used.
Re: Re: Re: This is a watcher problem
The difference there is that the person with the brake pedal is actively planning to use it during a run with a student driver. The observer in the driver’s position is not. (The passenger of your Driver’s Ed car isn’t either, when the driver is known to be experienced.)
The person in the driver’s position in the car has been watching the car do its thing for however long, uneventfully. There is little or no traffic, no sign of people. While it isn’t time to break out a novel, there’s nothing to warn a person that a second’s inattention is going to be crucial.
And even if there were, "the car should handle it". You have to go through the recognition that "oops, the car isn’t handling it".
It isn’t humanly possible to be prepared to do something "instantly" for hours on end. And there’s no guarantee that braking by itself would have been enough.
Re: Re: Re:2 This is a watcher problem
So….. we don’t know that the state of Arizona expected the "driver" to do anything. What was Uber’s view of her responsibility? Was she expected to keep her eyes on the road, or was she looking down at some Uber software? If the former, Uber should have employed gaze detection to make sure she’s paying attention. That camera was aimed at her for a reason right? They have much more video than us and can look for patterns of attention over her history.
Re: Re: Re:2 This is a watcher problem
Thanks for getting it. The reaction time of an actual engaged driver to a surprise event is something like 1.5 seconds.
It will obviously be worse from someone who is essentially monitoring the car. So it would be completely unfair to expect them to react as a normal driver would.
Re: Re: Re:3 This is a watcher problem
Why are we calling them the "safety driver" then? It seems to me that it’s exactly what the public expects of them. "Normal drivers" aren’t necessarily "engaged" either. You don’t get away with killing someone because you had cruise control on, or Telsa’s autopilot.
Re: Re: Re:4 This is a watcher problem
Misnomers and bad assumptions don’t change the realities of human reaction.
The person in the seat, is NOT the driver. The car is doing all the driving.
The person is monitoring the car. We should call them Autonomous Vehcile Monitors.
AV Monitors are effectively passengers. If an emergency happens, the car is supposed to handle it. The monitor is not pressing the brakes every time the AV gets close to something. They are really assuming the AV will hit the brakes because the AV is driving.
If an emergency happens an AV monitor would have to:
1: Recognize it. But since they are in relaxed passenger mode, it would almost certainly take longer than an actual driver, who would be more engaged.
2: Recognize that the car isn’t going to handle. Because they are used to the car doing all the driving and handling all the situation, this could be a long pause.
3: Shift into driving mode, grab the controls and take action, and this will take time because unlike a normal driver, they aren’t already using the controls.
Regardless of how you try to legislate this, there are many more cognitive and physical steps for an AV monitor to take than a regular driver, an it will take them a multiple of the time a regular driver would to react.
Thus considering them a safety element in any fast developing emergency is absurd.
As I said in a previous post:
If an AV isn’t considered safe enough to operate on it’s own without a backup/safety driver, then it should not be considered safe enough to operate with one, because the difference between those two is mostly illusory.
IMO, given the failure by this Uber platform, it does NOT meet that safe enough standard, and shouldn’t be permitted on the road with or without a backup/safety/monitor in the seat.
Safety
I want autonomous vehicles to be safer than human drivers, preferably a lot safer, before they’re allowed in the wild.
Re: Safety
Technically they are already far safer then a human driver.
Re: Re: Safety
wait 100 odd years then we can talk or even wait until they are not confined to limited traffic suburbs, try again with 10000 of them all over a major city, 1 is going up very fast if this one is typical of the state of the art.
Re: Re: Re: Safety
Do you also want a guy with a red flag walking in front of the car, just in case?
Re: Re: Re:2 Safety
Don’t forget, you must stop and disassemble said vehicle at all intersections and reassemble on the other side.
Re: Re: Safety
No, there is not singularity with ai used to perform safe passage of vehicle because ai knows not why it is going and where it is going and what it will do once it arrives at any particular location. Ai has no conscious awareness of who it is only a mechanised system to conclude.
Re: Re: Safety
Ars Technica have done the math and claimed that Uber’s cars kill people at 25 times the rate of other drivers.
What I don’t get is how people are partially blaming the victim for wearing dark clothes. Shouldn’t the LIDAR be able to pick up people with dark clothes, even at night? How many other automated cars are running similar tech that can’t pick out someone with dark clothes like that? That seems like a problem that should’ve been worked on and fixed before letting these cars out loose on the open roads.
When you ask if we’ve had our first pedestrian death by AV, the answer isn’t “Kinda? Maybe?”, but simply “Yes.” I don’t understand the need to be so wishy-washy about it.
And honestly, I’m sure that quite a few people will justifiably be turned off of AVs completely if clothing choices factors into your chances of getting run over.
Re: Dark clothing is still dark to infrared LIDAR…
…and RADAR is little help when it’s tuned for vehicles and
solid obstacles. They should look into military antipersonnel
RADAR such as these:
https://en.wikipedia.org/wiki/Man-portable_radar
No doubt they can add a simple stripped-down version of
these to detect pedestrians and wildlife at low cost.
All people should be careful crossing streets in dark clothing,
because even good drivers and the best AVs will still hit them. ;]
Re: Re:
“What I don’t get is how people are partially blaming the victim for wearing dark clothes.”
That’s as much for the people claiming that it performed worse than a human driver, I think. While you can talk about whether or not she should have been visible with the other sensors, the fact that a human driver would also not have seen her until it was too late is an important thing to consider. That changes the discussion from “why didn’t the AI perform properly” to “why didn’t the AI perform better than we expect people to perform”, which is a different question.
“When you ask if we’ve had our first pedestrian death by AV, the answer isn’t “Kinda? Maybe?”, but simply “Yes.” “
Define “by AV” first. If you mean “an AV was involved in a fatal collision”, then yes. If you mean “an AV caused a fatal collision”, then no. That’s an extremely important distinction for this discussion.
“I’m sure that quite a few people will justifiably be turned off of AVs completely if clothing choices factors into your chances of getting run over.”
Do they have the same reaction to the other deaths across the country that week where people say the victim should have been more visible, or just the one where a scary new technology was involved? There’s a reason why high visibility clothing is being considered as mandatory for cyclists in some countries.
Re: Re: Re:
This accident is looking like it was more and more easily avoidable every day.
https://arstechnica.com/cars/2018/03/police-chief-said-uber-victim-came-from-the-shadows-dont-believe-it/
The sensors should’ve handily seen the pedestrian in these conditions, but it didn’t. What the hell went wrong?
Re: Re: Re: Re:
We’ll find out, and maybe people will stop trying to second guess everything before all the information is known. We don’t actually know what the sensors on the car picked up yet, nor the full timescale of the car’s reactions to it.
But, it’s notable that the narrative has now changed to the other sensors now that it’s clear that a human driver wouldn’t have reacted much differently to the footage in the video. Meaning that we’re now criticising the car for not being superhuman enough, rather than saying it was worse than a human would have been.
RED ALERT! RECORD-BREAKING ZOMBIE! "BAlbrecht" or "Bruce A."
SEVEN AND HALF YEAR GAP! LAST SEEN IN 2010! https://www.techdirt.com/user/balbrecht
HA, HA! — OH, NO, NO ASTRO-TURFING ON THIS SITE!
This one is unusual because makes several comments, but below those here, right on first page, back to 2010!
Only real question is whether I’m the only human here!
Re: RED ALERT! RECORD-BREAKING ZOMBIE! "BAlbrecht" or "Bruce A."
Whoops. Laughing so hard confused “Bruce” with “PeterScott”, which has only a 15 month gap, though many on this topic so MAY be human.
Anyhoo “Bruce” is all the more remarkable for ONE comment after 7.5 years!
And again, is most likely that these zombies will show up on Timmy’s pieces, and they all seem to have his views. Weird, huh?
Disingenuous expert?
“If I pay close attention, I notice the victim about 2 seconds before the video stops,”
I have checked this, and that claim seems very disingenuous. I used a stopwatch and timed it on multiple runs.
I get 1.0 seconds average from the time I see the first visible indication, and when the video freezes.
1.0 and “about 2 seconds” are significantly different. Different enough, that I question the motivations of “Expert” that is that far off from the truth when making public statements.
Check it yourself.
Normal driver reaction time to a surprise event it 1.5+ seconds. There is no way in hell an average normal driver would even touch the brakes before hitting her.
I am not trying to exonerate Uber. The Car should NOT be limited to visible beam of the headlights. LIDAR should have picked her out of the shadows. The car failed. The fleet should be grounded indefinitely.
Hopefully the official investigators will ignore the media circus around this.
It probably would have been best to say nothing, and not release video until after the investigation was concluded.
BMW/Mercedes/Audi infrared would have spotted this cyclist
https://www.caranddriver.com/comparisons/night-vision-systems-compared-bmw-vs-mercedes-benz-vs-audi-comparison-test
Re: BMW/Mercedes/Audi infrared would have spotted this cyclist
It’s not really the seeing her that was the problem, it was predicting her actions and reacting to them before a collision. Again, let’s stop moving the goalposts and see what the data says about what actually happened.
Re: BMW/Mercedes/Audi infrared would have spotted this cyclist
None of the thermal systems apply brakes for collision avoidance.
They just put a video image on a screen. Thermal cameras tend to be too low resolution to provide meaningful data for self driving cars.
The only way to stop bad programming with a car is good programming with a car.
Pro-tips for Techdirt Zombie Killers (TM):
#1: This gets latest, starting on ALL over a million comments:
https://www.techdirt.com/comments.php?start=0
#2: The lite mode lets see all comments, though apparently loses useful gravatar so can’t tell among the many ACs, ALSO, sets a cookie so you’re stuck in that mode:
https://www.techdirt.com/?_format=lite
#3 Most important: don’t take this site seriously! It’s just entertainment, like pro-wrestling.
Fascinating how so many people seem to imagine that technology just appears fully formed, with no mistakes during its creation.
Don’t get me wrong, this is tragic. Somebody died. As did thousands of other people over the course of human history to give us everything we take for granted today.
Trial and error sometimes means big fucking errors, which ideally result in big fucking leaps forward.
Re: Re:
“Fascinating how so many people seem to imagine that technology just appears fully formed, with no mistakes during its creation.”
Beta testing shouldn’t happen in a production environment.
Re: Re:
Think of how much quicker new drugs could be introduced if we used public testing. There might be some “big fucking leaps”!
I’m comforted by the apparent fact (from reading the pearl-clutching diatribes from so many) that pedestrians are never hit or killed by human drivers, that it is never the pedestrian’s fault even though they’re wearing black clothes, crossing outside the crosswalks in an unlit area on a pitch-black night, on a 45mph stretch of road. Every human driver would have avoided that poor woman, she’d be alive today! Right? Right??? Yes, technology likely should be able to detect every ignorant fool but it didn’t in this case. That doesn’t mean the technology should be abandoned, just modified.
Re: Re:
…do you have any examples of anyone in this comment section actually saying any of those things?
Where did all these self-driving car apologists come from?
Yes, cars driven by people kill pedestrians & cyclists all the time. And sadly, juries filled with other car drivers won’t convict even really bad drivers.
But this situation in no way excuses the self-driving car companies and their lobbyist-paid politicians who removed their liability.
I’m not interested in self-driving cars who also want to kill 30,000 people each year; I’d like to see self-driving cars do a heck of a lot better than human drivers.
But I don’t want them on the public roads while they’re still practicing with their “L” (“learner”) tags.
Roads are for cars
Look both ways before you cross a road.
Re: Roads are for cars
Look for pedestrians when you drive down a road.
Re: Re: Roads are for cars
The driver will not be alert nor paying attention in any half usable So set-up is just another point is failure. After all when suddenly needing to make a snap judgement he is at least as likely to make the wrong one as the right. Seems like an obvious failure in the Au or sensors. The bike should have been obvious to radar or life and the car react accordingly. A fail in the sensor desighn, the sensors or the Ai?! We likely will never know as AND A makes examination of the code illegal!
Weren’t self-driving cars supposed to eliminate, or at least reduce, stupid deaths?
Re: Re:
That’s one of the aims. Given that this is literally the first death related to such cars, this doesn’t change that aim.
Again, I’ll note that nobody know the details of the 10 other deaths in that city over the same week, because they’re so ordinary and haven’t really been reported upon. This death is international news because it hasn’t happened before.
Cars And Roads
About five hundred years ago, Leonardo Da Vinci, the universal genius of the Italian Renaissance, drew up a plan for a city with two sets of streets at different levels, one for walking, and on for vehicles. He was presumably aware of Venice with its canals, but this was a dryland version, suitable for employment in, say, Milan.
An automobile is part of a transportation system, together with the roads it runs on, the gas stations, the drive-throughs, the parking lots, etc. Changes in one part of the system inevitably dictate changes in other parts of the system. This means that the ultimate owner of land, the government, is necessarily a key partner, in fact, the leading partner. You cannot do an electric car or a self-driving car as a tech product. You must have government sponsorship and large sums of government money to pick up the loose ends. Almost anything built under the tech-entrepreneur system tends to develop a critical shortage of physical public goods, because the greed of the tech-entrepreneur is incompatible with the spending of public money. The major limiting constraint on computers has become the internet, and Comcast has become one of the most hated companies. .I think that most Techdirt people would agree that we cannot put up with Comcast much longer, that the telecommunications system, which undergirds the internet, needs to be socialized, and placed under the control of public water authorities and the Post Office. The internet is too important to be run by businessmen who want to control everyone. However, considered as civil engineering projects, telecommunications networks are kid stuff compared to roads. There are lots of ways you can snake a cable which is only half an inch in diameter, and can easily be bent to a six-inch radius. Anything involving automobiles is much messier.
A smart car is going to require a smart road, with lots of built-in electronics. An electric car is going to require an electric road, with built-in power supply wires. These things are not particularly impossible. It’s not usually a major issue for public railroads, such as American commuter railroads or European national railroads, where everything belongs to one government entity. However, such things cannot be done with Elon Musk or Travis Kalanick as sole beneficiary and proprietor. The State of Arizona allowed just about anyone to start running self-driving cars on the public streets, without regulation, but the state was not prepared to meet the expenses of building pedestrian over-passes on a large scale, say for every road where the speed limit is at least two-thirds of that on an urban freeway. The state swallowed the tech-entrepreneur kool-aid., and abdicated its responsibility to organize transportation for the public good. You cannot build public works as private luxury goods. They have to be for everyone, because everyone has a vote, and people driving two-thousand-dollar “beaters” cannot be expected to vote for roads which require an eighty-thousand dollar car to use.
Often,, change of technological systems involves military sponsorship. The first jet airliner, the Boeing 707, was also known as the KC-135. It was an air-refueling tanker to go with the Boeing B-52 bomber. The United States Air Force acted as the general patron and sponsor of jet aircraft. The United States Army was the institutional sponsor of the Interstate Highway system.
Unlike an airplane, an automobile, or a train, does not usually have a very good field of view for the path is is traveling on. This means that sensors need to be located where they do have a good field of view. This might mean mounting cameras/radar/lidars on telephone poles or lamp poles, where they can see things in a much more unambiguous way than sensors mounted in a vehicle three or four hundred feet away. The basic means by which railroads avoid collisions at grade crossings is that a track detector located half a mile or a mile from the grade crossing identifies a train, and then a signal is transmitted, which causes gates to swing down, blocking automobile traffic. In the case of high-speed trains, various additional refinements are necessary, but the basic principle remains the same.
Smart cars work fairly well on freeways. The lanes in opposite directions are isolated by barriers, and the intersections by overpasses. It is rare for the relative speed between vehicles in the same roadway to be more than 20 mph. Freeways are already much safer than ordinary streets.
What can work, given sufficient money, is “slow-fast,” that is slow off the freeway, and fast on the freeway. Set an off-freeway speed limit of, say, 20 mph, for arterial streets; 10 mph for secondary streets; and 5 mph for paring lots, campuses, etc. . However, on a controlled interstate, let the speed limit be 80-90 mph. Extend the freeways far enough that slow speed, off the freeways, becomes acceptable. You can build a 20-mph interstate, that is a limited-access road with tighter curves, steeper ramp grades, and shorter merging zones, so that it fits within the space allocated for an arterial street. Because cars do not have to stop, the 20 mph freeway will have the same net speed as a 40 mph conventional road. Such a road would typically be dug into a trench below grade level, in much the same sprint that subways are put underground.
Alternatively, you can provide a car with an information system which permits it to “time” traffic lights, that is, to arrive at a light just when the light is turning green. A system like this would probably still require pedestrian overpasses, because people simply don’t move fast enough to clear an intersection within a couple of seconds.
The basic problem with self-driving cans is that they have been mistakenly edited into the framework of what a Silicon Valley company can do. They are forced to pretend to human-level artificial intelligence which they do not have. Even now, a computer with the power of a human brain would approximately fill a warehouse. The “brain” in a self-driving car is more on the level of a cockroach. Insects aren’t real bright. You can manipulate them with a flashlight.
Remember how good human-assisted AI was going to be? The best of both worlds /s
Re: Re:
Did anyone say it would be perfect? If not, then what you’re being sarcastic about is demonstrably happening. This is still the only AV related death. How many involving human drivers have happened in the time since this one?