I don't know what the law says about this in the US, but when I was taught to drive (in the UK), I was told not to assume that someone signalling would actually do what was they were indicating, and that I would be responsible for whatever happened if I took action in reliance of that signal.
Not sure if that's the law in the UK, or just my instructor teaching me to drive defensively.
It's basically the same here. You'll often see people with a right signal on coming to turn into a shopping center and then they don't either it's the wrong place or they forgot their signal was on. I wait to be sure they are turning first.
I used to get people trying to trick me into turning in front of them all the time in the company car for the last company I worked for.
Usually go like;
Me: making right turn out of a parking lot see a car with their blinker on presumably turning into the place I'm leaving so I creep forwards a few feet.
Other car: sees me creeping forwards and think I'm going to turn in front of them so floors it.
Happened so many times I don't even move until they're already halfway turned.
My favorite one;
Me: making a left turn onto a main street
Other car: making a left turn onto the side street I'm on
Other car: waves me out
Me: just looks at him
Other car: backs up a bit and waves me out
Me: just looks at him
Other car: backs up a bit more and waves me out
Me: "I see where this is going..."
Other car: puts it into reverse to continue the dance
Me: floors it as soon as they shift into reverse
Other car: puts it into drive as fast as they can and tries to ram me
A prius is a lot quicker than people think...dude's lucky I was in the company car because I'd have gotten out at the next light and slashed their tires if someone did that to me in my personal vehicle -- can't really do that in a car with the company phone # on the side in big lettering.
While it sounds like the human driver was likely at fault in this situation I wonder if she made eye contact with the self-driving car's "safety driver" before the collision. If she assumed he was operating the vehicle and could clearly see her she would have expected him to slow down. This is the cultural birthplace of the Pittsburgh Left, after all.
After reading up on the Pittsburgh Left, that needs to be eliminated from driving culture ASAP. If there are intersections that require it, simply add a protected left.
In most of the regions where the Pittsburgh Left is common (I grew up in Massachusetts, and thought it was just how you drive until I moved out to California), adding a protected left is not feasible. The roads are single-lane, barely the width of a single car, and property rights extend right up to the side of the road. (Similarly, in many cases sidewalks are infeasible because there's always some homeowner along the road who refuses to cede their land.) The properties themselves are usually residential, and there will be hundreds of them alongside the road.
It's very different from the 3-lane boulevards with wide setbacks, protected turn lanes, and commercial zones all owned by the same developer that you'd find in the West or South. The roads in the Northeast U.S. are former cow paths, and they are about as wide as you would expect a horse-drawn wagon to be. (They also don't form any sort of grid system and frequently meet at 5- or 6-way intersections, which is a whole other issue.)
Yep, if you make a left turn in front of oncoming traffic it's pretty much always your fault even if the robot was driving the other car.
Though, in this case, it's probably cheaper for Uber to fix the other car than pay all the lawyer fees because you know there's going to be lawyers involved.
There will be legal fees, but if Uber is confident they can win it, it might be a good investment just to defend the reputation of self-driving cars.
If they settle out of court, it leaves room for people to wonder if they settled because they had to. If they proceed to court and win, then they can say, "Our algorithm didn't cause this crash, and an impartial jury confirmed that."
The damage was on the front-right bumper of the left-turning car. The only way I can see the left-turning car being at fault is if the left-turning car was in the leftmost lane, and the right-turning car tried to merge into the leftmost lane as well (rather than the closest lane as required)
There was no right-turning car. The uber car was going straight. The woman in the other car (making the left turn) argued they had their right signal on, but it's not apparent if there's any evidence to corroborate that. Uber has stated that the car's left turn signal was on to indicate an upcoming lane change
According to the article this was a 4 lane road, and she was in the furthest lane to the left of the Uber driver. The Uber driver was in furthest right lane, with a left turn signal on, preparing to change lanes.
So, she cut across two lanes to crash into oncoming traffic. I'm not sure of the Pittsburgh driving laws, but generally you're supposed to get into your left-most lane before taking a left.
I don't think she has a hope in court, especially since Uber definitely has high quality records of everything that occurred.
> According to the article this was a 4 lane road, and she was in the furthest lane to the left of the Uber driver.
No, she was turning left from a perpendicular street, from the leftmost lane in her original direction of travel into the rightmost lane in the new direction of travel. Pop
> The Uber driver was in furthest right lane, with a left turn signal on, preparing to change lanes.
The last part is a disputed claim, per the article.
> So, she cut across two lanes to crash into oncoming traffic
I'm not sure how you figure that; she ran into crossing (not oncoming) traffic, and I don't see any way of counting where it works out to cutting across two lanes.
> I'm not sure of the Pittsburgh driving laws, but generally you're supposed to get into your left-most lane before taking a left.
She was making a left turn from the leftmost lane; that's not in dispute.
> I don't think she has a hope in court
I don't either, but that's because making a car making a left into crossing traffic, unless the crossing traffic is violating a control (stop sign, signal, etc.) or speeding, is almost always at fault because of right-of-way
You might be right, I was only going off what I could glean from the article, which may have been misleading.
From the article:
"McLemore was on a four-lane road called Liberty Avenue, heading Northeast, while the Uber vehicle was approaching from the opposite direction."
This description, led me to believe, they were on the same road approaching each other from opposite directions. As you can't really be going in opposite directions unless you're on the same road, or a parallel road.
If that's not the case, I guess the direction they're referring to is the direction in which she was turning to go?
Also from the article:
"'I was not expecting someone to turn from the far left lane into my lane,' the Uber safety driver said."
Based on my understanding of the cars heading in opposite directions, I assumed he was referring to the lane furthest to his left. And he's claiming he was about to change lanes to the left, which would place him in the furthest right lane and her in the furthest left lane.
I had assumed that she tried to turn left at the intersection, from the wrong lane, and hit the car in the process.
I don't want to get into the particulars of this single wreck.
But it does make me think that we're going to need some new laws as we adapt to a mixed use on the roads:
1. The self-driving car & its operating company must retain all sensor and video data. No "oops we lost it" when going to court.
2. There has to be some way for the human driver to know what legal entity is operating the car. It's pretty easy to imagine driverless hit-and-run accidents.
3. Human drivers have to be able to get insurance information from the automated vehicle. (I have no idea how! But humans are able to work out an ad-hoc protocol to exchange insurance info on the spot. Automated vehicles would need to have something predefined.)
Basically, _when_ an accident occurs, the human and the human's insurance company has to have some way to connect with the insurance company of the automated car and its operator.
#2 and #3 can be resolved, in the worst case, by existing public ownership and insurance registries and making any collision involving a self-driven vehicle without a human representative of the owner who can provide registration & insurance info require police response and identification of that information based on license and/or VIN. (As this potentially increases public expense for that mode of operation, licensing for it may be more expensive.)
Yeah I've been around enough self driving cars to know that they behave a little weirdly sometimes, in a way you'd never expect a human to. We can keep throwing humans under the bus on some technicality every time this happens because we're engineers and beep boop everything is black and white. period. But in reality there's a social aspect to driving with a lot of unwritten rules that people do not seem to be worrying about as much as they should.
Not sure if that's the law in the UK, or just my instructor teaching me to drive defensively.
reply