It appears these robotaxis need an emergency button similar to the ones on industrial robots. But instead of completely stopping, a remote operator immediately takes over and bugs out. Meanwhile, the AI can take a backseat and observe how to respond in a situation like this. Because obviously they don't have enough training when dealing with emergency situations.
Kia and Hyundai have already made that trivial the past few years. At least with the tender to their provider and GPS the robot taxis should be recoverable and disableable.
maybe instead of adding even more complexity to solve the issues our solutions are creating, we could wait until we have solutions that don't create huge issues. What happens if the thing locks all the doors and manual controls and then drives off a bridge? Or if it's caught fire. Personally I'm barely okay with taking planes, if there was a car that could, at any moment, turn into a prison cell I'm not going near one of those things as long as I live. Even if they say they've disabled the "feature". Do we really trust these corporations with our lives so much that we're willing to get perma-locked into their test vehicles?
Idk I think I agree with the chief here that they're not ready for primetime if we're suggesting we put a halter and lead rope on an autonomous vehicle.
The closest I've come to this scenario is playing games (eg CSGO) where you can take over a bot. So often, I take over a bot (without spectating their play first) and immediately die because I have no context for their situation and they're about to be shot at; in many of those situations I feel there bot would have done better than I did.
This makes me feel a human operator probably would be no good at taking control ... however a computer that has access to more data (eg data from other vehicles), or that has more control (of other vehicles, of traffic lights, of pedestrian crossings, etc.) or more capabilities, might be far more successful than a person?!
I don't see how the situation is similar. The robotaxi is parked in front of you, you connect to it and move it away. The entire context is right there.
well honestly, if we're saying that the person being inconvenienced is the one who gets to take control, there's probably gonna be a lot of angry people driving these cars off of bridges or into ditches or something. Giving an incredibly frustrated person control of a separate car, though better for context, is probably worse for desired outcome.
How about the robotaxi company driving in respect to regulations and other's property, then we wouldn't have this discussion. Oh, it's technically not possible? Well, then don't drive them.
A family member was a NY fire fighter many years ago. The Soviets had a lot of people at the UN, and they had diplomatic plates. They would park wherever they wanted, often blocking things. There was nothing the police could do.
The fire chief decided to have fire drills. Whenever a cop saw a car with diplomatic plates parked in front of a hydrant, they would call the fire department, who would come out and perform a drill. Upon seeing the vehicle blocking the hydrant, they would practice breaching through the vehicle to attain access to the hydrant. The vehicle would not survive in a drivable state.
The Soviets complained, but stopped blocking fire accesses.
EDIT: the firechief could solve this by putting truck or police cruiser style bullbars on their vehicles and "carefully nudging" the miscreant vehicles out the the way.
Heheheheh. Bare-knuckled, but I like that approach.
Beyond instructing emergency responders "if it doesn't stay away, just smash its windows" (mentioned in the article, at n=1 scale)...I wonder if robotaxis have some convenient central depot, which might find its driveways block by emergency street repairs, or some such.
Damn that's rough. I saw a car parked in front a fire hydrant at a fire in Queens once. FDNY simply smashed the windows to open the doors, and then ran a hose through it.
That has some "how much pressure can be loss to friction and still provide the necessary water rate at the other end." Bends in the hose will increase the amount of friction and thus reduce the flow rate.
One of the challenges with this is that often city/urban fire engines have front bumper discharge lines so that hose can be hooked up to the front of the engine, not just the sides (this is done to improve access options - 1 3/4" hose is fairly flexible, but when you get to 2 1/2" lines or 3"+ supply lines, they need space, and are not overly flexible.)
>EDIT: the firechief could solve this by putting truck or police cruiser style bullbars on their vehicles and "carefully nudging" the miscreant vehicles out the the way.
I know this was tongue in cheek, but it underscores a problem I have with this article title.
The problems, despite the article's title, usually don't involve robotaxis vs trucks, but robotaxis vs active firefighting scenes. The article I read earlier this year referenced cars running over hoses and inching towards firefighters after being blocked by an active firefighting scene.
Most countries have penalty points on driving license. If some driver is violating traffic rules too often, they should lose driving license! Self driving cars should not have an exception!
are you suggesting this on a per car basis, or all cars working off of the same point system? accruing enough points to unlock "suspended license" achievement would seem appropriate to be applied to the entire fleet since it's the same "AI".
Per driving license. It already works like that with truck driver and other professional drivers. They may have 5x more accidents than normal person, but they drive 20x more miles.
Truck drivers are individual people making discrete decisions. A robocar company all uses the same training data to make the same results.
Put 10 different human drivers in the same situation, you'd expect a variance of decisions being made. Put 10 robocars in the same situation, you'd expect 1 decision being made. Is this not how they are being tested? Am I crazy to assume this?
Absolutely, however holding the entire fleet responsible for the number of faults that would disqualify a single driver also seems wrong. If a taxi driver drives 8 hours a day, then 10 robotaxis will drive 80 hours in that same day and statistically be subject to perform 10x the number of faults, if we're holding them to the same standard.
Now, if 12 points loses your license, you have 10 cars, and rack up 120 points across the fleet, it seems obvious that on the whole the fleet is performing worse than a fleet of human drivers we would otherwise ban.
Not an easy problem, to be sure. On one hand, companies can't be allowed free reign to clog up the roads and cause havoc without consequence. On the other, if we NIMBY the development too much, we will be stuck behind countries that don't. In my opinion, we're near the right balance but probably need to take some short term action to make it clear to companies that a remedy for the current behavior around emergency vehicles needs to be priority 0.
If you take 10 exact same calculators and perform the same operation, you're going to get the same result on all 10. Why? It's a program running on the same hardware. If this hypothetical fleet of robocars are all on the same hardware using the same software, they are essentially the same machine. This one of the key things we've been told about how much safer robocars will be compared to humans. These robocars are not sentient making unique decisions. They all have the same set of logic in them. Your logic just does not compute with me.
Edit: also, it's the same as a recall to me. If one part is deployed to thousands of cars, all of them are recalled. Same applies here for me.
It should be per software. Maybe per software version, if the company can conclusively prove that the prior problems that led to points cannot happen again.
The specific examples listed out in the article are egregious, causing real harm. The fire chief is right to be angry, if anything her response is too measured.
From the article:
- Running through yellow emergency tape and ignoring warning signs to enter a street strewn with storm-damaged electrical wires, then driving past emergency vehicles with some of those wires snarled around rooftop lidar sensors.
- Twice blocking firehouse driveways, requiring another firehouse to dispatch an ambulance to a medical emergency.
- Sitting motionless on a one-way street and forcing a firetruck to back up and take another route to a blazing building.
- Pulling up behind a firetruck that was flashing its emergency lights and parking there, interfering with firefighters unloading ladders.
- Entering an active fire scene, then parking with one of its tires on top of a fire hose.
It would be funnier without the life endangerment, but it is funny regardless. Practical AI can be so smart and so clueless at the same time, it's like watching a kid grow up.
I don't know if massive fines are the way to go, because they could cripple that technology, but firefighters should have free reign to shove those vehicles around when they misbehave, or some way to signal them to get the hell out of the way.
Just lose your license? If I did one of those, I might get sued. If I did more than one, I might wind up in jail. (What charge? Reckless endangerment, if nothing greater.)
Are you joking? Name any jurisdiction in America where you can lose your license for parking in front of a firehouse. There are no points for that in California.
LOL no. Personally I look upon the Swiss "via sicura" regime wistfully. I want people who speed through central cities to go to jail and have their cars crushed.
If the rest of traffic is driving 15 mph faster, clearly it isn't difficult to ignore the people driving slowly. So if anything, surely it's the people who want them imprisoned who should be punished. Their crime? Poor priority management.
You won't immediately loose your license for speeding, even in Switzerland. Afaik it'll be expensive until 20km/h over, only after that will the really nasty things like criminal prosecution and car impound start (for first-time offenders, repeat offenders are punished more harshly).
40km/h over in a built-up area in Switzerland is not only an automatic fine and loss of license, it is automatic jail time with no judicial discretion. This is the equivalent of 45MPH in a city, for the American readers.
Some states have "super speeder" punishments similar to this, with mandatory jail time if you're over a certain speed (>100mph usually) or just over a certain amount over the speed limit (double the limit, or something).
Either way, none of those address the new problem of robotaxis. If any normal human did this type of thing, you could go scream at them or honk your horn enough for them to figure it out. If it isn't programmed into the robotaxis logic it won't do anything as shown. Or possibly do the wrong thing.
You mean parking in front of a firehouse, having it open its doors, having a firetruck come out with lights and sirens, and then continuing to sit there blocking its way while twiddling your thumbs? That would most certainly warrant some kind of criminal charges.
We need to stop allowing computational agents (and thus those who deploy them) to escape blame as if they don't exist, or as if their behavior is just nobody's fault. Computational agents need to be viewed as bona fide actors (as people are), with the actions of the computational agent being considered the willful actions of whomever deployed it.
None of those will lose you your license in California. I'm not sure how to lose your license in California, as I don't know of anyone personally who has lost their license.
There was a case in the news where someone was exceeding the posted speed-limit by 150kph on an undivided highway (i.e. just a stripe between you and the traffic going the opposite direction) and got a 6 month suspension.
I was on the jury for a case where the defendant was involved his third DUI, fled the scene of the accident, and still had his license.
There's that scene from a TV show where the fire fighters are at a fire, and a car is in front of the fire hydrant. One of the characters calls out "Car!" and they then proceed to smash the windows, and route the fire hose through the car and connect it to the fire hydrant. Throughout the course of the incident, the hydrant leaks and fills the cars interior with water.
If only there was, perhaps, a similar effective, and cathartic response to these things.
Yeah, but imagine the car starts trying to drive away while the fire hose is attached to the fire hydrant. They'd have to disable the vehicle completely in this case, right?
That’s pretty much SOP (there are videos on YouTube), however it still wastes time, and damages lines, and if the car parks on a line you have to waste time shifting the car itself.
When I was a kid back in the late-70s, I saw a handful of firefighters and a few other burly dudes overturn a car that was blocking a hydrant near an active fire. They got the car bouncing, then completely turned it over onto its roof. Then, then sorta scooted it out of the way of the hydrant. It's one of my oldest memories, and it was awesome!
Time for Zero Tolerance, if one of these do it seize it, and levy the robotaxi company and whomever built it with Huge Fines. Not "oh that will just be another expense", make it that those fines pierce the corporate veil and get to the executives and the millionaire shareholders. And now that I think of it, if there are any passengers, arrest them. That way they can sue the RoboTaxi company too.
I would hesitate to arrest passengers that can't override the taxi. There are knock-on effects that can't be remediated by suing a robo taxi company, such as loss of job, violation of parole, or simply not having any money to begin the suit to begin with. [I know that it's unlikely that someone wealthy enough to hire a robotaxi for a trip may not be wealthy enough to hire a lawyer, but you never know and it would be a supremely nasty edge case]
I understand your point, but I can see the reasoning to arrest the passengers. In summary, the robotaxi company's business model and how its cars work is none of the emergency service's business. It's a car, in the way, with people in it.
If an ostensibly normal car were to block an emergency response, would its occupants be off the hook if they all denied having driven the vehicle to its current location, and said they have no ability to move it out of the way? I doubt it. They would be at the very least charged with something.
The same thing should go for alleged robotaxis—“But we don't control the car we're sitting inside of!”... not the city's problem. The car you were in was blocking an emergency response.
Would this be unfair? Yes it would. Would this make people fearful of using robotaxis? Absolutely yes it would. And would that put economic pressure on the robotaxi operators to fix these issues as fast as possible? Damn right it would.
These robotaxis are operating legally with permits from the city are they not? It's nonsensical to suggest arresting the passengers sitting in a licensed robotaxi for something they have no control over. This is entirely on the city (and state?) for authorizing something that clearly isn't ready for use on public streets.
So by your logic, if I order a human driven taxi, and the driver does something illegal, I should be arrested along with him? Obviously assumming I didn't threaten and/or bribe him to do that illegal action.
No, this is more akin to hiring an uber, the uber driver does something illegal and the cops arrest you for it with the idea that you can sue uber for the injustice. That's obviously wrongheaded.
In general I agree with the general concept, Scale up standard punishments to be in line with corporations.
* Robotaxi violates traffic laws like blocking a fire engine? Fine of $10,000
* Robotaxi drive into an active crime/fire scene? $100,000 fine
* Robotaxi actively hinder emergency services (ex parking on a hose)? Confiscate the vehicle.
Corporations are the same anywhere, until a problem becomes too expensive, they don't do anything about it. So sure, let them drive anytime anywhere, but if they make egregious mistakes the cost is not "oh darn, an minimum wage 'automation engagement specialist' will have to drive out to the car and move it", it will be a "We were fined $1 million last night due to interactions with emergency services".
> * Robotaxi violates traffic laws like blocking a fire engine? Fine of $10,000
$10,000 seems fair. A blocked fire engine could only lead to delay firefighters to get where they’re going. What’s the worst that could happen? A few building burn down? Somebody has to wait to receive medical intervention?
In the scale of things that’s a small price to pay to uh, eventually avoid small talk with Uber drivers. /s
I don't think these actions take place enough times for your numbers to have any impact on the robo taxis. At that point it wouldn't even be a slap on the wrist.
You'd have to add at least 2-3 zeros before any action would be taken, and the action would likely be something else then what you are expecting. These things just don't happen often enough for this to ever be addressed without the original point of piercing the corporate veil and making management personally responsible for these crimes.
But at that point you'd likely still end up with the dynamic of people getting paid to become the fall guy
> Since Jan. 1, the Fire Department has logged at least 39 robotaxi incident reports.
Assuming they are all the lowest "class" of incident with a $10k file each, that's $390,000 . A not insubstantial fine but maybe not huge impact. Though we know several of these incidents (because they were in the news) were of a more severe class. So lets assume that's half a million in fines at least.
So a million dollars a year in fines just to operate at limited times and in limited locations. Open that up to a lot more cars in a lot more places? Those fines will add up to be quite substantial.
> Time for Zero Tolerance, if one of these do it seize it, and levy the robotaxi company and whomever built it with Huge Fines
This is certainly in San Francisco’s tradition. Given the economic depression those policy’s are putting it in, one might wonder if chasing away this industry, too, is smart in the long run.
So, by your argument, if people are riding a bus and the driver decides to just cartwheel out the door and leaves the bus parked in front of a fire hydrant, everyone on the bus should be arrested?
Uber has been and continues to be just as egregious in SF. I’ve been hit by two Ubers, both when I was in the crosswalk, and one incident caused me a $500 loss. There are Uber Eats double-parkers who stand in queues all night long making key roads unsafe or impassible. I’ve been in an Uber where the driver was literally trying to brake-check and cause rear-enders and he wouldn’t let me get out until I opened the door while the car was in motion. I’ve messaged Uber support dozens of times and they of course do nothing.
Uber is clearly a different problem, but a much bigger one in the macro.
That isn't a solution. Any time spent dealing with the car is a failure.
> The fire chief said each robotaxi company offers training to help deal with “bricked” vehicles.
> “We have 160,000 calls a year. We don’t have the time to personally take care of a car that’s in the way when we’re on the way to an emergency,” she said.
It’s absolutely insane that automated driving is allowed on public streets.
It should be completely banned until such time as there exists a comprehensive testing regime for validating that the automated system functions in all the scenarios it might be expected to face including emergency vehicles, construction, pedestrians, bad weather, and damaged sensors.
This may not work out, as the fines may be lower than the costs of development to address the underlying issues, while it's at the same time all about basic safety of the community. (The fines would have to be higher than the cost of "oops, we have to employ human safety drivers and put them into every car.")
push the robotaxi out of the way with the engine (in the one way street example; this doesn't work with all of them) and bill the company for damage to the engine's bumper. If you park your (human) car in front of a hydrant they'll break your windows with an axe or a halligan and run the hoses through it and you'll be on the hook for fixing your car.
Simplest thing to do is to confiscate the vehicles and levy fines large enough that its in the financial best interest of these companies to ensure that basic road safety situations are at the top of their priority lists.
The state should confiscate or remove the vehicle, bill them for storage or for the removal operation and also fine the company and take it to court if something goes wrong because of them.
Honestly human drivers in SF do this quite often. How many Door Dash drivers are double parked picking up a food delivery and block an emergency vehicle? SFFD doesn't keep a record of it.
When roads are closed, I see people trying to maneuver around cones and get through as if the rules don't apply to them.
The self driving cars are much safer drivers than a very high percentage of SF drivers.
But the GP was about pre-certification. We do that to some extent in the form of driver’s licenses, but as far as I know getting one is more or less a formality in many parts of the US in particular.
Excuse me? I had to pass a written exam and pass a driving test and maneuverability test in order to get my license. It's insane we're allowing these vehicles on public streets without a licensure process in place. We do that for aircraft, why aren't we doing it for self-driving cars?
Individual humans can be arrested. The exact people found in violation can be held accountable on the spot in a manner that is proportionate and scales to exactly the responsible parties.
What are you supposed to do with a car with no humans around causing a problem for people? Key it? Break the windshield? Wait around for a tow truck to move it? Revoke an entire fleet’s license to drive in the State of California over a specific incident in San Francisco? Or wait until the fleet has tallied up enough incidents for the State to say “that’s enough”?
We probably do have to validate driverless vehicles to a higher standard than the average driver, and maybe a higher standard than the best drivers. We want correctness and accountability, because driverless or manned, an automobile mishandled is a death trap.
EDIT: How about this? The State can pass a law that gives immunity to anyone who breaks into a misbehaving unmanned autonomous car in order to disable the autonomy and take control of it and move it out of the way. No, I wouldn’t limit this to emergency personnel either. Anyone with a Driver’s License can do it. Cruise and Waymo can then decide whether to make this process easier or they can replace a lot of broken windows.
> The State can pass a law that gives immunity to anyone who breaks into a misbehaving unmanned autonomous car in order to disable the autonomy and take control of it
What do you do with vehicles without human controls? Mandate human controls?
A human operator can be yelled at and made to leave. They don't become an unresponsive lump of metal blocking the road when stymied by emergency vehicle activity.
it is clear to me that most if not all of Human Progress will cease to advance as we have gone to the extreme with safety culture.
if we were still using Horse drawn carriages for travel, and someone invented the first Horseless Buggies today it would be banned and never allowed to advance at all. The amount of death and injury from the inception of the automobile would never be allowed in a new industry today. We have rationalized and assimilated the everyday human automobile into our lives, but refuse to accept any risk for something new
If you were experiencing a severe emergency and received delayed care as a result of a self-driving car misbehaving - would you feel the same way?
As you lay there bleeding out would you think "I have contributed to human progress - this is good"
I believe there are ways to advance self-driving without these problems. Why can't they have a 1800 number staffed 24/7 for fire departments to contact?
Internally your logic is flawless but I don't see how you can ignore the current delays caused by human driver traffic, not to mention the ambulance traffic created by human driver errors.
Please mind that this is about fire fighting. There is reason, why they have the right of way, as this exceeds any individual concerns: would you really rather have another Great Fire, as a single car blocks critical efforts in the name of progress? Also, as it comes to emergencies, there's a certain difference between a human idiot, who can be always coerced into doing the right thing, and a dumb, but inflexible mechanism.
I guess I am not seeing your point. We know that human drivers severely impede fire response. This has only intensified in the Uber/Lyft era in central cities like SF and NY. The only known solution is to simply take road space away from human drivers, as we have seen with the lowered fire response times in Paris as bike lanes are being expanded. You can't count on human mass action to get out of the way of a fire truck. Anyone who lives in Manhattan has watched an FDNY truck blast its horn for minutes while drivers just stand there.
Well, it may well be that, as a European, I really don't understand how far that kind of mindset may go. My point was really that, when it comes to an individual blocking communal efforts to preserve a city as an expression of personal freedom and/or commerce, there may be always means to take over control of that car, while that may not be the case with automated systems. (There may not be even accessible and standardized controls for manual emergency override.) If so, this is an entirely different "game", with — so far — no known winning strategies.
(You may want to sort this out before inserting potential random road blocks into a city.)
Oh BS. I'm sure you're aware that when trains were first introduced some jurisdictions required that the train be preceded by someone waving a red flag. 'Safety culture' is not always correct by any means, but any tort law book will supply you with abundant nightmare fuel about what happens when safety isn't prioritized.
This is hilariously ignorant because thats actually what happened with cars when they were first introduced. Amazing tech that caused so much societal issues that all kinds of laws and regulations came into effect.
I'm surprised no one has mentioned the US legal system as part of a solution. If a robo-taxi causes material harm, sue the operator for some multiple of the cost of the harm.
I suppose the drawback to this strategy is that real harm has to happen first and that could easily involve loss of life or limb, but perhaps the threat of that would be enough to motivate the robo-taxi providers to fix the problem.
Yes, but it seems here that the regulators are bending over backwards to dole out permits for these vehicles without bothering to think about the externalities. Still, one would think that 1) the operators could be charged with obstruction of emergency services and/or 2) they could be sued for damages in the civil court.
If the US legal system is so effective why are there so many personal injury cases? Growing up abroad, US litigousness was a joke when I was in elementary school (>40 years ago). Other developed countries seem to have significantly lower problems with avoidable mortality; maybe the American approach is just not that great.
> If a robo-taxi causes material harm, sue the operator for some multiple of the cost of the harm.
There haven’t been many cases of actual harm. This article pretty much lists out situations which could cause harm but didn’t, due to factors out of the operator’s or fire department’s control.
These articles always conflate all industry participants but when you dig into it, it's always Cruise that is causing the problem. SFMTA's complaint to the state about Waymo cites 13 incidents in Appendix B involving Cruise cars. The very best thing Waymo could do is lobby the state government to establish strict rules so their own reputation doesn't get diluted by Cruise.
There are a quarter million vehicle crashes in the DataSF Fire Department Calls For Service. Self-driving cars will prevent these. It's the systematically better way to go.
After riding in both Cruise and Waymo cars in SF, I think that Waymo cars are so much more road-ready. While my Waymo rides all seemed pretty smooth albeit with a timid driver, my cruise rides featured extremely skittish behavior around other cars, missing several turns to avoid being around others, and stopping in odd places, especially for dropoff and pickup.
> There are a quarter million vehicle crashes in the DataSF Fire Department Calls For Service. Self-driving cars will prevent these.
That's a bold claim. Do you have evidence to support it? I'm not talking about results from controlled or ideal driving conditions, I mean evidence showing that self driving will definitely be better at the conditions those quarter million accidents occur in.
Perhaps there's some research I'm unaware of but as far as I know the best we can say is that self driving might end up being safer for general usecase driving, but that there are wide disagreements about how likely that chance is.
It's weird, I'm a developer, and I really want AI to work, but it just doesn't, so I'm just generally against AI in general. And it seems it's far, far from working properly.
It's difficult to see domains where AI can really improve productivity without having major drawbacks.
I'm not against research on AI, but as long as science cannot define what intelligence really is, I guess AI will not make major advances.
I do think that, as long as someone hasn't yet invented the sufficiently smart self driving car, that robo-taxis will be required to have some sort of override mode, either cell or satellite based; something where a remote human operator can take over, using the sensor data to drive the car until it can get back to situations where it can actually handle.
I understand that lag can be an issue, but if speeds are limited to 5-10 mph, that should be less of a problem.
This past weekend in SF, I saw a driver slow and stop to reverse parallel park on Bush Street at Fillmore. An automated Cruise car stopped right behind him, blocking the driver from reversing into the parking spot. The guy got out to yell at the driver to back up, but then saw that it was driverless.
In practice humans frequently neglect to signal in instances where it is legally required (I don’t know if it is strictly legally required in California when parallel parking, but it’s certainly good common sense). Human drivers know that people don’t signal and account for that.
In this particular scenario the reverse lights would be a crystal clear indication of the driver’s intent, even if they failed to signal.
I don’t want to assume the intent of your comment, I don’t know what you’re getting at..
Just wondering because it's not always apparent that a car is slowing to park parallel, especially if they don't signal. The only way I know to make it really clear is to stop alongside the open spot, signal, move ahead slowly and put it in reverse, but half the time humans still don't get the message.
I have never run into this where the human driver didn’t get the hint when I remained in front of them with my reverse lights on. They have somewhere to be so sooner or later they go around me (backing up if necessary). From my understanding, the driverless car in the OP just sat there forever.
>Under the agency’s own rules, issues such as traffic flow and interference with emergency workers can’t be used to deny expansion permits. The resolutions list four “goals” to be considered:
inclusion of people with disabilities
improved transportation options for the disadvantaged
That seems like a solution in an emergency, but these kind of interactions must happen all the time. And it is part of a wider problem where the baseline data about the world changes and not communicated in a standardised way. It could be a fire truck, road closure for cycling event, tornado ripping through a town, a terrorist attack, a secret service motorcade, oil on the road, anything. And dealing with those situations is much easier with that data. Making that data explicit could help everyone, not just self driving cars.
It would be useful if vehicles with sirens/emergency lights/hazard lights could broadcast position and identity in a similar way to ADS-B. Make data open and allow third parties to forward, store and re-broadcast that data. Make the consumer of the data (such as a self driving car) responsible for how it uses that data.
> It could be a fire truck, road closure for cycling event, tornado ripping through a town, a terrorist attack, a secret service motorcade, oil on the road, anything.
Some of these are likely just a lack of training data.
However, I agree that there needs to be a better self awareness of "Cruise has insufficient training data about this exact situation, now we fall back into a safer discovery mode, where we make reasonable guesses and also ask for the help of an operator"
first of all it'd be a dedicated emergency line, like 911 to Operations center direct.
Second of all, if that is not fast enough, then humans are insufficient too. You cant just magically get people to act (a lot of the time they'll do the wrong thing and fail to even recognize what situation they're in at the moment) Notice how bumper to bumper traffic is pretty bad for EMS, people do not know how to react.
Honking your horn and getting the person to pull away is pretty fast. By comparison it takes several minutes to get through to 911 where I live. Not to mention most humans would know better than to park in front of a fire department driveway or drive through caution tape.
The incidents in the article are inexcusable. But to be fair, I manually reviewed a dozen accident reports on the CA state website and 100% were due to human error in other vehicles or due to issues with an operator of an AI vehicle in manual mode:
https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...
Why are the people operating the trucks not getting ticketed for all this then? These various programs should be accruing points on their license just like a real driver. Cruise would probably be off the road if they did this though.
Some simple infrastructure could help with this. A no-fly zone for robo-stuff that travels with fire engines for instance. Require all robo companies to respect these zones. Radio or something.
How do these companies not have a simulator built to help train cars for simulated non-standard scenarios? Have dogs, kids, fire trucks, caution tape etc and run them with 100k variations.
And then do real world staged validation.
Actually why are extensive real world mock scenarios not running 24/7? If a car does something bad, do mock scenarios around it.
Please tell me they are doing these things because it’s crazy if they aren’t.
In that case, By the way cars don’t know how to approach fire trucks, fire hoses, and caution tape it’s is very apparent whatever they are doing in the closed is a failure or grossly inadequate
Of course they are...I'm sure you've seen the dozens of marketing videos of tesla detection cameras/sensors stopping short for a kid chasing a ball into the street.
now, like anything else, does test/staging translate to production? Not even remotely. Automomous vehicle manufacturers claim to be in a monitoring/production phase to compare to their internal testing/staging. And over time we'll normalize Autonomous driving and the associated risks.
We used to drive all the nails in the world with a hammer precariously aiming at your own hand.
reply