Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

It seems like these cars will have to be operable in both the manual and self driving modes. Otherwise the car will become much less flexible.

Examples: 1) immediate unplanned stop at a yard sale 2) drive "off road" to get around an obstruction 3) dealing with unstructured parking situations 4) avoiding emergency stops in unsafe locations 5) driving through puddles (is it 2 inches or 2 feet deep?) 6) etc.

And for some significant transition period the road will be populated by both manual and computer driven cars.

How does the hybrid system work? Won't many people take advantage of "dumb" cars. How would you drive if you knew many cars were computer controlled. Would people figure out how to "game" the known computer driving rules? I don't pull out in front of cars that are too close because the human might not stop and hit me. Maybe I don't worry about it if I know the computer is in control of the other car.

Long haul freeway driving does not seem too complicated. But what about high density suburban and city driving?



sort by: page size:

Self-driving cars will always have limitations. It's just not possible to make it ready for all kinds of road or even off-road surfaces. Fast-forward into future, if human-driven cars get banned, there will be no reason for self-driving cars to have manual controls and the only option for car will be to refuse to go into a situation it can't handle.

I very much believe in gradual improvements and that is what many large car manufacturers are doing. I can easily imagine not needing to control my car on a highway, in a city, or when parking it, but I expect the manual controls will stay there for a very very long time.


One way to roll out the tech is to do a hybrid system. Instead of selling self-driving tech as a convenience, the tech could be used to backup human driving. If the car sees I missed a stop sign, it could stop.

The goal shouldn't be a self driving car better than humans, but a self driving car better than a human driven car with automated safeguards.


The thing is, in the real world, you cannot rule out that there are cars that don't have the required hw or sw on the road. Maybe in some future there will be no such exceptions, but the driverless cars technology is not going to wait for that. For the foreseeable future, the solution must be a hybrid (human/old tech/new tech) , unfortunately.

I don't expect self-driving cars to drive much more densely than current cars, though, because all or almost all the same reasons we drive with separation still apply for self-driving cars. Where you might think you get some advantages, they will also have disadvantages, too, like being able to compute exactly how risky their behaviors are. People making the cars and writing the car software aren't going to want to stand in court and explain why their car crashed the 50-car dense train because their car software left no possibility of the brakes going out at that exact moment (i.e., no previous sign of brake failure), and caused a huge crash because none of the cars had any buffer for failure.

Personally I expect self-driving cars to drive a great deal more conservatively than humans as a result, and I think self-driving cars are going to end up demanding a lot more space and buffer on the freeways. And if they don't at first, one day you'll way up and they suddenly all will, because somebody won some lawsuit and a software update was pushed overnight to double the buffers.


Would I trust a driverless car with current technology to navigate a busy parking lot? Not a chance. Would I trust it with the 40 mile commute I had, 38 miles of which were 6 lane highway? Probably. Would I trust it to make that same commute at times of low traffic when I'm likely tired myself? Absolutely. Not all driving needs a human's full attention, and the situations where a human doesn't need to pay full attention can be handled better by a computer which is immune to many of the distractions and weaknesses of the human mind. It's all situational and difficult to figure out, but the proper response is to keep plugging away at the parts of the problem you can gain some traction on and take what you can get.

Sure, I think we're in complete agreement here. But the idea raised in the article that we're going to see driverless cabs on the Las Vegas Strip? I really don't think so. The day when we can trust a completely driverless car to wander round on its own without hitting anything is a helluva long way off.

In the meantime my main concern is that the hybrid human/computer driver might wind up with the worst features of both. It'd be much like an airliner, which can pretty much fly itself from origin to destination 99% of the time, but needs a carefully trained pilot for the occasions when something goes wrong... except that when an unexpected situation crops up on the road it requires a split-second decision, not the several minutes which airline pilots usually get.


Self-driving cars don't deal with either scenario very well yet.

Packing more self-driving cars onto roads is an appealing notion, but I don't think it holds up to scrutiny. Even if a computer detects hazards instantly and continuously negotiates with and anticipates the movement of other vehicles, the physics of moving cars remain the same. Safe stopping distances won't be dramatically different. Road surfaces, stray animals, mechanical failures, and any number of other hazards don't care who or what is driving a car.

Driving itself is straightforward, but the inputs are messy and noisy. Computers aren't good at that.

Consider the wide variety of different road surfaces and cambers, the ever-changing appearances of obstacles according to conditions and the seasons, the limited accuracy of road maps, and the constant changing of the road network in minor ways. I expect a lot of driverless cars to be flummoxed by potholes, confused by temporary roadworks, utterly bamboozled by temporary diversions - and they won't be able to find my road in the first place. (I don't live in the middle of nowhere.)

As a simple matter - how will the car reliably know how fast to go? You can't rely on map data, as the legal limit can change, and nobody will think to tell the map people. You can't rely on the car spotting speed limit signs, as people can (and do) graffiti over them, or twist them so they're not straight any more. I don't think people will be so keen on "driverless" cars after they're held up by a whole train of them going 30mph on a 60mph limit road, or after they're in an accident with one going 60mph in a 20mph residential area.

Perhaps I'm being overly cautious, but I just don't think this will work very well. I can think of two outcomes. The first will never happen, because it involves simply letting the computers kill and main and cause accidents, under the assumption that the overall accident rate will be lower. But then who will be to blame for each accident? People need somebody to blame, so they can be taken to court and maybe sent to prison.

The second option is that you require a human to be in attendance all the time, ready to take over the controls when the computer gets confused. Which means it's not driverless. Which makes the whole exercise a totally pointless waste of money. If you need a driver... well, it's not driverless. You might as well class it as an amazing high-tech set of astounding driving aids. That is probably what we'll end up with, I suspect.


They've made a self-driving car. The merely annoying parts are going to be the fun bits! Everything else is going to be difficult, if not bordering-on-impossible. I find it very difficult to believe that there is any technical barrier to computer-operated indicators.

I don't think machines currently have enough contextual information to be able to drive fully automated. Case in point: the limitations mentioned in the article.

I don't think you can compare human cognition and driving abilities to current AI/Hardware, since the roads and all the information to drive are geared towards humans.

This being said: I do believe fully automated cars will come, but the 2 conditions I mentioned are prerequisites (in my opinion).


I think there is a potential for them to exceed human performance in all driving conditions. Machines can perceive and react much faster than humans, and once the perceptual and control problems are solved, all of them will become uniformly "good" at driving in difficult conditions.

BUT (and there's always a "but"), the problem is exponentially hard, and it can't be solved with today's technology.

I agree with you that widespread adoption will require modification of roadways, and potentially also segregation of autonomous and human-controlled traffic streams.


It isn't the normal driving that makes self-driving cars so difficult. It's the edge cases that will be multiplied significantly when a large fraction of the cars are self driving. It will take decades, not "5 years from now" (which I have been hearing for years) to get these systems to work.

A better analogy is anti-lock brakes. When they were first introduced, many drivers were absolutely adamant that no computer could possibly have the nuance and feel to outbrake a human driver on difficult surfaces. Now, it seems utterly obvious an algorithm sensing wheel slip at 1000Hz has a better ability to maintain traction than a human driver with reaction times measured in the hundreds of milliseconds.

Self-driving cars won't suddenly arrive on the market fully-formed. The technology will be introduced incrementally as a series of driver aids, to supplement the range of driver aids that are already standard (ABS, TC, ESC). We're simply seeing the acceleration of a trend that has been happening for years - computers taking over control from the human driver.

Several manufacturers now offer collision avoidance systems that can automatically apply the brakes based on RADAR sensing. Mercedes offer an adaptive cruise control system that can match speed with the car in front and steer through corners to stay in lane.

The self-driving car will be preceded by the uncrashable car.


I think that's the key. If self-driving cars ever reach a level of ubiquity, it's going to be on large, open roads and highways. Urban areas will still require a human driving.

This is the thing that scares me the most. If the self driving cars need to cooperate for maximum efficiency what if there are some cars that don't do it and exploit that?

I'm as incredulous as the next person that a mostly self-driving car will be available by 2017-8 and that this will be it.

That said, on the way to full autonomy I can foresee a car that can handle 99% of the driving itself but calls on the driver to help out during the other 1%. In such a scenario the car may know where it is and isn't safe to drive, but not exactly where, or when.

Examples include the 1-lane, 2-directions-of-traffic rural roads. Unmapped car parks and private premises. Navigating around roadworks. Basically all the 'edge-case' scenarios people often cite.

The car could guess the route and you use a joystick to control the forward speed. You can use the joystick to change the proposed route as you drive. And if you wanted to leave the detected 'safe area' ("no, I really do want to drive into that car - it's a tow truck and I need to get on it!") you acknowledge a bunch of ominous warnings and control the car directly, albeit at a very low speed.


If the self-driving vehicles are also tied into a common communication and planning system, so that all the self-driving vehicles in an area coordinate with each other, such a system could be designed to accommodate people who want to take manual control.

Suppose, say, you and a friend would like to do some street racing against each other. You could tell the system, and it could make some minor adjustments to traffic timing to clear a course for you and your friend to race on, and then switch your cars to manual control. (Actually, it wouldn't have to clear your whole course. It really just needs a buffer around your cars).

During the race, the self-driving system in your cars could monitor your driving, and take over if you are starting to lose control, where "lose control" is defined as put the car in a state that the self-driving system cannot safely stop from.

If the self-driving systems are good enough that they are better at extreme driving than you are, then you'd be able to push your limits in manual mode, because the self-driving system would only have to intervene if you've pushed beyond your skill level and would be heading for a crash.


Let's not forget that all of the things you list are things that humans are pretty bad at. A self-driving car doesn't need to perfectly handle strange situations, it just needs to handle them better than most people, which is a much lower bar.

Not saying we'll get there anytime soon, I don't know well enough to say. But I find that people in these discussions tend to overestimate the skill of most human drivers on the road.


I have been saying this for years. Even if self driving cars handle 99% of situations correctly, they don’t become feasible until they can do that last 1%, which is 100x more difficult then previous 99. The reality is, if the car drives itself, people will be doing other things in the car and will be unable to respond to emergencies.
next

Legal | privacy