I think people who are born into it will be fine with it. You probably get inside elevators without second thoughts. If you are of the age of the typical HN reader, they've been L4 autonomous since you were born, so you don't think about it.
With planes I think most people realize that the trained pilot is a better pilot than they are, so they feel more at ease giving up control to someone that is better than them. People just need to see that similarly, software drives cars better than humans.
I can't imagine anyone stepping onto an autonomous passenger aircraft for a long time, not until long after cars and smaller commercial aircraft are regularly driven using computers only. And even that seems like a stretch. I don't see how pilots use or non-use of some automated systems is going to influence popular opinion on that one way or another. But maybe there's a perception among pilots that it will.
The public aren't really aware of how much of a flight's control is handled by a computer and how much by the pilots themselves. Behind closed doors, literally.
I'm keen to see self-driving cars, but there will be a lot of opposition. Lobbying from trucker unions and the like will push negative stories and pressure decision makers, for example.
The temptation for abuse seems too great to me. You have to tell a third party (or multiple - the government, law enforcement, your employer, your insurer possibly, or anyone who signs up for the inevitable Google app which plots your driving history) where you want to go and now they not only know who you are, but where you're going and where you've been. Someone else has complete control over the software which determines whether or not you get there (assuming these cars don't have some sort of manual override).
Yes, currently large aircraft operate more or less the same way (and everything is fine until something happens and the pilot realizes they don't actually know how to fly a plane because they've never had to). But people won't be taking planes to the store, or to work (for the most part) or the voting booth or to the hospital or police station or an AA meeting. The intersection of surveillance with physical restriction seems too much. I've got nothing against the technology, I just don't want to see it in widespread adoption. I don't want people in general to become trained to ask permission from black boxes to take them where they want to go.
Though I readily concede I probably seem like someone railing against the evils of the Web in the 90s, since a lot of people on HN think they're brilliant and probably a few actually work on them. It just seems very Orwellian to me.
I think for the most part, we do. The pilots are part of the "theater of security"- the computer does most the work. Many people wouldn't ride in planes without a human behind the wheel. And this fear of machines is the biggest hurdle to auto-autos.
I find myself in the strange situation where I both agree and disagree with this sentiment.
The source of disagreement can be described as "time" - I consider not only the past, but the delay that happens after a technology is described to when it makes changes in our lives. It all takes time, and in the time to come, there is a chance that autopilots can take more responsibility from the pilot. Would you really be confident that in 20 or 30 years time, pilots will have the same job description as of today?
Of course, a pilot in a passenger plane has a lot more responsibility than simply piloting the plane. For example, dealing with passengers or talking to ground control.
The autonomous trucks will not take driver's seat overnight - the truck in question will be a decade-long testing phase, according to the article. What happens afterwards is anyone's guess.
I'm actually in complete agreement! What sticks out to me is your assessment that the flight environment is "embarrassingly simple from an automation perspective", which I agree as well (as compared to cars). And yet despite that simplicity and decades at it, we still run it with an incredible robust infrastructure to have a human oversee the tech. We have super robust procedures for checking and cross-checking the automation, defined minimus and tolerances for when the automation needs to cease to operate the aircraft, training solely focused on operating the automation etc. But with cars, we somehow are super comfortable with cars severely altering behavior in a split-second, super poor driver insight or feedback on the automation, no training at all, with a human behind the wheel who in every marketing material known to man has been encouraged to trust the system far more than the tech (or law), would ever have you prudently do.
I'm with you that they are super different, and that the auto case is likely much, much harder. But I see that and can't help but think that the path we should be following here is one with a much greater and healthy skepticism (and far greater human agency) in this automation journey than we are currently thinking is needed.
As someone who isn't afraid of flying (except in the case where the pilot has to suddenly drop the airplane due to turbulence) I still wouldn't trust a pilotless airplane until it's safer then with a pilot, which will be very difficult to prove. I feel also that I'm not alone and many people who fly would be extremely wary of a pilotless system. On the other hand, I would be much more likely to trust a self driving car. The main reason is that, if a self driving car malfunctions, the risk of my death is present but not absolutely guaranteed and with all of the safety features in a car it's likely I would still be alive after a crash, even at high speeds. A plane that has an autopilot malfunction, with no human present, without changes to the safety during a crash, would be essentially guaranteed to kill everyone on board. Until a plane can crash from the sky without killing all passengers in most scenarios, I wouldn't fly in a pilotless airplane.
I'm infinitely more comfortable with an AI flown plane than a AI driven vehicle. The issues the plane has to deal with are going to be much more predictable than the issues a vehicle has to deal with.
Certainly things like the ghimli glider are better for having a human at the helm, but those sorts of things shouldn't happen.
I largely agree with you, however auto-pilot has existed for decades. I don't foresee AI taking over the cockpit anytime soon, but a compromise is likely. Perhaps it'll allow for copilots with less overall experience than copilots today.
It's not. I got my pilot's license and fly in a technically advanced aircraft, and all that it means is that there's quite a bit of automation there to help you out. The lessons imbued in it, the issues you learn exist, actually using it in very critical phases of flight, etc., builds an appreciation for both the wonders and dangers of automation.
Going through that experience has 1000% made me more weary of autonomous vehicles.
How about no control? The only way people are doing to adopt the idea of mainstream flying vehicles is if they are full automatic. In a way, they should be easier and safer than self-driving cars, because there are a lot less obstacles to care about.
I see fully automated planes as a kind of "proof of stake" problem. Right now the person that makes the fly/no fly decision has his butt on the plane, he has a stake in a successful flight. I would have a problem with that call made by someone sitting on the ground.
I know it will eventually become the norm, and that there are incentives for airlines to make sure flying is safe, but a part of me can't give up the comfort of knowing I'm on a plane with the person who made the call to fly it.
The political problem is that people still trust humans a lot more than they trust machines. So passengers would be nervous about being flown by an AI, even if the AI was safer.
There's some weird psychology behind this: human pilots/drivers/etc are seen by passengers as a personal proxy, with agency over any situation.
If you take away the proxy you take away the illusion of agency. Humans really do not like being put into situations where they believe they have no agency at all.
As AI gets smarter, this will become more and more of a problem, until there some kind of cultural shift because AI is obviously safer it's not even a question any more.
I could not feel comfortable as a passenger on a commercial airliner without a human on board that could take over and fly the plane manually. There are all kinds of failure scenarios where a computer, AI or not, would get confused. Even just the specter of malware is enough for me to expect a human being, that values their own life, is able to take over.
At this point I think it is fair to say that pilots are only kept around for passenger comfort. Automated aircraft have been in the skies since the 1940s and as computers became more powerful we have reached a level in which flying without computer aides is not only difficult, but literally impossible for a many new models (aircraft as far back as the F-16 in the '80s are impossible to fly straight by hand).
Given the strict and well defined procedures regarding commercial air travel, I see it as only a manner of time until air traffic controllers are giving commands with keyboards instead of their mouths. That is, until automated air traffic control completes the circle.
Automated air travel is, by almost all accounts, a solved problem. None of the ambiguity that plagues self driving vehicles exists in the air. Not only does the plane have basic location (INS + GPS) and attitude knowledge, but it also has access to synthetic terrain databases and live weather radar combined with hundreds of ground radar stations offering NDB and VOR/DME services, and to top it all off most commercial runways have ILS installations that can bring the airplane within inches of the centerline and perfect glideslope.
It is a bit of a shame as someone who has wanted to be a pilot for a long time. I still want to get my PPL, and I wholeheartedly believe GA will be alive for decades to come even after the Boeing drones come out, but at this point I can't see it as being a valid career option.
I will concede the point that I am not qualified in aircraft engineering, just in autonomous systems, which is a kind of meeting ground here. I have seen way too many people getting over enthusiastic about the capabilities of autonomous systems of all kinds, which I think is a real danger here.
It is strange you quote Air France 447 as an example of some kind of superiority of automation, when in fact its problem was caused by eliminating physical force feedback between the pilots by (over) automation.
Genuinely would love for you to say more. As a (relatively newly minted) pilot, my head goes the opposite ways. All the negative externalities and issues around self-driving cars make sense to me, but I end up thinking that these are all simpler and easier dealt with in the flying case. What am I missing?
With planes I think most people realize that the trained pilot is a better pilot than they are, so they feel more at ease giving up control to someone that is better than them. People just need to see that similarly, software drives cars better than humans.
reply