Technology always outpaces legislation. It's been that way since the telegraph and earlier. Legislators can't rule for stuff that isn't there, and tech changes spec all the time. Once there actually are fully-autonomous vehicles on public roads, the rules will come together. For now it's humans, just like every other vehicle.
I think it would be hard to argue that we are a long way from fully autonomous vehicles, since millions of miles have already been driven by autonomous vehicles. What remains is mostly regulatory hurdles, psychological hurdles, and fine tuning.
from one perspective we already have fully autonomous cars, it's just the making them safe for humans and fitting them into a strict legal framework for their behavior that needs finishing before they're released to the general public (comma.ai being a publicly available exception)
Look, if self-driving cars can't drive like a human then it's already game over. Of course it's possible to completely rearchitect our transportation systems to make autonomous transportation possible. Hell, we could install a series of movable tracks along the roads that send signals to vehicles that would remove virtually all of the difficulty. If autonomous vehicles aren't feasible without sweeping changes to traffic law then they're not feasible yet.
The thing I've always wondered about is: if governments can put street signs for humans on every road, why can't they do the same for autonomous vehicles? Sure, you could vandalize the signs and wreak havoc, but you can also remove a stop sign or hack a traffic light and do the same now. If a standards committee was formed to develop a spec for autonomous vehicle guides it seems like we could get to full autonomy far faster than waiting for AGI. Maybe you still have to drive on backcountry dirt roads, but wouldn't automating 90% of traffic be an enormous win for society?
Perhaps unpopular opinion but I think the tech is more than good enough that I think most cars should be autonomous already. However the reason I think there isn't is because public perception, regulation, changing tradition is hard, and peoples acceptable safety.
It seems like most would rather wait until autonomous cars are way better than human drivers while not truly acknowledging most human drivers are awful. Sure I dont want people hurt or killed but I think it could have made more progress in prod so to speak.
I think the closer we get to the point where all vehicles are autonomous, the easier it will get. The interim is the hard part. Right now autonomous vehicles have to be able to operate alongside human operators whose ability to communicate their intent is limited to brake lights and turn signals, and there is basically no way to cooperate. Now imagine a world where it's all autonomous vehicles who are constantly communicating intent and can cooperate in marvelous ways, like with orderly zipper merging or intersection management without the need for traffic signals. There will still be a need for computer vision to detect pedestrians, obstacles, and wildlife, but it will be simpler than having to deal with other human operators.
The whole purpose of the new autonomous vehicle law is to allow development of the systems in real-world conditions, before they can get to the next level of driving without a driver behind the wheel as a backup.
Starting to think that autonomous driving should be limited to roads where people are already restricted from being near (e.g. freeways) until the technology is (rigorously) proven safe. Doesn't seem like it would take much to implement on top of the current technologies in the wild.
I can't see fully autonomous vehicles in my lifetime. Everyone will chime in to tell me about advances in robotics, ML, sensor technology and fusion, how many miles Google cars have driven, so on and so forth. I have approximately 30 more years before departing this mortal coil.
Roads are tough. Sometimes they're covered in snow, ice, rain with poor visibility. Sometimes they're well constructed, other times they're in disrepair. That's just the technical side of the issue.
The social side (at least in the US) is much more difficult. In the foreseeable future, any AV will have to share the road with "classic" cars, driven by humans of modest computing power. Trying to anticipate how they will interact should be interesting to say the least. And in our hyper-litigious and hyper-protective society, the clash between AVs and traditional cars will be difficult to resolve.
It's really quite sad that the regulation of autonomous vehicles been so slow to come along. Public roads are filled with other drivers, passengers, and pedestrians that did not consent to be a part of a large scale beta test for partial driving automation that could fail at any time. I believe this is a case where self driving software should be default illegal until proven safe. Most companies in this industry, thankfully, seem to be moving carefully and rolling out their products conservatively; Tesla seems to think "move fast and break things" is an appropriate motto for 5000 lbs projectiles on public roads.
If autonomous cars become more common this seems like a problem that would solve itself. If more cars on the road are following the letter of the law then people will stop being so surprised to find a car following the law and won't be caught by surprise.
So maybe we need high definition maps, more than just cameras for sensors, and tight clear government regulation in order to get to full autonomous driving?
are you suggesting any developer of autonomous driving systems can straight up ignore this law as long as they make sure the system requires the presence of a human?
reply