I don't disagree with your point, but I do at least slightly disagree with your perspective. This isn't something like an unavoidable feature of the future that is going to affect a subset of society. It is that we are fundamentally doing the wrong things.
Even in your own, and many of ours, situation there are people struggling with education, housing and stability. So even if you are keeping up with the future in the positive sense, you aren't protected from the negatives. Something like UBI isn't going to fix that stability has increasingly become something valuable to be traded.
This isn't however something unique. It happens at least to some degree every time there is a shift. People start capitalizing on the positives and the negatives eventually catches up. When the greatest shift is in how to handle the negatives. That is really how to keep up with the future.
Once there is a solution to how people e.g. can get education, housing and stability you can automate everything you want. But it often requires giving up the former for the latter. You can't have a high cost of living and automate all the bullshit jobs without consequences. Because the bullshit jobs are paying for that.
I could never subscribe to such a dystopian outlook of the future. Sure, we could augment ourselves with tech, but that will not ultimately stop things like automation in transport. Millions of jobs are set to be made redundant and there is no clear replacement for them. No amount of tech augmentation to ourselves is going to fix that. More importantly, we are still trying to understand what a majority of humans would do if their basic needs were cared for. That's the purpose of studies like this. Humanity has never truly broken free of its evolutionary bond to fighting for the resources it needs to survive, it just changed what resources it fights over. This is truly uncharted territory and UBI is currently the only viable idea on the table to deal with it.
Is it me or does it feel like the door to economic advancement is closing faster and faster every day?
In this thread people are suggesting UBI, but I can't see that as anything more than an economic uncertainty (trying to predict how market forces will respond and adapt), which will be exploited by the oligarchy that is already exploiting the systems we have now (as you might expect anyone to). To think we could successfully create a good UBI system despite the fact that we can't get other very important pieces of governance right in many parts of the world seems ludicrous.
People are suggesting that new jobs will be created as jobs are automated but there's no proof of that except history and this time seems different -- humanity has never had the tools it has now, the step change in ability/utility over the last few decades is insane. In a world with high levels of automation, where does advancement come from? Up until now, it's mostly been differentiation through some form of work. Do creative endeavors become the new work/only way to differentiate if we spend most of our time on leisure? But then what if someone automates that away too?
It seems like once we evolve past work the economic system and the big & small players in it will freeze in place. If you had enough capital to be on the right side of the split then you get to stay on that side, but if you didn't you seem all but condemned. Dystopia seems inevitable. I'm not saying that I think work is good in and of itself but it's disconcerting to happens when that particular music stops -- it's run so much of the world up until now.
Another thought -- the good 'ol revolution/riot path is going to get closed down in a few decades I think. Weaponization of autonomous machinery will happen, and the efficiency of oppression will reach untold heights. Security will be a highly sought skill and those most able to pay for it (with resources/money/whatever) will be able to purchase it.
AI is here and its use will increase no matter what you desire. It's really no different than any other technology. As a species, we've consistently been getting more and more efficient in most aspects of our lives. How that translates into the well being of the many is a completely different topic. Being more efficient is ultimately only a good thing. But that doesn't mean it will lead to a utopia.
Our problems always have been, and always will be, each other. You point to AI as a problem, but we all should really be looking in the mirror instead. It's up to all of us to solve the social troubles that will arise from continual increases in efficiency.
UBI is one idea that might help prevent a severe imbalance of resources in society. But it alone won't be enough to solve our social problems, and I don't think most people claim it will. It's a start.
If this is really something that keeps you awake at night, be a part of the solution and help us think of realistic ways to address these issues. Going backwards in time isn't an option.
The problem with envisaging how current technological trends will shape society in the future is that we try to work it out rationally, when people mostly take what they're given without thinking about it too much. The ability to listen to and watch recorded artistic performances - music, plays - ought to have decreased the value of live performances, but tickets to those performances are now more expensive than ever.
The comment on AI parenting isn't as revolutionary as it sounds: wealthy people raising their own children (rather than employing a dedicated nanny) is a 20th-century innovation, at least in the West. But I think people will feel unsettled about their children being raised by robots, regardless of the quality of the tech - Norlands College has little to worry about, I think.
But the central issue of this article is the issue I have with Universal Basic Income. If a person's labour is worthless, then the State has to provide some sort of income for them by taxing the business for which they would otherwise have worked. If this becomes ubiquitous throughout the economy, then the State becomes a dominating force in the economy, and the principles of the market economy start to break down. The idea that nothing can be produced without labour is a fundamental assumption to all economic models, and without it, we're in very dangerous territory.
Currently, the economics don't stack up. How does society handle it when the best jobs are automated? When all jobs are automated?
The Star Trek post-scarcity utopia scenario feels very unlikely; Mad Max-style scrapping for leftovers while Musk, Bezos, et. al. live behind walls feels infinitely more probable.
How do you implement UBI when a huge proportion of the political class is vehemently against it. Maybe we need to AI politicians, so they start to figure it out?
Things look pretty dystopian as it stands, and if concentration of wealth continues, we'll speed down that dark road even faster. If not UBI, what other solution is there? Especially as automation sweeps through cognitive work?
Every political faction has an "idea" about UBI. That's the core of what's wrong with UBI. People don't want their income to be subject to political whims, and change every election cycle.
The problem is that, before automation, people had something to trade for their daily necessities - their work. Work was tied to each individual and that empowered people. Now, with automation, people will have nothing else to offer, but still have the same needs as before. So people lose their influence and become subjects to the whims of whomever decides the quantum of the UBI. What if they set it too low? What can a person do against the state in that case?
I have been thinking about this problem, and came to the conclusion that the only way to assure people's future is agriculture and self-reliant industry. If corporations won't hire people, people need to be "hired by the land". I see cooperatives being formed where people buy land and cultivate their own food, possibly using technology, even robotics, that is in the public domain and can be used freely.
In the long term it will be essential that AI and robotics be implemented in the public domain otherwise only the big corporations will reap the benefits of automation. Remember what happens when a concentrated source of wealth appears: with operating systems, the Windows monopoly; with search - Google; with social - FB, with oil - arab countries (where huge social problems appeared as a result). People need to be in control of their sources of income. UBI is just a promise from the state and "the 1%" that we will not be left to starve. But can we trust them? We need to become self reliant.
As a side note, a number of technologies will be essential for self reliance, such as: solar, water filtration, 3D printing, robotics & AI, agriculture (including the right to create seeds, that has been usurped recently by big corps), open & free education, generic drugs and of course open source software.
I didn't say it would be undesirable! Just that it would completely alter the world in ways that we cannot prepare for on an individual basis.
Ideally, we move to UBI and, as you say, everyone lives like an aristocrat. But we don't get there by trying to hedge against AI taking over our individual jobs.
If I truly believed that automation, especially in the short term, would include UBI or some other social safety nets, I'd be far more optimistic about this.
I'm nervous that in the short term, fewer people will continue to get very rich, and more people will fall out of the middle class.
Yes, it is easier to find the dead-end roads when speaking of future societal changes. It is easy to panic about massive global unemployment in the face of automation. The fact that a conclusion is easy to come to doesn't make it any less likely. The patterns we see now are based on recent, past trends. What we can't see, or extrapolate from what we've experienced already, are the effects of accelerating exponential change in the tech sector. What I get from this article is the beginnings of the political rationalizations for denying basic guaranteed income. When your spreadsheets say there isn't a problem, when your economists tell you that work for humans is actually increasing, it is much easier to ignore the needs of the people on the ground.
I have been wracking my brain about this problem for a few years now, and I still can't seem to find the essential leverage point human labor would have against automated, robotic labor. I once thought it was our ability to synthesize and generate novel solutions to problems that would keep us in a dominant position within the economy, but that is quickly turning out to be a naive delusion. I don't fully understand the direction neural networks and machine learning are moving, but it looks like they will match us in those "uniquely human" areas of cognition very soon.
I have always thought there is an optimistic version and a pessimistic version of this.
The optimistic version is that technology creates enough wealth to provide a basic income for everyone when the value of most human labour is diminished. The interesting part here is who 'everyone' is, because in our imagination it rarely includes impoverished countries.
The pessimistic version is that technology could already create enough value for everyone, but what it really does is it increases inequality.
The value of automation might be captured by few large companies and a technological elite in a few countries, which have no incentive to support the rest of humanity other than to avoid a bloodbath. Combine this with vastly increasing populations in developing countries and resource scarcity + climate change in various regions causing mass migration movements, I just do not see how this can really workout peacefully.
If people have seen the movie Elysium, I always feel that this is already happening, just that Europe and the US are Elysium, and it is only about to get worse.
That doesn't make it inevitable. We could create regulations about when and what automation is acceptable.
That reality seems more likely to me in the present climate than a maybe more ideal solution like basic income - though I think the most likely outcome is huge (and increasing) swaths of the population languishing in permanent poverty.
I see a shallow analogy that isn't true to me on close inspection.
To me, human activities from which we can earn a living wage feels like nomadism as the edge of an ever expanding region of agriculture (technological automation in this case). When you lose some activities to automation, we've always found new ones until now. In the end though, there were no more pastures for nomads to move, and there will be no more new activities from which humans can earn a wage (not to mention the satisfaction of accomplishing something hard). And, while there might be a future with UBI for everyone, the transition seems rough and exploitative.
I think the people that presume their intelligence will inform how automation plays out will be sorely disappointed when the future is nothing like they expected. To be honest, I don’t want to make any predictions about automation but I’m highly skeptical of the pessimism that dominates the discussion and fatalism around things like UBI.
I agree with your assessment. It this realization that encouraged me to start learning about alternative economies; binary economics is one alternative I've explored. In the end, we live in an era of increasing scarcity (especially with regards to food, water, and land) and automation will continue to erode the job prospects of humans moving forward. In the meantime, medicine is working towards increasing one's longevity. Personally, I hope we find a peaceful resolution to this mess, but it does seem to be a recipe for disaster.
People bemoan Kurzweil's vision of the future, but we may be forced to virtualize ourselves to survive peacefully. Otherwise, I want to be in the country with the strongest military.
I don't think this will be likely outcome - no change happens instantaneously worldwide, so the very period of (possibility of) deployment will - I believe - make this system turn into robots working for rich while most of the humanity starves.
So, while I fear a different future, I'm also firmly in the basic income camp. Right to live should not in principle be dependent on slaving away your life, so let's deal with that when the technology allows us to.
Certainly, but that's not the difficulty. The difficulty is not in stating the problem, but imagining trying to solve it.
Problem: some will suffer emotionally and psychologically by being underemployed or unemployed.
Solution: There isn't one. Welcome to reality.
At one extreme, we have a negative utopia run by a techno-elite that designs, builds and installs an army of robots to relieve everyone of dull, repetitive, boring jobs, and thereby produces an unequal distribution of wealth even more extreme than what is true at present.
At the other, we have another negative utopia in which no on goes wanting, resources are distributed according to Marxist ideals (from each according to his abilities, to each according to his needs). I think the second has already been tried, and failed, and I don't think the automation of labor will change the outcome.
Between the negative utopias, we have some version of reality, without the ability or the will to force any extreme solution onto the public. We'll just muddle along and adjust to the future, to the degree that we can foresee, and deal with, its effects.
Even in your own, and many of ours, situation there are people struggling with education, housing and stability. So even if you are keeping up with the future in the positive sense, you aren't protected from the negatives. Something like UBI isn't going to fix that stability has increasingly become something valuable to be traded.
This isn't however something unique. It happens at least to some degree every time there is a shift. People start capitalizing on the positives and the negatives eventually catches up. When the greatest shift is in how to handle the negatives. That is really how to keep up with the future.
Once there is a solution to how people e.g. can get education, housing and stability you can automate everything you want. But it often requires giving up the former for the latter. You can't have a high cost of living and automate all the bullshit jobs without consequences. Because the bullshit jobs are paying for that.
reply