This is also a great argument for closed-source platforms like Apple. When quality standards are high bugs are impermissible. With open source allowing people to patch their bugs it's easier to let them slide through the cracks. It reminds me of the Israeli day care center that had way more tardy parents after instituting fines for late pick-ups, since it communicated "being late is OK, you just have to pay a fee" rather than enforcing behavior through an unspoken social contract (enforcing quality through a high-caliber quality premium platform).
"If that were a problem in reality, the markets would be punishing companies where that happens."
Quite the opposite: markets have been rewarding it for some time. The richest companies mostly had buggy software. What got them revenue was everything but flawless quality. Then, once their customers were locked in via other tactics, the customers kept paying them so long as the software continued to work with a switch costing too much. They also often patented anything that could block competitors.
Even quality-focused customers often want specific features even if it leads to occasional downtime. Also, releases improving on features fast. I think Design-by-Contract with automated testing can help plenty there with the pace necessary for competitiveness in a lot of product areas. The markets don't care about perfection, though. The company's priorities better reflect that.
This is not a software quality problem, this is a business efficiency problem. We could write commercial software with no bugs, but it would be very expensive (slow process, complicated tooling, etc). It's far cheaper to accept some rate of bugs and pay some schmucks weekends/nights to be OnCall)
Everything exists in an economic context. Errors in plumbing, dentistry, and electrical work can result in thousands or hundreds of thousands of dollars in damage, and they often can't roll out a fix in a matter of hours. Some software bugs can also result in lots of damage (financial/health data security, etc), and we do take those bugs seriously via compliance (although compliance isn't a guarantee for good software just like it's not a guarantee for good dentistry). But no one is going to pay for the effort required to perfect every software release, nor should they.
Some industries have decided that software really does matter, and go to greater lengths to make sure it works.
It'd be annoying if Things for iOS crashed and lost all of my data. It'd be horrifying if flight control software crashed and all aboard a plane were killed. It stands to reason that some software is and should be held to higher standards than other software. It probably doesn't make sense that all software should be held to the same high standard, as it is extremely time- and resource-consuming to ship avionics software. Do folks really want to dish out a few $thousand for a copy of Things for iOS?
And some companies do already take responsibility for open source software. In aerospace development, we routinely use GNU software that has been thoroughly inspected and certified as good by companies that accept many thousands of dollars to stand behind it. (Of course, if we were to upgrade from their GNU Foo 2.1.0 to the FSF's copy of GNU Foo 2.2.0, then all bets are off.)
There is simply not a culture of quality in software, and trying to graft one on to software controlling robots does not get to the root of the problem.
To take just one example, software makers are uniquely able to shield themselves from liability via EULA. I can sell you software for millions of dollars, it can say “reliable, helps route 911 calls” on the tin, but it I make you click an EULA as everyone does you can’t sue me even for the most basic negligent defect in the software.
Indeed, we tend not to even use words like “defect” when discussing software. We use words with more forgiving associations like “bug.”
We do not have any widely used metric of software quality. Even hypothesizing such a measure feels a little absurd in todays climate.
It’s basically still the Wild West in software. Even as the systems controlled by software become more critical and the those critical systems become more widely deployed.
Economics, economics, economics. Companies only put the amount of effort into their software that's necessary to optimize profits.
Startups are incentivized to move fast and break things, and then to keep adding to their broken prototype instead of rebuilding the product, because it's more affordable.
OS vendors benefit from lock-in and hardware is fast enough that the vast majority of consumers don't notice. If something breaks, you just take it to the Apple store and they reset it. It's cheaper for everybody involved.
Online ad vendors have no incentive to create a less-than-terrible web experience because it's not their site that's being trashed.
On top of it all, there's no regulatory or institutional quality standard. It's left to be a race to the bottom.
I don't know what the fix is, or whether there is one at all, but we should at least stop being surprised. We shouldn't really be blaming it on "kids these days" either, which is an all-too-common refrain. There simply isn't a business incentive to invest in quality.
It is a solvable problem, but most company incentives don't line up to it. We accept good-enough software. And so there are few people who know how to do this. And due to that it continues to take time and due to that the companies don't think it is worth solving and the cycle continues.
Some software companies realize that quality is important for their business. And they do this right.
System software providers - do this well. Examples are Amazon's S3 Store, leading relational Databases, and even embedded software such as Arista EOS that Arista runs in their networking gear --do have very high quality.
The reason we don't see it all around us is that most well written software is invisible to us. We don't really think about the iOS software on our phones -crash rate is extremely low.
I think companies like Google, Apple and Microsoft have realized that QA departments and software quality aren't worth it. People have gotten used to buggy software. At Apple, there's no Steve Jobs that cares about whether things actually work. Releasing new features to get media attention is more profitable than making sure the features actually work. We'll never see something like Snow Leopard again with its "no new features". Internally at these companies, there's also no reason for developers to care about quality. It's not rewarded by the managers.
Additionally, we as developers keep building software using more and more complicated tools that seem fancy and new to us, but are brittle and don't deliver good software in the end. We keep adding more and more layers of abstraction, both on the frontend and backend. Why? To put it on our CV. Things are moving so fast that we're afraid to get left behind. We're at a point where things just keep getting more and more complicated – actually keeping something alive (let alone building new features or making those features work) takes more and more man hours.
Buggy upgrades? I think it depends. If there's an updated iteration of a surgical tool or drug, it goes through a 3rd party vetting process, a significant portion of which is publicly transparent. There's a lot of shenanigans going on in OS upgrades under the disguise of, it's proprietary and no one has the right to see the code, as well as the EULA, as well as DMCA that'll prevent some forms of reverse engineering to find out what the changes entail.
That doesn't necessarily mean open sourcing all software. But it does mean having some kind of transparent testability of inputs vs outputs, a statement of expected behavior, tolerances, and so on. But what we seem to tolerate in software these days is, "oh fuck that upgrade just fucked up everything" with very heavy weight recourse like rollbacks or reverting to backups.
The problem with such regulation isn't the concept of regulating minimum quality standards in software, but rather the competency and delays in getting effective coherent legislation produced. It basically requires many people getting pissed to get a law passed, and that's not really the best way to set a standard is it?
And even though the points raised are mostly valid I think there is a lot of nuance to this.
If you write control software for medical devices this is simply not true. Same goes if you handle incredibly sensitive data or critical infrastructure or assistance systems for air travel and you can probably come up with a a bunch more cases where bugs are not ok.
And also I think this sentence
> Where high quality is nice to have, but is not the be-all-and-end-all.
while being mostly true in all other cases I don't think it does it justice what mediocrity actually means for everyone involved. I have seen very varying levels of quality requirements/enforcement, testing, delivery speed and eventual bugs at different costs at different companies.
And I in general found that higher quality software and less bugs were associated with much happier and more productive developers. And I would make the argument that less happy developers => more turnover => more second order costs. So while it might not make or break the company it certainly has a negative impact on profits downstream. And for customers too. Vendor lock in is a great thing for a company to profit from but if you really make a shitty product it opens up a lot of venues for competitors to eventually canibalize your market or from users to jump ship the next time they can. Take MS Teams for example. Lots of lock in but trust me the second better competitors are on the table I vigorously fight to switch. It's a slow burn but a burn nonetheless.
What's the incentive to spend money to constantly improve, fix bugs and update the features on every OS release? If you consider the entire development lifecycle, quality costs developer money every bit just as much as new features.
This does not work. The economic incentives are biased towards fast releases of mediocre software. Whoever delivers faster and cheaper tends to win.
Quality is detrimental to the business. Good software does not require support. Paid support earns money.
An initial release that is stuffed with features takes longer to develop (the competition gets the customers) and has fewer incentives for later upgrades (less income after initial release).
Software cannot really improve unless quality standards become mandatory. The liability disclaimer should go. Engineers don't get a free pass if their products don't work. Why should software be treated differently?
In Peopleware, Tom DeMarco thinks it's because a business's customers will tolerate lower quality software, so there are diminishing marginal returns to revenue as investment in software quality continues. He predicts that while this management style works wonderfully for the bottom line in the short run, it causes long-term ailments such as team dissatisfaction, overly complex architectures, and other issues that may be more expensive overall.
Quality and security become increasingly important as we depend even more on software systems for essential functions such as cars, power grid management, agriculture, etc. Unfortunately, this situation is all too similar to how many opt for the emergency room over preventative care.
We should also consider that many businesses wouldn't exist if not for lax quality requirements for software products. How many product V1s are chock full of bugs and exploits, and to what extent is that okay? What about open source? As usual, it's pretty complicated.
I think cars are a poor example. Cheaping out on quality can cost companies 6 years of 12 developers' salaries, market share, reputation, customers and bankruptcy procedures instead of... 3 years of 9 developers' salaries.
I've seen it firsthand: Major update to our only software product. Years of work. Everything is moving too slow. Managers hire more people. Performance tanks due to training the new folks. Our publicly available software is abandonware, all hands on the major update to get it done. A few customers switch to competitors, but hype is high for the new update.
After 5 total rewrites, they cut their losses and launch the half-baked current iteration. Launch feels earth shatteringly disappointing and insulting to all stakeholders, even including the developers of the major update. Developers receive death threats. Many developers leave the company. Remaining customers flee to competitors.
Lessons learned: Don't cheap out on quality. It can quite literally cost you your company. Nobody needs absolute quality, and that's impossible to achieve, but at least try to care about quality.
Very much the same in software. Now and cheap. I don't really mind though. While certainly there are superstar hackers who can provide quality and still make a buck under those restrictions, the majority are delivering crap (I blame the consumer most of the time) that eventually reqires someone to rewrite it under more realistic constraints due to lost revenue and increasing maintenace costs.
Sometimes anyways. Some people need to be beaten up by their bad decisions many times before the lesson really sticks.
I appreciate the long response, and I kind of feel bad because I meant my question in a rather different way. And (assuming you even notice my response at this point) perhaps I'm going to take us down a road of talking past each other.
My question is about incentives. An unchallenged organization, I would think, would have the tendency to perpetuate itself, and sing its own praises, and deem itself very important.
Let me put it this way: The analogous question isn't what Microsoft does to maintain quality. The question is, how does the world at large maintain Microsoft's quality (and efficiency?). A decade or so of security embarrassments and OSX's and Linux's better reputation got them to step up their security game. Recently, Apple's design got them to step up their design game. If you asked a manager for a canceled project inside the company they might have a great process plan, a great argument for why they need more time and resources, and a great explanation for why they and their team are important. But they may have no sense of the needs of the outside world. This exchange between Steve Jobs and an (apparent) Apple employee highlights this mentality, vs the mentality of an entrepreneur who does respond to the outside world: https://www.youtube.com/watch?v=FF-tKLISfPE
It very well may be that the FDA does a fantastic job of filtering out bad stuff. And, as (I may as well come out and say it) a libertarian, perhaps I am guilty of not appreciating all the work that has gone into setting it up. But it doesn't mean that they have a sense of the correct amount of rigor to apply, money to spend on various aspects of their operation, etc. That's why we like to see competition in the field of certification.
"That a business should go bankrupt before 1 bad drug is approved." and "some of the largest fines against businesses in history" are great examples of perhaps being a bit too rigorous or punitive. You can always apply more and get at least marginally safer drugs. But you may also get fewer drugs through the process, or more expensive drugs, or companies that decide not to even bother starting the process (which you'll never hear about), and (maybe) more people die on net as a result.
I will grant to you, taking your claims at face value, that you make a convincing case that the FDA at least hasn't gotten lax for the lack of outside corrective influence. I guess the mechanism for improvement there is simply human dedication, and I will buy that argument.
For people in the software biz I think being allowed to make quality products is one of the most important aspects of our working conditions, particularly for our social standing in the long term.
This is like asking why does entropy exist. Because that's the way it works. Add economic incentives that don't correlate with maintaining quality and that pattern is accelerated. Look at healthcare in the States for an example. Or Apple software quality for shining examples of expansive software efforts. I've sat having a beer with folks working on major projects we use every day and when I mention their product has a problem they defend it--really couldn't care less about the truth or their customer. Or maybe I'm just missing something. Silicon Valley is a brand. And software development is an arts and crafts effort--not engineering. This is a generalized comment.
reply