So you want many broken implementations with quirks of their own? (HTML5 + WHATWG's output).
There needs to be a single dictator with a single standard. The problem so far is that the dictator and standard need to be hit with the clue stick man times before rather than pushing corporate features and agendas. POSIX, HTML5, Java are all design-by-committee crapfests.
Golang is about right on this. Once core standard and reference implementation which is opinionated and built by people who know their shit.
I firmly believe that if every fucker designing those standards had to provide 2 different reference implementations (say 2 different languages so idiosyncrasies don't leak into design) plus full test suite for every feature we'd be with far simpler and better working standards, coz maybe someone will realize "fuck, this is hard to implement right, maybe we should rethink it?"
Instead it's yet another standard that looks like someone that fell out of the ivory tower outhouse.
as many open standards for interoperability as possible
I don't think that's a good idea. There's the xkcd 927 concern, since there are already long-established standards for a lot of things; and that brings me to what I think we really need, and that is stable standards. The longer a standard has been around unchanged, the more implementations will arise. On the other hand, the browser situation is a great example of what happens when some entity takes control of an "open" standard and then churns it endlessly. I think this is particularly true of programming languages, since they are foundational for everything else.
That's a bold statement. I don't think I've ever seen a program / library which implements any standard completely and without issues. It's not uncommon to see a list of things that are not done, incomplete, or just called out as wrong and rejected from the implementation.
It's not perfect, but no implementation will ever be imho.
It's possibly an unpopular take, but it's not completely clear to me what community-driven programming languages gain from a standards process anyway. (At least if the intention is to really standardize the complete thing, bells and whistles included, not just some tiny core, as R5RS did.)
If you have multiple adversarial commercial implementers for something, a standard makes perfect sense. It's still a lot of work, but necessary because otherwise, you wouldn't get things to interoperate at all. That's true for things like HTTP, or Bluetooth, or JavaScript, or C++.
But for a community project, having one blessed implementation and iterating quickly has proven to be the most successful model, in my opinion. I can point towards Python, Rust, even PHP for examples. Even Haskell, which started out with a written specification, has at this point de facto been forked into the much richer language called "whatever GHC understands".
This is IMO the core problem with proprietary-based "standards": even if the formats are opened, people rarely make independent implementations of them and bugs often persist for years before being found.
The pace of change is too quick and the permutations of how to do X are necessarily too numerous.
As far as I know there is no “standards body” that governs these things across organisations.
Ie: if you’re developing a web app that needs to poll other systems for data every X minutes there should be a standard that governs the best way of doing this in the major 5 languages.
Taking into account SRE principles like logging, scaling, security etc. And some clear code examples using the simplest and least OO-functional-prototypical-new age code possible
Everyone is still free to implement their own version. But it seems like many of the people behind the biggest implementations would rather agree on a standard.
> On the other hand: if someone would do the equivalent to their browser, people would call it fragmentation.
This happens all of the time in browsers. See: Dart, vendor prefixes, JavaScript, etc, etc.
This is also absolutely nothing new in compilers. GCC has had a bucket load of its own C extensions for years (decades), as have many other compilers from many vendors.
Vendor-specific extensions are par for course. In fact, they're a good thing! The first step in moving a standardized language forward is to have the vendors designing and adding non-standard extensions so that they can experiment with ways to "scratch the itch" they're feeling. Good extensions get taken up in committee, and if they can be made palatable to all involved, they get standardized. Bad extensions die on the table.
Having tried-and-tested features drive standardization allows hindsight and experience to strengthen resulting standards. We call the opposite, where the standards body invents a feature out of whole cloth with no example implementation having been tested in the real world, "Design by Committee". This strategy does not have a high reputation for quality and success.
Speaking as a member of one of the many W3C working groups, I would say that it isn't desirable. When designing a standard, we want to have multiple competing implementations of the standard to ensure it is well defined and understood by users.
Should there be one implementation, we wouldn't be able to say with confidence that we have succeeded for that goal.
I'm not sure what you are getting at, an implementation of a standard is not the standard itself. I buy lots of software that implements various open standards.
It's as official as, say, the HTML 5 spec, or the HTTP spec.
The implementations can be plenty different as long as they conform.
It's not that I'm not getting your point. But part of the reason I wrote this rant is better than others, is because it's not harping on the point of "omg! someone dared to write a standard that has multiple implementations! the audacity!", but instead has criticisms of the standard itself.
I understand you don't like the standard-and-multiple-impls model, but it's worth noting that the most popular Wayland implementations share plenty of auxiliary and utility code in the form of libraries (e.g. libinput, libdisplay-info, etc). It's not as fragmented as it may seem, people are pretty reasonable.
Personally I think this is a healthy way to run things, for example because changes to the standard can easily be validated in multiple implementations, which tends to promote better quality for all over time.
The problem is these large standards bodies are always controlled by the players with the most amount of money making their "standards" useless to IMO. A public specification and an open source reference implementation is good enough for me.
There needs to be a single dictator with a single standard. The problem so far is that the dictator and standard need to be hit with the clue stick man times before rather than pushing corporate features and agendas. POSIX, HTML5, Java are all design-by-committee crapfests.
Golang is about right on this. Once core standard and reference implementation which is opinionated and built by people who know their shit.
reply