I'm still building new frontend UI with Typescript 2.x, Bootstrap 4 alpha, Pixijs 3.x. I don't particularly care about flags in code dependencies, and if they're a problem to upgrade I'll use the old version too. Does it matter? A UI just needs to work, ideally forever, and it's all javascript in the end anyway. It'll work the exact same 10 years from now. Serverside code is a different beast and you can't avoid upgrading Node or PHP, or migrating to MySQL 8. But only very occasionally does this present the opportunity or sufficient reason to upgrade frontend code.
Backwards compatibility in the javascript world isn't great. If you stop updating for a couple of years, half your libraries have incompatible API changes. Then something like a node or UI framework update comes along and makes you update them all at once to work on the new version, and you're rewriting half your application just to move to a non-vulnerable version of your core dependency.
In my experience, upgrade everything all the time only works if you can keep your dependencies to a minimum, which can be harder in JS/TS land, but not impossible. I used to think every line of code you write is a liability, but have come to realize that every dependency is also a liability. So it’s about balancing the two.
No one maintains anything in JS world. You just upgrade or hop to the next framework. The ES spec I believe discusses backwards compatibility but that only applies at the language level.
It wouldn’t be the cool hip framework if it had to commit resources to older versions.
I think there's some ambiguity here around LTS for the project itself vs its dependencies, but as one concrete example, I support node 4 for an open source project I maintain, and I would defend that decision.
In my case, the only additional effort needed for node 4 is to use Array.prototype.indexOf instead of Array.prototype.includes, which is mildly annoying, and sometimes I forget and it causes a failure of the node 4 CI build, but really it's not that big of a deal. TypeScript handles the rest of the JS stuff, and the project doesn't use enough of the node API for those differences to matter. Other projects may different, but I think the right call is to drop support for old node versions when maintaining them is hard or because nobody is using them, not simply because they're old.
I want to use open source projects at work, not just in side projects, and at work we're on node 6 and always lag behind the latest node release by some amount, partly because upgrading node is a real project. We're a startup, not a massive enterprise, but it's still pragmatic to focus on real work instead of upgrading our dependencies all the time. If all of the open source projects out there only supported node 8, I think it would make them much less useful in the real world.
For someone who hasn't opened the app in three years going through the upgrading process for each major version/minor high version would be more difficult vs rewriting for most smaller components.
To the parent's parent's point Javascript has evolved since 2015 and comparing what a 2015 app looks like to a 2018 app is day and night.
Good example, imho better to stop adding stuff to JS, declare it legacy and leave support in browsers so old pages render fine. Then add a new language without compromises like `flat` or `includes`.
i have an app that was written for 0.8 three years ago. it is running on 4.2 with no modification. (it doesn't use any binary add ons). javascript is almost entirely backwards compatible because they can't break the web.
Other apps that require binary add ons required me to install the latest versions from NPM -- one one of them required ANY changes to how the API was called and it was extremely minor.
Waiting 2 years to upgrade the engine your code is running on is not horrible and requires much less pain
The lack of static typing (in base JS, at least) also makes it hard for tools to automatically spot very basic brokenness in dependencies without (repeatedly) running & testing the code. This makes even "safe" version bumps less trustworthy and harder to audit, and makes it harder for developers to notice if they've accidentally changed an interface on one of their libraries that they marked as a minor patch (i.e. the errors are both harder to check for, and more likely to occur, basically because they're harder to check for), so it's tempting to stick to old versions longer.
Add to that everything else—the fast pace of changes, javascript "culture", the weak standard library, the tendency to patch in what ought to either be basic language features or else avoided in favor of more-vanilla idioms, often in competing and incompatible ways—and all that is how you end up with 20 slightly-different copies of the same damn library in your dependency tree, and then 20 other copies of another library that does the same thing.
> JS frameworks are so short-lived that one can't be sure if their code will still work in one year.
Versioning your dependencies is good practice regardless of language and has been for decades. Javascript has enough cons, you don't have to resort to imaginary ones.
Good point. I hadn't thought of compiling old versions of javascript to new versions - that keeps everyone writing in the same language and seems like a better solution to me.
Yes but... I don't envy those who would have to maintain and publish several subsets of the Javascript language.
And you would almost certainly have to specify in the source code which version of the language you wanted to adhere too. You couldn't assume it would be the latest, because that would break your code in the future, when the latest version has deprecated whatever feature you relied upon way back.
It's not necessarily a problem. It actually cuts both ways.
The code is not going to benefit from updates, but the code is not also going to be harmed by updates. Think API or behavior changes. Additionally, if the use case for a dependency changes, sometimes the "new" solution doesn't match well.
Ex:
I work with jscript in ASP sometimes. A javascript library changed such that instead of doing iterative processing of nested items (pushing an item on to an array, then looping, popping off the item and processing it, then repeating), they changed to use node's nexttick with some logic like 'we want to move to nested function calls for processing, and doing that iteratively would blow the stack, so we'll just use nexttick so it won't have an increasing stack'. Well, jscript in ASP doesn't have nexttick or any equivalent timer. So while the original code itself worked flawlessly, the use case of the authors moved and we pulled that code in as an external dependency.
That's obviously an extreme case, but I can't really count the number of times that an API change in an NPM package for node has meant modifying code, without a change in functionality.
So yes, you get updates, and sometimes those are going to be security updates and real bug fixes. Other times though that update is going be adding new functionality (and possibly new attack vectors), or dropping support for your use case, etc.
Like many things in our field, it's wisest to look at the risks in all cases, evaluate them for the specific situation at hand, and then choose the appropriate one, rather than cargo-culting one 'best practice'.
That's correct -- because we compile to "standard" JavaScript, we're at least somewhat comfortable introducing significant changes (even to the syntax) where desirable. Even if you never get around to updating older pieces of code, all of the compiled JS continues to be fully compatible and interoperable with the newer stuff.
“Most frontend developers still remember the dark days of JavaScript dependency management.”
As opposed to now, where even the smallest “app” has hundreds if not thousands of dependencies for the most basic of functionality (‘need to iterate an array, better bring lodash’), often there being multiple copies of the same dependency due to conflicting versions?
This is the direct effect of high levels of backwards compatability. You can't change APIs; you have to create new APIs and then deprecate previous APIs.
I don't see how that follows at all. You can't make breaking changes to APIs, but it's fine to extend them or even expose different levels of abstraction as long as the underlying model you're using stays consistent and you provide suitable defaults where necessary.
If you don't need to do any work to something it has zero cost.
This is tautological, but if we're talking about a typical web application then you are working with that code, every time you make a new build. It might sit there as an untouched black box for a while, but if it has any kind of external dependencies or it relies on anything that could change about the language, platform or other parts of your own code, then there is always a risk that it will later break, and then you'd better hope you have someone around who still knows how to fix it.
This isn't mainstream yet. Until very recently the majority of companies were targetting IE 11.
You've made several claims like that, but in the absence of data there's no way to know whether you're correct. For example, preset-env has been in production for 3+ years now and had an extended beta/RC phase before that. It's true that it defaults to ES5 if you don't configure it, but the Babel docs explicitly recommend that you do, and it is literally a one-line change in your package.json to do something more modern and efficient based on Browserslist unless you genuinely need that degree of backward compatibility.
Of course, in some cases, even quite recently, you did need that. A relatively small proportion of business customers stuck in the IE11 era could still represent significant revenue at risk if you dropped support. But it will take more than an unsupported assertion to convince me that this has been the norm for most web development for quite some time. At one of my businesses, we've had literally no-one using IE11 visit a B2C web app for years. At another that mostly does outsourced development work, we've excluded IE support contractually for a long time and no client of any size has pushed back in years. There must be many thousands of other small businesses out there like mine, and I have to assume that they have senior devs who sometimes read the docs for the tools they use too.
OK, yeah, I'm not seeing this at all.
I think that's because you persist in interpreting my comments as being about JS itself, when I've tried repeatedly to emphasize that it is the culture and surrounding ecosystem that are the bigger problems. I'm not talking about the technical committees defining the language. I'm talking about the developer experience of using it in production.
On that score, I stand by my earlier comments. After many years of programming professionally, using many different languages, I have yet to find any more flaky ecosystem than JavaScript's other than the deliberately fast-paced world of Haskell. Nothing else even comes close for how much wasted effort I've seen over the years just keeping all the plates spinning so nothing falls off and breaks.
But most of the products these people create are also being replaced every 2 years so it's not as big an issue as you think it is.
Yes, I understand the commercial argument. The bizarre economics in the web dev industry sometimes make throwing entire projects out and rewriting every 2 years affordable. That doesn't mean it's not a horribly unstable environment, which was the original point of contention, nor that disregarding traditional wisdom about the dangers of big rewrites isn't a bad idea. It just means parts of the industry are so rich that they can afford to have a toxic culture where quality is a second-class citizen and the only answer to producing so much unmaintainable junk is exactly what you said. It's much nicer working in other parts of the software industry where what you're producing is expected to work and to last, but sadly it doesn't always pay as well.
It has testing frameworks, it has automated code formatters, it has linters...
And these kinds of things are advanced tooling, in your view? We had those in our IDEs for building desktop software back in the last century. Except they were automating rewriting millions of lines of code on 2000-era PCs, not 10,000 lines of code on 2020-era PCs.
I'm not sure what other popular programming language ecosystems you think don't have these kinds of tools today or have tools significantly worse. But in web development, TS is the current hotness because the developers finally realised that explicit typing is helpful for building and maintaining more than small systems, and not so long ago build tools adopting tree shaking was being heralded as a great advance for reducing the size of builds, and before that we got the radical new concept of a module import/export system so you could write code in more than one file without jumping through crazy hoops, and before that we discovered that immediate mode UIs and declarative specs are a thing. All of these ideas were well known and widely used elsewhere in the programming world much, much earlier.
Are you trying to tell me that a breaking change that was so bad that it took almost a decade for the community to complete is something that we should aspire to?
It took that long because that backward compatibility and stability we've been talking about meant a lot of older projects didn't need to move any sooner. And yes, having a platform where you can write code and it keeps working and the ecosystem still positively supports it for another decade is definitely something we should aspire to.
Generally not in the major libraries (React, etc).
Don't tell that to anyone who uses React Router. Or who builds their app with Webpack. Or Babel, as you mentioned. And again, it's important to note that you only said backward compatibility here. If we're talking about the broader concept of stability then you also have to include high profile libraries like Moment giving way to more modern alternatives because although they still work, they don't lend themselves to modern programming styles or play so nicely with other tools.
These comments are getting long and I suspect we may have to agree to disagree on some of this, but if you haven't worked much on software in other parts of the industry, I encourage you to look at what is considered the norm and acceptable practice in fields where better quality and greater longevity are more highly valued. If we built security libraries or the embedded software controlling industrial machinery the way a lot of online companies build their web apps, we'd be lucky to survive the night.
Okay, those cases seem like situations where you'd need to be explicitly targeting those devices. Like if you're shipping new code to users that is being run on an old runtime, then sure, you should make sure that code is legacy JS. I don't see how that is unreasonable?
reply