Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Even web devs learned this a long time ago. And no we are not staying in our own bubble. As you probably know web is huge and it's hard to know all best practices. Even in the language you are programming there are probably things you don't know. People learn all the time, people start learning all the time.

It feels like people have superiority complex over front-end devs and lately specifically javascript. It bothers me. What if you tell me your language of choice and every time I see it mentioned somewhere I will post some random "intellectual question" combined with stereotype that will make you feel stupid and just demotivate you a bit. Maybe I am missing something but your comment made me feel that way.

To answer your question fully; In my opinion it's because there is no single entity that owns the web. No one controls which things should be deprecated or continued, that's why it's happening much slower because big guys have to come to an agreement. Apple provides best practices for iOS, Google for Android,... MDN is a good resource but there is no official "BEST WEB PRACTICES" entity.



sort by: page size:

Because JavaScript rules web dev and the JS community has pushed so many mainstream ideas that have turned out to be duds. It’s a major echo chamber due to its size relative to other web dev communities.

I also partly blame the careerist obsession around always learning new technology, because it’s becoming obvious that a lot of the “new stuff” over the last decade created more problems than solutions. Don’t get me wrong, learning is an important part of the job, but we’ve created a culture of constantly pushing new technology to pad resumes and feed our insecurities around irrelevance instead of solving problems.


Many bad things are popular. Java, C++, JavaScript. Popularity doesn't preclude something from being shit.

The front end UI isn't a collection of evolving standards. It's the same standard with more and more cruft added on top.

HTML wasn't designed for what we are doing with it now. Each additional layer... css, JavaScript, the dom, typescript, react, is a new layer over an old thing. It's like never buying a new car just modifying your old one from the 50s to stay up to date. That's why the entire front end API feels so bad. It feels disorganized with complexity that exceeds necessity.

The complexity of the spec even took years for browsers to get right. This is not a sign of good design. It's a sign of endless designs layered on top of each other all in vain effort to modernize everything and keep it backwards compatible.


My two cents: It's a generation of programmers raised on the Web who think web technology is good because that's what they know. They believe in "open standards" and "design by committee" and nonsense like "semantic HTML" being "best practice".

To everyone outside of that bubble, the following is obvious: Web technology sucks. HTML sucks. CSS sucks. Javascript sucks. They're inadequate solutions, which is why people keep reinventing the way to do web frontend every two years.

> It may be hard to answer as we are all biased in favor of HTML.

Not me.


You are certainly not the only one, and as you point out this is nothing new. Web development is incredibly fad-driven, and the vast majority of web developers can't even tell you why they are jumping from fad to fad.

If you are writing a mail client, or chat app or something, then a single page javascript app makes sense. But I don't think those apps are the majority now, and I don't think they will become the majority any time soon.


Honestly it's a bit disheartening to see a community with so much insight readily jump to bashing every single "hip new thing" in the browser/JS world, heck, even npm/bower.

I think part of it is because of the proliferation of JS in recent years, whereas many people still have this notion of copy/paste Javascript "skiddies". Even among new grads with significantly less experience, I often hear some voicing concerns that "web developers are not real programmers", or there's "nothing hard about front-end work", "CSS is only for designers", or "only people who can't do CS become web developers".

On the other hand, yes, new frameworks are appearing every day, and it's obvious this is a huge source of frustration; but we do we need to chase every new framework? Is it actually true that an application written in X framework HAS to be re-written every 2 years instead of a pure jQuery-based application? What is preventing you from continuing to use Angular v1, that you have no choice but to port everything to v2? (also worth noting that you can port only parts of your app to v2 and run them in tandem but I digress)

I don't want to start another flame war here (which seems to be a common trend these few days), but how many people do you know that has a deep understanding of the costs of DOM reflows and how to mitigate/minimize them? As a full stack developer who has done quite a bit of front-end work the past few years, I can tell you: not many.

Okay, so maybe you say: "For X and Y application that knowledge isn't important." That very well may be the case, but how about those that build richer and more complex interfaces that (maybe) involves thousands or even hundreds of thousands of elements on the page? What then? Has no one ever seen the absolutely abysmal jank and (lack of) performance some JS apps have?

Is it really so horrible that we have to bash and nitpick on every single thing someone makes in the web community? Is it wrong for your kid to come home one day, exclaim "look what I drew today!", and show you a horrific picture of you looking like a gorilla? Or maybe it's okay that a lot of people are very working hard to do away with the hairy pain points we have currently, make it harder for you to shoot yourself in the foot, and maybe introduce some (small) pain points of their own inadvertently? Maybe it's possible that we're not all perfect humans with perfect ideas?

I'll give a quick example of what it's been like for my experience/or the experience of people I've worked with with jQuery/Angular/React.

jQuery: I want to update this element, okay I'll add a class or id as an identifier and query the DOM. Then I'll change the text in here. Oh but now I want to add another element to this list here (but I haven't cached the selector because I haven't learned that yet), let me query that element, create a new element, then append it.

Angular: I want to update the text on this element, okay, I'll change it in $scope, or the template directly. I also need to add a list item, okay, let's just .push() the new record in our $scope variable and it'll appear since we already told Angular to iterate on this model.

As for DOM updates? Well, I may or may not know it, but Angular is going to update it for me on the next $digest cycle which happens in intervals or can be triggered manually, after doing a dirty check for $scope/$watch.

React:

Similar to my thought process with Angular, I update the list item in our model with setState, and React will do its virtual DOM diffs and update for me, or maybe it doesn't update because shouldComponentUpdate told it not to. And it only touches the elements necessary on the actual DOM.

Perhaps it doesn't seem like a big difference, but after working with all 3, speaking from personal experience: once you have to return to doing pure jQuery, you realize how forced and unnatural everything now feels.

Give it five minutes: https://signalvnoise.com/posts/3124-give-it-five-minutes


This is my experience of web development, every time I return to it.

I’m really not an expert in web dev, but it _feels_ like web dev has been badly engineered.

JavaScript is a very badly designed programming language (yes, even in its modern form), which I think has caused a lot of problems. Many JS developers don’t know much about programming language design, so they can’t see how bad it is, and they make matters worse by not trying to abstract or use JS in a way that reinforces good PL design methods.

If you recognise JS as a terribly designed language, then the last thing you would imagine doing is widening its use. But that is exactly what node.js does. You should instead isolate JS and minimise its use. So red flag - something has gone wrong here.

It also feels that many web ecosystems have an over-inflated sense of self-importance. This has led to many competing frameworks for the same tasks, that try to do everything. Things are not pythonic. There appear to be an excessive number of package managers involved. There were already plenty of package managers out there - inventing new versions of existing tech makes it harder for everyone and is usually a red flag that someone somewhere is making bad decisions.

I need to build a website soon and I’m dreading the front end. At the back end I can use range of solid and well designed languages and technologies, but at the front end I feel like I’ve got to choose the least poor of a bunch of crazy options. If anyone has advice on how someone who prefers rigour, simplicity, elegance, and decent PL design over the latest trend should approach web dev that would be massively helpful. “Sane web dev to minimise the impact of the Web Dev CSS/JS disaster on your life” would be my favourite O’Reilly of all time.


Strong opinions follow

Flatly, this is my observation and I hadn’t as of the time of this post really seen it mentioned:

JS is over used not just because of tracking and ads, though this is a big part with all popular websites I’ve visited in the last 30 days (side note: thanks uBlock origin!)

It’s also in large part because I strongly believe frontend developers are re-enforced to think this way. I see a lot of blogs, community meetups and conferences organized around leveraging JS and specifically the large frameworks, which is okay! However it only re-enforces that JavaScript first solutions are implicitly better, rather than an emphasis on leveraging the whole tech stack correctly. I have friends that I respect very much who have largely gotten by with just a passing knowledge of CSS, HTML and hasn’t yet gained a deep understanding when it’s more appropriate to leverage those technologies over JS, let alone the trend of pushing so much work to the client (such as, not even bothering to scope APIs correctly. How many times have you had to sort the results of an API request because it is not sent over the wire sorted for it’s use case, even though your team controls the API?) The industry does not enforce wholistic thinking when it comes to this. That is the real problem to me.

Web components are somewhat an exception to this, as API considerations go they do attempt to strike a balance between server side rendered content and dynamic client side content, our industry just isn’t heading in a direction where that balance was struck


It's like there's a culture of cultivated ignorance around web apps.

The pop culture of web development strikes me as a spastic, young programmer who flits from framework to framework, always in search of some magic elixir to make application development more pleasant. They know it isn't quite right, but they don't want to give up the hard-earned, arcane JS knowledge they've used. Nor are they willing to sit down and admit that, maybe they need to sit down and learn a thing or two about how to structure programs better (as we all do)!

Instead, it's emotional blog posts and bikeshedding over inane topics (vim, Coffeescript), rather than looking at the deeper issue - the utter mediocrity of the tools they're using.


The issue is exactly this. More and more people are coming into the development space using HTML/CSS/JavaScript and the information on the other technologies is harder to grasp and get a hold of.

From my personal experience what the parent comment stated is actually very true.

As someone who is actively trying to get into the JS world coming from the back-end side it is extremely hard to find best practices that do not change every other week. In the beginning of the year, tutorials were made with grunt, yarn, yeoman and the likes, then they were written to use gulp, bower and now it's webpack and what not. I can't even catch up on what the tutorials have used. To be extremely fair, every other week there seems to be a new tool I need to know and I can't even catch up on why I should learn or stop learning/using the previous. Every tutorial/documentation does something differently - there is no unique set of tools and instructions that do not change over time. I cannot find the best practices docs, because everyone has their own opinion of something and bashes the other guy for having an opinion in the first place.

I know it might be a bad analogy - but I usually compare the state of the principles of RESTful APIs to the state of principles of the front-end stack. They are design principles and you have set of instructions/best practices you should follow regardless of the language, framework etc. Thus I can make RESTful APIs in Python, Go, Ruby etc. On the other hand, Javascript is very opinionated framework. I cant seem to find the principles wrapped in a talk, book or docs. Offer me a choice, let me make it, do not make it for me. Let me make a mistake, let me learn how it does not add up to the whole picture.

Every JS documentation assumes something or fast forwards through an important part of the process - the set up. Why the heck do I need to use something, and why is it good with another tool you have in this stack. Why shouldn't I use something? Don't just copy the gulpfile or any other js file I need, explain it.

You said it is a good thing to try a new thing every month - how do you have enough time to observe it and test it properly in production or wherever? I don't see a new framework every month as a good thing, the frameworks don't mature in your environment and neither does my deep understanding of the framework. That makes for constant struggle to optimize a way. People in the JS world, at least to me seem like, hmm, this does not sound cool anymore let's change it up with this.

I hope I don't come of as a hater, I'm just speaking as a front-end newbie.


Thank you for this post. We have a really bad habit of chasing "cutting edge" fads as an industry. It must come from the tendency of programmers to burn out and bow out, which exacerbates the tendency of young bucks to lack any respect for their elders. Software dev is still a very immature industry.

I respect Brendan Eich a great deal and I truly appreciate what he did for NetScape in 1995 by introducing JavaScript. What I don't appreciate is that we decided to extrapolate that madly-rushed, designed-and-implemented-in-less-than-a-week-of-all-nighters language not only into the default client side language for the web for the last 2 decades, but we have also allowed very badly informed people to start using it for server-side applications.

There has to be a way to bring order to this chaos. Any ideas?


This is because Front-end web developement has been taken over by back-end developers.

Few developers complained about JavaScript until Back-end developers became 'Fullstack' and had to start working with it and tried to apply the same mental model they had used with Java or Python etc.

CSS was always something you had to dedicate time to, to master and learn the quirks of the current and past browsers, until Back-end developers decided it was 'terrible' and a race to the bottom of CSS frameworks was created and nobody remembered proper selection or specicifity as BEM was adopted.

HTML was a tool for layout which developers made good choices for thoughtful semantic code, so that it was easily readable, for both man and machine, yet it was a manual process. Then Back-end developers didn't have the time to learn something new and everything is a div and if you're lucky it will have a role.


I think JS has this problem because of constantly high inflow of new developers who easily get on new boats. They cannot differentiate between old and new tools (except for hype and cv-driveness), and new tools don't kill their productivity as much. Personally I never made it into frontend because learning basics makes me sick of html/css/js trinity, and learning advanced tools is a neverending war. Advanced js programmers feel that and try to create more tools, shorting the circuit.

(Otoh, programming has changed since our youth time. Back then we programmed for one local, now we do it for thousand remotes. Latency penalties turned upside down literally, storage is faster than SIMM, ui is slower than FDD.)

Since web 1.98a* we stopped to look at program as a whole and separated it into frontend and backend. That was a key point that allowed so much divergence and frustration. It may sound crazy, but we need new, more low-level browser (don't even call it browser, it is a network layer) that could seamlessly connect single heterogenous app on two network endpoints, since no one really cares about underlying protocols anymore. Leave www rot in browser and make internet apps great again. All good tools from '95 are still here.

* you cannot call something 2.0 until it has at most 2-3 generic and mature ways to do everything.


Exactly! So many people assume that the emerging of a new "best" way of writing front-end software renders all previous architectural decisions wrong.

It even came to the point that some people reject using any frameworks or even a helper like jQuery and "take everything under their control" because they don't want to deal with "deprecated" libraries.

Meanwhile, a small development team I know keeps using Knockout.js successfully in huge projects.


Are you arguing in favor of everyone sticking to the lowest common denominator technology just because the pace of change is too quick? I would wager that things are changing so much exactly because of this reluctance to adopt new developments at scale. It's a vicious cycle, to be sure, but plain vanilla JavaScript everywhere can't possibly be the savior!

Is it the culture? For years the people (Microsoft, Apple) who would have naturally pushed standard ways of doing things in javascript front end, actively neglected it. So lots of much smaller entities had to invent their own ways of doing things. That culture persists on the back end.

What a grumpy comment.

- Two of the best developers I know are completely self-taught.

- I'd argue there have been so many front-end frameworks not for the sake of novelty because the web platform itself was incomplete and stagnated. (And also blew a huge opportunity with the APIs for web components.)

- Now that the platform is evolving and innovating again, more and more things are moving out of the frameworks and back into the platform. Javascript has massively improved.

- Some amazing and complex engineering happens in the frontend space because of the constraints and the need to eek performance out of everywhere you can. You can't just throw more servers at the problem.

- Experiences vary, but I've been on more projects delayed by over-engineering the backend than the frontend.


Again, you are attributing to the web what a pop-culture is doing with it.

While willfully ignoring all the people doing better.

Maybe we are- as you fear- stuck, forever, in thick JS-to-JS transpilers & massive bundles & heavy frameworks. Maybe. I don't think so.


If you stop thinking like a front-end web dev, and take a step back from the whole thing, the status quo is truly insane and embarrassing. A many-billion dollar, world changing industry has been exclusively constrained to one dynamically-typed, weird language cooked up by one guy in a hurry, over 20 years ago.

How can such a phenomenon even exist, without programmers falling over themselves to create a development ecosystem with compilers and multiple language paradigms? The best answer I can come up with is that the history of web development is rooted in a culture that cares only about product design, user experience, and making a million dollars fast. It took a long time for systems programmers to get interested enough to provide the perspective of, well, a systems programmer.

It could be argued that not rocking the boat, and just accepting JS as the standard unconditionally, helped the web succeed. I don't agree with that. I don't see how having a good execution environment in the browser in 2004 could possibly have hurt the web.

next

Legal | privacy