I'm afraid to mention(considering many might feel offended) that IMO most "web 2.0 startups" are full of ill-competent self-claimed "software developers" who have 3 months of experience in PHP and JS and then build "cool apps", who then are considered "technically talented" just because their "cool app" has had some exposure on different media which has brought in some users.
There's much more to actual technical talent when it comes to building software than what you get from year or two of experience, let alone 3 months. Though, in this era when you can actually build usable software products in weeks, if not days(as there have been many interesting "weekend projects" mentioned here in HN) this doesn't matter as much than it did in the past. For example game development has never been as easy it is today, there are thousands if not millions of young people creating games for mobile devices. Some of them succeed in making money but that is most certainly not due to technical talent, as many may easily think.
Back in 90's kids(under 20 yo) literally wrote their own 3D renderers in software because there was no hardware acceleration available. Now 20 years later, how many teens would be able to write their own 3D rendering engine even when using graphics APIs, let alone when doing it all in software? Although the amount of programmers has gone up, I think amount of competence has gone down simply because technical competence is far less needed these days.
These days there's far less need for technical talent when it comes to "building web apps" than what one would've needed say 10 years ago to create a simple client-side desktop program. Yet, now more than ever there's talk about technical talent. Strange.
Exactly. Having started building webapps (when there just called CGI) in 1996 and seen all of the app servers and platforms come and go, there has never been a better time than now to be a web developer, thanks to products like Heroku, New Relic, SauceLabs, CloudBees, etc.. I'm constantly blown away by the improvements in developer efficiently being made seemingly every day right now. This is why you are seeing a ton of startups (even the well funded ones) only hiring a small focused technical team. What is rather funny is watching how fragmented the mobile development space has become, with some exceptions (PhoneGap, SpacePort, Mobify, etc.), especially considering that is where people are moving so rapidly to do their computing. But I'm no mobile expert so my perception is probably skewed.
(1) Have you considered looking at it through the lens of "Engineering" vs "Web development?" I think there's much more of a methodology involved if you do since the primary difference becomes how would you solve challenging problems through the use of computers, programming, and math than how do I master some domain specific technology in order to write JavaScript and CSS to make a dynamic website.
(2) It's an over-generalization to believe that the best developers are awkward and socially inept. Do you have enough data points? The best developers are actually great collaborators who care more about the problem they're solving and the value it provides to users. Often that makes them stronger communicators and peers because they're working a team of other skilled people (designers, product managers, customer facing teams) to find the best, most impactful solution. I'd rather hire a 10-20% less skilled developer who has great communication & collaboration skills. Have you considered working at an organization where you could work with a larger team in place?
(3) It's becoming easier to learn to write code than ever before but that doesn't mean that the value of those skills is reaching asymptotic levels. In fact, I believe the bar is increasing: machine learning/AI, scaling large distributed systems, re-thinking how we'd build software as hardware improves and becomes cheaper, etc. Finding great engineering talent is still extremely scarce. I won't really comment on how you feel about one's appearance (I don't know why that matters) and communication style (pick a culture you like).
I mean sure, one can still build sane web projects, just don't expect to be able to get a job doing that and do expect to be mocked as some sort of luddite.
When we build web apps for clients, they generally don't care what tools we use as long as the results are good. There's still plenty of work out there like this and there are still plenty of success stories.
You just have to avoid the parts of the industry, particularly around the startup scene, where most developers have just a few years of experience and are heavily influenced by high profile online commentary. That type of business tends to hire buzzword-compliant people because even their own "senior" developers don't have enough experience to know any better.
It seems to be more of an Internet Culture thing than a Software Developer thing. There's a huge industry of C/C++/Ada/Assembly/.NET/JVM programmers out there, behind the scenes. Many of them are working for big companies, and big ol' companies generally aren't cool with their employees even talking about what they do online. You know that old 'the opinions expressed in this blog are my own and do not represent the company I work for or their investors or their advertisers" spiel? That's considered lenient.
But the Internet likes to talk about itself, so here it seems the only world is Javascript. And this corner of the Internet likes to talk about "tech startups", which is a very broad sounding term that actually refers to a very narrow set of almost exclusively web-technology companies. Not much hardware coming out of YC.
Additionally, this is somewhat of a "get rich quick" crowd, and nothing gets you a bigger market faster than a 'web app', whatever you define that to mean. It was always hard to get the average computer user to install desktop software, let alone pay for it, and that hasn't changed much.
I really don't agree with this. I'm a 27 year old JS-focused web dev @150k base.
I understand that the bootcamps and proliferation of web-teaching schools have made web dev seem like the easy thing, and I agree that what I do is less complex than these $800k-earning data engineers, but web development in 2019 is no joke. The same goes for iOS/Android.
The types of things these companies are trying to build are not just "slap it together in a week and done" type projects and it takes experience to architect and engineer them effectively. Your average bootcamp grad is not able to do it yet.
iOS/Android/Web development forms the 3 primary portals to a company's clients, and as long as that's true, the value of such skillsets will remain high.
I think people forget just how new the web industry is. The web itself has literally only existed since 1992. It's only been commercially viable to charge people for websites since around 1995. Web development in the modern 'building apps that are accessed in a browser' sense has only been done for about 20 years. If you're talking about 'single page apps' that drops to about 10 years.
Someone with 2 years experience in web development has been working in the industry for 10% of the entire time the industry has existed. If they've got a 3 year degree and a few years building things before that as a kid, they might have been making web stuff for 50% of the time anyone in the industry has been working in it.
"Time served" as a measure of experience doesn't really work in an industry that's as new as the web, nor does it work in an industry where the barrier to making a website is knowing how to open Notepad.exe.
Really good people could definitely make senior level by the time they've been working professionally for 2 years.
The reality is that that lightweight or even hand-coded apps and sites are faster, more semantic, and easier to maintain, and often look less cookie-cutter. But you don't get hired for knowing how to create a great web app with HTML, CSS, a server-side language, and a little JS or jquery for flavor. You get hired for becoming an "expert" in something that has only been around 2.5 years and is at the inflection point in the hype curve. What you're seeing is the result of resume-driven development.
The problem with web development isn't that there is a resurgence of bad development practices. It's that the average web developer isn't as skilled as what they need to be.
The average U.S. worker spends 4.6 years in a given career, and yet it takes ~5 years to master a framework. The average computer science grad earns much more than a web developer, so the skill set required for proper development is often lacking. Add into the mix cheap offshore labour, poorly made "out of the box" web packages aimed at medium-small business, and inexperienced "geeks" who build poor websites on the cheap.
Given the skill set required for professional level web development is on par with software development, it's no surprise the role isn't getting the skilled people the career requires.
In my opinion a large percentage of web developers really are unskilled. I blame the prevalence of those three week development bootcamps, you have a lot more of those for web development than backend/application stuff
A quick glance at TechCrunch's Job Board shows lots of employers looking for talented web developers. Their requirements usually include all the latest web 2.0 technologies like JQuery, YUI, Dojo, Prototype, Rails, plus PHP, SQL, etc etc...
I don't work in the industry, and am not sure what its actually like out there: Is it hard to find people who possess all of these skills?
I don't think it's more skill, just different. I currently work on a DAW. I left for a while because I thought my skills were becoming obsolete and I needed to get in on this 'web' thing. I was useless and I lasted 9 months. To just jump in and learn html/js/css + everything else you need to be a web developer is a huge task. And a lot of stuff just doesn't work -- and the only way to know how to get it to work is experience.
By contrast I think C++ and realtime audio programming is simple, but I've slowly picked it up over almost 20 years. I retreated back to desktop apps and firmware. I'll take another stab at learning web technologies, but not at a startup where I need to deliver asap and learn at the same time.
As much as developers hate sandboxes, the lack of one (except Mac OS until recently) is why I think desktop is failing. Users just don't feel safe installing random software like they do going to websites or installing mobile apps. Users have been conditioned to only install software from sources they trust, and they trust no one.
If I post a link to my new web project, most people will click on it. If I post a link to an .exe almost nobody will download it. I think that's the main issue killing the desktop. Big downloads, compatibility issues, slow installs are also an issue, but I think they are secondary.
The real problem is that to compete in today's world, your website:
1. Needs to have great design
2. Needs to be rock-solid in terms of stability
3. Needs to be snappy to use
4. Needs to be able to handle a Reddit traffic swarm
5. Needs to have individualization and customization
6. Needs to play nicely on computers, tablets, and phones
The person (or people, more likely) who is good at all of this just isn't available at rates the companies want to pay. This is the important distinction. Hiring somebody for 100k/yr is pretty much the minimum ante for a good developer in a tech city. But these "entrepreneurs" and "founders" think that is too much money, and instead claim there's a shortage.
In reality, if you're competing for top talent, 150k/yr should be your ante, and be prepared to go well into the 200s. This shouldn't be a surprise. To get even average talent from a consulting/contracting company, you're paying about 125/hr (of which 60-70 goes to the developer). That's 250k/yr. So why are companies freaking out at paying over 100k/yr for a full-time person who they can invest in? Hell, cut out the contractors and pay your people 200k/yr, and watch the good people come talk to you.
This is what Amazon and the like are doing now. I'm hearing 22 year old kids who haven't coed a line of code in a professional capacity, getting offers of 120k/yr with a 20k signing bonus. Netflix pays a significant chunk of its engineers over 200k.
The only real problem is that it's getting more expensive to do business in a tech world. But compare the skill sets that used to draw 100k/yr, and the skill sets now required to get 145k/yr (100k/yr in 1998 dollars = 144k in 2014 dollars). Before, you could just know HTML and some shell scripting. Now, you need to know multiple frameworks, how to deal with scalability, multiple data stores, "full stack," etc.
If anything, the good talent in this industry is still dramatically under-paid. A common metric is that the company should make from you about 7x your salary to justify your position. So if a company is making 2.1 million, you should be getting 300k. But we all know that's not what's happening. Instead, our wages are being capped and forced down due to illegal collusion between the top tech employers.
This all comes down to the people who own the businesses wanting to treat tech people as interchangeable cogs, despite all the evidence to the contrary.
Sorry to hear of these experiences, you've given your life building up experience in a field and it deserves recognition.
But I wonder if this is mostly a US-centric phenomenon. Or maybe it is a tribal mentality, I wonder if web devs of 20 years ago face the same problem with younger web devs today. Or a large amount of web devs considering everything and everyone who doesn't work in the browser to be stoneage.
As an anecdote, I once worked with a gentleman twice my age who cut his teeth writing assembly for an obscure HP platform, and I had huge respect for him. And it was obvious that even though we were developing a modern full stack webapp in Python and React, he had an excellent mind for the tough engineering questions.
The bar's certainly a lot higher. When I was in high school in the late 90s, a bunch of us high school kids got jobs as webmonkeys writing HTML for the dot-com boom. It was something you could easily teach yourself as a student.
Nowadays, Google's having trouble hiring enough people with sufficient web skills. And the reason is because the web basically "grew up". In addition to HTML, you also need to know CSS, JavaScript, how to write maintainable JavaScript, DOM APIs, browser quirks, performance tricks, emerging web standards, and at least one server-side programming language. The definition of a "web developer" has shifted so that it's really a full-fledged programmer with specific domain knowledge, akin to the skill level needed to do compilers or embedded systems. And the barriers to entry have become correspondingly bigger; you don't see as many high school kids writing major websites any more.
5 years ago is a lifetime in tech, the blog clearly mentions the difference. Web dev stack is a dead-end professionally? That doesn't line up with reality at all does it.
You attracted better-qualified colleagues? How so exactly? You have a smaller pool of candidates to choose from. Are you implying native developers are just smarter?
Outdated opinions, favouritism and elitism all wrapped in one.
The problem is "web developer" is a junior role now. Nobody hires web developers with 10 years of experience unless they know the underlying "software engineer" stuff to go along with it. The market is flooded with bootcamp devs that are now doing what was considered 'software engineer only' territory 10 years ago. Some designers are also getting wise and realizing that HTML/CSS and javascript aren't beyond their ability to learn.
There's much more to actual technical talent when it comes to building software than what you get from year or two of experience, let alone 3 months. Though, in this era when you can actually build usable software products in weeks, if not days(as there have been many interesting "weekend projects" mentioned here in HN) this doesn't matter as much than it did in the past. For example game development has never been as easy it is today, there are thousands if not millions of young people creating games for mobile devices. Some of them succeed in making money but that is most certainly not due to technical talent, as many may easily think.
Back in 90's kids(under 20 yo) literally wrote their own 3D renderers in software because there was no hardware acceleration available. Now 20 years later, how many teens would be able to write their own 3D rendering engine even when using graphics APIs, let alone when doing it all in software? Although the amount of programmers has gone up, I think amount of competence has gone down simply because technical competence is far less needed these days.
These days there's far less need for technical talent when it comes to "building web apps" than what one would've needed say 10 years ago to create a simple client-side desktop program. Yet, now more than ever there's talk about technical talent. Strange.
reply