Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I agree with this. Most companies don't want to spend the time and money training a candidate. This doesn't apply just to new college grads but to more experienced candidates trying to find work where newer technologies are used. Someone who is a very good C programmer will never be hired as a Ruby on Rails developer though they could probably learn RoR in 6 months. This person is stuck finding C jobs or ends up getting forced out of the market. The reason why companies don't want to provide training is because they don't have anyone to provide that training. In the good old days, a senior engineer would take the time to teach a new hire the code base, best practices in a language and the tools involved which'd help the new hire ramp up fast. With the whole 'agile' process, everyone is focused on completing their sprints and training your co-worker gets penalized. Some blame must be placed on experienced engineers who coast along in a Big Co till that company starts going under or starts layoffs. Suddenly they realize that they don't have any skills that the market needs.


sort by: page size:

There's so much they don't teach at coding school :( .

This is why entry level devs have a difficult time getting a job and devs with 1-2 years of experience are recruited. (In a more normal market; I understand that right now with all the layoffs, everyone is having a difficult time.)

There's so much on the job training that companies want someone else to do.


While this might sound like a good question on the surface, it does not get to the "root" of the problem. Let me explain ...

Yes, most companies prefer to hire people who already have the skills & experience rather than train "junior". This is not because companies don't want to develop the skills of their employees; it's simple economics. The biggest bottleneck in any company is experienced people. The senior engineers who already understand all the systems, have been to all the product meetings and solved many critical bugs in production. These people are the "goose that lays the golden egg". Most companies are looking for more of these "golden geese" who can be effective & contribute to the product immediately because the "ROI" on these people is 10x (or more!).

Training someone up from scratch in a key tech and all the companies systems usually has a negative "ROI" for the first 1-6 months and distracts senior people so it's a "lose-lose" in the short-run! Add to the fact that most companies have a "LIFO" pattern with hiring (the most recent hires are usually the people who exist first!), and many hiring managers (HR) are put off by the idea of hiring people who do not already have the required skills.

Consider the following often repeated quote/saying:

CFO: What happens if we train them and they leave? CEO: What happens if we don’t and they stay?

A lot of people have the mindset that training people costs too much time, money & effort and it distracts the key people in the company away from their focus (building the product).

This is not the fault of the company or the people working there. It's a "systems problem"; most companies simply don't have an effective system for "on-boarding & training" new people.

I've worked for several companies over the past 20 years (including starting my own twice) and have been responsible for hiring & training thousands of people.

Training people in tech skills, company culture & workflow simultaneously is a "hard problem". If you can get a "head start" on at least one of these areas the chance of successfully integrating someone is much higher. HR people know this so they want to "check" as many of the skills boxes as possible up-front. You as the "junior dev" can use this information to your advantage and invest a few hours up-front to demonstrate the necessary skills and make the HR/hiring manager's job much easier!

My advice to any "junior" person reading this:

1. Focus on your own learning/skills for at least an hour every day (preferably first thing in the morning).

2. Share your learning somewhere public e.g: GitHub or a Blog. that way the hiring manager reviewing your "CV" has a clear indication that you are "fast learner" and a "team player" who shares what they learn to help others "level up".

3. Pick the skill/tech/tool that is most valuable in your chosen industry/sector or even target it to a specific company you want to work for. e.g: if you know that AirBnB uses React.js https://stackshare.io/airbnb/airbnb you find and devour all the best tutorials for learning React.

4. Consolidate your learning into a tutorial of your own to show that you have understood the tech/tool.

5. Link to it directly from your CV/LinkedIn.

Seriously, this will take you 20h at most. You could get it done in a week and it will transform your "hireability" from "no thanks" to "when can you start?".

I know this because I have used this strategy to get jobs & contract work in the past to excellent effect. Investing in your skills and sharing your knowledge is the single best time-investment you can make. It's a 1000x ROI! Put in 20h of focussed effort and you will get an extra $200K in the next 2-5 years. Guaranteed.

My advice to any company wanting to solve the "problem" of hiring "junior" people and making them effective as fast as possible is:

1. Commit to becoming a "learning organisation" where everyone in the company shares as much of what they learn as possible.

2. Establish metrics for learning in your company! "What gets measured gets done". If there is no actively tracked & visible metric for each person's learning, people will stagnate and default to using their existing "hammer". https://en.wikipedia.org/wiki/Law_of_the_instrument what you want is people who are proactively learning new skills/tech/tools and then bring those skills into the company to improve effectiveness or build features that your current tech does not allow!

3. Systematically share anything that is not "sensitive" or "secret sauce" in public. Having private wikis with lots of learning is fine for internal use, but what if people could learn your "stack" before they join your company/team?

4. Hire the people who proactively contribute to the learning materials without being prompted. This is the mindset you are looking for: people who want to learn and share what they know regardless of getting paid.

Anyone looking for a proven example of any of this, see: https://github.com/dwyl?q=learn dwyl is a bootstrapped, profitable Open Source software company that shares all of it's learning in public. [disclaimer: I co-founded it!]


Everyone claims in a job interview they're willing to learn as needed. You hire programmers because they know how to program and will be valuable from the start. Not some novice who did some rails tutorial.

I have only seen software training twice in my career.

At Travelocity I became a developer because I was involuntarily reclassify into development. If I wanted to keep my job I had to learn it. Any time tough challenges came up I was expected to train myself and figure out how to do it the right way. Copy/pasting from an external source or hoping some framework/tool would do my job for me were unacceptable. This line of thinking evaporated the moment I separated for a military deployment.

At Bank of America they will pay for you to attend some boot camp class about using some framework.

In absolutely every other case you were expected know what you needed the moment you walked through the door and there is never training or any other kind of professional growth once you get there. If you want to get better or learn some new school you are writing code on your own outside office hours.

That sets false expectations and bias in hiring and candidate selection. If you look at almost any posted developer open rec the listed qualifications are usually a list of tools, frameworks, and languages. Rarely is the list of qualifications specific to skills or performance.

The expectation is to hire a tool money that can turn a figurative wrench instead of an engineer who can create a more powerful/efficient system. The bias kicks in during candidate selection. The company knows it needs a capable candidate to be selected for the given set of requirements but there is no formal industry guidance to define any kind of baseline.


Not that I disagree with you, but sadly, most companies don't seem to want to train people anymore. They rather push that responsibility to either universities or the applicants themselves.

In fairness, I do think that someone who is very comfortable with a language or framework will be able to do things much faster than someone who isn't. Considering how quickly people change jobs nowadays, I can also see how companies don't want to train people only so they will leave shortly after or even before they're useful.

Obviously this is just based on my own experience in the US.


>>, many employers would rather hire a couple of inexperienced computer programmer and spend a few months training them [...] In addition, many employers aren’t interested in providing training to engineers or programmers

>That directly contradicts the preceding paragraph,

The "many employers" can be 2 different subsets of employers and/or 2 different tech stack situations. Examples...

Subset (1) FAANG or "tech" companies will train on specific in-house technology stacks for younger new hires. The "inexperienced" was in referencing "young". E.g. Apple hires fresh young college graduates that only did Scheme and Python in school but will train them on Objective-C and Swift so they can work on macOS and iOS. However, Apple typically doesn't hire older experienced COBOL programmers to re-train them in Swift.

Subset (2) companies that don't train new hires (many non-tech companies where IT/dev is a cost center). They usually don't recruit from college campuses and prefer the job candidates to have existing skills that already match the job listing. E.g. a hospital IT's department has a job listing for a Java programmer to help maintain their billing system. The hospital is not interested in a candidate who's skillset is only Turbo Pascal and Macromedia ColdFusion and retraining them on Java.


Part of the issue is that programming is largely seen today as a process of problem solving using an "agile" hack it 'till it works approach. Quick thinking to put out a fire or implement a quick feature using some new technology easily learned is most valued, along with the ability to be "flexible" when it comes to quality of work and workplace demands.

Given this, one might as well look to recruit some smart, cheap and enthusiastic graduate. Sadly industry experience and lessons learned don't seem so important.


That's a good point. The question is, is the return from training up someone worth it? Do they spend more time at the company than someone hired in with more experience? Do they get the company more in income than they cost in training?

The cost is in the near future and fixed, whereas the benefit is in the future. Which is why a company can't answer those questions, and often goes with the safe route and tries to poach or hire a senior person (which, honestly, is just a junior person who has made mistakes and learned on someone else's dime).

However that has its own risks because it's not like interviews are cost free.

I just think there's a tremendous market flaw that someone is going to take advantage of by finding talented entry level folks and hiring them for less and then making money somehow. That or eventually entry level folks will leave the profession of software development.


You're trying very hard to dismiss experience specific to a company and industry, but I'm not buying it. Someone who has intimate experience with an established code-base and the company will be vastly more proficient than a new employee of similar intellectual capability. It's why companies often try to hire people within similar industries and tech stacks to help reduce this gap.

It's like trying to argue that a mechanic who has only worked on Ford vehicles for 20 years should be able to quickly have the same proficiency as a mechanic who has 20 years of experience on only Porches. Nonsense.

Remember, a company can typically expect to hold onto a developer for maybe 3-5 years, so even half a year of training is a significant portion of their tenure.


Right! What we are trying so hard to hire are not programmers, but good engineers. For candidates who don't have much industry experience, we are looking for potentials. Some crash courses won't grant potentials to candidates. Speaking frankly, we expect the new hires to learn fast, to have a great jump after working with good senior engineers for one year. This one year helps the new hires to connect the dots of their knowledges. That requires a solid background, which can't be acquired by crash courses.

There are genius programmers who don't need a CS or related degree. They are exceptions, though, not norm.


More and more I think it's because it's difficult for companies to identify the capacity to be successful at this type of work early on. Partly because it's actually difficult, and partly because companies are really bad at identifying skills as a rule.

Instead they resort to the simplistic. Senior developers have made it through several filters. They've proven they can do the work, they've done it at several companies, and they've survived in the role enough to be promoted a couple times. That's just signal that they're less likely to run into problems, and that there's less risk the hire won't work out.

So, companies are lazy. Not intentionally—but because they're literally squeezing every ounce of time and focus on the very complex and wildly difficult task of managing a company and staying solvent and productive and going in the right direction.

They shouldn't be—they should have room for training and developing a talent pipeline—but this is the real world, and most companies can't even do basic company things really well, so the prospect of also doing on-the-job training really well to de-risk the hiring of junior engineers seems pretty far out.

In addition, many hiring managers have learned the hard way what '5 years of experience' means, because they've had 5 years of experience themselves, and only on the 5th year of that figured out how to do their job in the way they now know it needs to be done.

It's not an arbitrary set of skills they know they need, but an experience working in an ecosystem in specific ways that they know are crucial. Experience is not necessarily just skills, but a sequence of realizations, hardships, events, and successes that teach you things you can only learn by going through them.

Hiring managers hiring for a Senior role and pointing to a checklist of technical skills are probably not being fully forthcoming—they're likely looking for someone who does not see their own value as a checklist of technical skills.

Not saying that's you necessarily, and they could certainly be wrong, but personally I've learned that when the team needs someone with 5 years of experience, it takes 5 years to develop it, and there are no shortcuts to gaining that experience. The one thing I've found that speeds it up is experience in smaller companies or starting your own business—you'll experience a lot more much faster.

That said, there are companies that truly invest in on-the-job training, and with a holistic business model centered around that as a core value, it can be successful. It takes a philosophy, though, that most businesses will never mature enough to reach, dare I say, especially tech businesses.

The one I know of is the Greyston Bakery in NY, which was started by a Zen monk named Bernie Glassman (who passed away a couple weeks ago, sadly). His book "Instructions to the Cook" is really interesting, and outlines how they made a policy of hiring anyone who wanted a job work very well. They now run the Center for Open Hiring that helps other companies do the same thing.

It would be pretty incredible to see a tech company embrace that kind of thing and really invest in hiring and development as a strategy. I'd almost imagine it as a merger of something like General Assembly with an actual product and long-term business. It would be interesting to see if the significantly greater investment would be worth it over the traditional model.


I can't disagree with this person more.

More than anything have learned that education and training are hugely important and hiring to train leads to mediocre staff who think their two years of development work stack up to your 4 years of college and 6 years of professional experience.

They take forever to start writing productive code, if they ever bother leaning at all.

I will never hire someone without a degree or equivalent experience again. Even for Jr. roles


I agree with your points.

It's like no company wants to train anyone (internal nor external candidate) and expect you to start working on their production system/code almost immediately.

Self training and really outstanding portfolio is the way go.


I was an inexperienced candidate when I got my first dev job. I had literally some very basic front end skills in that I could use jquery to change font colors. I'm now a fairly seasoned backend developer - if nobody had taken a risk on me (for low pay at first) or mentored me I would not be anywhere near where I am today. We treat training like a cost center and despite the huge amount of money being made out of the tech industry there are very few companies organizations or even groups of people that prioritize taking time to make people productive. I don't know the solution but it seems that there is a huge untapped amount of potential skill that will never get used.

Why not hire someone you think is smart but inexperienced into a low level programming job?

There's lots of chores that don't exactly take a lot of deep knowledge.

(Of course the answer is that companies have discarded the idea of having a training pipeline)


I don't think this matters in big software nearly as much as in other fields. Every new hire needs to be trained up - even the most experienced are going to require at least a few months before they are useful to the company.

That’s totally fair (and there indeed was a time when the typical response to my resume was, "we are not looking for juniors") :-)

But this argument makes sense only from the collective standpoint (as in, the industry in general). Of course developers have to be trained by someone, because otherwise there will be no dissemination of knowledge, no passing down of experience, and no qualified workforce. So of course that should be done.

The question is, whether you (as a programmer on a team looking for new members, or as a company looking for new employers) need to be doing this, or whether it's all right to let someone else do this while you look for already well-trained and experienced professionals.


Serious question: Why did it never occur that you might need to take in a less experienced developer and train them? Wouldn't it make sense that there are no experienced developers available because they are currently fully employed?

I agree that companies need spend more time developing talent than chasing a dwindling pool of people who already have it. My mom had a similar career path to your father - she was hired out of college in the late 60s with zero programming experience by one of the railways to work on scheduling systems.

There is one difference, though - the opportunity to learn programming (actually, to learn most things) has exploded in the past few decades. In 1969, I think you needed access to expensive technology (like a mainframe) to have the opportunity to learn. Now, a few hundred bucks and a web connection gets you started. So if someone wants to be hired into a programming position with no experience in 2013, that does say something about the candidate that it didn't say in 1969.

next

Legal | privacy