And at the end of the day, students should be trained to teach themselves (including things like Git, and Unix).
I'm curious if GP (or anyone else) has been in the position of hiring entry level and finding someone who they felt didn't work out because the new hire was incapable of learning tools. Or the new hire eventually taught themselves?
Can they do the job they were hired for? Because if someone has a good attitude, can write good enough code, and is willing to learn I can quickly teach them the Linux basics.
Like, if someone doesn't know how to chmod or how to set up a .zshrc, I can get them up and running in an afternoon.
These folks aren't clueless, they've just learned at a higher level of abstraction then you did. That's fine.
Nobody ever had to teach me these things, I taught them to myself because I have a little thing called motivation. Do we really want all of these students who couldn't even be irked to learn about something as simple and ubiquitous as version control wasting everybodies time?
This sort of ignorance isn't even caused simply by a lack of interest in open source. Without these very basic skills being covered by this, you're pretty unhireable.
I agree. I'm a self-taught programmer, and I think the learning curve was reduced quite a bit because I had been using Linux on the desktop for years, and I was already familiar with the shell and many of the system concepts.
My new-grad coworkers have more trouble with the build systems, version control systems, deploy systems, and servers than they do with writing code. I think it would be beneficial if more CS programs had a course focused real-world tooling as it is used in the workplace.
Shell stuff, basic coding, tooling. I don't have a problem with people needing to learn. I have a problem with someone saying they have 10 years of experience and having to do their job for them while paired up (literally telling them what to type and them not understanding what or why).
If someone is joining a company to work full time, I can see how teaching them to use the tools "correctly" is good, and new skills are always a plus, but I frequently work on projects with freelancers whom I will only speak to for 4/6/8 weeks then they'll move on, particularly with javascript/frontend devs and designers it's tough enough to get them to use version tracking correctly with something 'point-and-click' (GitHub for Mac etc.), if I told them they had to learn how to use the command line too, they'd ether tell me to go away or increase their fee.
People /should/ know how to use the command line, even if they prefer to use other tools daily, but often it's not practical to force people to use it.
So you basically sat on your ass and said "somebody teach me"? No wonder that
your colleagues from university got so much better.
I never got much help from other people. In the beginning, I didn't have
anybody interested in computers around me at all, yet I managed to learn
programming. Then, when I started studying IT and switched to Linux, and
I didn't have anybody to systematically learn working with unix from. This
happened when I didn't have an access to the internet, and we had very little
learning material published (first half of '00 decade), so it wasn't even
possible for me to ask a question on StackOverflow or something.
And then the pattern repeats a number of times.
It's not mentoring that helps people grow. It's learning. It can be done
with a teacher/mentor (and in fact it's easier in a number of fields), but it
doesn't require one. Don't blame others for your own idleness.
Spend money to get your system into schools, because a huge proportion of workers will adamantly HATE anybody who tries to make them learn something new after their first few years working.
Apple figured it out. Microsoft figured it out. Google is figuring it out. GNU hasn't quite gotten there.
1. There are a decent amount of software engineers or programmers whom literally aged at the perfect time to organically learn these tools that later became fundamental. If you even touched a computer from the 60s to the late nineties in a engineering aspect at all, you were bound to have worked in a terminal, worked on computers with a single core, worked on computers with very little memory, had to either get comfortable with some lower level tooling or build your own, at some point had to mess with networking or at least understand how packets were being sent and received, seen and gone through iterations of version control and saving your work, automated task using shell scripts.
2. While there is a plethora of knowledge, videos, tutorials and flavors of ways to learn these things; the sheer volume and breadth that presents to newcomers is daunting. Yes you can learn Git, but there are so many ways to do it that it can cause analysis to paralysis. Building on point (1) if you learned it early there were only a few ways it was being done or even shared in smaller communities. Too many choices or paths can lead to just using the way someone showed you without digging 1 layer deeper just because you might not know better.
All of those things you ‘caught’ by being at the right place at the right time are a privilege. Please don’t look down on people trying to aspire to learn or want to enter into this field that haven’t got there yet.
Coming from a family of immigrants and being the first person in my family to graduate college + enter SWE. I cannot count how many times other engineers were rude, made condescending remarks or discouraged me by shoving these expectations in my face as failures instead of taking the opportunity to teach (I was always open to learning).
I've watched a number of new grad hires pick up bash, vim, and version control from scratch in a month or two and go on the be very successful. For better or worse some good schools don't cover those sorts of ancillary skills, and not every good candidate will tinker with Linux as a hobby.
It should be the job of the environment. While i agree about learning general skills, Let's face it that Unix commands just aren't discoverable outside of the curated list by a teacher or other resource.
Yes, and then they learn all about source control. It's really not that hard and is pretty easy to learn. Hence why you don't need to pay someone to teach it to you.
All of this seems like stuff I would expect someone studying CS to pick up on their own. I learned to use git in college, but not because anyone taught me: I just figured it out because I had group projects and needed version control. On the other hand, there's no way I could have possibly learned how something like ray-tracing worked without learning it in school or doing some really intense self-study. I feel like it's a waste of class-time to teach stuff like git, shell scripting, etc. If you're smart enough to understand graphics or compilers, you should be able to learn those tools on your own without any trouble.
While self learning tools is obviously a useful skill, it's also nice to have someone experienced give you a full overview of how things work. For example, I learned how to use a shell on my own, but learned a lot of features I didn't know by taking a system programming course.
UIUC has a similar course which teaches a variety of commonly used tools, and is graded based on open source contributions
Although I commend the idea of practical learning, I wouldn't want to be graded for learning about bash or makefiles sorry. Those things should be self-taught.
I was shocked when I ran across a relatively decent “full stack” developer who ran Linux but had almost no working knowledge of the command line. All they knew was copying commands from Google and pasting them into a terminal.
I never thought I’d have check for basic command line proficiency in job interviews.
I don't think you're wrong that you can learn this stuff on the job, but a "primer" kind of class like this which surveys several useful tools helps new engineers develop pattern matching skills around how to continue self-teaching these kinds of things. Shell/scripty/vim-ey,linux-ey stuff can be really challenging to learn how to learn for some people.
>I'd respectfully disagree with this one unless the steep learning curve is a requirement to the task at hand.
The task at hand is to learn basic linux. You shouldn't be editing server configs or performing other higher level tasks until you have at least a basic competency at using the system, and learning text editing with vi is one of the most basic and useful skills to have since it's ubiquitous.
The reason I'm so opposed to this teaching philosophy is that "just get started developing"-first approaches instead of admin-first invariably seem to lead to sloppy and/or insecure products, and you end up with yeehaw "chmod -R 777 to make it work / SELinux is a hassle, always turn it off" developers that have to be painfully retrained once they get into the real world (or worse, are left to their own devices and you end up with horror stories like unsecured databases being exposed to the internet with PII or PCI stuff stored in it).
It is impossible to develop good quality software if you don't understand the platform you're building on top of, and the concerns of the people who'll end up using your software.
>I read this as "throw em in the lake, they'll learn to swim" vs. being honored to help them start
I see your approach the same way - skipping the basics and throwing them in at the deep end. So many projects suffer because of lack of knowledge about systems administration, basic networking, security, etc.
I'm curious if GP (or anyone else) has been in the position of hiring entry level and finding someone who they felt didn't work out because the new hire was incapable of learning tools. Or the new hire eventually taught themselves?
reply