All of this seems like stuff I would expect someone studying CS to pick up on their own. I learned to use git in college, but not because anyone taught me: I just figured it out because I had group projects and needed version control. On the other hand, there's no way I could have possibly learned how something like ray-tracing worked without learning it in school or doing some really intense self-study. I feel like it's a waste of class-time to teach stuff like git, shell scripting, etc. If you're smart enough to understand graphics or compilers, you should be able to learn those tools on your own without any trouble.
I agree that how to use tools like editors is super important.
But it's not "left out" of CS courses. Rather you're expected to figure that out yourself using any of millions of existing resources to do so.
In other words, instead of spending a week teaching you Git, you are shown the concept of version control and the course moves on. You are _expected_ to Google "how to use git". Precious CS time should be spent teaching concepts not products.
This has the advantage of making you flexible. You learnt Git at college but you got a job at a shop using Mecurial? No worries, you did it once you can do it again, just Google "how to use Mecurial".
It's the same for language. It really doesn't matter what language you learn, what matters is that you learn how to program. You can learn another language in a matter of days. (
At college level we once did 10 languages in 10 days, with an assignment in each, just to prove the point. Clearly we didn't turn into experts but it showed how the concept of language matters more than the syntax. Switching between functional (lisp like) and imperative (c like) is more interesting but by that point we already had exposure to both.
You can learn it yourself, but when I took a similar course, it opened my eyes to knowing which tools were useful for what and how to apply them productively. (I work with people now - software engineers - that basically see any command line fu to be magic that they couldn't possibly understand.)
While self learning tools is obviously a useful skill, it's also nice to have someone experienced give you a full overview of how things work. For example, I learned how to use a shell on my own, but learned a lot of features I didn't know by taking a system programming course.
UIUC has a similar course which teaches a variety of commonly used tools, and is graded based on open source contributions
I agree. I'm a self-taught programmer, and I think the learning curve was reduced quite a bit because I had been using Linux on the desktop for years, and I was already familiar with the shell and many of the system concepts.
My new-grad coworkers have more trouble with the build systems, version control systems, deploy systems, and servers than they do with writing code. I think it would be beneficial if more CS programs had a course focused real-world tooling as it is used in the workplace.
It highly depends on where and what exactly you study, but in many cases you've probably been explained shell basics once and can get by with knowledge how to to copy-paste commands to install packages and basic git usage.
Of course, many students are interested in more and learn it, but isn't a requirement, so students with other interests have no real need for it.
Terminal (bash shell) and git are comparatively easy to learn. You can learn very basic functionality with those in a weekend... so no real downside to picking those up -- http://gitimmersion.com is good for git. CS fundamentals like algorithms and data structures are obviously another can of worms. Luckily, while knowing them will give you a far higher ceiling as a developer, they are not always required to contribute to a project in an employable way (depending on the company). So you could definitely get hired as an intern and then dedicate yourself to going back and learning the CS fundamentals over time. I'd say you would want to make it one of your primary personal goals over the first couple of years if you want to really get the most out of your potential in the industry. But it's not a barrier to entry.
Or even better, why should the university waste time teaching me petty things like version control instead of focusing on things that I cannot learn by myself.
The next step would be an ipython notebook verses showing the student the debugger console with prepopulated state in Javascript.. In fact, later lessons as modules to interact with in a notebook would be kind of cool.
I really don't buy the "you have to own a car to drive one arguments" and the whole point of modern software engineering is to pull you away from the assumption of full system control and the ability to make problems go away with shell skills.
But, I would like more immediate source code management integration. That is the essential reality I always see lacking..
A non-programmer that understood git basics would be more helpful to me as a colleague than a competent programmer that doesn't.
Depends on the person and on his specifics. A CS student should absolutely take time to learn to use these vim, awk, gdb etc. For a self-learned dev who is already working and already have his habits, i don't think this is worth his time.
It makes me wonder how this sort of knowledge could be acquired in the first place. Many students studying CS haven't seen a command line before entering university, and intro classes usually encourage students to use some IDE. I've known classmates that had no idea they could run `cmake . && make && ./main` on the command line instead of clicking a button.
And at the end of the day, students should be trained to teach themselves (including things like Git, and Unix).
I'm curious if GP (or anyone else) has been in the position of hiring entry level and finding someone who they felt didn't work out because the new hire was incapable of learning tools. Or the new hire eventually taught themselves?
In general, I agree with you; I’m not suggesting that the whole degree be built around these tools, just that they should be taught, preferably early on, alongside the traditional intro to CS course.
But it’s not just about the tools, there are some seriously timeless concepts and skills under the surface: regexp, environment variables, users and user groups, documentation comprehension, piping, filesystems, streams, processes, to name a handful. These apply to any Operating System or programming environment, and give some concrete foundation to the theories you learn elsewhere.
Plus, some of these tools really do merit a “timeless” label at this point. VIM, git, bash (and bash-inspired shells), and most of GNU utilities have been around for a very long time and have not been conclusively supplanted by more powerful tools.
If you divide the sticker price of my college education by the number of class hours, the implication is that one hour of instruction costs about $80. In the harsh light of that fact, I would still have paid a few hundred bucks to learn CVS or SVN in college rather than learning bad habits. My first two jobs programming (academic and quasi-academic) didn't use source control, and I kept my bad habits until I got into industry and was dragged kicking and screaming into professionalism. I think source control should be taught starting the first day of CS101. If the exact tool changes in 5 years, oh well, you can learn the new tool. But it should be an automatic, instantaneous, ingrained part of your process from day one. (Ditto IDEs and basic Unix system administration.)
Four courses that have been worth substantially more than $80 an hour to me: Japanese, Technical Writing, AI (mostly because it really should have been called Introduction To Scripting Languages), and (weirdly enough) my single course on assembly. That was entirely due to a twenty minute discussion with my professor that had an effect on me, the general gist of which was "Any performance problem can be solved by caching, if you do it right." I haven't programmed a single line of assembly in my professional career but every time a performance problem comes up I cache and the problem goes away. (And is replaced by a cache expiration problem.)
Although I commend the idea of practical learning, I wouldn't want to be graded for learning about bash or makefiles sorry. Those things should be self-taught.
I realize your question is rhetorical, but there are tons of people. Anyone new to programming, in a CS course that uses git, for example, would be familiar with basic git but many would be unfamiliar with the path (or on Windows).
I agree with this. I find that a lot of enthusiasts get caught up in learning to use tools, instead of understanding fundamentals.
Vim, git and Linux are great, but they are tools for getting a job done. Once the fundamentals of computer science are learned, these tools can be picked up rather quickly.
I'm all for this approach, as I'm a huge Linux/vim/git fan, but all of this must come after the fundamentals are in place. It's very easy to overwhelm someone with too much information about tools and terms.
Now if you're the next Linus Torvalds, then this is an excellent approach. Us mere mortals shall marvel at your talents from afar.
1. There are a decent amount of software engineers or programmers whom literally aged at the perfect time to organically learn these tools that later became fundamental. If you even touched a computer from the 60s to the late nineties in a engineering aspect at all, you were bound to have worked in a terminal, worked on computers with a single core, worked on computers with very little memory, had to either get comfortable with some lower level tooling or build your own, at some point had to mess with networking or at least understand how packets were being sent and received, seen and gone through iterations of version control and saving your work, automated task using shell scripts.
2. While there is a plethora of knowledge, videos, tutorials and flavors of ways to learn these things; the sheer volume and breadth that presents to newcomers is daunting. Yes you can learn Git, but there are so many ways to do it that it can cause analysis to paralysis. Building on point (1) if you learned it early there were only a few ways it was being done or even shared in smaller communities. Too many choices or paths can lead to just using the way someone showed you without digging 1 layer deeper just because you might not know better.
All of those things you ‘caught’ by being at the right place at the right time are a privilege. Please don’t look down on people trying to aspire to learn or want to enter into this field that haven’t got there yet.
Coming from a family of immigrants and being the first person in my family to graduate college + enter SWE. I cannot count how many times other engineers were rude, made condescending remarks or discouraged me by shoving these expectations in my face as failures instead of taking the opportunity to teach (I was always open to learning).
I do not agree. If you follow a course about algorithms or data structures, or Java, or parallel programming, you should focus on that. Learning the same time an operating system or command line will introduce unnecessary complexity.
Anyway, when I was a student in University, most of my colleagues knew the command line and had a strong understanding of operating systems. By the time we did the OS courses the only things we had to learn was about the more advanced architecture, system calls.
As a working professional the IDE is a huge time saver. From quick refactoring to debugging, to step through, to step into, to watch memory, to renaming, to jumping to definition, to method or variable exploring, to function definition, to writing common patterns, to running unit tests, everything sits at one keyboard shortcut away. And the IDE is smart enough to correct my mistake as I am typing, so by the time I hit build, the software builds without errors.
Without an IDE, my development time will take longer.
And why shouldn't someone who is aspiring to become a professional not use the tools a professional uses?
If anything the IDE showing his mistakes might make it a more pleasurable experience since he doesn't have to search Google or Stack Overflow as often. And by having a good experience, he will be encouraged to learn.
Nobody ever had to teach me these things, I taught them to myself because I have a little thing called motivation. Do we really want all of these students who couldn't even be irked to learn about something as simple and ubiquitous as version control wasting everybodies time?
This sort of ignorance isn't even caused simply by a lack of interest in open source. Without these very basic skills being covered by this, you're pretty unhireable.
reply