> This brings me to another point. Programmers are necessarily autodidacts because shit changes incredibly fast. You can't be a one trick pony. So if we're pushing kids through these incredibly formulaic methods of learning, what are we really preparing them for? A life of being an enterprise code monkey? The student who taught themselves anything, even if they are missing a few core details, are more equipped to fix those gaps in their understandings.
depends where you go and what their objective is. some schools focus on teaching real-world tools. which is great for getting a job, but less useful because there are completely new tools every few years.
other places focus on the theory of it all so when you see "new" things, you already have the knowledge to understand the basics.
chances are your professor is cherry-picking the parts to implement because while you could fully implement all parts, the learning probably bleeds off quickly. Also, based on your calculations, assuming you take a full course-load (5 courses), 3 weeks * 5 = 15 weeks = ~3.5 months, which is roughly a semester.
sure, you could get an education without a school. nothing stopping you. but, ultimately, people put a value on the piece of paper, the connections can be valuable, and school does a good job of educating/exposing you to a lot of things in a fair amount of rigor in a decent amount of time.
> the thing I keep coming back to is something along the lines of guided self-learning.
As a university lecturer in CS, I agree this would be a great thing for a lot of students. The thing about CS though is a lot of students hear that programming is a path to a high paying job, and they are really not motivated to learn the subject in the way that people who frequent HN might be. They don't want to self-learn, they want to be told what they need to learn to achieve an end-goal of earning a high salary when they graduate in 4 years. If you sit them down and ask them about their curiosity, interests, or ambitions related to CS, they give you a blank stare. They just want to get paid.
This perception also means that our program is the biggest at my institution. Students from every college want to take our classes. Our department is not so big (in terms of faculty, fewer than 20), so our class sizes are huge. My PL class last year was 200 students. My systems course last semester was 160. What this means is that I can't offer the kind of guidance for self-learning. Maybe if my class sizes were 30-40 students, but not for classes of 100+ students.
And then there's the issue of what students imagine a self-guided education looks like. They want to do things like mobile app development and AI. Most students don't self-guide themselves into fields like compilers and operating systems in my experience. They just aren't interested. Hell, I wasn't interested in these topics, until I was forced to take these classes as part of the standard curriculum. Now compilers are pretty much the only thing I'm interested in! I guess that's where the "guided" part comes in, but the point is that if students are left to their own devices, I worry we'll end up with a generation of programmers who are experts at making predictive AI models and iPhone apps, but have no idea how an OS or compiler works. Then who is going to teach the next generation how to make an OS? Already we have problems hiring people in these fields. 90% of the tenure track applications from our last round of hiring were from AI/ML type researchers, with only a few systems people (2-3 if we're lucky). I even have trouble getting TAs for my PL class, because all of our available grad students only know Python and C++ for their ML research. I see this only getting worse in the future.
>> If you already have prior programming knowledge and know at least one programming language, joining this camp is silly and it doesn't prove anything one way or another
I have to disagree here. If you taught yourself enough for what you needed at the time and never knew any other programmers then you face a huge impediment to learning. Every time you hit a wall where you know how to describe the problem to a human but not to google or stack overflow you can spend hours (notice I didn't say waste) hunting down what is the proper way to describe the problem. While this can be extremely educational in the broad sense it can be slow in terms of actual progress towards being a better programmer since the overhead associated with troubleshooting/debugging is multiplied when it is most costly.
Also, if you're spending 12+ hours a day writing code on average (in my program we did) for 12 hours a week, then you're putting in a hair over a thousand hours of coding time. Assuming a 12 week semester, 2 semesters a year, and 4 years this comes out to about 10 hours per week of actual coding. This clearly isn't the best student in a 4 year program's work schedule, but it's not unrealistic for an undergrad. It takes time to get situated with a given problem and into writing code, but if you're essentially on a 12 week coding binge where you take breaks to sleep and eat then that context switching overhead is reduced. The productive hours of a 12 week bootcamp may seriously outpace the hypothetical student who spends that raw time over 4 years.
> Not everyone can learn coding without externally imposed structure.
That's actually not what a CS degree is primarily about. You learn to code in maybe the first 2 or 3 classes. After that, it's assumed that you can translate ideas into code and you start learning about different areas of computer science.
If you just need to learn to code for a job, there are bootcamps that can teach you that in a fraction of the time and cost.
> But those who can't probably have an ongoing problem in keeping up with the state of the art.
Personally, I wasn't able to learn to code in any meaningful way before college. I had tried to learn from books and online sources but never got beyond basic scripting. After working my way through a bachelor's degree and PhD, I don't have much trouble keeping up with the state of the art now.
> Anything you learn in school, you can in principle learn on your own, but it can take significantly more time and there's no guarantee of consistency (compared to someone learning with a teacher).
So, honest question, did you guys really learn anything directly from a teacher? I've gone through high school (obviously) and university but everything I've learned has been at home by reading about it (or simply practicing to become fast enough). The lectures tended to only touch the absolute basic concepts, and the actual learning you had to do at home. Maybe it had to do with the lectures being in giant halls with little to no interaction with the prof in most cases, and that probably changes if you're doing a PhD (or just go to a different university), but even during high school I pretty much never learned anything of note directly from a teacher. So if anything, it would have been significantly faster and more efficient for me to just get a list of topics to learn instead of sitting in class.
As for consistency, my friends from a different university learned in some cases completely different things. If we compare strictly what was discussed in class or required to pass tests, there would be surprisingly little overlap. (And neither overlapped very much with actual programming.) Even courses with essentially the same topic would often differ greatly in content, as the professor usually decided which particular things to focus on. Which is completely fine I mean you can't go in-depth into everything, but this notion of consistency is kind of funny to me when the same degree from different universities (sometimes even the same university just a few years apart with different professors) can mean completely different skill sets. And then of course if your have a CS degree and want to work as a developer, from my experience you have to learn the actual programming pretty much 95% on your own anyway, as most courses focus on purely theoretical topics, and programming simply requires a lot of practice.
> -Did you learn to program in school or teach yourself?
Well, I guess a little bit of both.
I wrote my first program in the early 80s. It was a game I copied directly out of a book. It was my first exposure to programming.
I also took a programming course in High School in the early 90s. It was, crap, and my 'self-taughtness' I was too advanced for that class.
Then I went to college and got a Computer Science degree in the mid 90s. This degree taught me a lot about programming that would have been harder to pick up on my own. As I understand it; college curriculum in programming is a crap shoot. Some are good; some are bad. Some are focused on teaching specific languages and some are focused on programming concepts.
I would classify my education as winning the lottery. Because I was taught a lot of the concepts behind programming theory and how to apply them. This has put me in a very good position in my professional career whereas I am often able to pick things new languages / technologies / approaches easily.
I'm not saying that an education is necessary to be able to pick up new things quickly; I'm only saying it helped me.
I feel I meet a lot of programmers who know a language [or framework] while missing some of the other underlying concepts and they struggle when it comes time to learn something new.
> -Did you do unpaid work to establish yourself?
Nope! My first client was in college. I was writing some data processing code for some type of research they were doing. I think I got paid $20 per program; each which took me a few hours to write.
My first programming job was a co-op at a business to business advertising firm. A co-op is like an internship; but mine was a paid internship. I made a lot more on the internship than I did at my job at Waldenbooks.
However, I'll add that a lot of what I've done is writing, both blog writing and book writing and article writing. The blog writing is unpaid. I'm cautious to call the book writing or article writing unpaid; although it paid very very low. These actions help me convince clients I know my stuff and has helped me keep an "independent" career as a small business owner with consistent work for many years.
> -Roughly how long did it take you from day 1 of learning to day 1 of being paid?
Where do you count day of learning? If it was when I Was copying stuff out of a book in the early 80s; then probably around 15 years. If you count when I first started college, then probably about three years.
> -What was your first gig?
My first 'real' job was as the 'tech guy' at a business to business consulting firm. They did a lot of marketing. I did a lot of Lotus Notes work. Some work with Perl; web development stuff (JavaScript/HTML), some iCat (a Now defunct ecommerce technology), and some ColdFusion. This was the same company I co-oped with. They gave me a full time offer before I graduated [and I started the week after I graduated].
A few years later; I left there and 'accidentally' started my own consulting company which I still do today.
> they compensate by being able to learn quickly and efficiently for the job at hand.
We ALL need to be able to learn quickly or we simply don't last. If you've been doing this for a decade or more, you're guaranteed to have that skill in abundance, regardless of your origins.
What the OP was getting at is that you get exposed to a host of different ideas and paradigms in a proper CS program. Assembly, lisp/ada/scheme/etc., C/C++/Java, graphics, computer learning, computer architecture, etc.
I learned how to code when I was ten, but I would have never exposed myself to any of those things if I didn't focus on CS in college. What does that add? Exactly what the OP said. I know what I don't know. And it's a lot.
>This doesn't need to last for long, perhaps just the first few weeks of instruction, but it should be present.
This assumes you're learning in an academic environment. I taught myself programming just because I wanted to make some video games. I've done very very well career wise. I will admit I do lack some fundamentals, but I can still get things done.
I'm absolutely unashamed to rely upon modern conveniences. Yes I need the strong type system C# provides for any bigger project. I practically need autocomplete, particularly with C# to get anything done.
If it's small I can hack it out in JavaScript. I still remember the three lines of code you need to stand up with NodeJS.
One of my friends needed a small app done last year, I was able to build it for him and flutter in about a month. In fact modern programming languages are so much easier, I reckon this friend if he put the time into it could have built his own application in a few months. The old guard of Computer Science tendsl to have a very strange gatekeeping to them.
Overall, I'd like programming to be less of a foreign thing to the general public. If you learn a little bit of python to reformat some old phone contacts you have, you're very much a programmer. Like not every single person with a driver's license can drive a big rig, not every programmer will be able to build their own operating system from scratch.
> Do you believe that you need to the environment, structure, and pace provided by college/university to actually learn CS, and that you're incapable of doing so on your own?
For me, it was not a case of not being able to do it on my own, it was not knowing what I should know.
You could probably spend years becoming an expert on C# or Java or C++ or Go or whatever just by coding in it at your own pace (or at a job) and slowly learning by osmosis/experience/mistakes along the way etc. That's fine. But would you learn the useful theory along the way as well? And if you did, would you bother if someone hadn't created a nice structured syllabus for you? I know that I almost certainly would not had I just stuck to churning out fairly clunky (as I know it was now) code without the formal education in it.
As you said, I am sure some people dont need this though, and somehow just have limitless time or already somehow know exactly what they should learn next, when, and in what order and never need to ask any experienced people any questions to clear up misunderstandings or have their knowledge checked. Lucky them.
> Honestly, I think programming is at the point where apprenticeships are a good idea.
I strongly agree with that. There's still a lot of important "tactile" experience that is easier to transfer via mentoring than to learn from written material.
> Thirty years ago the tools where changing to fast
My impression is that the pace of change of tools is only increasing (to pretty ridiculous levels if you look at the web ecosystem).
> and eventually we can codify the correct body of knowledge to have useful formal education.
This should be the goal, yes. But I don't feel like we have identified much of the knowledge that's worth codifying. Instead, the industry seems to be running in circles, each iteration less efficient and more bloated than the previous one. I wonder what the way out for us is?
> It's also important to remember highschool teachers aren't college professors. You may have had awesome and passionate HS teachers, but from what I saw, there don't seem to be very many.
I had some -- none in my programming classes, though -- but that wasn't all too different than my college experience. If anything, the high school teachers seemed, on average, better teachers, though perhaps in a worse teaching environment.
The college professors obviously had more formal education in their subject area.
> I think the quickest way to destroy a future programmer is by forcing them to learn dumb ideas from half-interested teachers during their formative years.
I think the quickest way to destroy a potential future programmer is to have them learn nothing relevant from anyone during their formative years.
> Many of the best programmers I know never even went to college...they are just interested in the subject and taught themselves.
You must not know that many great programmers. I work in Silicon Valley and most of the great programmers I know absolutely crushed college. That doesn't necessarily mean they went to top schools (which it turns out, is not a great predictor of programming skills), but they at least went to college, and most of them majored in a STEM field and performed well academically also.
I have met one or two who didn't go to college and were great programmers also, but they're by far the exception and not the norm.
> However, I didn't know how to write a for loop.
> If you had asked me what "for...in" was, I
> wouldn't have had a clue. And forget about
> asking me how setTimeout has to do with
> the call stack.
I'm self taught, I started with the C64 in the early 80s. By the time I was 15 I was working in assembly, had an excellent knowledge of what we would now call embedded software development and a working knowledge of algorithms.
I'm glad that you took the step to get a proper education. With the modern internet it's extremely easier and the quality is incomparably better than my experience of the UK educational system. Between Coursera, MIT and the vast number of excellent books it's a dream.
It's an eye-opener to me that someone can be self-taught and not understand something as basic as loops. It has been my experience hiring developers that the self-taught type are considerably stronger than those who are purely college educated. I've also noticed that the self-taught type who consider themselves software engineers all went through a two year period of intense studying of computer science and software engineering in their own time. The combination of a real working understanding of software creation with passion is an extremely strong combination.
What I've noticed is that your type don't get scared of technical changes because you're basically able to learn anything. The "college career" type tend to scare easily and also jump into management the second they get a sniff of it.
All of this said, if I was 18 now I would strongly advise myself to do 3/4 STEM A-levels (Math, Physics and something else) and then go study Computer Science at a university which has a good program. By good program - one where I'd write a compiler from scratch, learn the theory (finite automata etc), machine learning, linear algebra and hopefully something fun like building a 3d engine or game engine or something. You can do it all yourself but a three-year program certainly makes life a lot easier, plus you get to make friends and network. AND you have a nice piece of paper.
> Seeing how little pre-college programming experience matters vs. coming in with no programming knowledge is an interesting lesson of doing college.
Wait what do you mean by this? I started programming in 6th grade (after begging my dad for a month to spend $50 on a Java textbook) and so the CS portions of college was incredibly easy. When I didn't understand algorithms I just implemented them and learned by doing (and I instantly understood how they could be applied to past projects). I would show up to classes only on the midterm and final days because it was such a headstart. I mostly focused on the humanities because there's nothing the computer science department could teach me that I couldn't teach myself online.
I was lucky enough to do my undergraduate at a university where this was sort-of true, we didn't do any math apart from some very basic things that I was taught early in high school, and most of the classes were hands-on programming. Most of the assessment was coursework-only as well, only about 20% of the modules had a final exam.
I barely had to do anything as I taught myself most of what they were teaching long before university, but for some of my peers that were only starting to code it was amazing. From my limited experience the quality of developers (at least the ones who actually put in some work instead of just trying to pass) coming from my university was miles ahead of the ones that studied at a 'regular' university where some of my friends went. They taught them theory, math, or even physics, but somehow forgot to teach them how to code.
> I wish there was during my teens (in the nineties) an option at school to learn programming. That was in France and I don't think it has changed since then.
That's funny, because I learned programming basics (and was self-taught from there) at school at a time when there was no formal curriculum and our teacher was a CS student who just figured he'd do a few lessons on that because what the heck.
I'm pretty sure that nowadays IT classes in my country are standardized enough that this wouldn't fly. Also, today nobody would dare hiring a CS student with no formal teaching qualifications instead of a professional teacher whose IT competence doesn't extend beyond MS Office. Meh.
> So does this mean that programming education is broken? That companies should invest more in training? That bootcamps should revamp what they teach? That there should be industry standards for what programmers at different levels should be expected to know?
Programming education is broken. I did one year of computer science at one of the top universities in the world (switched into mathematics after that), and I'd see that course as useless if not actively negative. The only way of learning that I've seen really work for anyone (myself included) is more like a craft apprenticeship, working closely with someone more experienced. We shouldn't be surprised that that produces widely different approaches.
Frankly the field isn't mature enough to have standards. If you tried to set a professional exam based on today's best practices, in five or ten years the answers would mostly be wrong. We still don't know the right way to write software. Million-dollar systems still come with laughably simple bugs.
What does the interview process look like for a craftsperson? That's probably the best we can expect from a field as unsystematic as ours. The one thing that strikes me is that in creative fields it's normal for people to show a portfolio of past (client) projects, whereas in software past employers usually keep all copies of your code. I have no idea how we'd go about changing that norm though.
> Just because programming is simple enough that you can learn how to do enough to get a well-paying job before hitting 16 doesn't mean that people who go through a college degree aren't also doing a ton of self-learning alongside lectures.
If you read my statements carefully, I said exactly the same thing.
>Like university is the only place you can go to learn things.
I get that the self-reliance learning angle is way overstated, but that's going a bit too far.
I learned programming mostly on my own, albeit while working in QA and having exposure to code and programmers. But it's not like I got mentored or needed someone to hold my hand learning this stuff. If you can read and have the initiative to play around with code and figure it out you'll do a lot better than half the learned fools coming out of university. I consistently out perform my peers with CS degrees when it comes to tracking down a hard bug. I swear some of these people hit the first wall and just have no idea what to do.
> the idea of devoting so much of my free time to studying boggles my mind. Programming 9-5 everyday makes me much less likely to want to do any coding when I get home.
I can sympathize, but I suggest long term you figure out a way to get over this sentiment. If you plan to work in the industry for any length of time you absolutely must study to stay employable.
With few exceptions almost every segment of the industry changes how it does things about every six years. What you do now will be pretty out of date in mid 2020. I've been working as a programmer since the late 80s. If I step back in six year increments and look at the technologies I was using, here's what I get:
At every stage of my career people I've known said there was no need to go learn new stuff. What they're doing will always be in demand. I'm talking about people doing things like ANSI 77 COBOL, RPG/3, and system 36 assembler. They were probably right, I'm guessing there's some orgs out there still using that stuff. But the options get smaller every year.
If you want to have your pick of the best opportunities available and be in control of your own destiny, and want to work in the industry more than 5 years, you're going to have to train yourself.
It'll help if you figure out a way to enjoy the experience and have fun with it.
depends where you go and what their objective is. some schools focus on teaching real-world tools. which is great for getting a job, but less useful because there are completely new tools every few years.
other places focus on the theory of it all so when you see "new" things, you already have the knowledge to understand the basics.
chances are your professor is cherry-picking the parts to implement because while you could fully implement all parts, the learning probably bleeds off quickly. Also, based on your calculations, assuming you take a full course-load (5 courses), 3 weeks * 5 = 15 weeks = ~3.5 months, which is roughly a semester.
sure, you could get an education without a school. nothing stopping you. but, ultimately, people put a value on the piece of paper, the connections can be valuable, and school does a good job of educating/exposing you to a lot of things in a fair amount of rigor in a decent amount of time.
0.02
reply