The article's main point is that learning C first makes it easier to learn the language a programmer will actually use, be it Javascript or something else.
In my experience, the best course of action is to learn whatever you need first.
For those considering programming as a hobby/occupation, being told to learn a relatively complex and unpractical language would raise the bar of entry.
C with its unsafe pointers, unsafe memory management and unsafe types is the worst language to learn for the beginning. It's good to know and learn as 2nd language, but for the beginning you need to learn proper concepts, not broken concepts.
Lisp is still the best language to learn, but any proper scripting language will do also.
PS: "proper scripting language" of course excludes PHP and JavaScript. Dart is fine.
C is hard to create complex programs but for learning simple things pointers are much more simple to understand then say Rust or other abstractions, a programmer will have to understand what it happens under the abstractions.
I understand the point that C is not the best choice this days to do big projects but for learning it seems a good idea, even to learn what segfaults are and buffer overflows are.
C for utter beginners is probably like being tossed into a tsunami for swimming lessons.
"Okay, now this is how you float on your back; but watch your head: there a protrusion of iron rebar sticking out of a broken concrete bridge support, and you're heading straight for it! We call that 'uncomfortable behavior'."
C is also a terrible learning language, much harder to reach the "get things done" stage than most other languages.
You can spend weeks segfaulting in C before you get things actually working, and for a beginner — especially self-teaching/unsupervised — it's extremely easy to rely on compiler-specific behaviour in the face of UBs.
When learning, if the only choice were C or assembly I'd actually recommend assembly.
> You can spend weeks segfaulting in C before you get things actually working
This seems like an overstatement. The first programming class I ever took was "Introduction to Programming with C" where the textbook was K&R and we were writing and compiling programs from day one.
If you can't figure out how to manage memory (one dimensional data structure) in C jumping into a language that immediately allows you to abstractly manipulate complex multi-dimensional data structures will certainly not make your life easier.
Depends on the type of thinker someone is. Everyone thinks best at a particular level. While a certain data structure might be complex, it’s use might be simple.
If I show someone a list of lists in C# the semantics of that will be very easy for them to grasp. The same data structure in C might defeat them altogether.
If you don't get a sense of achievement and progress when you're learning you'll quickly get frustrated. Starting with a "simple" language like Scratch might teach you things more slowly but you're less likely to quit so it's a much better choice.
i learnt most about computers programming C and assembly, its very interesting and helps me every day not only in programming but in any computer related issues or quetions. It was a tip to learn it someone gave me a long time ago and i am still grateful they did it. It has also taught me a lot about debugging and memory inspecting due to the issus you will surely encounter :D (fun fun!)
The fact that learning C is good, isnt the same as 'C is good or C is better', i do realise it has it's applications and that certain programming patterns are just much easier and safer to implement in languages better optimized for that pattern.
That being said you can program and make any pattern in C and C will force you to understand these pattenrs thoroughly. (your code litteraly won't run if you dont , and if it does, it's probarbly not doing what you expect!)
C is definitely a language that also teaches a lot (but not everything) about how computers work. I wonder of languages like Pascal or Modula-2/Modula-3 would be equally suited, but with less quirks and pitfalls.
aren't the quircks and pitfalls what you learn from? :D As soon as you learn these things exist, you have learnt a lot about the tediousness of programming and computers =D
I've mostly learned programming in Turbo Pascal (I programmed a little in C64 and Turbo Basic before, but I haven't really understood programming at that point yet).
I think Pascal is a good compromise between Python and C - it's compiled, low-level and statically typed, but has a reasonable string type and operations included out of the box, and much less quirks than C.
I agree with meatbundragon's comment "learn whatever you need first", but for a computer science degree program, I've held the opinion for a decade that students should learn Python first to get a feel of how humans should think and then C in their second year to learn how computers think.
Assembly is too niche for all students to learn it, and Java/C# are each too isolated from other environments, whereas C is behind most everything we use and teaches you about linking/compiling more than any other environment. After students are comfortable with Python and C, they or their program can choose whatever they find fit for their goals.
Assembly is not that difficult if you pick a suitable target and environment - possibly a virtualised one for early learning. Make sure you can single-step cleanly. And this is really how computers "think", while C is full of traps that fall between the language abstraction and the compiled realisation.
(The book "C traps and pitfalls" remains one of the most instructive programming books I've ever read, and more languages could do with one)
C is a good enough approximation to the level students need. Many will never use or even think about assembly again, especially on the architecture they happen to be taught, but most will use some derivative of C syntax and semantics. For the 25-50% that will, sure, that'll be a good third language, but remember that most will develop web frontends or industrial logic for the rest of their life.
> Many will never use or even think about assembly again, especially on the architecture they happen to be taught, but most will use some derivative of C syntax and semantics.
This is deems true in my case. At my community college, I took a lower division Computer Architecture course, also known as Computer Organization and Assembly Language, a year or two ago and I recall little from it unfortunately. Yet, I still go back to the practices I learned from taking a course in C.
I use and enjoy dozens of scripting languages including Smalltalk and Python, but colleges need objective reasons to teach a certain language, and the popularity and applications of Python can't be beat.
Is that really true though? Looking at performance optimizations that people do, understanding C is a minimal part of understanding how computers work. It's sort of a codified fiction masking a much more complex reality.
C ignores the complexities of the cache hierarchy and register allocation and IPC extraction. If I code in a garbage collected language I can ignore one more complexity. But C is closer to Java than to the ALU.
Maybe learning LLVM IR would be more useful than either, as it will me allow to debug/reason about the performance impact of the abstractions of many languages backed by LLVM.
For the 25-50% that need it, sure, those would make good third or fourth languages. My point is that Python and C should be the first two, not the only two languages they learn in college. My reason for recommending C is to get an overview of what "writing for the computer" feels like with complex data structures, not memorizing the Intel x86_64 manual. Would you expect 100% of CS students to be prepared to learn cache hierarchy and register allocation in their second year? I know a few CS graduates that somehow didn't manage to learn single a programming language (to fizz-buzz level), even in state universities, not talking about community colleges. I think we should step back and look at what all students need, not only the advanced ones.
Im sorry but if someone graduated with a CS degree and can't write an if-else, the teaching was not the problem. I knew a few people like this, but they had to cheat their way through & do 0 work on team assignments.
Students need curiosity & motivation to get to the fiz buzz level, not a specific intro language
And then there are those that legitimately try really hard but still have difficulty with control flow logic and algorithms in their second year, so this is what they should be taught. You can't just ignore them. Skipping ahead to cache access, instructions, and registers in just their second year leaves them behind. There's tons to learn about programming in two years for even the best students that doesn't require intricate knowledge of the machine. Assembly is just not realistic for the majority of sophomores. Additionally, it would be a disservice to teach languages that don't align with industry attention. C is used perhaps 4 orders of magnitude more than LLVM IR.
I never argued for learning llvm, the comment I replied to mentioned CS grads that could not write fiz buz in ANY language. I would never suggest going 0 to 100 like that
I think a good exercise is to build a CPU/Micro simulator in a more modern language. Followed by writing a compiler (for an uncomplicated language) to target that CPU. Should teach you most of what you need to know.
I remain reasonably agnostic to what language people should learn first, whether it be C / Java / C# / Python / <some other language>.
I definitely agree. I took C first then Python. Where I took the courses, the C course was an intermediate programming course while the Python course was an introductory to programming course (I think this the case for many colleges, not entirely certain). Python, personally, seems like such a good introduction to programming language because the syntax seems so English-like. But otherwise, at my community college, after one takes the C course they usually go on to take a C++ course then a Data Structures course where the majority, if not all, the students use C++. All these courses being lower division.
I find it's a good skillset to have: Python for scripting, backend, and frontend with Django. C/C++ for other purposes I'm not entirely aware of, maybe if you like working with video games, browsers, operating systems, etc. Then a proficient understanding of HTML/CSS doesn't hurt, and anyways web development is pretty darn fun since designing and playing with things is cool. Those are my tools, or at least I like to think those are my tools: Python, C/C++, HTML/CSS.
Given what I mentioned above and the courses I have taken, I still don't feel like a programmer though. I mean you could give me a basic problem and I can probably write a simple, elegant solution in five hours or so but I'm going to have to look at the documentation or Google how to do some specific task. It might be that I've only taken lower division courses or only built stuff following tutorials online (Treehouse, etc), but I still don't feel competent and because of this, I've been thinking about finishing my degree in computer science but moving into product management as a professional job. I'm definitely going to try to program daily though.
> [...] I still don't feel like a programmer though. I mean you could give me a basic problem and I can probably write a simple, elegant solution in five hours or so but I'm going to have to look at the documentation or Google how to do some specific task. It might be that I've only taken lower division courses or only built stuff following tutorials online (Treehouse, etc), but I still don't feel competent [...]
What's wrong with that? If you are able to develop working stuff by yourself, with help of documentation and WWW, it's perfectly alright - those are the tools you will usually have in the real world anyway. You can't keep all the details for various topics, or different library APIs in your head anyway.
You're right. It's just, from an outsider's perspective that has only worked blue-collared jobs (think meat clerk, greenskeeper), the technology industry's interview or working in technology as programmer seems a bit daunting, even especially as a minority. I'm still an undergraduate (spending a fourth year taking interesting lower division courses at my community college) but it might be a combination of imposter syndrome on my part and not feeling I know enough. I mean, again for instance, I can tell you off the top of my head how six sorting algorithms work and their respective o-notations. But if I had to write them? I would probably need the entire day and no one looking over my shoulder. I can write a fizz buzz program in ten minutes or so but I can't tell you off the top of my head how pointers work or how polymorphism works, I would have to look it up.
I just feel if I ever get a software engineering internship one of these summers, I will ask myself: what am I doing here? After all, I take a long time to write good piece of code. I'll think I'm holding my team back. If I'm building stuff on my own, in my own time, I'm fine and dandy. But otherwise, I'll be stressed and nervous.
Everyone starts somewhere. Take into account that you are just studying it from time to time. At work you will be actively doing things 5-8 hours a day, almost every day. Take your time, people generally expect even experienced programmers to take a month to catch up to speed on a new job. After month or two of such intensive training you will learn a lot too. Make sure that your learning is directed i.e. don't just solve problems, but figure out how and why they occurred etc. What approaches people took before you etc. In general, read a lot of relevant literature, especially articles and books. And after a year or two you will start hitting point of diminishing returns. There is really only so much that people need to know to do day to day work.
Then there is elusive experience, but that will come with time as you will observe first hand how silver bullets turn into legacy ;).
I'd say learn C if you need to think on that abstraction level. For a lot of things, you don't need to think of a computer as much more than a bunch of names that are either data or functions. Some GC will clean up after you.
If you're working on a problem where it actually matters where the data is and when it's deleted, start looking at C.
For me this is a pretty sharp distinction. If you're not used to messing about with pointers, you'll need a bit of time to learn this.
It has a relatively simple (and lacking) syntax, but a language is much more than its syntax. Compare to Python, Go or Rust and you'll see the following:
1. Infrastructure - C has no package repository, no standard build system. When you receive a C program, you need to know a dozen of build systems (autoconf + automake? scons? qmake?) and ways of getting packages (git? distro packages? some obscure website?) that sometimes get in each other's way. As a side effect, build process isn't properly documented. Probably the best we have are CI files explaining the environment and maybe Dockerfiles if you're lucky.
2. Standardization - yeah, there are standards, but
they're actually ignored. There's lots of comments stories on HN about how C committee specified things that are unrealistic or outright impossible. There's lots of important subjects that aren't standardized at all.
3. Intuitiveness - one trivial example is undefined behavior. Due to existence of legacy code, pretty much all compilers default to letting through expressions that end up leading to UB. Ultimately you can't really reason about your code without also knowing how it was compiled.
4. Safety - selling point of Rust and hate it as much as you want, but C makes it way too easy to shoot yourself in the foot for no apparent reason. Do you really expect your student to learn all secrets of OS/C memory allocation plus spend his first half a dozen of hours of debugging because he misused a pointer? In university, it's sometimes easier to rewrite the program than fix the bug if you're only getting started with pointers. Backtraces are useless if your stack is corrupt.
There are probably more reasons why it's a terrible idea, but those are just a few off the top of my head. Consider some of those posters as of examples of C design failures:
> yeah, there are standards, but they're actually ignored. There's lots of comments stories on HN about how C committee specified things that are unrealistic or outright impossible.
Any examples? It sounds ridiculous, perhaps you were speaking about C++? But then again I can't think of anything but "export templates" that were finally dropped from the standard.
> Should every programmer learn C as their first programming language?
> ...
> It depend on what are your position in Web development, are you a front-end web developer? Or you are a back-end?
If this is your first programming language, why are you worried about being a front-end or back-end developer at all?
If you're learning programming for the sake of learning programming, we can have a conversation about what your first language should be, and we'll worry about precise career choices later. If you're learning to fill a position, it's not your choice to make.
I'm self-teaching in my late 20s after dropping out of a philosophy degree several times in a row. I hope you'll permit me to rant a bit, it's related to your comment and the article: the sheer amount of jank is absolutely unbelievable. The vast majority of it is written with selfish careerism in mind, not generous pedagogy. Much of it is vapourware advertisement for a paid course. Plenty of it ends up on the front page here nowadays which is a bit of a shame. Related: https://meaningness.com/geeks-mops-sociopaths
I've wasted about 6 months on distractions from following fashions and persuasive writing/videos, and now know a little bit of a bunch of popular languages and some CS and IT and already feel a bit burned out (as I'm stretched in so many directions - sorry for the mixed metaphor!).
In the past couple of weeks I've nuked my windows installation and have an Arch+i3 setup where I try to use CLI wherever I can. I've limited myself to following CS61A (the famous SICP course), HtDP, and I leaf through a book I have on Rust, typing out examples and steeping my brain in healthy confusion. The difference in my rate of learning is astounding: not because I'm suddenly better at research, but because I'm better at filtering. What an age to live in!
soon as the word "Should" is used, you know the answer to the question. There is no one true path.
Is C worthwhile learning? sure. Is it worthwhile learning as a first programming language? maybe... what do you want to do?
I've done 30+ years of C programming ( interleaved with other languages ). I've used it mainly for embedded systems, and it works fine. It's easy to go wrong with it and you need to employ a lot of defensive techniques, but with experience, it's good. However it's not a very good language to express abstractions in. You have to be very aware when trying to build a modular systems. I virtually never use it outside of embedded systems as I think other languages work out better. But that's my opinion based on tradeoffs I want to make. Other people make different choices.
The only advice I'd give is try different languages, spend the time getting good at a few, and be very good/productive at at least one.
Yes, I think so. As a long-term hobbyist programming since the 80s, I happened to stumble over better languages very early and for this reason never really learned the intricacies of C.
It turns out, in retrospective, that my own snubbiness of preferring Pascal, Modula, Ada, Realbasic, CommonLisp, Racket, Xlisp, and whatnot to C probably got more in my way in the long run than it helped. Even though I do understand low-level details like memory management and C structures, I'm still not proficient with reading arbitrary C code, e.g. with a lot of bit-twiddling in it, and header files, understanding the pre-processor, and so on. However, no matter which language you use daily you really need to have these skills for writing glue code to C libraries, glue code to system APIs, and so forth.
There is just no way around C. So yes, learn C first or at least early, and then whatever high level language you need.
When I started programming, in the 80’s, it was first Basic then Assembler. Not because we wanted, but because we had no choice. My first computer was a ZX-81 and believe me, without asm, everything was toooooo slow.
So when it came to C, it was an ‘easy’ step.
But now, you have the choice not to program on low level.
I asked my business associate not so long ago what he thought a good starting language will be for starting programming lessons to my 8yo kids.
He also told me « C ».
And here we go on the state of development nowadays.
With all high levels languages, there is a lot of people starting programming stuff, and often successfully.
But is assembling libraries/brick making you a good developer ? Is not understanding how stuff works underneath making you a bad developer ?
I seriously have no idea. Is the fact that we, as « old » persons, had to work low level, helped us to better understand ?
My first language was Basic on Commodore64 at 6 years old. Second was Visual Basic with some HTML done in FrontPage. After that came high school where we had Assembler, C, C++, Prolog, Java, Pascal, and Delphi. At the college, there was more of the same + Javascript + a bit of Lisp + some obscure and specific ones like CLIPS. On my own, I worked with several others for example Python, C#, a bit of Elixir and Ruby and PHP and Kotlin and Objective C.
Point being made that first language is largely irrelevant if you are really interested in the craft you will eventually get exposed to many languages and paradigms and learn to choose what best suits the problem and in which you want to get the most expertise. It`s not like you would shoot yourself in the foot if you learn C first instead of Python or the other way around.
C was the first language I learnt when I was at university. Honestly the teacher was bad and I found it difficult. It almost put me off programming.
Fortunately, I later took a mandatory course in Java and Haskell. We learnt both languages in parallel to learn the difference between OO and FP. We were set tasks where we had to write programs in both languages. I loved this course and the lecturers were amazing.
I am against beginners learning C as it's too low level and you can get stuck easily. I would recommend Python as it's widely used for web dev and data analytics. It's linear [edit - correction as I had written "async"] unlike JS which makes what you learn more transferable to other languages.
Maybe its just that the Java+Haskell one was simply a good teacher. Imagine you had them switched, so you would've had a bad teacher showing you how to program in Haskell and Java at the same time. Still to this day, your worst nightmares would be slow to load and express in terms of monads.
However I still feel like C may not be the best first language as it's low level. This means there's a lot going on and more to learn for the average user so they end up getting stuck more often then if they used a language with memory management. This is based off the premise that the user is of average intelligence and you don't want to to put them off programming.
I agree. In the first few months of my CS degree, first exposition to programming languages ever, and it was Pascal. Then they quickly moved to C, but at least that first impression was done with a leaner language that allowed us to concentrate on what is programming, what are common imperative language constructs, etc. without worrying much about fine details of the machine. I don't think I'd have grasped too well all that new information if I also had to be distracted with understanding all subtle gotchas and details of the C language.
C has quirks that can make beginners conflate separate concepts into one - for example pointers==arrays, expressions can be used as instructions, there's nonobvious automatic type conversion going on. It's not a big deal, but why complicate things more than necessary?
I think Pascal is still a better first language than C, even if it's less useful in real life.
I would say: every programmer should learn to talk to low level devices first.
For example when you use an Arduino you program in a stripped down Java language, but you will learn a ton about how computers work because you learn about I/O. You will learn about bit swapping, shifting, and what not.
I don't think programming is about a language. It's about how you think about computers.
Nah. I started with Flash and would never have had a programming career if I had to start with C, gcc, make and all that bullshit.
Learning C by itself is interesting because obviously a professional programmer needs to know how memory is managed and laid out at a lower level in a computer, and how higher programming languages themselves are implemented.
But you don't need to start with C to actually teach programming, what is a function, what is a loop or a variable, or to teach algorithms, data structure, or problem solving.
People often complain about "the lack of diversity" in tech, well I can tell you people who think C should be the first programming language of any CS student are a severe hindrance to bring that diversity to tech.
Teach Processing or Python, make students create interactive apps on mobile phones with a visual or audio feedback thanks to a simple environment or language. They will be no less of a programmer if they succeed in creating a multiplayer mobile game in the space of a few month because you didn't burden them with malloc, pointers or null terminated strings and macros.
I have asked HN the very pertinent question albeit in a different manner after doing some homework myself[0].
This was me 8 months back-
Freshman Year | Autodictact | Non-CS Major |Knew no programming language.
In my experience C is better off learnt by textbook approach i.e. modules of theory+problem sets. The chapter end problem sets must test accumulated knowledge of the previous chapters. This boosted by problem solving skills, algorithmic thinking skills, understanding deep internals, their working and memory layout.
I recommend postponing learn-by-building-fun&real-projects approach as I hold that as a culprit for many people who get discouraged or worse, turn away forever. That approach is something brash, confident, and experienced people should be doing (like fcking with Brainfck language, haha!), not amateur innocent kids like me who feel the immense pain and frustration of shooting oneself in the foot. I've doubted on me creativity in building the project more than my ability to do abstract thinking through the language( or...I am not able to express myself clearly on this but then again, its better to suppress depressing memories). There is frustration in both the approaches but the immensity of the later(most) is nowhere comparable to former(least).
I cut through 80% of Cormen's problems with C like hot knife in a block of butter. I prefer C for the immense pleasure it gives that I don't prefer python for Cormen. I am now planning to build something with C with the Computer-networking theory I am learning. I am still wary of touching Linux though I can understand and take pleasure in reading its highly optimized code. Linus and his world wide team is a genius.
Coming to python, I started learning it once I completed Linked lists and Files chapter in Noel Kalicharan's book.
Thereon, I took advantage of MOOCs and started out building projects (GUI apps in Linux) at a blazing fast speed. Quickly I turned into web development. It wasn't even 2 weeks and I could take everything swinged at me.
Now the comment pertinent to the article,
I do recommend learning C first even after I have budding-experience with Java, Javascript and Python. I don't like being a magician without knowing on how-s and why-s of my own tricks. This gave me power to tweak and god did I twerk every time I tweaked!
Had I started out with any other language,I would have been left hungry and dissatisfied in its richness of mental exertion and incompleteness in operating that language as well as no appreciation of the power of jumping miles( without knowing how to run). C hits the sweet spot that neither assembly nor Java can. I can see why C is still recommended and relevant today.
I am one of the few who is wanted in a company I am interning in, because of my in-depth understanding. Many fellow interns come to me whenever they are stuck in all those higher-level abstracted languages( Thank you for letting me brag on this humble achievement).
My routine during these 8 months - 6 to 8 hours a day of programming with a lot of intermittent breaks. Lots of StackOverflow, rubber duck talking, writing code and test cases out on papers, hair pulling, banging keyboard, and Coffee. My relationship with programming was that push-pull/love-hate one with a stronger desire of pull/love than push/hate.
Should everyone learn C? Yes. Should it be taught as the first language? Nope.
First of all, the quality of C learning material is abysmal. Most of the literature has been written before the era of C99 and encourages bad practices such as useless casting, single-letter variables, variable declarations separate from assignment and so on. At least some of this bad literature will inevitably find its way into the hands of newcomers.
That said, too many books also teach C as if it was the reader's first language. Were we not looking for a book for newcomers? Great! Now our glorious ANSI C book will be mostly about basic programming concepts and language constructs. The last chapter is most probably named "Advanced features" and this is where pointers are first introduced. In reality C is all about pointers and they should be taught as soon as the programmer knows how to write and call functions.
Learning C is also hard and requires shaolin monk-like self discipline. The Zen of C is satisfying but the path there is paved with segmentation faults, bus errors and endless hours of watching Valgrind logs. If anyone claims they've never had similar problems, they haven't really grasped the essentials of the language - see the previous paragraph.
> That said, too many books also teach C as if it was the reader's first language.
I feel that's quite a general problem with programming languages. Many languages could actually profit from books that assume that the reader knows a couple of mainstream languages and explain the respective language's intricacies from that perspective -- also in a much smaller format.
I'd buy a small book (say the size of K&R or "the AWK programming language") for many interesting languages in a heartbeat.
Introduction should be about making people feel that they progress, learn something useful and enjoy what they are doing. C is not good for that, unless you do introduction to robotics. Something like Python, JavaScript or even PHP is better for that. Go straight into interesting, useful applications like scripting, games, websites. Don't make people think that they need to memorize how to implement stacks, queues or heaps in C and actually implement them each time before they get to resize a photos collection from vacations. This can be done in 5 line python script and feels good.
In my experience, the best course of action is to learn whatever you need first.
For those considering programming as a hobby/occupation, being told to learn a relatively complex and unpractical language would raise the bar of entry.
reply