Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Sounds like you're an algorithms guy, and the other coders didn't have the background needed for understanding their use?

I'm self taught, and for many years didn't know about algorithms, big-O notation, and similar. You can do a lot of stuff without that knowledge, but there are definitely some areas that require it.



sort by: page size:

"Maybe it has to do with the problem domain, but I have a hard time accepting that."

Your profile says that you are a web application hacker. I think that explains why you may not feel that data structures or algorithms are important. I was the same way myself (started my programming life by building Perl cgi scripts, made a nice little cms, definitely didn't use any fancy stuff to do that).

You should take it from me though. You will suddenly develop a deep and profound appreciation for understanding algorithms when you have a problem that requires having to comb through over a hundred thousand records, perform some type of operation on them, and sort by the results of that operation - all within a fairly narrow time period.

I think that it actually is a question of problem domains. If you never encounter this type of problem, then not knowing about things like big o notation and the various sorting algorithms will never impact you. However, when you do, that knowledge quickly pays for the time invested in learning it.


Are you sure? I've never worked with professional devs that aren't familiar with the usual algorithms and data structures.

Whenever I interview I'm always asked the usual leetcode bullshit. Are there programming jobs that don't ask/require this stuff?

I have met a lot of analysts and researchers that code Python for data science that know nothing of big O, but that's to be expected.


There are plenty of people who don't know their algorithms and data structures writing prodduction code for large companies.

On that last part a Jr. CS question that you maybe not even use in real life for some CS jobs. People forget that not everyone is writing awesome ML things or other awesome things you read about in newspapers. Most of the people are writing CRUD code, do some data transformation , automate so simple office process, etc. Never had much use there of Big-O knowledge....

Does not mean I never used much of this knowledge I have needed at some points. But by that tine I had to refresh the knowledge again...


How can anyone applying for a programming job not know Big-O notation? That's CS101. I don't care if you're self-taught, it's audacious to even call yourself a programmer if you don't know the very basics of algorithms and data structures.

If I were interviewing a candidate, and it was revealed he didn’t know Big-O notation, my next questions would be about data structures, because now I’m suspecting he wouldn’t even know how to implement the simplest of structures. What next, simple pointer arithmetic, is he unable to even walk an array?

We are no longer talking about a programmer then.

Not knowing these things, I usually chalk it up to either laziness, or lack of interest in understanding how things work.

Either way, that’s not someone I’d want on my team. Programmers should be enthusiastic about fundamentals. Good programmers have a “hacker” mentality, a need to know and understand inner workings, a craving to dig deep. I’d say this mentality is what you want from anyone in STEM.


It's hard to see how you can be a great coder without understanding big-O. The better example to use is logic puzzles.

You also don't need to know much about many algorithms to write workaday programs. Most of the work is done by others. /head nod to Timsort.

You need to know how to avoid common pitfalls (i.e., yeah, maybe nested for-loops with millions of elements per level of iteration, maybe some alarm bells should go off)

Advanced algorithm knowledge is not needed for developing typical business software.

Bad security is a disaster for any public-facing software projects. A product you didn't ship because you hired the wrong people can't be reverted, but neither can a data leak.

Or the company who goes bankrupt because you leaked highly sensitive data.

Etc. etc. ad nauseum.


I know almost no programmers who analyst algorithms in a formal manner.

Most coding doesn't require it - certainly not the kind of thing that most large companies want coders for.


This is probably an unpopular opinion here, but only a very small subset of developers need to design algorithms or even know any of them by heart.

Disregarding a few years that was mostly WordPress consulting, my experience is largely enterprise C#. Lots of line of business applications, glorified CRUD apps, and some client work. Zero need for any ability to write a BST or radix sort.

If you're working for SpaceX, or Twitter, or a Big 4, of course you should know those things. But most developers don't work for one of those companies. The vast majority of programming is done to further a business other than programming.


In my "real world" we normally don't care about things like big-O complexity. We worry about doing dumb things and not taking more time than we have available. I'm not saying big-O is useless or CS wizards are never helpful. It's just that you need one or two of them for a large team of normies, IME.

I have a problem with this notion that knowledge of algorithms is required to be a good engineer though. Case in point: Senior algorithm nerd on my project is going nuts over algorithmic complexity and delaying an early delivery to another team so they can begin using our new feature. In our case, n is fairly small no matter what, so 2n vs n^2 really doesn't matter to us. The code he's optimizing isn't even the long pole in the tent. It calls other routines which take 2/3 of the total time anyway. We could just deliver now and improve later when we have real world data on how people want to use our feature, but nope, we're wasting time on endless rewrites in search of perfection which may not matter if the other team can't deploy our stuff in time.


How can anyone applying for a programming job not know Big-O notation? That's CS101.

You pointed out exactly how someone would not know -- they may have never taken computer science classes and taught themselves programming.

Or they may have taken some CS classes, not enough to have covered that particular concept.

I'd guess that, especially for older programmers, anecdotally, it's not uncommon to not really understand or use it.

I don't care if you're self-taught, it's audacious to even call yourself a programmer if you don't know the very basics of algorithms and data structures.

Just because someone doesn't use the same terminology to describe a set of concepts doesn't mean they don't understand those concepts.

Also, I wasn't aware there were were formal qualifications for someone adopting the title of "Programmer", pretty sure big-O notation isn't part of the dictionary definition.


I agree. However I think it is not the distinction we’re talking about here, which is about not having a firm grasp of fundamental algorithms and data structures and big O scalability.

In school terms it would be more like a technician or vocational training around coding for particular types of projects.


Do you teach 'hardcore' cs? Like the master method for working out big O? That's is what I think he is trying to say people might be missing.

There are plenty of tutorials on the internet about data structures, relatively few about algorithm analysis.


Algorithms? yes, basic algorithms and sorting and Big O, none of which I've needed in 10 years of professional and hobby dev work. But to say "well you learned data structures and you use hashmaps and arrays so obviously it helped" is a very strange defense of academia. I can also learn "datastructures" online in a few days of studying as a new dev, so that's not really a good defense of 4 years of study.

The basics are essential.

If e.g. you don't know what an hashmap or a balanced search tree is, or that you can find common items between two sequences or find a shortest path in a graph with non-negative edges in O(n log n), or that you can't (as far as we know) solve SAT, max-clique or subset-sum in polynomial time then you should not be programming software to be used by others.

Advanced algorithm design skill however is generally only useful if you are doing specifically algorithmic work.


They don't teach you the sorts in CS because you'll ever need to do them. It's merely a means to teach you Big-O by example. Being able to tell what's O(n) and what's O(n^2) is a very valuable real-life skill for a programmer.

I agree that there's a balance between the two. My ideal team is probably a quarter that know core CS concepts and how/when to use them and the other three quarters devs can write good code whether or not they know core CS concepts.

The part that makes me frustrated enough to comment on this is that I've found that a lot easier to convince a dev who doesn't know Big O that they need to than it is to convince a dev that does know Big O that it's not always necessary. Convincing is significantly harder if the dev graduated from college in the past 3 years.


I majored in computer engineering and skipped algorithms so it was pretty intense studying for a couple of months before I felt comfortable. Overall I really enjoyed learning it but I made it through so many successful years of software engineering without learning it that I don’t think it’s relevant to most people. Even now I write an O(n) algorithm to merge sorted lists and people comment on my PR just tell me to throw everything in an array and use the sort function.

You know, in years and years and years of programming, of applications and web and back-end and front-end etc etc, I have never had a real-life use case for Big-O notation, except in interviews. Nor had a real-life use case for caring about sort algorithms. I mean, not once.

I'd say the true foundation needed is entirely different. Avoiding spaghetti code, good refactoring practices, understanding when more architecture is needed, avoiding premature optimization or over-architecture, concepts like technical debt, writing code designed to be readable, how to establish practices and standards and make sure they're communicated properly... These are the kinds of things that actually matter in most projects.

(Obviously, if you're writing video codecs or kernel code, it's a whole different story...)

next

Legal | privacy