Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Quite true.

Apparently many programmers keep forgetting that the syntax is just a small part of:

- language itself

- toolchains (compilers, interpreters, AOT, JIT, hybrid)

- differences between implementations and specific behaviour

- libraries, both standard and the most well known third party ones

- IDEs

- build systems and deployment options

- extra tooling for correctness like static analysers

- culture of the people that usually cater for the language

Hence why it is easy to dabble and grasp concepts from multiple languages, but very hard to very quite proficient in a few of them.



sort by: page size:

I'd say the problem is many programmers don't learn enough languages to know what they're missing, or if they do they only get as far as languages with the same semantics but slightly different syntax.

That is true for every programming language and platform pair. The syntax is always the easy part, but unfortunately the only thing most people fixate on.

That's probably because those languages are completely middle-of-the-road, with a load of inconsistent APIs that one has to learn along with the syntax itself, and that don't really expand one's programming ability.

I’m with you. I think people that treat syntax as some completely unimportant detail are forgetting that reading code is a more important use case than writing code.

No matter how much you internalize the syntax of language X, as the sheer number of syntactic structures in the language increases, the higher the likelihood you’ll misread something.


I can't agree with this article more. It's like the author read my mind. I have programmed in perl, C, C++ in linux, VC++ & COM, python, JAVA, JS, VB, assembly language, Prolog, Lex, Yacc. I am sure I missed a few in this list. I think of problems in abstractions and when I think of syntax/semantics its almost a vague haze a mix of libraries of various languages. Dont get me wrong, once I start programming in a particular language, I am in the "zone" so to speak. But I think I need Google too often.

I think the primary drawback of languages with complex syntax is the additional barrier to entry it creates for developers to write more tools for the language. Many more people have started a Scheme interpreter than their own C++ compiler.

If we had better tools, we wouldn’t necessarily be limited to just one user-facing syntax for a language. The syntax could simply be a preference in the tools. My point is that as long as we are stuck in text-file-oriented thinking, no one is even thinking about these possibilities.


>easy to keep the whole language in your head

Is that not all programming languages. Practically have identical syntax for the problem they are solving. The thing which differs is “standard” libraries, which is not part of the language but a side effect of.


I agree. I don't understand the programmers who debate syntax all the time. Good syntax is important, but appropriate semantics are more important. Only after you've selected the typing, execution environment, memory model, paradigm support, available platforms, and libraries you need, should you worry about syntax. At that point there are usually 0 or 1 satisfactory languages.

Because it is easier to reason about languages with formal syntax and semantics. Most languages used in industry are a hodge podge of syntax and semantics with so many edge cases that even seasoned programmers will often end up writing bugs which take them days to track down.

There is much much more to programming languages than syntax. If syntax was the main difference between programming languages it would be very easy to write translators from one language to another. Lots of people like their pet language more than JavaScript, so why don't we see tons of compilers like Python->JavaScript, Lua->JavaScript, Ruby->JavaScript etc?

The answer is that syntax is just the tip of the iceberg. Even ignoring major language differences (garbage collection, static vs. dynamic typing, lexical vs dynamic scoping, eager vs. lazy, threads vs. coroutines) there is an incredibly long list of very detailed semantics that you probably don't even realize you're dealing with when you move from one language to another. For example:

  - in a complex inheritance hierarchy, which method is selected for a.b?
  - how much precision do numeric types have?
  - what happens when numeric types overflow?
  - what order are arguments evaluated in?
  - what happens if fewer arguments are passed than were declared?
  - are keyword arguments supported?
  - are default arguments supported?
  - when are default arguments evaluated?
  - are simple types like integer and string classes?
  - can they be subclassed?
  - can they be monkey-patched?
  - will types be implicitly converted?  (ie 1 + "2")?
  - does 0 evaluate to true or false?  what about "0"?
  - are function arguments passed by value or by reference?
  - are values hashed by their identity or by their logical value?
  - what hooks can you define to customize the behavior of your object?
Nothing in this list is about syntax. Syntax is just the tip of the iceberg when it comes to what makes languages different.

Syntax is very important for a programming language. It's in your face all the time. A serious downside to many programming languages is their awful or inconsistent syntax, and that results in code that is hard to read and comprehend even before trying to understand what the code is doing.

I just can't understand how even experienced programmers can fixate on syntax. In my - repeated tens of times by this point - experience the syntax is important for a few months (tops!) at the beginning of using a language, and then stops to matter almost completely. What experiences would make someone convinced otherwise? There's an argument against too high complexity of syntax, but most general-purpose languages out there are very similar in this regard...

Syntax matters. But...

1. Other things matter more. How a language scales to 100 programmers working on 10 million lines over 20 years, say, matters more (in some environments) than the syntax does. Syntax contributes to that. But syntax contributes to that precisely by being pretty vanilla, uninteresting syntax. More, sexier syntax makes a language worse for that environment. (I'm talking about go here. But I could make a similar argument for other languages in other environments.) Syntax matters as a means to an end; the end matters. Syntax where the end is syntax doesn't matter so much.

2. I suspect (and assert without proof) that peoples' brains work in different ways, and that a person finds languages easier or harder as those languages conform or conflict with the way the person thinks. Ruby syntax gives you a headache? And the problem isn't that you just need to learn Ruby better. But for every you, there's (at least one) someone who has the same issue with J/K/APL. And that's fine. People whose way of thinking matches APL should program in APL, and those whose way of thinking matches Ruby should program in Ruby. We don't need one language to rule them all. They each have their target niche and their target audience.


The programming world is like this too, only worse

Much worse. I think the problem is magnified by our obsession with languages - overlapping subsets of syntax features that have highly intricate relationships with programming techniques (making certain techniques easier to implement, others-harder, regardless of problem domain).


Syntax is really not what makes most programming difficult at all. We have languages with really good syntax now, at least for the fairly low level most programming languages operate in (storing and retrieving variables, calling functions, etc).

The only way I could see what you're proposing making sense is if you got the linguist to design a much higher level language that operated on more concrete concepts. Some kind of domain specific language.


I've never had trouble keeping multiple programming languages straight in my head. To me, its really just a matter of "context switching" much in the same way that a multilingual person can switch between various languages on the spot. If the context calls for a specific language, then its that language that naturally arises.

It goes even further than that. When I code I instinctively take on the "mannerisms" of the surrounding code. Brace style, indentation, alignment, etc. This is all part of the same mechanism that helps one use the appropriate idioms in the right context.


More than syntax, methinks it's semantics and culture. For those who went off road toward ML, Lisps, Prolog etc, syntax becomes a mere detail. We care more about what the linguistic constructs give us, whatever the clothing. But for others, it's a god damn slap in the face.

In some important cases, languages already have common syntax (such as mathematical operators, regular expressions, and string interpolation such as "\n"). And in those cases, the simplest explanation seems to be: "there was no other sane way to do it".

Some differences really are arbitrary, and could theoretically be merged. Some are historical, e.g. someone worked on Unix for years and thinks in terms of verbs like "echo" because that's what shells used.

But usually, there are very good reasons for any differences.

One reason is the inherent complexity of a parser for a full programming language. It turns out that it's fairly easy to confuse a parser, because they're just not as good as human beings at correctly interpreting the "intent" of a statement. Most languages don't have a reason for being unless they introduce a bunch of unique constructs; and they're lucky if their own constructs can be parsed unambiguously, much less after adding support for extra "standard" constructs from other languages.

Yet another reason is that languages aren't as universal as you may think. For example, what would 'print "hello world"' mean in a makefile...when would the print-out occur? Would languages be allowed to ignore "standard" expressions that they can't logically use, or be forced to invent some interpretation of them?

A final reason is what would happen when languages are embedded in each other, which is even summed up by your example, PHP. If one language is embedded in another, it's a plus that they have different syntax: it clearly separates one from the other. This reduces the risk that you'll have to (for instance) escape or namespace every single name used by the embedded code, to avoid collisions with the surrounding file.


Why do people even care about languages? I've written code in dozens of imperative, object oriented and functional languages and really can't see what the fuss is about. It takes a few days to get comfortable with a new syntax and then you're back to looking up library documentation and gluing pieces together. Learning a new syntax is a fraction of the time spent writing an application. Are people just lazy, complacent or stupid?
next

Legal | privacy