Did it? People seem to use Clojure a lot, other dialects are popular with hobbyists, and some of the most enduring computer science books of all time use it. How many other languages from the 50s can claim that kind of wide use?
Aside from that, as the author points out, ideas from Lisp have made their way into almost every widely-used language today. It's fingerprints are everywhere. That doesn't sound like a failure to me.
I often daydream about an alternate universe where the entire Lisp community unified behind Clojure, built other runtimes for it, etc. I wish we could do that, because it would become a force to be reckoned with. Clojure is perfectly designed and positioned for success in the industry, the real problem for lisp is that the community is so fragmented.
I often dream of a world where you and I ride bicycles as we canvas the neighborhood with our Clojure and Rust pamphlets explaining the path to perfect computation.
Graalvm can compile a native binary for all major OSs that does not contain the JVM but does run the substratevm which generates binaries in the order of tens of megabytes (2 in my case) it eradicates JVM startup time so now things like scripting is viable without using JS
Runtimes for Clojure include
JVM, JavaScript engines, .NET then there are less popular runtimes like Go Erlang python? realistically I'd use one of the first three till the community is bigger
I don't know another language with as much practical reach as Clojure has
It is also a success, on the other hand. It's a good alternative to Javascript.
We also have a subset of Common Lisp that can be statically compiled to javascript: Parenscript. And a subset of CL that runs on the browser as a lisp implementation: JSCL.
Every now and then people write these silly articles about lisp failing, yet it persists as an elegant solution that doesn't need constant new releases and picking up the latest fad.
I have been using Clojure(+Script) full-time for 6 or so years. I hope I never have to use a less elegant or more verbose language. The trick/tradeoff is to work at early stage startups that leave you with enough control to choose your tech stack. If I did have to go back to, say... JS (which thankfully has improved in the past several years), I will at least bring with me a fresh clarity; Clojure certainly made me a better programmer.
I think Lisp failed because it had no killer app. Most developers don't pick a language, they pick a project and the select the most appropriate language.
Web frontend -> Javascript
Unix / Linux -> C
Wordpress plugins -> PHP
Windows apps -> .Net
iOS -> Objective C / Swift
Android -> Java
In my entire career (25 years), I've never had a project that directed me towards learning Lisp. This pretty much leaves Lisp to the type of developer that seeks out new languages and is willing to spend the extra effort integrating, and that's a pretty small number of developers.
If Lisp was in the browser instead of Javascript, it would be popular no matter the complaints about the language.
That's interesting. The author of Javascript wanted something lisp like. That must be why Javascript feels a bit like lisp to me with functions and closures being primary building blocks, notwithstanding the recent misguided attempts at making Javascript look more imperative with promises and the like.
I think there is something to the argument in the article about Lisp being too expressive and allowing developers to have too different styles making it difficult to read other peoples code. I wonder if another part of the problem with Lisp is that newbie developers struggle to follow continuation-passing style programs. This would explain why the promise crutch is so popular in Javascript.
I don't think it's the brackets that people dislike, it's having to understand the layers of scoping and closures that the brackets imply. Once you get used to it though it becomes very powerful.
Could you recommend a resource that explains why these features are misguided? I've found them to be much nicer than callbacks but it's possible that I'm misguided too. :~)
With CPS it seems like you have to build and then instrument a lot of additional machinery (CPSMath.pow, seriously?) and occasionally turn your code inside out (the loop completion example) in order to rarely get a more convenient way to extend certain types of computations.
The author also seems to lay the fact that Javascript is often implemented in an odd way in browsers (the iframe complaint) to find fault with the entirety of the concept of Promises in general, which doesn't seem appropriate to the discussion here.
Finally.. he suggests that CPS shows it's power in a "probabilistic programming language" that executes blocks of code in a random order. Perhaps I shouldn't judge an author by their contents, but I think this author is so in love with the idea of CPS that he can't see how ugly and mostly mismatched it's implementations are.
Don't use CPSMath.pow. It's just in there to demonstrate that it's easy to turn direct style into CPS style. When a function doesn't need to be asyncronous, just use direct style.
Friendly recommendation: when writing examples it's much nicer to use a real use-case rather than abstract functions like `cps4()` or `CPSadditionalProcessing()`.
>I think there is something to the argument in the article about Lisp being too expressive and allowing developers to have too different styles making it difficult to read other peoples code.
Rest assured this doesn't happen in Common Lisp, because what often happens is that the program is done in the style where it makes more sense.
For example, if the program lends itself to be easily done using OOP, then it is written using CLOS (the lisp OOP system).
If, for example, there is a part that is a state machine, it might be written in the old "goto <label>" style.
If it lends to functional programming, well, lisp was the first FP language, so fine.
If a part of the program requires generation of HTML, the source will resemble HTML.
IIRC, the only actual browser scripting language competitor to Javascript at the time was Microsoft's VBScript, and that probably would have won. There was never really a third option on the table, just "what Microsoft did" and "something else."
It would probably have been FutureSplash Animator, which became Macromedia Flash and then Adobe Flash. For a long time Flash was the de-facto web scripting language.
Flash is scripted with a language eerily similar to JavaScript -- so similar that I assumed it was derived directly from JavaScript. In the hypothetical world where JavaScript was lisp... wouldn't Flash use that?
I would say cross platform apps, and an 'alternative to C/C++' was Java's killer app. Popularity as a backend language comes as a side effect of its general popularity (a la Node w/ Javascript, though there's something to be said about Node's 'different' approach)
This. Also the libraries - in 1994 c++ toolkits were very poor - I was there, it was terrible - the Java libraries and in particular AWT were just fantastic. Basically features and deliveries that required weeks of heartache and grind could be delivered in hours.
In very practical terms, dependably obtaining a stack trace that says "NullPointerException" or the like instead of silent and intermittent memory corruption was more than enough to attract shell-shocked C and C++ programmers.
Objective C didn't have a major corporation behind it until 2006-2009. Apple wasn't really relevant (outside the iPod/iTunes) until OS X gained traction and the iPhone (and app store) came out.
Apple was not a major corporation in 2000? (But it's true that would be post-Java.) Then neither was Sun. (Or maybe FigBug meant IBM which pushed Java heavily a few years later? They also pushed Smalltalk before, by the way.)
In 1994 Apple was about to go insolvent after failing multiple times to create a next generation OS, and spending the remaing money acquiring NeXT, which happened to be a reverse acquisition in the end.
What do you consider a major corporation? In (fiscal year) 1997 when it merged with Next and started to push objective-c the company had $7bn in revenue (down from $11bn a couple of years before, that’s true) and more than 8000 employees. The next couple of years sales were around $6bn but increased to $8bn in 2000.
Java features that have yet to show up in other languages:
Scalability - both horizontal and vertical.
Memory utilization (you can have humongous heaps without having to worry about managing all that memory).
Managability - easy to "divide and conquer": you can plop a jar and start using it right away. Something that's practically impossible in other languages.
Third party ecosystem (both free/oss and paid).
Speed. Java is one of the fastest managed languages.
Yes it is verbose, yes inexperienced developers do the wrong things, but it does a great job preventing you shooting yourself in the foot, and most importantly, once you have a stable codebase, it can keep running for months/years.
> you can plop a jar and start using it right away
I'm mostly a Java developer, so I think this explains why Docker always looked a little silly to me. I also write some Python, and once I have to set up a virtualenv, that's when Docker's existence makes more sense.
Yes, applets looked promising, but server-side, you might say Java was the Node.js of its time. Back in the early dot-com era, we were looking for a way for inexperienced programmers (us) to do Internet programming easily, and with its built-in threads and easy-to-use network libraries, Java looked pretty good. (And where it wasn't good yet, we thought Sun would improve it.)
This is around the time when Apache was popular with its original process-based concurrency model. You could waste a lot of memory running mod_perl with a pool of a hundred processes.
Java performance was pretty bad compared to C, but writing a select-based C server looked convoluted, and using threads in C looked obscure, non-portable, and bug-prone. We wanted a friendlier language than that.
Also, to add to what everyone else is saying, you have to remember (or try to imagine) what it was like back then. When Java first came out it was hella cool! (Like MTV.)
Yeah, we geeks like to think the concepts and technical design of a language in itself is what makes or breaks it. But JavaScript could have been similar to BASIC or it could have been similar to Scheme - people would have grumbled but used it regardless. Objective-C was considered a pretty obscure niche language until the iPhone made it mainstream.
The article cites fragmentation as the reason Lisp didn't become mainstream. But consider SQL - a severely fragmented language. Each implementation have major incompatibilities with every other. But it is still ubiquitous and rules its niche.
Yeah, that's a great way to look at it. Just to add to your point, why in hell would someone use Emacslisp? Well, it has a killer app (Emacs / org-mode / etc.) and people learned and used it.
I wonder if ClojureScript -> [Node, Web-js] is how lisp will be saved? Or maybe lisp doesn't need to be saved and it'll always be this non-mainstream language. (To be clear: I say non-mainstream but I don't mean it negatively.)
Was Pascal a failure? Lots of programs were written in it. It had killer apps. Lots of programmers learned it. It made its way into the education system.
Lisp might be viewed similarly. It’s not a fad and big companies don’t use it widely, but does that make it a failure?
Google uses Common Lisp. (They even host a style guide!) So does Rigetti Computing making quantum computers. So does/did DWave, another quantum company. There are a handful of companies that have existed between 15–30 years developing Lisp compilers (LispWorks, Franz Inc, etc). Lisp also seems to be making HN front page at least once a week.
Go is direct successor of a long line of alternative C evolution (Bell Labs didn't stop evolving the language or experimenting with new ones, and Go pretty much initially was a syhtax sugar for Plan9 C and its runtime and style)
Then it's a case of convergent evolution then, because coroutines were even a part of Modula-2, and the type of class / interface approach that Go sports looks very much like Oberon's.
I won't wonder if the ideas were circulating between Oberon and Plan 9 communities back in the day.
Griesemer, one of the three original Golang developers, did his dissertation on extending Oberon to, I think, massively parallel computers. Golang has a bunch of constructs that weren't in Plan 9 C; some of the syntactic sugar comes arguably from Oberon, but most of the semantics come from Newsqueak and Alef.
IIRC after Alef went "bust" due to disagreements about GC, its thread semantics got ported to C as libthread, as well as used in Limbo. Then Go kinda resurrected it but with GC?
First, it was used for teaching, like Java today. So you had a wide base of programmers knowing it.
Second, there was Turbo Pascal which made it an extremely popular language on PC platform (also used to have significant presence on Mac, but I'm less familiar there.
I guess, although in the case of Java I'm not too sure. It seems like what you're really pointing out isn't so much a killer app but a captive market. If programmers are forced to use a language, they'll use it, is basically what you're pointing out. But think about C++, or Python or Java(!) or any number of other languages that didn't actually have a captive audience or a killer app and still became popular--- interestingly, in one of the other comments they say that the "killer app" for Java is safety and GC. That doesn't sound like a killer app, that's a language feature.
What I'm getting at here is that it's pretty clear that language features are an integral part of what makes a language successful. In fact, apart from some extreme, extreme outliers like JS which just happen to be the most visible, features are the only deciding factors outside of luck (and marketing). So if you ask why a language isn't successful, saying that it doesn't have a captive audience isn't a very descriptive or helpful metric. Yes, without a captive audience or maybe won't be as successful as JS but that's not the real deciding factor at all.
Aside from a captive audience based killer app, projects don't lead to language choice necessarily. Why did the person who created Numpy choose Python? Python didn't have a killer app for that. They chose it because of language features.
So we're back to the OG question: in the arena where languages normally compete (besides outliers), LISP looks amazing, so why isn't it as successful as we'd expect? I'd honestly suggest it's just bad luck, no marketing, and a fragmented community.
Now, we could talk about why the LISP community is fragmented, and in this case I think it's due to too much of an emphasis on extending the language. DSL-building based programming paradigms are actually really effective, they're essentially what FP and OOP are all about. In OOP you build a custom type system and language to represent the problem. For FP you do the same, just with less internal state. LISP's only difference is it has more powerful abstractions for formalizing the process. The problem is they confused DSL building with language building and so multiple different general purpose dialects proliferated.
It was bundled by C compiler vendors early on during the 90's, and it was on the rise as all desktop vendors were jumping into it as the way to write GUIs on.
Mac OS, BeOS, OS/2, Windows, they were all moving into it.
Had it not been for the rise of C based FOSS and the respective free UNIX clones, and it would have spread even more.
I don't think anyone has "chosen" to implement Numpy (or any of the hundreds of C and C++ modules) in Python because of its language features. On the contrary, they specifically avoided using pure Python because of its low performance and opted for more performant languages used through FFI.
The reason people have gone through the effort to implement those modules is absolutely because of "captive audience based killer apps" which for Python were ML and scientific computing. And while Python is decent in its role as a glue language, it succeeded mostly by being easy to pick up and very forgiving, which shouldn't be the main factors for choosing a language for anyone who is a software engineer and not a scientist.
But what made the people who chose it for ML and scientific computing choose it? What made it better for that? There has to be a reason to choose it in the first place, and before Numpy it wasn't awesome at ML and scientific computing that I know of.
Also, being easy to learn is very much a language feature that put it in good standing for non-software-engineers, and I'm not sure why you brought up if it's good for software engineers.
The reason to choose it in the first place is, as I said, ease and convenience of use when used as a glue language. It's easier than in say, C or C++, to call some functions operating on Numpy arrays and make a graph based on that.
I just think saying "the person who created Numpy chose Python" is a bit weird, when all the heavy-lifting code is in C. Following this logic that person also chose all the other languages that have bindings to Numpy. Yes, the creators of Numpy probably had Python in mind when creating the library. But that just means they chose Python as their glue language of choice, not as their platform for implementing algorithms used in ML or scientific computing.
I did say that being easy to learn is a great language feature that made Python succeed in scientific circles. It's just that using Python for science exposes its shortcomings much less often that using it more generally, which is in most cases done by "full-blown" software engineers, who spend more time programming than, for example, creating a scientific model.
Yeah I mean I agree with you on basically all of this, I just think this proves my point that to whatever extent languages are used, it's more about features of the language, not killer apps.
And yeah, saying the author of Numpy chose Python is a little weird, since it's mostly written in C and Fortran (IIRC) but I'm pretty sure it was made with bindings to Python in mind, as you say, so the point still stands.
Actually, Matlab pulled educational licenses from research institutions like Fraunhoffer. This created a need for an alternative to Matlab. This made python popular in ML and scientific communities.
>Now, we could talk about why the LISP community is fragmented (...) The problem is they confused DSL building with language building and so multiple different general purpose dialects proliferated.
The Lisp community isn't fragmented. There are two main lisp dialects (Common Lisp and Scheme), three if you want to add Clojure, and each of them has a faithful community.
Well, yes, but look at how fragmented the Scheme community is. And then also, fragmentation between 3 dialects is still a good bit more fragmentation than most language communities have.
"He later calls Scheme “that beautiful research language I was tempted with.” But by the time he’d joined Netscape, they had a deal with Sun Microsystems, which was now pushing their newly-minted language Java. “And suddenly the story was, ‘Well, we don’t know if we want Scheme. We don’t know if we even need a little language like we wanted you to do. Maybe Java’s enough.'”"
Android -> Java? Java dominated well before mobile.
Also JVM languages are surging even though none have a specific killer app.
But Clojure is still niche. And is the most powerful/simple of them all. I think there’s something else going on, too much power & simplicity in a language doesn’t yield to mass appeal - why?
Also Java is not c++, I was a c++ programmer in 1994, in 1996 I was a Java programmer. I cannot explain how much my quality of life had changed. I know that c++ isn't the way it was then, now, but holy heaven, working with it then was like shaving with a hover mower.
the fact you could get roughly the same order of magnitude performance as C in an environment that didn't need tons if #ifdefs to work on various platforms was amazing.
> I think Lisp failed because it had no killer app.
As a vim user I love to think of all of the emacs peoples' heads exploding when they read lisp has no killer app. However it should be noted that lisp still has strong use cases in education and language design and even if not commonly used in industry its principles and design ideas come through in many modern languages.
They retired public access and private access for "small clients". The engine is still there, Google just decided it's not going to keep the vendor-client relationships.
Either that or people know a language and they get given a new project which is generally not written in that language and they find some compiler/bridge/some other tool that allows them to write what they need to write in the language they are accustomed to writing in.
For what it's worth, the "Android -> Java" part is now obsolete. Especially as you do mention Swift in the context of iOS development. Kotlin is now the default language for Android, officially recommended ahead of Java.
I'd argue that none of those languages are the most appropriate for any problem at all. As long as the language itself is somewhat usable it's mostly the ecosystem (and hype) that counts: Libraries, editor support, build tools, etc.
I think you’re on to something, but maybe missing a historical detail: Lisp had a killer app: AI.
Unfortunately, the massive investment of time, money, attention, etc that went into Lisp and AI came too early, faltered, and left both high and dry for a few decades. It’s called “AI Winter”.
Thank You, Which is basically my economic theory of PL that I have been trying to explain across to many people.
An ecosystem forces you to program in a language, as long as the ecosystem grows in market value, it will require more developers, more people learning it and more jobs for it. A positive growing circle, and once it reaches a certain threshold it should become self sustainable, there will be interest from many parties to try and improve on it.
On the other hand, if a languages ecosystem does not reach that threshold, there will be doubt if new project should be built on it, there will be less opening for developers, less in demand means people are not learning it, market wants experience dev but experience dev would have moved on to other ecosystem. Lack of hireable Devs meant PM should choose something else that makes hiring easier. And the ecosystem shrink, what we call a languages is dying. It really isn't dying, it should be accurately described as shrinking. The Negative circle.
And in most cases languages lives on in other form. People call Ruby as Matz's Lisp.
I wonder what is the most widely used, popular program, that has been written in Lisp? Is it Emacs?
Every time I sit down to try to learn Lisp, I end up wondering what the hell I'll use it for. Functional programming languages seem to have flipped a bit in my mind that predisposes me to be prejudiced against Lisp .. I find it very hard to do anything actually useful with it, whereas I can pick up c++ or Lua or Python and immediately get something running.
I'm not saying this is Lisps' fault, but I've been programming for 30 years and have tried many times to become a Lisp programmer.. its just never been effective.
Lisp is magic. You use it to conceptualize your problem. This is most naturally done by creating a DLS for it. That is, imagine that you had a programming language that was built exactly to do exactly what you want to do. You write that down just as you imagine you would like to see it. Then all you have to do is patch in the underlying machinery. Magic!
Macros. You can actually add new syntax and paradigms to the language in Lisp. You can't do that in Python or Javascript. Yes, you might argue that "I don't need to change my language! It's fine the way it is". And it might be -- for now. But consider -- whatever you think about the merits of object-orientation, most languages like Javascript and Python had to be especially written to support it. Lisp could add object orientation to itself without creating a new language, a Lisp++, as it were. Lisp is future-proof the way other languages aren't.
The question with that though is, is creating a unique DSL for every problem you solve actually a sustainable way to program? Once you do that, every single person who works on the code needs to understand all the implicit behavior of the DSL before they can use it. And any misunderstanding they have about it comes out as bugs. Yes, maybe OO had to be specifically added to Python, Java etc., but it's one OO that everyone uses, so when I encounter it in the wild I know exactly how it works.
>What can you do in a few lines of Lisp that can't be done similarly in Python or Javascript?
To pick just an example: Anybody can create a Brainfuck (programming language) interpreter in a few lines of Python or Js code.
With Common Lisp you could create a Brainfuck compiler (that compiles down to native code and thus has native code speed) in the same amount of lines of code. Or maybe shorter.
A different example: The circle-ellipse problem. It is solved easily in Common Lisp (few lines of code), because the CLOS OOP system is more powerful than the OOP systems in Java, C++, Python, Javascript and many other languages.
That's just because you're used to writing code those languages, and if they work for you for the things you want to do, that's fine. What Lisp gives me that other high level languages don't is that it's interactive and has a built-in garbage collector (like Python but unlike C++), it has symbols with values and properties, it has closures, I don't need to write methods to read or print objects, and I can add to the language. If you don't need these, you probably don't need Lisp.
I use C for low level stuff (like building a virtual machine for Lisp), and C++ for machine vision (and nothing else). Their compiled code might run faster than Lisp's but for most things Lisp's fast enough, and its faster development cycle makes it easier to find more efficient algorithms.
I could, I suppose, do many of the things I currently do in other high level languages, but I don't have to. The only other high level language I do need - which gives me functionality Lisp doesn't - is Prolog, which I need to implement the type system of my visual programming language. I wrote the Prolog in Lisp and can run it on the Lisp REPL, so it's really just a Lisp library.
Speaking to why a LISP newbie finds it tough to get things done... I think most people think in imperative terms. "I want to make a type. Now, given a thing of that type, change a thing here, change a thing here, and return the thing I want."
And most languages delineate a programmer's tasks clearly. For instance, in Python and Ruby, logical structure is expressed in blocks, control (mostly) transfers between statements, while calculation is performed in expressions, and these are all visually distinct.
LISP's idea that all structures can be expressed as lists doesn't visually demarcate the distinct "things" a programmer is trying to compose. You can absolutely learn it, but it makes it harder.
Now, since LISP does basically use an AST directly, it makes metaprogramming trivial. The beauty of extensive metaprogramming is that you can construct new languages cheaply. The curse is that you get a lot of cheaply constructed new languages.
If Python wants to add an "async" keyword, they go through weeks of debate and do a long writeup on it.[1] In LISP, someone writes a few macros and, boom, new syntax. The LISP community are smart people, they recognized this and I think the standardization efforts tried to mitigate it, and a ton of thought clearly went into Scheme.
But even this article doesn't get the problem. It's not "there are too many parens" but more that everything in LISP looks the same when you're new, it's a jumble of parens. Your brain isn't getting distinct markers to help learn the structures.
And LISP, shares a problem with most dynamic languages that anything can kind of go anywhere. It's not as bad for others, though, because if you look at an example of Python code, you can generally see the structure and that narrows down what can fit there. Whereas LISP is always a mess of parens and keywords; yes, it's usually obvious from looking at the docs, but it's just more research you have to do to get something done.
>But even this article doesn't get the problem. It's not "there are too many parens" but more that everything in LISP looks the same when you're new, it's a jumble of parens. Your brain isn't getting distinct markers to help learn the structures.
On a typical language you have "structures" you mention, like "if", "for", "switch". Lisp also has them, but they look more like function calls. The syntax has a lot of less noise.
Automatic indentation helps you visualize such "structures" in an easier way. It's not any more difficult than using a language like Python or C.
>And LISP, shares a problem with most dynamic languages that anything can kind of go anywhere.
Not really, because Common Lisp -the major Lisp dialect- is a strongly typed language, unlike the most popular dynamic languages: Javascript, PHP and Python and Ruby (Python & Ruby being 'duck typed' for the most part). And unlike the famous weakly-typed statically-typed language: C.
So no, "anything" can't go anywhere, because a type mismatch will cause a runtime error. And in CL you can correct runtime errors by modifying the source code and resuming program execution (without having to restart the complete program), so no big deal either.
The most common data construct, the Lisp list, accepts any kind of data type inside, but Lisp also has structs, objects and arrays, all of which can be defined for specific data types. Note that the ANSI Common Lisp standard includes extensive support for type declarations.
Not to mention that Lisp, unlike other languages (Java, C++ -- i'm looking at you guys) don't erase types at runtime. Lisp works with values (not variables) , and values are typed. Any value, in Lisp, carries its type information at runtime.
Note here how you unconsciously use "has" instead of "have", that kind of grammatical noise exists because those small redundancies help us understand each other.
I think the reason very few computer languages use parens is because they're harder to read by virtue of having less grammatical redundancy.
(At the other end of the spectrum, languages like COBOL are hard to read because they're swamped in redundancy.)
> So no, "anything" can't go anywhere, because a type mismatch will cause a runtime error.
I should have been clearer: it can go anywhere while you're writing it, so you have to run it to find out. That's the problem people also have with other dynamic languages when they're working on non-trivial codebases. Also, because types are rarely declared up front, you're often paging through code to figure out what type should go some place.
> And in CL you can correct runtime errors by modifying the source code and resuming program execution (without having to restart the complete program), so no big deal either.
I agree that's an amazing feature few langauges have been able to copy, but it seems like it wasn't enough.
> So what is the problem with creating domain-specific languages as a problem solving technique? The results are very efficient. [...] It results in many sub-languages all slightly different. This is the true reason why Lisp code is unreadable to others.
> The reason Lisp failed was because it fragmented, and it fragmented because that was the nature of the language and its domain-specific solution style.
The main points above can be solved with two things:
1. A more opinionated base language with batteries (e.g. Clojure)
2. A package manager shared by users of the language
Ruby is expressive and has the DSL failure modes as lisp (though usually not as low-level), but it starts with a more batteries-included base. If it wasn't for Rails and gems leading the way to normalized usage Ruby would not be as successful as it is today.
Sorry but that's just wrong. He is complaining about DSLs making things hard to read. That's not fixed by switching to another language or a package manager. That's purely a software engineering problem - nobody forces you to write code in that style.
And LISP code being unreadable to others - sorry, that's bull. Seen e.g. some "modern" Javascript stuff recently? Good luck making heads and tails out of some of the frameworks - it often doesn't even look like a Javascript anymore! And nobody seems to be claiming that it leads to the demise of the language. Or numerical calculations in Fortran - still a gold standard for scientific stuff.
To me the article is very much an output of someone who couldn't have been bothered to learn the language griping about things he doesn't really understand well and making huge generalizations based on that.
Why LISP isn't popular today is simply because that unlike for other languages, there hasn't been a free/cheap compiler and IDE for a PC for a very long time (except for the crippled LispWorks). LISP has always been an university/research thing running on either on specialized machines or later on Unix, not something "mere mortals" had access to.
Also most programmers have been taught languages like Pascal/C/Java, maybe Scheme in their introductory courses and have never been exposed to LISP, so they have no way to know about it unless they are themselves curious about it.
> Sorry but that's just wrong. He is complaining about DSLs making things hard to read. That's not fixed by switching to another language or a package manager.
I wouldn't put the formatting of the adhoc dsls used in a program as the hard part. It's the chosen abstractions and decomposition/composition patterns used. This is true in any language with metaprogramming but why has Lisp suffered more than others? I doubt reader macros. If anything they would help put things in logical rather than implementation required form.
I agree that CCL and SBCL are superb on Linux and Mac. But until at least a few years ago, they were both poorly supported on Windows (requiring something like Cygwin). That may have changed now. Also, AFAIK, CCL is dying now. :(
CCL's development is relatively slow compared to sbcl, but I'm not sure that it's "dying". And, there's been a brand new compiler on LLVM, clasp, as well as the SICL project.
>there hasn't been a free/cheap compiler and IDE for a PC for a very long time
One of the first Common Lisp implementations (80s), Kyoto Common Lisp (KCL), was also free.
One of the highest performing Common Lisp implementations, CMU CL (CMU Common Lisp), has been also free for decades. It is still available and SBCL (the most popular implementation today) is partly based on it.
>LISP has always been an university/research thing running on either on specialized machines or later on Unix
This was only true in the 70s, and only true for the most advanced/modern Lisp dialects like ZetaLisp.
The 80s is a different story.
You can check out the TIOBE index for 1989 -- the 3rd most popular programming language was Lisp, after C and C++. It's also very interesting that ANSI Common Lisp was the first objective-oriented language to get an ANSI standard.
Today you don't need a Unix/linux machine, there are many implementations that run just fine on Windows: ECL, ABCL, SBCL, CCL, and CLISP to name a few. SBCL, ECL and CCL produce native code; SBCL is particularly fast and CCL, while relatively slower, is still a fast implementation.
LISP doesn't really have a hugely complex conceptual level. It gives you a list, which can be combined to form complex structures, and also a list can be code.
MLs took ideas in LISP much further and imposed powerful type systems on them, so they appeal to a smaller audience than LISP does.
And the multi-paradigm languages simplified ideas from LISP to be usable for a broader audience, and brought in new ideas from elsewhere.
So I don't think SE has failed to reach LISP, if anything, it's surpassed it.
> It gives you a list, which can be combined to form complex structures
In Common Lisp, using lists for most data structures is a mark of a poor programmer. If you think Lisp is about list processing, you perhaps need to become familiar with how Lisp has evolved since the 1970s.
honest question: is the following the way you do hashmaps in lisp: ?
```
(setq a (make-hash-table))
(setf (gethash 'color a) 'brown)
(setf (gethash 'name a) 'fred)
(gethash 'color a) => brown
(gethash 'name a) => fred
(gethash 'pointy a) => nil
```
In principle I agree that reader macros are handy. In practice I mostly avoid them because reader macros don't have separate namespaces in CL, so it's likely your reader macro will step on one from an external library.
(Yes, you can carefully bind
*readtable*
before compiling code, but methods for doing so are ad hoc and not standardized. This is a perfect example of a convention that Common Lisp programmers need to establish formally when using CL for software engineering.)
This works, but it's a plist and thus O[n] for lookup. I assumed the questioner wanted an O[1] data structure; thus a hash table is the better answer. (Granted it makes little difference when n is small.)
>This works, but it's a plist and thus O[n] for lookup. I assumed the questioner wanted an O[1] data structure; thus a hash table is the better answer. (Granted it makes little difference when n is small.)
There was a test on some Common Lisp implementations, of hashtables versus plists on access speeds.
On small (<30?) datasets, plists were much faster than hashtables. Bigger than that, the constant access speed of hashtables makes HTs a more sane choice.
The other thing is memory consumption -- on CL, plists are very frugal.
As far as representing code the same way it represents data, that's still done with lists. And the fact that code and data have an identical representation is the most unique feature of LISP.
Recall that the original claim was that software engineering hasn't caught up to LISP. Making metaprogramming trivial by using an identical presentation for code and data is the key insight of LISP, and we do a lot of metaprogramming in many languages, so yes, we've caught up with LISP.
It's true that "code and data have an identical representation" but the important point is that neither of those things is a string. Strings are simply one presentation of code and data, but they're not what Lisp uses internally. In most languages, this distinction is not important, but it's critical for Lisp. If SWE had caught up with Lisp, git for example would be able to produce differences in function definitions, rather than merely in the strings in files that represent those function definitions. That would make git a lot more useful to a Lisp programmer.
Also the reason why for me it's essential to use an editor (usually an IDE) that works with the AST when using other languages. But it's a good point you make. If lisp (or lua tables or any kind of tree) were the base structure for everything instead of bytes it really would be a huge jump up. (There were probably many attempts at this including Charles Simonyi s language based workbench and also in the eclipse world with EMF as the basic tree .. urbit/nock etc etc)
This is precisely correct IMHO. Lisp was designed for single programmers managing their entire stack (including the compiler itself) rather than for teams mostly gluing libraries together. Modern software engineering practices are focused on lower-level languages that don't give programmers the power of Lisp.
[Herein I mostly mean Common Lisp when I talk about Lisp. I have no experience with large projects in Clojure or Scheme.]
Most modern software -- Lisp programs included -- is written by multiple programmers in both coordinated and uncoordinated teams. Lisp can easily adapt to team programming but SWE needs to be augmented to handle programming projects that operate at a meta-level, which is natural with Lisp. If you write code at a meta level with ordinary SWE it can lead to disaster.
SWE needs tooling, procedures, and policies to deal with macros, package naming, bootstrapping, multiple-inheritance classes and first-class methods, etc. before it can really support Lisp. Lisp projects that have added such SWE features manually tend to succeed; those that don't tend to have difficulty with Lisp. My point is the team has to do the SWE work; SWE tools operating in default mode tend not to be good enough for Lisp.
So did Symbolics. And most of those programmers had full-stack mastery. That's not the case with modern software engineering projects. Nor did McCarthy or Symbolics have to deal with the manager-friendly buzzword salad that permeates most software projects today.
I use Common Lisp for most of my development now that I am (mostly) retired, so I admit some bias. Given that:
Common Lisp tooling is very good and there is an active community of users. I argue that Haskell is really a Lisp language, at least it seems so to me. Also, so many good Lisp-y things are in Ruby, Python, modern JavaScript, etc.
I think language selection for a project is a process of making trade offs. I am happy that it is a free world to develop using whatever languages and tools we want. I try very hard to never judge anyone on their politics and I try to do the same in programming language wars.
Lisp didn’t fail. There are several successful modern Lisps, like Clojure.
But, moreover†, there are also successful modern languages that don’t look like Lisps at first glance, but which certainly are Lisps by most definitions—the best example I know of being Elixir.
Rather than a distinct compiler that does codegen to a distinct architecture target, languages like Elixir (and Clojure!) consist of two distinct components:
1. A plain language grammar parser, which parses “syntactic literals” of the language into a Lisp-alike AST (and Lisp had one of these as well, even in its first incarnation—Lisp’s “syntactic literals” were called https://en.m.wikipedia.org/wiki/M-expression s.)
2. A homoiconic runtime interpreter of this AST, with hygenic macro support, which evaluates macros at runtime with the side-effect of producing a program. This “runtime” is called “compile time”, but it’s really just the same runtime that the compiled program runs in (and both have equal access to the machinery that produces programs.)
In any Lisp, the definition of a function or a module isn’t a special form that the compiler does something with; rather, `defn`-like clauses are just references to an ordinary macro, which the S-expression interpreter invokes when attempting to reduce the AST that has `defn` as its head. The `defn` macro consumes a parameter list and an expression body, and has the side effect of producing a compiled function in some way; and then the `defclass`-alike macro has the side-effect of collecting those compiled functions generated within its scope, and producing a compiled module from them.
And, in any Lisp, the M-expressions (like those of Clojure, or those of Elixir) are a convenient syntax to write code literals in, but you don’t have to use them; you can just as well write “raw AST” as the M-expression representation of the AST’s data-structure literals (since, in a Lisp, these data-structures are always ordinary stdlib data structures like lists or tuples, rather than compiler-specific data-structures); or, equally well, you can generate and build these AST data structures by writing functions to produce them, and then call these functions inside a macro body in place of where you’d write a quoted M-expression literal or a plain S-expression ADT literal. In est, there’s no difference between the macros that a user of such a language writes, and the macros that define the language; both are just homoiconic AST->AST mapping functions. There are a few that do fancy side-effects involving invoking an object-code compiler... but you can write your own functions that do that as well. (This is why it’s so easy to build a new language [like HN’s runtime Arc language] on top of an existing Lisp—there’s nothing stopping you from writing macros that call out to your own codegen machinery, and at that point you’ve bootstrapped your way out of the original language.)
——
† Note that, by this definition, there really isn’t any difference between Clojure and Elixir; I didn’t need to distinguish “actual Lisps” and “languages that act like Lisps.” Clojure and Elixir are both Lisps—both M-expression glosses on top of a “compile time”-runtime S-expression evaluation engine with compilation as a side-effect of certain “compile time”-runtime macro invocations.
Certainly, Elixir has special M-expression syntax for calling various macros (`if`, `for`, etc.), but adding such a thing to Clojure wouldn’t make it any less Clojure.
The real contrast, if there is one, is that Elixir has no homoiconic M-expression syntax for S-expressions (i.e. no equivalent to Lisp’s parentheses), so you can’t just write a low-level S-expression equivalent of an M-expression inline in a function in order to have that AST subtree be part of the function. Instead, you have to define a separate macro to declare your intent to inject a low-level S-expression, and call it at that point in the function body; and you have to write such ASTs in an alternate M-expression form (as tuple data-structure literals.)
This is less of a distinction between the languages as languages, though, and more a distinction between the languages as M-expression syntaxes. You could define an alternate M-expression syntax for Elixir, which did have direct inline S-expression support, and the resulting language—at least in my opinion—would still be Elixir.
Unpopular opinion, but I find readability really hard on lisp projects. Everything has to be read backward, and just because you can nest 10 lambdas doesn't mean that you need to nest 10 lambdas all over the place. So the code you end up with is write-only nested-over-nested-over-nested backward-reading code.
Can clean lisp be written? For sure, and it's magnificent. But in practice, it's a total shit show. Basically, what I'm saying is when it's great, it's amazing, and when it sucks it's the worst.. whereas other languages like java/javascript/python have less variance in code quality and range from ["pretty-meh to "good-enough"].
And what's happening these days is the great features of lisp are integrated with "good-enough" languages, and then they become "pretty-good" languages.
I agree. Sure, I can understand lists and parens and all of the operators being out of order and the stupid abbreviated syntaxes and weird names for basic operations and the infinite nesting, but why would I want to?
So four hours later it's buried under the "didn't fail" and "killer app" explanations.
When a question is raised over and over again through the ages, it's because the answer (in this case, usability) is being dismissed every single time.
This is a really loose definition of “failed”. His definition of failed seems to be “didn’t take over the world”... but most things don’t. It’d be like saying “John Smith failed at life because he wasn’t a billionaire by age 25”. As a language it was (and is) used in a lot of places, had a lot of offspring, and brought a lot of influential ideas into the world. That doesn’t seem like a failure!
Considering the number of people who have bemoaned over the years how Lisp never became mainstream, I think it's fair to say that it failed compared to what its enthusiasts wanted.
It's difficult to agree with the idea that VB displaced COBOL. Even if this were the case, it certainly wasn't because COBOL compilers were expensive compared to "a cheap interpreter" that came with the machine. COBOL compilers surely were expensive, but they were also intended for mainframes rather than PCs. Also, I don't remember VB ever coming packaged with Windows for free. VB has been a part of Visual Studio for as long as I can remember and I don't believe there was a free version of until around VS2005.
Lisp, it turns out, is a weird and challenging language. It takes a little extra before you start to see the exciting bits. Unfortunately, until you reach that point, it's a matter of struggling with the parts around it - namely the editor. Lisp and Emacs go hand in hand, and while today you can rely on other editors to competently work with Lisp code, 20 years ago I don't think this was the case.
So now a user has to pick up Lisp (challenging when viewed through the lenses of either VB and COBOL) and Emacs (positively baroque if your only experience is within a GUI and IDE). I'd venture to guess that getting a functional Lisp environment running on a 286/386 era PC was probably a challenge in itself for most people.
Now consider VB or Java, significantly more familiar languages that didn't effectively depend (at the time) on the features of their editor or IDE. Both were a mouse-click away as far as installation, and Java didn't cost anything. Both were backed by large organizations that had a lot of incentive to invest and convince people to use their particular thing.
Lisp was this weird, mostly academic, language intended to tackle relatively esoteric rather than business problems, that was Unix only at a time when PCs and Windows were taking off. It doesn't seem unusual to me that it didn't take over the world. Frankly, I'm surprised and thankful that it's still around, actively used, and even sporting a significant community.
> while today you can rely on other editors to competently work with Lisp code, 20 years ago I don't think this was the case.
I've been doing Lisp for just about that many years and have used nothing but Vim. Its Lisp support is built off :set lisp mode which is in the original Unix vi. It does a good job of syntax coloring, indentation and all that.
To reindent a block of code in Vim, I just put the cursor on one of the parentheses, then hit =% . The odd time there is a stylistic disagreement (like Vim wanting to align the arguments of something as if it were a function, rather than let-like).
My Lisp implementation, TXR Lisp, comes with a comprehensive Vim syntax file.
When you browse the sources using CGIT, all the syntax coloring you see is performed by Vim, being used on-the-fly as a back end for HTML generation.
Yikes. I've been using vi forever and that never occurred to me. I just tried it on my own config and it works just as you say. I hope it is configurable, 'cos it doesn't fully agree with my habits (although maybe that's a fault of my own, and not vi(m)'s!)
Best thing about HN comment threads is that these gems keep happening.
I used that on C code before Lisp: jump from { to } or vice versa, while reindenting.
Whether you get this alignment:
(oper args
blah blah)
or this one:
(fun arg arg
arg arg)
depends on one very simple configuration: the lispwords variable. All the identifiers listed in lispwords are given the former treatment.
Then there are minor squabbles like what happens with the t clause in cond.
A lot of the time I just use visual select and =.
Also, in Vim, when you reindent multiple lines, it basically assumes that the first one is indented right; it will not move it relative to it predecessors.
You often have to hit == on the first line to get it into alignment, and then %= (or whatever) to reindent the range below it. E.g.:
(let ((x y))
(blah ;; here we type ==
blah blah
(foo bar
boo))
(let ((x y))
(blah ;; now %=
blah blah
(foo bar
boo))
(let ((x y))
(blah
blah blah
(foo bar
boo))
I have a lot of gripes with the first part of this article regarding old languages that "failed" (which I detail below), but what follows is a pretty reasonable assessment of why Lisp never really "clicked" as an enterprise language. My opinion on the subject overlaps, but I would say that the biggest issue is not programmer understanding (it is easy to write unreadable code in any language), but problems with the language from a business owner or manager's perspective. Lisp programmers are less replaceable, and the trade off between the benefits in power of Lisp compared to what large businesses lose in terms of ability to replace employees quickly was not worth it to them and I honestly couldn't say whether their judgement was good or bad in regards to this.
Starting at the top:
> Other languages of similar heritage (to Lisp) are still widely used.
Which ones? Unless I'm very mistaken, there were virtually no languages which fit into a similar role as Lisps for at least 1-2 decades after Lisp (1958) came onto the scene. The only language I can think of that would accurately match this description is Prolog (1972), because although Basic (1964) was also a dynamic, symbolic language, it was really meant to be an easier alternative to Fortran II with instant feedback and simpler syntax for people with little to no computer science or math background.
> Some of the above languages are no longer quite as popular as they once were. This will be the definition of "failure" we will use here. The question is why did they fail? The first stand-out is COBOL.
COBOL failed?!? It is possibly one of the most successful languages ever. From a 2018 Reuters report on active COBOL use that continues today (and was even more prevalent in 2009 when this article was written):
- 43% of banking systems
- 80% of in-person transactions
- 95% of ATM swipes
- 220 Billion LOC
> as time progressed fixed sized arrays as data structures became obsolete
According to who? Vast amounts of numerical computation software from weather and climate models to modern AI algorithms make use of fixed length arrays.
> The ALGOL language family succeeded. The reason for this is that these were the languages written by programmers for programmers.
There is no doubt that the Algol (C, really) family of languages have absolutely dominated, but whether that was because they were well designed or something like that seems a questionable notion. K+R were essentially looking for a portable assembler with which to rewrite Unix, and many would argue that C's popularity stemmed primarily from it's use in Unixes, which spread to all main stream operating systems.
> active COBOL use that continues today [...] 220 Billion LOC
That 220b LOC is because most newly-written COBOL programs today are copied and pasted from some other large existing program, then some small parts of it changed. I know in one large corp I once worked at, a manager ordered that instead of a new code being created for a new customer in the program suite as was usually done, every program in the suite was cloned and the new customer's name was hard-coded into the strings in the program text. That manager got the new system up and running in record time and was well regarded by his peers. The maintenance programmers and computer operators got some extra job security in the years ahead too.
Most of my day to day work is in python, but I have a number of projects in Racket and have been dabbling with common lisp. I think the author is simply incorrect here. Other algol like languages offer the illusion of understanding, but then you get things like 'patterns' where even though in principle you know what the atomic elements of the language mean you are confronted with an undocumented 'pattern' that you have to puzzle out to understand what the code is actually trying to do. Lisp provides structure for these patterns in the form of DSLs, and if that means someone has to confront that they don't know something, at least it whacks them in the face rather than giving them the illusion of understanding. More importantly lisps provide a way to formalize, specify, check, and document the pattern.
So if lisp failed for some internal reason then I don't think it was for the reason the author specifies. What should worry people more is that a language that is better in 99% of cases can fail for reasons that have nothing to do with its technical merits, but purely by accident of history or as a result of politics unrelated to those technical merits.
Trilobites were the most diverse clade in the history of the earth. Trying to rationalize why they went extinct by studying their anatomy is going to be fruitless. Same with LISP.
For me Lisp was a lot of fun to learn and play with, but I dont think it had a chance to survive in modern production environment because:
1. Not typed
2. Absence of a proper package mechanism and package managers like cargo or npm
I was just thinking that far from "not surviving", one of Common Lisp's great advantages is that it's difficult to see how it could die. The free implementations have achieved a remarkably level of maturity and polish, particularly SBCL.
Your first point there seems weird, giving the growing market penetration of Python.
CL has some type checking features in some of its compilers, but this doesnt mean it is a typed language.
Quicklisp is still in beta and is in no way even close to package managers like cargo or npm.
Are there examples of languages that were not popular for a long while before suddenly going mainstream (the definition of success from the article)? I feel like even a potential killer app is not enough, there were some interesting web frameworks for smalltalk (like seaside) but it wasn't enough to shake in any way the feeling that the language was a dead end for most people (and they can just wait for a reimplementation in a popular or new hyped language if it's really a good idea).
Phoenix/Ecto for example would probably not get the attention it gets (and deserves) if it focused on Erlang instead of a new language with some momentum. And Lisp seems to get it even worse when all of them (old and new, with as many programming paradigms as there can be) get grouped together as one 60 year old language family. Hopefully clojure, racket and other newcomers can turn this around.
Yes, I was thinking about that case. While Ruby was comparatively still a modern language at that point, Rails used the metaprogramming of the language to effectively create a new unique language on top of it with it's opinionated design. It might be even a point in favor of Lisp, since even in an older language like Common Lisp it can still be possible to build a new language on top that feels modern, maybe some kind of tidyverse environment for handling the next AI wave paradigm whatever it may be.
It was used in teaching (I used it as such, at the time), and by hobbyists.
VB allowed creating whole business systems in conjunction with Microsoft applications. Which to my recollection was quite a step beyond. Earlier business office products might have been based on WP or Lotus macros, possibly DBase. VB unified much of that.
> The reason Lisp failed was that it was too successful at what it was designed for. Lisp, alone amongst the early languages was flexible enough that the language itself could be remade into whatever the user required. [...] However, the process causes Balkanization. It results in many sub-languages all slightly different. This is the true reason why Lisp code is unreadable to others. In most other languages it is relatively simple to work out what a given line of code does. Lisp, with its extreme expressibility, causes problems as a given symbol could be a variable, function or operator, and a large amount of code may need to be read to find out which.
And I perfectly agree. You can see it in tutorials too. You can even see it in any Lisp-lover comments showing you a proud snippet of how "it can be solved better with lisp": it's too clever.
A little bit of clever is fun, it's good for the soul, it can even be productive. But Lisp is just a big pile of cleverness.
Being clever is so appealing to my geek nature. It's so cool. Yet experience (in coding and life) taught me it is not a good property to build a community around. Or a project for that matter.
At best, it's something that can emerge from a battle against long and arduous stream of problems. And then you can rest, you try to factor away the cleverness. But not as a goal. Not as a basic proposition.
Do you have examples of otherwise well regarded Common Lisp software that has the criteria you state?
People have used Lisp cleverly, but most modern libraries are a bunch of functions and classes. Sometimes more foreign features are used, like macros or even reader macros, but they do so to help one express him/herself. Even the open source Common Lisp compilers, written by arguably the lispiest of Lispers, don’t have a lot of “cleverness”. Check out SBCL, CCL, or ECL.
Some of the prolific library writers—Weitz, Fukamachi, Hafner (aka Shinmera), Strandh, Beane, etc.—all produce good libraries and good code, and aren’t at all as you describe. Check out Hunchentoot (a web server), Clasp (a new CL compiler), CL-PPCRE (regex library), Radiance (a web framework), Quicklisp (package manager), ... These are all easy to jump into and relatively easy to understand if you know the domain.
Even if you go back and look at the Symbolics sources, there really isn’t a lot of cleverness.
If the complaint is that some lone wolf programmers write obscure code, then I’m not sure what’s to worry about. Somebody’s lone wolf code isn’t what you’re going to be importing or depending on.
I hear this opinion you state somewhat often by drive-by comments, but I never see examples, just anecdotes.
I think Forth has a similar issue: despite the attempt to ANSI it, Forth is a family of idiosyncratic individuals. Chuck Moore even says that standardizing Forth is kind of doing it wrong, you're supposed to custom build your own, for each application even. (Like if a Jedi knight built a lightsaber for each battle. (^_^) )
So on the one hand, Forth "failed", while on the other it's still used a lot.
(This article is full of errors; the author doesn't seem to know much about FORTRAN, BASIC, or LISP.)
Paul Graham wrote an article about this in 2001, "What Made Lisp Different": http://www.paulgraham.com/diff.html. He lists nine features: conditionals, first-class functions (though not, at first, closures), recursion, dynamic typing (and what I called the object-graph memory model in http://canonical.org/~kragen/memory-models/), garbage collection, no distinction between functions and expressions, a symbol type, a notation for code using trees of symbols (and thus the ability to add macros), and the whole language always available (no strong distinction between compile-time and runtime, dramatically simplifying macros and other forms of metaprogramming).
As Paul points out, these features got adopted by other languages gradually over time, but in 1960 or 1970 or 1980 or even 1990, if you needed garbage collection and dynamic typing, or to pass around functions as values, or to do metaprogramming, your non-Lisp options were very limited. Prolog or Smalltalk might be a possibility, but usually they weren't. So Lisp was extremely popular. It was really the only reasonable candidate for an embedded scripting language in the 1980s, so that's what Emacs and AutoCAD used.
By contrast, consider the currently-popular crop of languages: Java, C, Python, C++, C#, VB.NET, JS, PHP, SQL, Objective-C, Ruby, assembly, Swift, Matlab, and Groovy, say. Let's omit SQL and assembly from what follows. All of them have conditionals; all of them have first-class functions (though in C, closures are a nonstandard GNU extension); all of them have recursion; all except C have some form of dynamic typing, and half of them are purely dynamically typed (except C, C++, Objective-C, C#, Java, VB.NET, and Swift), and even more of them use the Lisp object-graph memory model; all of them are garbage-collected (except C, C++, and sometimes Objective-C); many of them have a symbol or "atom" type (Python has intern(), Ruby has symbols, Objective-C has SEL, Swift has Selector, and JS just acquired Symbols in ES6); and most of them support Turing-complete metaprogramming in one way or another: templates in C++, "eval" in Python and JS and PHP and Ruby and Matlab, "Eval" in Groovy, and loading dynamically generated bytecode with a fresh ClassLoader in Java.
Metaprogramming merits special attention here; fully a third of Paul's items (symbols, representing source code as a tree of symbols, and the lack of compile-time–run-time distinction) are about metaprogramming, and those are the items that are not widespread today. The main use of metaprogramming is implementing embedded domain-specific languages, which you could reasonably argue is the most important part of the Lisp approach. (Certainly the article claims that it's what sunk Lisp.) But there are ways to implement EDSLs other than compile-time code evaluation to modify your source code while represented as a tree of symbols, and indeed the immense difficulty experienced in solving the hygienic macro problem in Scheme (getting to Macros That Work) suggests that it may not even be the best way. You can get a long way by using reflection instead of macros, and in Python you can override __dunder__ methods, implement iterators, and write metaclasses; in Java, in addition to firing up OW2 Asm and generating new classes, you have @annotations; in Ruby and Objective-C, you have method_missing and -doesNotUnderstand:; in object-oriented languages in general, you have virtual method dispatch (including but not limited to the Interpreter pattern); and in Ruby you have block arguments, and the ES6 => syntax is lightweight enough to be used in the same way. (I don't know several of these languages well enough to comment on their metaprogramming facilities.)
So the real story is that most of Lisp's features went mainstream, and every popular language has them, so they are no longer a reason to choose Lisp stricto sensu. They do differ in how to implement metaprogramming, Lisp's most radical feature, as did Lisp — fexprs are nowhere to be found in Common Lisp (or in McCarthy's 1959 Ur-Lisp), and Scheme hygienic macros are another game again, one which also doesn't provide an S-expression API to the macro-writer.
(The expression–statement dichotomy is an exception here. It's true that Lisp doesn't have it and most modern languages do. I think this is an example of the tradeoff between error detection and succinctness I described in http://www.paulgraham.com/redund.html — the expression–statement dichotomy improves the reporting of parsing errors considerably, and the compensating expansion of your code is almost insignificant.)
FigBug argues in https://news.ycombinator.com/item?id=20375596 that Lisp stricto sensu failed because it had no killer app (other than, I suppose, Emacs and AutoCAD), because most developers don't pick a language, but are rather constrained to use the language demanded by their environment: JS in the browser, C for Unix, Java for Android. But that just poses the question of why Android uses Java instead of a purer Lisp, why the browser uses JS instead of a purer Lisp, and so on. It just reduces the adoption decision to a smaller group of programmers.
There were a couple of other historically contingent things that happened, which don't have anything to do with the merits of the languages as such: around 1988 the AI Winter and the workstation revolution wiped out the Lisp companies; around 1995 the internet went mainstream and for a while all the interesting development was in Perl 5, partly because of its Lispy qualities but also because its attitude toward Unix was the extreme opposite of Lisp's; and the microcomputer world developed its own programming traditions, despite the noble efforts of magazines like BYTE to bridge the gap. Presumably something similar is happening right now in Shenzhen.
> the currently-popular crop of languages: Java, C, Python, C++, C#, VB.NET, JS, PHP, SQL, Objective-C, Ruby, assembly, Swift, Matlab, and Groovy, say
Did you get this list of 15 languages from TIOBE's July 2019 top 15 rankings at https://www.tiobe.com/tiobe-index/ ? It also says Apache Groovy has risen from #81 in July 2018 to #15 today. I do believe the #81 ranking but the rise to #15 only 12 months later is ludicrous. Someone's obviously spamming a search engine to get that ranking up. I know someone does the same thing with Groovy downloads from Bintray.
Now if Groovy's ranking has been fabricated over the past year, then surely some others of those languages have also been similarly fabricated for much longer, and their popularity has been exaggerated.
> Did you get this list of 15 languages from TIOBE
Yes, I thought it would be better if it reflected TIOBE's biases instead of my own. However, I did succumb to the temptation of including the first 15 instead of the first 10, because leaving out Ruby at #11 seemed too extreme.
> Groovy … ludicrous
I was surprised to see Groovy on the list myself, because I thought it was pretty much dead. It would have displaced Golang, which is notable for its lack of metaprogramming and its departure from the Lisp memory model into something much more C-like than Java, C#, PHP, Python, or Ruby.
> surely some others of those languages have also been similarly fabricated
Yeah, likely. There's a whole SEO industry of liars trying to fake buzz. I think they're broadly correct, though.
> Groovy has risen from #81 in July 2018 to #15 today
The rise to 15 is probably not correct but 81 is probably as inaccurate the other way as 15 is today. Given how ubiquitous gradle has become and groovy being the basis of that. Especially when you consider the nature of tiobe, it is about search results, so even people migrating away from groovy will generate "groovy" traffic as they try to figure out how to do equivalent things in, say, kotlin.
> 81 is probably as inaccurate the other way as 15 is today
I can understand both how and why someone would push Apache Groovy's ranking higher up the Tiobe results, but I wouldn't know how you could push it down, let alone why anyone would. #81 is probably as accurate nowadays as the mid-40's has been for most of Groovy's life on Tiobe since 2006. Groovy's seen a significant drop in use (outside Gradle) over the past few years.
RC is part of the standard library, although optional, strings and vectors might use COW and since C++11 there is a tracing GC API, Windows COM/UWP relies on RC, and Unreal uses a C++ tracing GC on Blueprint integration components.
Typically what ends up in wide use is historical in nature. C became dominant because it closely maps to the hardware that was available when personal computing went mainstream. A whole generation of programmers learned it, and that influenced language design for decades to come. When everybody is used to doing things a certain way, it can take a long time for new ideas to gain popularity. People often get set in their ways and have a hard time adapting to new ideas.
A lot of the popular languages are fungible because they come from the same family. For example, Ruby, Perl and Python are all extremely similar in nature and offer no fundamental advantages over each other. Then we have system languages like C, C++, and Rust which offer better control over resource utilization.
As a long term trend what's going to drive language popularity is how well the language is suited to tackle popular problems. Functional languages are perfect example of this, as they have been around for many years, but they simply didn't fit well with the available hardware.
Back in the 70s you had single core CPUs coupled with very limited memory. Things like GC were simply not an option and you needed languages that were good at squeezing out every bit of performance possible. Imperative languages addresses this problem extremely well.
However, today we have different kinds of problems. In many cases raw performance is not the most important goal. We want to have things like reliability, scalability, and maintainability for large code bases. We want to be able to take advantage of multicore processors and network clusters.
The imperative paradigm is a very bad fit for these problems because it makes it very difficult to localize change. On the other hand, functional paradigm is a great fit for these problem thanks to its focus on immutability and composition. Today we're seeing a resurgence of interest in functional programming, and Clojure is a perfect example of a Lisp that's quite successful.
I learned Lisp in the 80’s and 90’s because AutoCAD used that as it’s internal scripting language. Was fun to play around with and I wrote code for a paying project to do election redistributing after the 1990 census. I’m out of that world now and have no idea what AutoCAD uses these days for scripting.
- In the past, it was better at abstraction, but slow and niche and you needed to shell out for hardware.
- In the present it is not better. Common Lisp is about on a par with Python in what it has built in, inferior in its ecosystem, and for some modern stuff (threads, sockets) you will have to go outside the language standard into the wild west of implementation extensions or third party libraries.
Lisp's one big selling point is macros. Macros are magic that lets you reprogram the language. Reprogrammed languages break the ability of a programmer to drop in and read the code. Languages that need reprogramming to get basic shit done are unfinished or academic (Scheme). Languages that can get stuff done, but tempt you to reprogram them are attractive nuisances (Common Lisp, Ruby). In use, they create incompatible tangles that don't mix well with other people's incompatible tangles. I have been bitten by this repeatedly in Ruby. But Ruby is still easier to get work done in than Common Lisp.
Basically it died of being meh with a side of footgun.
For reference, I'm not holding up Python as a good language, I'm holding it up as a comparable one. Python is warty and creaky, but more mainstream than CL, so it's a fair comparison.
>Python is warty and creaky, but more mainstream than CL, so it's a fair comparison.
How can it be a fair comparison, when Python isn't even a real functional language? (lambdas have been intentionally crippled, there is a separation between statements and functions, not everything is an expression, and so on...)
As a Common Lisper (among others) interested in language design, I've just started reading up on Go, after I've been told it's "simple, robust, safe, well-designed".
Purely subjectively, alot of what I'm seeing in Go so far looks to me like a "2nd system design" for C programmers. I'm hoping that the concurrency model is where it shines through with some good new ideas and implementations because I've been relatively unimpressed with the rest of it so far. That's not to say it's bad per se, but I've seen nothing in it that makes it stand out from other languages design-wise, and it feels rather warty to me so far...
> looks to me like a "2nd system design" for C programmers
That seems fairly accurate.
> I'm hoping that the concurrency model is where it shines
I'd say that channels aren't quite the magic thing they are made out to be and goroutines aren't that amazing. The real win of go for me is three things:
1. Evented runtime with a synchronous interface. "Go is my favorite epoll library"
2. A standard lib that has great http support (things like keepalive and connection pooling and http2)
3. A build and test toolchain that makes coverage and race detection and memory and CPU and lock contention profiling trivial.
Go was designed by strong programmers as a “good enough” language aimed at the combination of two goals:
1.) Letting good programmers get things done without feeling too many constraints, and without hating the output of #2.
2.) Enabling decent programmers understand the output of #1, while still getting things done themselves.
It is an extremely practical language ‘project’, in that it doesn’t pretend that all programmers are great, or average, and it tries to help the two subsets work together.
Nothing about go the language is objectively good. The language design is spartan and procedural. The type system is a train wreck. The headline "goroutines" usually have to give way to the arcane arts of locks and mutexes. Google has poured hundreds of millions of dollars into the language and compiler. The ecosystem is objectively good, but the language is not.
For a point of comparison, let's look at StandardML as a language that I believe is objectively good, but with an ecosystem that is objectively not good. The type system is amazing. The syntax is as simple as go (designed to be easy for students to implement), but more powerful. Unlike "goroutines", CML (concurrent extensions to SML) are sound and actually work. Most of the headliner features debated in go 2 are already present (with ones like parametric polymorphism having extremely elegant solutions baked in). The compilers (eg, polyML or mlton) even spit out extremely fast code.
Without a Google-level backer, SML just sits there kept alive by a small group of academics who work on the parts that interest them and their research. If even a fraction of Google's resources dedicated to Go were put into the SML ecosystem, golang would have never come to be.
My point is that why a language succeeds or fails is seldom related to the actual merits of the language itself. Almost every one of the most popular languages has one or more corporate backers with deep pockets who made it what it was then maintained it going forward. Consider that Rust at its core is a complex, C-like syntax on top of SML, but had a company willing to pour millions into developing the language.
Lisp lost what backers it had due to the AI winter killing them off. Nobody else stepped up, so the language languishes.
> It's the other way around. Lisp macros allow to eliminate boilerplate and produce clearer code. This is obvious to any lisper.
I don't have any particular experience with Lisp macros, but my experience with the closest equivalent in every language I've experienced them in is very different from your experience. For the person who wrote the system, it's a clean and expressive way that eliminates boilerplate. For everybody else, though, it's an underdocumented mess of magic incantations with some mental notes of "don't try to combine these things together, it will blow up," and if you ever have to debug a problem in the macro itself, you're going to find it to be a royal pain.
>I don't have any particular experience with Lisp macros, but my experience with the closest equivalent in every language I've experienced them in is very different from your experience.
Yes, because the way metaprogramming is achieved in other languages is very different than the way it is done in Lisp, thanks to s-expressions.
>and if you ever have to debug a problem in the macro itself, you're going to find it to be a royal pain.
Yes, because you're not dealing with Lisp macros.
If i'm looking at a Lisp source code, and there's a macro invocation, and I want to see what this macro is going to do to the code,i just press a key, and I can immediately visualize, in-place, how the source code gets transformed.
A big part of the standard library of Common Lisp itself is implemented as macros too, it's no big deal. Dealing with others' macros has no particular difficulty in Lisp; or in any case, no additional difficulty to dealing with others' functions.
From what I understand, the Rust procedural macro system should basically be similar in structure to Lisp macros, much closer than any other language I've used. It still has all of the aforementioned problems, in my experience.
Racket (which admittedly is a particular Lisp) has a very useful tool which lets you view each basically debug macro expansion by viewing each code transformation. This tool is even built into Dr. Racket so it doesn’t need any complex setup.
I’ve written both procedural Rust macros and Racket. Racket is far easier to debug and write macros compared to Rust despite their macros having similar features (in particular source locations and doing macro evaluation as a compilation step rather than at runtime).
Macros are just another form of abstraction, like functions. And like functions, they can be poorly named or poorly designed. Somehow the possibility of suboptimal use is frequently paraded around as an argument against macros, but seldom heard as an objection to functions.
(Also, macro systems like Racket's syntax-parse facilitate the creation of macros that fail gracefully with nice error messages.)
When the problem domain needs a novel control structure, and I can't write a macro, I have to try to expand the macro in my head and then you have to read boilerplate and infer what I meant.
> about 20x+ faster than Python, too; speed in the same order of magnitude than C.
I think this is exactly into parent thread's point: yes the performance can be great but you need to tail the code for the hardware, which is niche and the beautiful abstraction might never hold anymore
>but you need to tail the code for the hardware, which is niche
Not really.
You just need to declare the data types (Common Lisp supports type declarations) and tell the compiler you want the code to be fast, which reads just like that:
(declare (optimize (speed 3)))
You also need to use the same tactics you would use in C to gain speed: Use arrays instead of lists, use threads. All of this is easily doable in Lisp and IMO while it takes more effort than writing typical Lisp code, it is easier than using C and the end result is more elegant and debuggable.
This is really depending on your view, but the moment you step outside of lists and use arrays, you are keeping modern CPU architecture in mind. And all of a sudden all those wonder car/cdr/cons concepts as well as things built upon it no longer holds, which, again depending on the view, actually harms the abstractions in Lisp, and we are actually back at the same page with all the other programming languages.
Also, when you are promoting something, it might not a good idea to compare against Python when it comes to speed, but when talking maintainability, only C is talking as a target here.
Maybe not quite as big of a selling point, but lisp has also always been the best environment for writing programs that need to modify themselves at runtime. Homoiconicity removes the limitation that code must be known and constructed at compile-time.
Yeah, it is definitely niche.
However it is super cool to be able to do optimizations like constant propagation and loop unrolling using information that would never be available at compile time.
For instance if you have a function that evaluates an argument and branches based on the value, and if for some other reason your program knows that after a certain condition the argument will always have the same value, it can just optimize away that branch while the program runs.
Now that is kind of contrived, but one real world example I ran into happened when I was writing a program to render 2d plots of complex functions.
If you hardcode the equation of the function into the source code, the compiler can optimize it (important, since this function gets called for every pixel in the image being rendered) but the program is limited to rendering that one function.
If you let the user enter that function at runtime in the form of a text string, your program has to essentially act as an interpreter by creating an AST from the string then walking the AST repeatedly for every single pixel, so obviously much more costly.
The ideal solution is to have an environment where code can be seamlessly compiled at runtime and executed without extra ceremony or semantics that would be out of place in compile-time known code.
In C you would have to attach a JIT to your program and manually write to an executable page etc, whereas in lisp the system handles all of that for you.
Another example happens with config files:
Config file data is used for conditional branching, but since it only becomes known at runtime, your progarm ends up with a lot of branches that would get optimized away if the config info had been known at compile time.
You could just make all the config data compile-time known by having a something like a massive config.h full of #defines, but this is obviously not user friendly at all and makes it necessary to recompile every time you want to change a setting.
If your code is able to introspect and optimize itself based on data that only becomes available at runtime, this tradeoff is eliminated.
It is inaccurate to call things like threading a Wild West of implementation-defined stuff. It is true these things aren’t de jure standard, but most important stuff has de facto standard portability libraries in Common Lisp. For threading, use BORDEAUX-THREADS. For filesystem operations, use UIOP. For module definitions, use ASDF.
Elephant in the room: the hostility of the LISP community. Is there any language community with quite the reputation for unwelcoming unpleasantness, combined with the opposite - no counterbalancing reputation for welcoming friendliness?
That people feel bad when they try to enter LISP world is much documented[1]. That Python and, say, Julia, managed to get reputations as welcoming communities long after Eternal September says the problem isn't "everyone" invading Usenet.
I'm expecting immediate downvotes for this comment, but what I really hope for is "yes I have that feeling about LISP world compared to other languages" or "no I feel LISP world is represented approximately the same as other languages".
> "And yes, I think it still represents the comp.lang.lisp group, even in the post-Naggum area. Just look up the latest debate of the shortcomings of Common Lisp that was incited by Steve Yegge. Lisp is as close to a perfect programming language as any, and its zealots guard it rabidly. Criticism isn't tolerated. Ignorance isn't tolerated - which drives many newbies away quite quickly. Lisp has been attacked so much before, that any post by a newbie meekly seeking an advice on some arcane feature gets immediately interpreted as criticism and turns into a flame war. True, Erik Naggum was the main cause of such flame wars in the past, but he wasn't the only one. Wake up, Common Lisp gurus - your language is so unpopular because of you - there is no one else to blame. If you ever asked yourself why is CL much less popular than the obviously "brain dead" Perl (paraphrasing Naggum's favorite Perl quote), the answer is the community, that is you."
That "Hexstream" guy is a known mentally ill person who harasses other Lisp community members in various ways (github, twitter, his personal website, exposing other lispers' personal information, etc.)
He is alone and doesn't represent the community in any way.
>Elephant in the room: the hostility of the LISP community. Is there any language community with quite the reputation for unwelcoming unpleasantness, combined with the opposite - no counterbalancing reputation for welcoming friendliness?
I entered the Common Lisp community 2.5 years ago and found nothing else than good, helpful, friendly people.
Yes, they (we) don't tolerate stubborn stupidity, but if newcomers are truly interested they are treated very well.
Languages that have achieved any degree of success, as Lisp did, never fail for just one reason, or two, or three. Lisp failed for many reasons; some are listed in the other comments. The article suggests one, and discounts others, but all contributed.
The hardest lesson to learn seems to be that GC is poison. Other languages that have avoided Lisp's other faults succumb to that one, and lock themselves out of the most demanding problems.
When embarking on a project, you do not often know everywhere it will take you. If you start with a language that won't go some places, your project can't go there. If it needs to go there to succeed, you have doomed it at the outset.
How many times do you need the lesson? Some need it more than others. Many never learn it.
I wouldn't call GC "poison", but in a lot of domains it is a severe limitation. I don't see any GC language unseating C/C++ very soon, and perhaps never.
Clearly GC has achieved a ton. However, I am starting to wonder whether it will hang on to that success, or whether new language designs that don't need it will gain a lot of ground back.
GC, in a way, seems like a shortcut to avoid reasoning about lifetimes, and sometimes taking that shortcut comes back to bite. If it becomes easy enough to get help from the compiler, maybe we just don't need GC any more?
Obviously far-fetched, but interesting to think about.
Yet C++ community has adopted RC on the standard library, COM/UWP uses RC, and the most sucessful AAA game engine middleware uses a C++ tracing GC.
And yes, RC is a GC implementation algorithm as per CS reference papers and notable books.
C might never come close to any form of authomatic memory management, but in a couple of decades it will be relegated to surviving UNIX clone deployments.
GC is a way to let the computer do something so we don't have to think about it.
This is a GOOD THING. It's what using computers properly is all about. The idea that there's some sort of virtue in doing this reasoning manually just baffles me. It's like objecting to compilers because it's better if human string together the opcodes.
Rust either restricts what you can do with the pointers that don't own their referents, or it opens itself to reference of freed objects bugs. A system with proper GC would not have this problem.
>When embarking on a project, you do not often know everywhere it will take you. If you start with a language that won't go some places, your project can't go there. If it needs to go there to succeed, you have doomed it at the outset.
Yes. That's why an extremely flexible language, like Lisp, is an asset in those cases.
>The hardest lesson to learn seems to be that GC is poison.
Yes, the poison that kills a wide spectrum of memory bugs. And makes difficult data structures reliable.
Lisp failed because its community failed to accommodate the beginner trying to do something useful.
Lisp has a zillion useful ways of doing things--that nobody ever taught.
Try finding a beginner book on Lisp (even today!) that actually teaches about vectors and hash tables instead of trying to do everything with recursion, lambda and singly-linked lists.
You don't attract a beginner trying to write a game, for example, that way.
> Try finding a beginner book on Lisp (even today!) that actually teaches about vectors and hash tables instead of trying to do everything with recursion, lambda and singly-linked lists.
"Practical Common Lisp"--Copyright 2012--gets it right and deserves to be called out for that. Hashes and Vectors appear BEFORE lists in the book. They don't get the emphasis they deserve, but they are on par with the list coverage.
"COMMON LISP: A Gentle Introduction to Symbolic Computation, from 1990"--Um, wow, a whopping 17 PAGES out of 500+ (5 for hash tables) on nominally the two most useful data structures in all computer programming. (Side note: I had an actual dead tree edition of Touretzky from CMU in 1986?--I don't remember hashes in it--I think that was an add-on later). And, even 1990 is after Lisp starts getting displaced (Tcl, Perl, Python, etc. are all coming online).
The Perl4 Camel Book cookbook section was practically an existence proof for "Show people how to do useful things in your language and they will flock to you."
In addition, "Hash all the things!" was practically Perl's motto, and it wiped the floor with everything else for a VERY long time (something like 1994-2005--for 10+ years Perl was dominant).
> "Practical Common Lisp"--Copyright 2012--gets it right and deserves to be called out for that. Hashes and Vectors appear BEFORE lists in the book
Did you read the book?
The book is from 2005. Lists are introduced in chapter 3 with a practical example, a simple database.
The Touretzky edition I mentioned is from 1990.
Actually lists are the main fundamental data type in Lisp (well, it's also called as an abbreviation of 'List Processor'), which sets it apart from many other programming languages and is the reason of its existence. The basic idea of Lisp is that one can do many practical things with lists.
Let's look further in your claim that it would be hard to find introductory Lisp books with mentioning arrays.
Paul Graham wrote published in 1996 the book ANSI Common Lisp. Arrays appear in chapter 4 together with strings (1d arrays/ vectors), sequences (a data type joining lists and vectors), structures (records) and hash tables.
Winston&Horn, 3rd Edition of 1989. Arrays are in chapter 11. But from that book we get a good idea through practical examples that one can do a lot of interesting things with lists.
Another book, the 'Common Lisp Companion' from 1990. Vectors, strings, arrays, sequences, hash-tables appear in chapter 4.
> Perl was dominant
for text processing and early web server scripting...
For example I can't remember much in NeXTStep or MacOS that made use of Perl (or Lisp) in any crucial way. The OS and its applications were written largely in C, C++ and Objective C.
Right now I'm using Lisp for a hobby project. I find it actually really enjoyable. Lisp's metaprogramming support is super powerful. I'm the kind of developer who likes to explore different styles of programming. If you're like me, Lisp will not fail you. It really is a "programmable programming language".
For example, Lisp is not purely functional, but if you want to force yourself to program in a bit more of a functional style, you can make a macro that disallows the use of 'setf' (Lisp's assignment operator) in a function body, and it's actually quite easy to do that!
Maybe Lisp 'failed' because it's too flexible. You can solve your problem in a million different ways. Arguably, that is not a good thing in industry where it is preferred to have a language that enforces a single style on your code base for consistency across developers. However, I'd argue that it is very good thing for a creative individual. You want to have the freedom to do things your way.
From what I've experienced so far, Common Lisp offers a unique combination of customize-ability, utility, and interactivity. It's been around long enough for it to have a significant amount of libraries, and it has a multi-threaded REPL (so you can run your main loop and hot-reload your code-updates at the same time, how cool is that?!).
As for the parentheses... yes, I can see why Lisp code would be hard to read. On the other hand, with the right editor extensions (parinfer) it's really easy to edit, and you don't even need to think about the parenthesis when doing it. The documentation is alright I'd say, you can find plenty of stuff online.
Not sure what else I can say. I'm having a good time with CL... I'll try not turning into a 'smug lisp weenie'?
1. The more I dig into Lisps, the more I find Lisp is not a language. 'Lisp' is more like the term 'C-like'.
Common Lisp, Scheme, and Emacs Lisp are way too different from each other. It's more like a comparison with C++, JavaScript, and Bash. Common Lisp is hardly a functional programming language. Emacs has no lexical scope (This really made me freak out). Clojure is a more qualified functional programming language.
2. Lisps, for example, Scheme was superior to other languages decades ago. But now, as you can see JavaScript is basically a Lisp without macro (Yeah I know HN guys don't like JavaScript, but I choose JavaScript because it's a footgun language like Common Lisp and Emacs Lisp). Or just Elixir, basically a Lisp without paren syntax.
3. Lisp never dies, there will be more. It will march forward as the industry moves. The Lisp syntax means the ease of parsing and macro design, and people will combine this feature with other features. There are already:
a) Functional Lisps like Clojure
b) Carp, a Lisp without GC, there were also Linear Lisp
c) WASM, an assembly with S-expression
d) I have already seen other prototypes with static types/dependent types/monadic macro systems etc.
If we're indulging in idle speculation, then here's my contribution.
I think Lisp failed for the same reason that ORMs exist, and to be even more provocative, for the same reason that women are underrepresented in programming: the Apple II.
Yes, yes, of course that isn't "the reason." "The reason", singular, is nonsense, as any outcome is a consequence of many interrelated causal factors. However, often a few or even one dominate, and in my view the Apple II[1] is a dominant factor for these outcomes.
Hear me out.
Evidently, there once was a time when women had a healthier representation within programming, and this roughly coincided with times both when there was an efflorescence of different programming models, and when "computers" was more a profession for adults. Then, the personal computer revolution occurred in the 1970s, which introduced inexpensive, under-powered computers into (upper) middle class homes, and this roughly occurred in tandem with the introduction of video games both in arcades and then in homes. In my unscientific observation, girls were shoved aside by their brothers, who swiftly moved those video game consoles and computers into their bedrooms, where just as swiftly we set about adapting games and writing our own games using the only language offered: BASIC. And so a baby boom of baby programmers occurred. Programming expanded beyond an adult profession and became also a province of adolescent hobbyists, whose first and only (for a long time) view of what it meant to program meant: an imperative language, with subroutines, standard control-flow operations, and mutable state. BASIC isn't usually put in the Algol family of languages, but it's a better candidate for a foster child of that family than the never-learned models of before: Prolog, SQL, Forth, Lisp, etc. It wasn't difficult to make the transition from BASIC to C, then to C++, and then to Java (picking up Perl and maybe Python along the way). Without even trying, that generation of boys became a generation of young men primed to program in a non-Lispy way, right around the time when the internet was creating a demand for programmers. What fruit did this yield?
* Boys and then men became overly-represented in programming, relative to decades before.
* The database was held at arm's length, and SQL was tolerated only by creating ORMs that permitted retreating to the comforting familiarity of imperative, mutable state, Algol-like languages.
* Languages like Lisp and Prolog were forgotten about, until being "rediscovered" decades later.
Of course, if any of this is true, it raises other questions
* Why did the early, cheap computer makers choose BASIC instead of Lisp?
* Why did later, more powerful computer makers choose C?
* Why was there a gender disparity in affinity toward games in kids and adolescents?
I can speculate on some of the reasons for these, but I've indulged in enough for now.
[1] "Apple II" is just a handy placeholder. Substitute Atari, Commodore 64, "personal computers", whatever, if you like.
https://locklessinc.com/articles/
reply