> The official Clojure CLI, for example, is just plain confusing, and that's most people's first impression to the entire ecosystem. The config files, while they use the wonderful `.edn` format, are also not intuitive.
I completely agree; it is unfortunate they don't spend more time officially recommending leiningen.org. A beginner attempting to use the built-in CLI is going to lead to a poor first 3 months.
> While I like the language very well, tooling does have issues, even simple things like not distributing Leiningen with Clojure when it is almost essential for any serious development. Version incompatibilities between Leiningen and Clojure creates subtle problems. I also wonder what the experience of using lein in Windows, it seems to install using a mysterious batch file.
Clojure bundles tools.deps since 1.9, and it works great. cljs and figwheel already supports it, as do many others. I prefer it to leiningen, but YMMW.
I use emacs to develop clojure, never had any problems with it :/
> Usage of local jars is extremely difficult, you need to setup a local maven repository and then add dependencies through it, which is a very painful process, esp for hobby projects and prototyping. I should be working on my problem not wrangling with maven.
> Well, people do use these tools, but honestly, do you really want a language with multiple build tools / package managers?
I want multiple workflows. A single build tool can accommodate that, but Leiningen can't/doesn't.
For example, let's say I have a compiler called mycc. I can build something like this:
mycc main.c -o main
I might then combine it with an file watcher utility like entr
find . -name '*.c' | entr mycc main.c -o main
With Leiningen it has to do that all. AND you pretty much have to use the auto rebuilder capability (which means your workflow starts with turning the auto rebuilder on).
> As far as editors go, I'm perplexed. You can use nearly any editor, why do you perceive the need to use one of these 2?
Then why do nearly all (probably 95%) Clojure devs use either Emacs or LightTable?
> Also, the maintainers have a super weird aversion to everything that is "too simple". I've seen packages rejected because "the build process was too simple and there are no dependencies". The excuse was that "users can build it themselves". This is not for silly stuff random people wrote on weekend, this is for 20 year old tools used in production written in C.
wtf? What could be the possible motivation for this? Is their build farm really straining or something?
Can you point me to this example?
> I'm also using download links straight from language websites (Rust, Ruby, Python, DotNet, Haskell).
this sounds downright medieval to me, like why even use macOS for development if the software management tools are so bad that that's what you're doing
> Edit: I meant to note that the reason the specification hasn't been updated, afaik, is because it doesn't have to be updated. All the tools you need to add the features you require are there in the spec. You can make it into the language you need.
That's a big part of the reason I haven't used CL for anything bigger than some toy projects.
It's a crusty, old language, and instead of keeping up with the times, everybody has to build the features they need. I can do that, or I can use language that has the features I need and that I don't have to assemble myself.
And to make it worse, it's hard to even use libraries that solve the problem, because nobody in the CL community can agree on anything. There's always a dozen incompatible solutions, and it's never clear which one should be used.
> I'm pretty sure that numerous Quicklisp users can't program either, but Quicklisp is manually curated while NPM (AFAIK?) isn't so their code won't end up in Quicklisp.
This interpretation of GP’s comments strains credulity.
As for the rest, I edited and expanded my unnecessarily glib response. The technical ability you mentioned is nice, but I will remain unimpressed until it sees wider use. This is a shortcoming in the CL community, not a shortcoming in the CL language.
I’ll change my opinion once I see evidence of the CL community coming together to value shared, reusable code to accomplish common tasks.
> I think a modern lisp based editor, with concurrency/threading built in. With Racket/Clojure/CL backend with modern controls builtin would be all we need.
Sure, but it's the same as the "replace (La)TeX" problem: part of the value of the ecosystem is the huge collection of packages/extensions.
But if there was a reasonable way of porting packages over to a CL/Scheme new editor base, I agree that would be good.
> Including having to do less configuration and tweaking work.
Sure, but a bad, unconfigurable tool is more work.
> It's ridiculous to expect newcomers to a language and/or programming in general to slog through a 20 or 30-step environment setup process.
It's ridiculous for anyone really. If you're not using something outside of the standard libraries, it's ludicrous to expect someone to slog through the manual setup process when they can double click an IDE icon and be ready to program in under a minute.
Not to mention the fact that a lot of those setup guides are for specific OSes and the author always seems to leave that part out. So you get to step fifteen and get a weird makefile error only to find out that that particular library is for another OS with no alternative for your own. This forces you to restart at square one with more potential added steps (eg virtual machine setup, OS setup, etc).
> as though setting up or using Reason is trivial or well documented
It's both trivial and well documented. There is an installation page and a tutorial on the Reason website. The installation in itself is one npm command.
> For another example, you altogether dodged the issues associated with multiple "standard" libraries.
There is only one standard library. It's the one shipped with the compiler. It is also significantly more battery included now than it used to be.
> It would be great if the OCaml community were as committed to making their language useful for production applications as they were on convincing everyone else that it already was.
The OCaml community as a whole is one of the least vocal on the internet. It does very little outreach. I never see any of the people I consider relevant to the community on HN for example.
No one cares about convincing you that OCaml is ready for production use. It doesn't need to be argued, it can just be shown. OCaml is not an up and coming language, it's a 25 years old one. It powers a lot of Jane Street infrastructure. It is used to develop Coq and Frama-C. It is transpiled to javascript at Facebook and Bloomberg. Heck, Rust which you apparently hold dear was originally written in it.
> [...] I can't think of a single situation where "understanding several compilers" would have helped me design/maintain/troubleshoot infrastructure I'm responsible for.
Oh, sure, you don't need to understand how ELF binaries work, until you try to
do anything non-trivial to them (building chroot image anyone?). You also
don't need to know how Ruby or Python work with modules, but I'll want to stay
away from any your system where you happen to install a random recently
developed software, because it will be a mess.
> But hey, looks like you're not in my target market, and that's ok!
Of course I'm not. What you proposed is a list for novice sysadmins,
except it doesn't touch the essence of the craft, focusing instead on shiny
bells and whistles of limited applicability that will be obsolete five years
from now.
> but it seemed to me that the complaint was about using BoringSSL at all.
From the article: "but it raises the bigger question of why are we compiling BoringSSL in the first place?" (emphasis mine). I think the "compiled" part of that is doing the heavy lifting.
> On Rust-land, I've noticed that some libraries like to have a fallback or compile time config flag to use the vendored version, but still provide a way of using the system's version.
This is probably a good middle ground, but it might be more harmful than helpful than if it instead just linked to direction for how to install it yourself (at the simplest level it's almost always ./configure; make; make install), because then at least it's obvious what happened and what libs are being used and you can find them (in /usr/local or wherever).
> I appreciate when both options are available because I've encountered some C libs that were very difficult to set-up on first run and the vendored version usually works out of the box.
Anything that the install process can automate should be easy enough to provide directions and maybe some simple helper scripts for. If it's too complicated for someone to do, how is it able to be done in an automated fashion by a dependent in a way that seems to work for most people? (and if it doesn't work for most people, why even offer it)
> it took me several hours to get llama.cpp working as a server
Mm... Running a llama.cpp server is annoying; which model to use? Is it in the right format? What should I set `ngl` to? However, perhaps it would be fairer and more accurate to say that installing llama.cpp and installing ollama have slightly different effort levels (one taking about 3 minutes to clone and run `make` and the other taking about 20 seconds to download).
Once you have them installed, just typing: `ollama run llama3` is quite convenient, compared to finding the right arguments for the llama.cpp `server`.
Sensible defaults. Installs llama.cpp. Downloads the model for you. Runs the server for you. Nice.
> it took me 2 minutes to get ollama working
So, you know, I think its broadly speaking a fair sentiment; even if it probably isn't quite true.
...
However, when you look at it from that perspective, some things stand out:
- ollama is basically just a wrapper around llama.cpp
- ollama doesn't let you do all the things llama.cpp does
- ollama offers absolutely zero way, or even the hint of a suggestion of a way to move from using ollama to using llama.cpp if you need anything more.
Here's some interesting questions:
- Why can't I just run llama.cpp's server with the defaults from ollama?
- Why can't I get a simple dump of the 'sensible' defaults from ollama that it uses?
- Why can't I get a simple dump of the GGUF (or whatever) model file ollama uses?
- Why isn't 'a list of sensible defaults' just a github repository with download link and a list of params to use?
- Who's paying for the enormous cost of hosting all those ollama model files and converting them into usable formats?
The project is convenient, and if you need an easy way to get started, absolutely use it.
...but, I guess, I recommend you learn how to use llama.cpp itself at some point, because most free things are only free while someone else is paying for them.
Consider this:
If ollama's free hosted models were no longer free and you had to manually find and download your own model files, would you still use it? Could you still use it?
If not... maybe, don't base your business / anything important around it.
It's a SaaS with an open source client, and you're using the free plan.
2. Make it reasonable to figure out what the "type" of a package dependency is so we can figure out how to use it and/or find its source code
3. Document package definitions. We document source code in typed languages; Nix expression language is untyped and generally less readable--why not document it?
4. Nix tools have a `--help` flag that only ever errors with "cannot find manpage". This is just user-hostile.
5. Using almost-JSON for the derivation syntax, but then providing a "pretty printer" that keeps everything on one line but with a few more space characters.
6. Horrible build output--everything for every build step (megabytes of useless gcc warnings) gets dumped to the screen. Contrast that with Bazel and company which only print errors.
> The author is lazy, and is channeling his energy into complaining rather than learning.
I don't know about that. The author is certainly confusing programming with setting up a development environment. I can understand why they would be frustrated with the latter, since it takes away with what they are trying to do.
When I started out, the latter was fairly simplistic: computers booted into BASIC. If you needed something more sophisticated, you could buy an IDE where everything would run right off the boot disk or after an automated installation. You can still find those tools, designed to serve everyone from home users to professional developers. That being said, it seems to be much less common.
In some cases, like web development, that makes sense. The target is a standard rather than an operating system or hardware architecture. In other cases it is self-inflicted. When you choose to work with third party libraries, there is a much higher probability that some work will need to be done to integrate it. Then there is the WTF category.
Take one of the easier cases: Java. Even though the development model is much closer to traditional languages like C++, the most common use case will involve installing an IDE and the language separately (possibly with some tweaking of the environment). That's not too bad, but it's likely more than people want to see when they want to get their tools up and running. Contrast that to C++ on a commercial operating system: you install Visual Studio or Xcode and are ready to focus upon programming for the chosen platform.
Of course, C++ isn't always like that and it comes close to representing the other extreme. Setting up development tools for microcontrollers can be quite the task, particularly if you choose the "wrong" one. That's a good part of the reason why novices like Arduino and Platform IO, it makes going from nothing to a functional IDE fairly straightforward.
Now I'm not going to say that the author is right, but I will admit that they have a point. Presenting Unix-isms in a macOS installer is going to rub some people the wrong way. That should have been addressed in a better manner. On the other hand, they were also attempting to accomplish something that is non-trivial to start with.
> This isn't the kind of a job that can be solved by a simple, beautiful 50 line file.
As a big CMake fan, I don't think CMake's success has anything to do with their super ugly custom language. What CMake does could be a lot more ergonomic, especially from a typical user's standpoint.
Have you taken a look to see if you find what you need?
For a parallel example: At first I was reluctant to develop in Common Lisp due to what appeared to be "too few libraries", again, it's just an appearance when you compare to the amount of libraries for a more mainstream platform like Java or C++. However, for CL i found all the libraries i needed (and then some); i wouldn't be surprised if for Racket you can find everything you need.
I completely agree; it is unfortunate they don't spend more time officially recommending leiningen.org. A beginner attempting to use the built-in CLI is going to lead to a poor first 3 months.
reply