Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I think Julia really dropped the ball on the execution model. Just-ahead-of-time compilation ends up being the worst of both worlds - you can't compile a small fast binary for deployment, and you can't quickly run a script or REPL for development. It turns out that this really matters for adoption.

And now Julia has competition from Mojo. Mojo makes some compromises for backward compatibility with the Python world, but it's really solving the problems that hurt AI most. And the folks behind Mojo have a lot of real-world experience migrating a community from one language to another.

I think Julia will remain a niche language, confined to science and statistical computing outside of mainstream data science and machine learning.



view as:

> it's really solving the problems that hurt AI most

Isn't this a bit premature? Mojo doesn't tangibly exist for most people (we can't run it ourselves), and I am unaware of any ML/ AI applications built with Mojo.


I have tried it. Iirc, the company developed it, or its compiler, specifically for ML/“””AI”””.

If they successfully import Pythons libraries without a bunch of wrapping, its userbase will likely follow.

It was faster than Python and easy to do paralleization like go. But it’s not quite fully baked yet.


Yeah, definitely early days. But they're saying all the right things: fast code off the bat, easy to vectorize and parallelize, efficient use of memory, easy to create abstractions that execute efficiently on various architectures. The demos are impressive and they have the track record to be credible. I've played with it and the stuff that's been implemented so far works well.

Swift for Tensorflow was such a success.

Oh boy, I remember reading about this for the first time and thinking it was April's fools.

IMO Mojo isn't competing with Julia. It's competing with Rust. Manual memory management, you have to type annotate every variable (if you want good performance), and no overloading doesn't seem to me like a language aimed in the same area as Julia.

Right now, Mojo is as successful as Swift for Tensoflow.

> and you can't quickly run a script or REPL for development.

You know that the entire blogpost you're ostensibly commenting under is about how the JIT overhead has been greatly reduced via more static AOT compilation, right?


And that's good, but not enough. 1.7 seconds just to load libraries is just too much.

good news is 1.10 already is shaping up to be about 2x faster.

So ~0.85"?

Python is not compiled and start & load pandas (which is comparable to the libraries loaded in the article) in ~0.4" on my computer, and that's a notoriously slow language.

If I were to use e.g. Rust with polars, load time would be virtually none. And when I have to process ~50k different datasets, I can't afford 0.85" per file, which would translate to ~11 hours of overhead.


It's not 0.85 per file. It's 0.85 for the 50k files combined.

No, because the program has to be run by slurm over several compute nodes, so it can't process them all at once.

as long as you don't have 50k computers, it still should be 0.85 seconds per node which is still tiny.

> it still should be 0.85 seconds per node

No it's not, because I will not architecture my whole pipeline & program around Julia inability to start in maybe a second in a year or 1.7" now, I will just use another language.


In good company, that is what Python folks do all the time.

Python does not take 1.7" to load pandas.

> I will just use another language.

Hello C, C++ and Fortran.


> If I were to use e.g. Rust with polars, load time would be virtually none.

Because you're compiling...

And if you need to do the same in Julia, you should also pre-compile or some other method like https://github.com/dmolina/DaemonMode.jl (their demo shows loading a database, with subsequent loads after the first one taking roughly ~0.2% of the first)


When is 1.10 expected? Just wondering I did a quick google and it seems a lot of the 1.9 improvement were back ported from 1.10? Or is that really 2x on top of current 1.9?

1.10 feature freeze is going to be in the next few weeks. after that it will be a few months depending on how much is required to fix all the bugs that have probably been introduced. 1.9 mostly doesn't shorten loading times (although weak dependencies end up helping a bit). 1.10 has had a bunch of load time optimization which became a lot more obvious once 1.9 got rid of all the stupid stuff. the exact speedups are package dependent, but 2x is a good estimate. some packages get a lot more, some are about the same.

Thanks. Exciting times for Julia!

Then precompile or use https://github.com/dmolina/DaemonMode.jl.

But 1.7 seconds at first startup isn't even enough time to articulate a serious thought, much less write any good code.

I struggle to believe it's a dent in anyones workflow.


I exposed my use case in the comment chain below; DaemonMode won't help there.

> I struggle to believe it's a dent in anyones workflow.

This is exactly what I dislike in the Julia community, the "if I don't have this issue, then the ones meeting it are holding it wrong" attitude.


If Julia can make good strides in Science/Engineering/Statistics world outside of DS/ML (that too mostly NN) I'm sure the Julia folks would see that as a major win. Basically a unified next generation language taking over R/MATLAB/Fortran space. I think that field is going to grow dramatically (in adoption of tools) over the next decade as opposed to DS/ML which are already overhyped.

If Mojo finds it groove and can convince people that static typing is a price to pay for performance I think Julia will have to adopt some sort of a pure AOT compilation mode and forgo the Dynamism that keeps that Just- portion lying around which does seem like an issue for DS/ML folks.


I'll preface by saying by far my biggest gripe with Julia is the inability to deploy binaries easily. The only hope on easy deployment I suppose is to hope it becomes commonly installed enough that you can rely on it being on systems. But some of your complaints are a bit off (in my own opinion):

>and you can't quickly run a script

What is wrong with the following to run a script?

$ julia myscript.jl

If you have specific needs that demand, after hitting return, the few seconds of delay for the vast majority of scripts is an issue, you can pre-compile it ahead of time or simply use something like https://github.com/dmolina/DaemonMode.jl

Julia has issues as with all languages but "not being able to quickly run a script" is by far one of the easiest to work around.

> and you can't quickly run a script or REPL for development.

REPL- I disagree. Of course you can - that's how many of use Julia.

> And now Julia has competition from Mojo.

...maybe. The code-samples we've seen from Mojo look very similar to Python, obviously. And that is specifically why a lot of poeple love Julia.

The problems people are more and more interested in (machine learning, etc) are at their base mathematical problems. These languages are all tools to translate mathematics into computer instructions. The code should then ideally look as close to that math as possible. Spamming np.linalg, sp.sparse, and so forth over and over again is just ugly, and the entire Python workflow overly encourages object oriented design for concepts that are mathematically functions. And, well, should be functions. If you're working in these or related fields at all and write any code with Julia, it's hard to not fall in love.

Mojo may make Python faster. But it will still be, roughly, Python.

And considering that the Python 2 -> Python 3 transition has still not completed, I think it's premature to conclude Mojo's python upgrades will get anywhere.


Legal | privacy