Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Deep Dive into Ownership in Mojo (www.modular.com) similar stories update story
2 points by matt_d | karma 16555 | avg karma 7.94 2024-06-09 20:57:15 | hide | past | favorite | 24 comments



view as:

Most of the video describes a system similar to Swift, where function arguments can be borrowed or inout, but borrowed references are not first-class types, and so you don't have lifetimes like in Rust. In Swift, this design definitely makes the language more approachable than Rust, but it results in a ton of unnecessary refcounting and sub-objects forced to be their own heap allocations, which contributes to Swift's reputation for terrible performance overall.

But then, at the end of the video, Chris says that Mojo has lifetimes! …But leaves an explanation for a future video.

So I'm looking forward to that.


Until that video comes out, you can read the proposal[1]. This appears to be quite recent (the timestamp says 4 days ago).

I personally found the writing style of the proposal to be clearer than the tutorial-flavor of the original blog post (I haven't watched the video). But that's probably because I have a programming language background and am intimately familiar with Rust.

[1]: https://github.com/modularml/mojo/blob/main/proposals/lifeti...


Swift's ownership features are still new and going through evolutions; so its current state isn't the necessarily the vision or where it will be in a year. E.g., here's a proposal that allows one to create a lifetime dependency between a function's return value and its parameters: https://github.com/apple/swift-evolution/blob/d9aa90bae13e35...

Well, that's very interesting indeed. Maybe there's hope for Swift in the future.

Does anyone actually use Mojo for anything? Or is it like Deno where the founder raised a bunch of money but the language will never ever find market fit?

The last time I looked at Mojo it required a sign up to install, that killed my interest pretty fast. Though now it looks like you can just install it?

Yup you can just install it now.

I don't use it for anything, but as it gets stable, I'd love to. I don't expect to even try it out for years, but don't think that means it's failed.

Given how new it is, it isn't widely used yet. But there's always new packages being posted in the discord, and mojo's CPU performance is already competitive with PyTorch in a lot of benchmarks. GPU support is supposed to be coming sometime in the summer.

I think the main problem for Mojo is that it really doesn't make the hard part of writing HPC code that much easier. Maybe a bit better than C++ (ex. better compile-time metaprogramming, built in SIMD type) - but you still have to do the dirty work of writing the optimized SIMD kernels, which isn't going to be sexy given the nature of the problem. (Also... CUDA support when?)

I think the main advantage of Mojo would be its tight integration with the Python ecosystem (can conveniently call Python from Mojo)... Given the atrocities with existing C/C++ build systems, guess this could be a much lower barrier for deep learning people to delve into HPC stuff?


Writing SIMD routines can be quite easy, at least much more pleasant than using intrinsic mnemonics.

My understanding is that Mojo's portable SIMD builds on top of LLVM vector MLIR dialect, which is good. If this is indeed true, it would make it competitive with C# cross-platform SIMD which usually means just creating variables typed as Vector128/256/512<T> and using either the regular arithmetic operators on them or VectorXXX.Shuffle/PairwiseSomething/etc. for SIMD-specific operations without ever touching mnemonics.


Mojo doesn't use most of built-in dialects, they currently only use LLVM and Index dialect, as well as many custom dialects

I would (as in, turn-key would use it tomorrow) but it's not open source yet.

The stdlib is, as of a few weeks ago. The rest I presume will follow once they are sure of their MLIR moat

This is an unfair side jab at Deno, it’s not being used as widely as Node (yet), but it’s the go-to edge runtime beside Cloudflare workers and the underlying system powering Supabase functions.

Fair fair.. They are actually used far more than Mojo. However the amount raised makes it very hard to pull through.

It's too new, they're still working on it. But if it's promised features materialize it's hard to imagine it won't gain an audience. And in general I wouldn't bet against Chris Lattner.

Is this important for Mojo or is this scope creep?

Lifetime is already a fundamental part of Mojo. At the moment it is not exposed to devs, but the proposal is to flesh out the ergonomics and also bring conceptual clarity.

It's good to have a story (I worry it's become cool slightly before we've hit the jackpot theory-wise, but still) for this but it does feel a little odd having it in "python" / a language which both idiomatically a syntactically isn't particularly interested in references and values and so on.

I'm honestly more excited for Mojo as a competitor in the systems programming space (Rust) than the ML space.

Where Rust's default way of doing things gives you the most optimal performance (i.e no copy), you pay for that with code complexity and symbol soup syntax. If you want to move fast, you copy things, which is both ugly and wasteful.

Mojo on the other hand, based on the ownership presentation, by default makes some assumptions on your behalf for most common situations that you can then override if you need every ounce of performance out of the program.

I appreciate this approach more because I can hit the ground running and build something with great performance and then optimize it later when needed.

I'm looking forward to building something with it once they finish their work on the networking and asynchronous libraries.


Some more generalized information to get started with:

MLIR was spearheaded by the same person who created LLVM, to add additional features that Google needed to build compilers for its in-house hardware AI accelerators.

Mojo is a new language that takes advantage of the new features enabled by MLIR, in the same way that Swift took advantage of the features previously enabled by LLVM.

Here's an interview with Chris Lattner discussing Mojo in much more generalized terms:

https://www.youtube.com/watch?v=JRcXUuQYR90


I'm wondering, since Mojo doesn't use most of MLIR's built-in dialects(only LLVM+Index), will upstream improvements to optimization passes for these dialects not be immediately applicable? (Mojo uses its own custom MLIR dialect)

Legal | privacy