It's not Julia's dynamic typing, but Julia's multimethod approach. For each function you don't have one set of arguments that must be generic enough to cover all cases (for example f(number, number)), but you have one visible implementation for each combination of types (f(integer, complex), f(complex, integer), f(T, T) where T <: Number...) which makes it easier to know what the function actually covers. And the REPL (and editors plugins) has some good tools for searching for available methods, documentation and source code.
Though while Julia has even sum and product types, it's not common for people to go to the level of detail of an ML language (plus you can't dispatch on the values of a struct), and there is no formal way to annotate an implicit interface for now, so each method will usually not be as clear from types alone as Ocaml.
Julia is dynamic. I've described this before, but the type annotations are used for multiple dispatch, not for static type checking.
The quick example of why this is powerful is multiplication:
# Assume you only have addition defined
# Quick and dirty baseline
function *(l::T, r::T) where {T<:Int}
out = 0
while l > 0
out += r
l -= 1
end
out
end
# Great, now let's say you have a data type, rational numbers:
struct Rational{T<:Int}
numerator::T
denominator::T
end
# And you want to define multiplication on rational numbers
*(l::Rational, r::Rational) = Rational(l.numerator * l.numerator, l.denominator * l.denominator)
# Cool, but multiplying Ints by rational number should also be defined
*(l::Rational, r::Int) = Rational(l.numerator * r, l.denominator)
*(l::Int, r::Rational) = *(r, l)
The reason this is so cool, and why, yes it's still dynamic,
is that I can call x * y on two variables, x and y, of which I know nothing about. The most specific method defined which matches their type will be called to handle it. If there isn't one defined, that's obviously a method error just like in your favorite other language!
Hmm, that’s interesting - I enjoy REPL-based development, having spent a lot of time working in Python, and a little Clojure. How does the REPL experience play with the static typing (that I think) Julia has? If you change the type signature of a function in the REPL what happens?
Julia has a very nice type system, the nicest of any dynamically typed language I am familiar with. This is something to do with multiple dispatch, but it's more to do with trying to have a type system that allows all the JIT to unbox all the things that have to be unboxed for high performance without sacrificing the freedom of dynamic typing.
IIUC, Common Lisp is the giant on whose shoulders Julia built in this respect.
Julia is a dynamic language, it doesn't have static typing. What happens is that it's dynamic nature is actually a superposition of all possible static implementations of a function. So if you call a * b with an integer it will JIT compile an integer based version of the product, while if you do it with matrices it will compile a matrix product version. Julia Base has 361 implementations of product, which any library or programmer can freely add at any point and the compiler will always match at compile time the most specific version defined for the combination of all arguments (multiple dispatch). You also usually don't need type hints when calling methods, the compiler will infer the types by itself and choose the optimal implementation.
If you define one method with the exact same parameters as an older one it will just redefine it. As a side-note, it has safe points where the JIT will work (usually the global namespace, in which the REPL runs each command), and otherwise it can't see newly declared methods (as it is running already compiled code) unless you use a method to force it (like invokelatest or eval). The period between safe-points is called the world age.
Julia relies heavily on type inference, similar to all variables to be marked auto or auto& by default in C++.
Functions are templated on the argument types by default, in C++ terms. So if you call a function f(x) first with an integer and then with a floating point number, Julia will generate separate machine code optimized for each case. Annotating function arguments with types only serves as a filter for whether the function applies to certain types, and can be used for method overloading (function template specialization in C++). The fact that functions are generic by default lends itself well to forward-mode automatic differentiation using method overloading.
Julia is still a dynamic language though. This means that if a variable is bound to an integer in one branch of an if-statement while a string is assigned to it in another branch depending on the value (not type) of a function input, then the variable's type cannot be inferred and needs to be 'boxed', i.e. the runtime type needs to be stored along with the data. In such cases, function dispatch can no longer be done at compile time, and needs to be done dynamically instead. While this is useful for rapid prototyping, it does result in reduced performance, so it should be avoided in code with high performance requirements. The type elision part of this is somewhat similar to boost::any.
We can trade anecdotes on this topic, but I've written numerical code in OCaml and also Julia. The strictness of OCaml's type system is painful in a numerical context but for virtually all other things it is awesome to pass code into the interpreter/compiler and catch structural problems at compile-time rather than maybe at runtime.
OCaml's type system is almost certainly not the right model for Julia but the ad-hoc typing/interface system Julia currently employs is at strong odds with compile-time correctness. There's almost certainly some middle ground to be discovered which might be unsound in a strict sense but pragmatically constrains code statically so there is high likelihood of having to go out of your way to pull the footgun trigger.
You can see how little type annotations are used in practice in major Julia libraries. It should be integral to best practice in the language to specify some traits/constraints that arguments must satisfy to be semantically valid, but what you often see instead is a (potentially inscrutable) runtime error.
I'm using Julia the same way for math computation: starting as a scripting language, but creating new types and adding typing information on the fly (start with a tuple, then create a struct if that tuple is useful multiple times).
Julia is a language that you write more or less like python, but in cases where python would chase a million pointers to figure out what the type of a variable is, get an instance variable, call a method, etc, Julia (if written correctly) does just in time compilation. The first time it has to call a function on a combination of types it hasn't seen before, it compiles the function for those specific types (into LLVM IR, and then native code) and stores the compiled code so that subsequent dispatches of that function with those types are instantaneous, calling directly into the compiled code. You do not need to add type annotations for this to happen; at runtime, Julia checks the types of variables (which may have even been determined when the function was compiled, eliding the runtime check, if it was used in a context where the types were statically known; see below) to determine which version of a function to call.
This is a strength, as the only thing you have to do to get the performance of fully compiled code (modulo things like GC and bounds checking) is make sure your functions are “type stable”, i.e., their type information can be determined statically. (In fact, in Julia you get better performance by writing more, smaller functions because you'll have more regions within which the types are statically inferrable.) But it's also a weakness because the first time you start up a REPL and call a bunch of functions, each of those functions has to be compiled from scratch, which takes a long time (google “Julia time to first plot”).
Julia has other niceties such as very flexible math (promotion between every pair of numeric types) and a lisp-like macro system with homoiconic code, which make writing numerical code and scientific algorithms highly ergonomic.
- It is functional (including lisp-like macro programming) but has a strict type system along with multiple dispatch, which effectively allows for OOP constructs.
- It allows for dynamic typing but since it is JIT compiled using LLVM, you can specify static types for variables and thus take advantage of lots of smart optimisation.
- There is garbage collection but also the ability to get right in there and reach into pointers and memory-allocation manually.
It's truly a pleasure to work with once you appreciate what's possible.
I'm sure this is too late to get much visibility, but I recently looked into using Julia (for my MS thesis) and found it sorely lacking in one major way that I found unforgivable.
Their type system is pretty interesting, and allows for some really cool abilities to parameterize things using types. I'd like to have seen more work done on, effectively, strong typedefs (or whatever $lang wants to call them). However that sort of thing is fairly uncommon so it's hard to hold it against them too much.
The biggest issue, and one they seem unwilling to really address, is that actually using the type system to do anything cool requires you to rely entirely on documentation which may or may not exist (or be up-to-date).
Each type has an entirely implicit interface which must be implemented. There is no syntax to mark which methods must be present for a type. No marker for the method definitions, no block where they must be named, or anything like that. You can't even assume you'll find all the interface methods defined in a single file because they can appear literally anywhere.
Whoever wrote the type has in mind some interface, a minimal set of methods, that must be present for any derived type. There are only two possible ways to determine this. The first is to look to the documentation. Even for the basic types defined by Julia this documentation doesn't seem to exist for all types. I don't have high hopes for most libraries to provide, and keep up to date, this documentation either. This concern gets even greater when considering the focus is largely on scientific computing.
Without up-to-date documentation, the only option is to manually review every file in a library and keep track of the methods defined for the type you're interested in. With multiple dispatch, you can't even get away with just checking the first parameter either. Then you need to look at the definitions for those methods to narrow your list down to the minimal set required. This is not an easy task.
This issue has been brought up before and discussed, but nobody seemed very interested in it. This is a fairly major issue in my view, as it cripples the otherwise very interesting type system. As it stands, it seems to be a fairly complex solution to the issue of getting good compiled code out of a dynamic language. It could be so much more.
Julia does it an order of magnitude better, though.
A generic `f` will probably be similar to the Common Lisp one. However, as soon as you use `f` in a context where types are known (e.g. if you use multimethods to overload a function that calls `f`), it will be specialized and JIT-compiled for the specific type (int, float, matrix, ...), with no extra work for the programmer.
the main thing Julia loses from this approach is function types. since each function is a user extendable blob of methods, there basically ceases to be a meaningful notion of the type of a function. imo, this is a worthwhile trade-off, but it can sometimes be annoying (and it's why Julia is hard to compile ahead of time. this pattern makes figuring out which methods you need to compile Turing complete)
In Julia every function has a number of associated variants. We call them methods, not to be confused with methods in OOP.
So for every function there is a table of methods. Each method takes a different number of arguments or arguments with different types.
The REPL is able to introspect this table. So when you type a function name in Julia and hit tab, it will essentially dump this table. If you start filling in some arguments, it will use the types of those arguments to filter this table, showing you the only remaining choices.
You can even go in reverse. The function `methodswith` allows you to provide a type and Julia will search through the methods of every function to show every method accepting that type as one of its arguments.
Julia stores a lot of metadata with object and functions. You will be surprised how powerful this system is once you start using it. E.g. you can jump to the definition of a function, that was generated on the fly in a for loop. The function when created keeps track of what file and line number it was created, even if that happened through meta programming.
Though while Julia has even sum and product types, it's not common for people to go to the level of detail of an ML language (plus you can't dispatch on the values of a struct), and there is no formal way to annotate an implicit interface for now, so each method will usually not be as clear from types alone as Ocaml.
reply