but all the normal marketing words: in my opinion it is fast, expressive, and has particularly good APIs for array manipulation
more realistic examples of compiling a Julia package into .so: https://indico.cern.ch/event/1515852/contributions/6599313/a...
1.9 and 1.10 made huge gains in package precompilation and native code caching. then attentions shifted and there were some regressions in compile times due to unrelated things in 1.11 and the upcoming 1.12. but at the same time, 1.12 will contain an experimental new feature `--trim` as well as some further standardization around entry points to run packages as programs, which is a big step towards generating self-contained small binaries. also nearly all efforts in improving tooling are focused on providing static analysis and helping developers make their program more easily compilable.
it's also important a bit to distinguish between a few similar but related needs. most of what I just described applies to generating binaries for arbitrary programs. but for the example you stated "time to first plot" of existing packages, this is already much improved in 1.10 and users (aka non-package-developers) should see sub-second TTFP, and TTFX for most packages they use that have been updated to use the precompilation goodies in recent versions
using Plots
display(plot(rand(8)))
end
1.074321 seconds
On Julia 1.12 (currently at release candidate stage), <1mb hello world is possible with juliac (although juliac in 1.12 is still marked experimental)I'll warn you that Julia's ML ecosystem has the most competitive advantage on "weird" types of ML, involving lots of custom gradients and kernels, integration with other pieces of a simulation or diffeq, etc.
if you just want to throw some tensors around and train a MLP, you'll certainly end up finding more rough edges than you might in PyTorch
What Julia needs though: wayyyy more thorough tooling to support auto generated docs, well integrated with package management tooling and into the web package management ecosystem. Julia attracts really cutting edge research and researchers writing code. They often don't have time to write docs and that shouldn't really matter.
Julia could definitely use some work in the areas discussed in this podcast, not so much the high level interfaces but the low level ones. That's really hard though!
Then again, I am also open to the fact that I'm jammed up by the production use of dynamically typed languages, and maybe the "for ML" part means "I code in Jupyter notebooks" and thus give no shits about whether person #2 can understand what's happening
Combine that with all the cutting edge applied math packages often being automatically compatible with the autodiff and GPU array backends, even if the library authors didn't think about that... it's a recipe for a lot of interesting possibilities.
explicit and strict types on arguments to functions is one way, but certainly not the only way, nor probably the best way to effect that
I readily admit that I am biased in that I believe that having a computer check that every reference to every relationship does what it promises, all the time
more generally, the most important bits of a particular function to understand is
* what should it be called with
* what should it return
* what side effects might it have
and the "what" here refers to properties in a general sense. types are a good shortcut to signify certain named collections of properties (e.g., the `Int` type has arithmetic properties). but there are other ways to express traits, preconditions, postconditions, etc. besides types
they sure can...
As far as the executable size, it was only 85kb in my test, a bouncing balls simulation. However, it required 300MB of Julia libraries to be shipped with it. About 2/3 of that is in libjulia-codegen.dll, libLLVM-16jl.dll. So you're shipping this chunky runtime and their LLVM backend. If you're willing to pay for that, you can ship a Julia executable. It's a better story than what Python offers, but it's not great if you want small, self-contained executables.
In practice, the Julia package ecosystem is weak and generally correctness is not a high priority. But the language is great, if you're willing to do a lot of the work yourself.
I've been involved in a few programming language projects, so I'm sympathetic as to how much work goes into one and how long they can take.
At the same time, it makes me wary of Julia, because it highlights that progress is very slow. I think Julia is trying to be too much at once. It's hard enough to be a dynamic, interactive language, but they also want to claim to be performant and compiled. That's a lot of complexity for a small team to handle and deliver on.
but there are other ways to express traits, preconditions, postconditions, etc. besides types
You can also put that in the type system, and expressive languages do. Its just a compiler limitations when we can't.I mean, even in C++ with concepts we can do most of that. And C++ doesn't have the most expressive type system.
That being said, I've always found the argument that types can be overly restrictive and prevent otherwise valid code from running unconvincing. I've yet to see dynamic code that benefits from this alleged advantage.
Nearly universally the properly typed code for the same thing is better, more reliable and easier for new people to understand and modify. So sure, you can avoid all of this if the types are really what bother you, but it feels a bit like saying "there are stunts I can pull off if I'm not wearing a seatbelt that I just can't physically manage if I am."
If doing stunts is your thing, knock yourself out, but I'd rather wear a seatbelt and be more confident I'm going to get to my destination in one piece.