←back to thread

311 points melodyogonna | 8 comments | | HN request time: 0.001s | source | bottom
Show context
postflopclarity ◴[] No.45138679[source]
Julia could be a great language for ML. It needs more mindshare and developer attention though
replies(4): >>45138887 #>>45138911 #>>45139421 #>>45140214 #
1. numbers_guy ◴[] No.45138911[source]
What makes Julia "great" for ML?
replies(3): >>45139000 #>>45139316 #>>45142213 #
2. postflopclarity ◴[] No.45139000[source]
I would use the term "potentially great" rather than plain "great"

but all the normal marketing words: in my opinion it is fast, expressive, and has particularly good APIs for array manipulation

replies(1): >>45139131 #
3. numbers_guy ◴[] No.45139131[source]
Interesting. I am experimenting with different ML ecosystems and wasn't really considering Julia at all but I put it on the list now.
replies(2): >>45139169 #>>45139451 #
4. postflopclarity ◴[] No.45139169{3}[source]
Glad to hear. I've found it's a very welcoming community.

I'll warn you that Julia's ML ecosystem has the most competitive advantage on "weird" types of ML, involving lots of custom gradients and kernels, integration with other pieces of a simulation or diffeq, etc.

if you just want to throw some tensors around and train a MLP, you'll certainly end up finding more rough edges than you might in PyTorch

replies(1): >>45148775 #
5. macawfish ◴[] No.45139316[source]
Built-in autodifferentiation and amazing libraries built around it, plus tons of cutting edge applied math libraries that interoperate automatically, thanks to Julia's well conceived approach to the expression problem (multiple dispatch). Aside from that, the language itself is like a refined python so it should be pretty friendly off the bat to ML devs.

What Julia needs though: wayyyy more thorough tooling to support auto generated docs, well integrated with package management tooling and into the web package management ecosystem. Julia attracts really cutting edge research and researchers writing code. They often don't have time to write docs and that shouldn't really matter.

Julia could definitely use some work in the areas discussed in this podcast, not so much the high level interfaces but the low level ones. That's really hard though!

6. macawfish ◴[] No.45139451{3}[source]
If I wanted to get into research ML, I'd pick Julia no doubt. It allows both conventional ML techniques where we throw tons of parameters at the problem, but additionally a more nimble style where we can train over ordinary functions.

Combine that with all the cutting edge applied math packages often being automatically compatible with the autodiff and GPU array backends, even if the library authors didn't think about that... it's a recipe for a lot of interesting possibilities.

7. bobbylarrybobby ◴[] No.45142213[source]
It's a low level language with a high level interface. In theory, GC aside, you should be able to write code as performant as C++ without having to actually write C++. It's also homoiconic and the compiler is part of the language’s API, so you can do neat things with macros that let more or less you temporarily turn it into a different language.

In practice, the Julia package ecosystem is weak and generally correctness is not a high priority. But the language is great, if you're willing to do a lot of the work yourself.

8. salty_biscuits ◴[] No.45148775{4}[source]
Yes, my experience has been that it is great if you need to do something particularly weird, but less smooth to do something ordinary.