←back to thread

58 points spearman | 1 comments | | HN request time: 0.418s | source
Show context
Lerc ◴[] No.41525765[source]
For me the questions to answer for whether or not I should bother.

Will it try and bind me to other technologies?

Does it work out of the box on ${GPU}?

Is it well supported?

Will it continue to be supported?

replies(2): >>41526910 #>>41528035 #
crazygringo ◴[] No.41526910[source]
Playing with JAX on Google Colab (Nvidia T4), everything works great.

Sadly, I cannot get JAX to work with the built-in GPU on my M1 MacBook Air. In theory it's supposed to work:

https://developer.apple.com/metal/jax/

But it crashes Python when I try to run a compiled function. And that's only after discovering I need an older specific version of jax-metal, because newer versions apparently don't work with M1 anymore (only M2/M3) -- they don't even report the GPU as existing. And even if you get it running, it's missing support for complex numbers.

I'm not clear whether it's Google or Apple who is building/maintaining support for Apple M chips though.

JAX works perfectly in CPU mode though on my MBA, so at least I can use it for development and debugging.

replies(2): >>41528019 #>>41531485 #
spearman ◴[] No.41528019[source]
Pretty sure it's Apple building it, and they're using JAX in-house so I imagine it will get better over time. Though they do love to drop support for old things so maybe M1 will never work again...
replies(1): >>41529825 #
mccoyb ◴[] No.41529825[source]
I think they’re likely using MLX in house now, no? (Probably not everyone, ofc - but seems likely that many will just use the native array framework designed explicitly for MX chips)
replies(1): >>41534790 #
spearman ◴[] No.41534790[source]
From https://machinelearning.apple.com/research/introducing-apple...

> Our foundation models are trained on Apple's AXLearn framework, an open-source project we released in 2023. It builds on top of JAX

replies(1): >>41536458 #
1. mccoyb ◴[] No.41536458[source]
Wow! thanks for the ref