Most active commenters

    ←back to thread

    311 points melodyogonna | 21 comments | | HN request time: 2.199s | source | bottom
    1. Cynddl ◴[] No.45137883[source]
    Anyone knows what Mojo is doing that Julia cannot do? I appreciate that Julia is currently limited by its ecosystem (although it does interface nicely with Python), but I don't see how Mojo is any better then.
    replies(8): >>45137977 #>>45137999 #>>45138029 #>>45138039 #>>45138393 #>>45138420 #>>45138513 #>>45143187 #
    2. jakobnissen ◴[] No.45137977[source]
    Mojo to me looks significantly lower level, with a much higher degree of control.

    Also, it appears to be more robust. Julia is notoriously fickle in both semantics and performance, making it unsuitable for foundational software the way Mojo strives for.

    replies(1): >>45143011 #
    3. thetwentyone ◴[] No.45137999[source]
    Especially because Julia has pretty user friendly and robust GPU capabilities such as JuliaGPU and Reactant[2] among other generic-Julia-code to GPU options.

    1: https://enzymead.github.io/Reactant.jl/dev/ 2: https://enzymead.github.io/Reactant.jl/dev/

    replies(1): >>45138080 #
    4. Alexander-Barth ◴[] No.45138029[source]
    I guess that the interoperability with Python is a bit better. But on the other hand, the PythonCall.jl (allowing calling python from julia) is quite good and stable. In Julia, you have quite good ML frameworks (Lux.jl and Flux.jl). I am not sure that you have mojo-native ML frameworks which are similarly usable.
    5. jb1991 ◴[] No.45138039[source]
    Isn't Mojo designed for writing kernels? That's what it says at the top of the article:

    > write state of the art kernels

    Julia and Python are high-level languages that call other languages where the kernels exist.

    replies(1): >>45138086 #
    6. jb1991 ◴[] No.45138080[source]
    I get the impression that most of the comments in this thread don't understand what a GPU kernel is. These high-level languages like Python and Julia are not running on the kernel, they are calling into other kernels usually written in C++. The goal is different with Mojo, it says at the top of the article:

    > write state of the art kernels

    You don't write kernels in Julia.

    replies(5): >>45138165 #>>45138182 #>>45138372 #>>45138558 #>>45138820 #
    7. Sukera ◴[] No.45138086[source]
    No, you can write the kernels directly in Julia using KernelAbstractions.jl [1].

    [1] https://juliagpu.github.io/KernelAbstractions.jl/stable/

    replies(1): >>45138124 #
    8. ◴[] No.45138124{3}[source]
    9. jakobnissen ◴[] No.45138165{3}[source]
    Im pretty sure Julia does JIT compilation of pure Julia to the GPU: https://github.com/JuliaGPU/GPUCompiler.jl
    replies(1): >>45138233 #
    10. ssfrr ◴[] No.45138182{3}[source]
    It doesn’t make sense to lump python and Julia together in this high-level/low-level split. Julia is like python if numba were built-in - your code gets jit compiled to native code so you can (for example) write for loops to process an array without the interpreter overhead you get with python.

    People have used the same infrastructure to allow you to compile Julia code (with restrictions) into GPU kernels

    11. actionfromafar ◴[] No.45138233{4}[source]
    ” you should use one of the packages that builds on GPUCompiler.jl, such as CUDA.jl, AMDGPU.jl, Metal.jl, oneAPI.jl, or OpenCL.jl”

    Not sure how that organization compares to Mojo.

    12. arbitrandomuser ◴[] No.45138372{3}[source]
    >You don't write kernels in Julia.

    The package https://github.com/JuliaGPU/KernelAbstractions.jl was specifically designed so that julia can be compiled down to kernels.

    Julia's is high level yes, but Julia's semantics allow it to be compiled down to machine code without a "runtime interpretter" . This is a core differentiating feature from Python. Julia can be used to write gpu kernels.

    13. hansvm ◴[] No.45138393[source]
    [0] https://danluu.com/julialang/
    14. ubj ◴[] No.45138420[source]
    > Anyone knows what Mojo is doing that Julia cannot do?

    First-class support for AoT compilation.

    https://docs.modular.com/mojo/cli/build

    Yes, Julia has a few options for making executables but they feel like an afterthought.

    15. MohamedMabrouk ◴[] No.45138513[source]
    * Compiling arbitrary Julia code into a native standalone binary (a la rust/C++) with all its consequcnes.
    16. pjmlp ◴[] No.45138558{3}[source]
    See new cu tile architecture on CUDA, designed from the ground up with Python in mind.
    17. adgjlsfhk1 ◴[] No.45138820{3}[source]
    Julia's GPU stack doesn't compile to C++. it compiles Julia straight to GPU assembly.
    18. Archit3ch ◴[] No.45143011[source]
    > Also, it appears to be more robust.

    Sure, Mojo the language is more robust. Until its investors decide to 10x the licensing Danegeld.

    19. bobajeff ◴[] No.45143187[source]
    I've looked into making Python modules with Julia and it doesn't look like that is very well supported right now. Where as it's a core feature of Mojo.
    replies(1): >>45143720 #
    20. dunefox ◴[] No.45143720[source]
    Shouldn't something like this work? https://github.com/JuliaPy/PythonCall.jl
    replies(1): >>45144282 #
    21. bobajeff ◴[] No.45144282{3}[source]
    That might work but can't seem to find much information on using it to create a pip installable module though.