←back to thread

186 points darkolorin | 7 comments | | HN request time: 0.64s | source | bottom

We wrote our inference engine on Rust, it is faster than llama cpp in all of the use cases. Your feedback is very welcomed. Written from scratch with idea that you can add support of any kernel and platform.
1. cwlcwlcwlingg ◴[] No.44571918[source]
Wondering why use Rust other than C++
replies(5): >>44572202 #>>44573216 #>>44574364 #>>44574476 #>>44576525 #
2. adastra22 ◴[] No.44572202[source]
Why use C++?
replies(1): >>44576559 #
3. bee_rider ◴[] No.44573216[source]
I wonder why they didn’t use Fortran.
4. giancarlostoro ◴[] No.44574364[source]
...or D? or Go? or Java? C#? Zig? etc they chose what they were most comfortable with. Rust is fine, it's not for everyone clearly, but those who use it produce high quality software, I would argue similar with Go, without all the unnecessary mental overhead of C or C++
5. outworlder ◴[] No.44574476[source]
Why use C++ for greenfield projects?
6. khurs ◴[] No.44576525[source]
The recommendation from the security agencies is to prefer Rust over C++ as less risk of exploits.

Checked and Lama.cpp used C++ (obviously) and Llama uses Go.

7. khurs ◴[] No.44576559[source]
So C++ users don't need to learn something new.