←back to thread

I don't like NumPy

(dynomight.net)
480 points MinimalAction | 2 comments | | HN request time: 0.449s | source
Show context
ChrisRackauckas ◴[] No.43998345[source]
One of the reasons why I started using Julia was because the NumPy syntax was so difficult. Going from MATLAB to NumPy I felt like I suddenly became a mediocre programmer, spending less time on math and more time on "performance engineering" of just trying to figure out how to use NumPy right. Then when I went to Julia it made sense to vectorize when it felt good and write a loop when it felt good. Because both are fast, focus on what makes the code easiest to read an understand. This blog post encapsulates exactly that experience and feeling.

Also treating things like `np.linalg.solve` as a black box that is the fastest thing in the world and you could never any better so please mangle code to call it correctly... that's just wrong. There's many reasons to build problem specific linear algebra kernels, and that's something that is inaccessible without going deeper. But that's a different story.

replies(4): >>43998533 #>>43998779 #>>43999127 #>>43999507 #
1. amluto ◴[] No.43999507[source]
Is MATLAB materially different? Loops are slow (how slow depends on the version), and the fastest thing in the world is the utter perfection of the black box called '\'.
replies(1): >>44000914 #
2. moregrist ◴[] No.44000914[source]
The last time I seriously used Matlab, over 10 years ago, they had implemented JIT compilation and often straightforward loops were a lot faster than trying to vectorize. And definitely less error-prone.

Iirc, ‘\’ was just shorthand for solving a system of equations, which you could mentally translate into the appropriate LAPACK function without loss of precision. You did have to be a little more careful about making extra copies than in Fortran or C (or even Python). But nothing was magic.