Most active commenters
  • jamespropp(4)

Is Matrix Multiplication Ugly?

(mathenchant.wordpress.com)
23 points jamespropp | 12 comments | | HN request time: 1.117s | source | bottom
1. ◴[] No.46009661[source]
2. jamespropp ◴[] No.46009709[source]
Do you disagree with my take or think I’m missing Witt’s point? I’d be happy to hear from people who disagree with me.
replies(5): >>46010312 #>>46010402 #>>46010578 #>>46010720 #>>46010913 #
3. sfpotter ◴[] No.46010281[source]
I think this sentence:

> But matrix multiplication, to which our civilization is now devoting so many of its marginal resources, has all the elegance of a man hammering a nail into a board.

is the most interesting one.

A man hammering a nail into a board can be both beautiful and elegant! If you've ever seen someone effortlessly hammer nail after nail into wood without having to think hardly at all about what they're doing, you've seen a master craftsman at work. Speaking as a numerical analyst, I'd say a well multiplied matrix is much the same. There is much that goes into how deftly a matrix might be multiplied. And just as someone can hammer a nail poorly, so too can a matrix be multiplied poorly. I would say the matrices being multiplied in service of training LLMs are not a particularly beautiful example of what matrix multiplication has to offer. The fast Fourier transform viewed as a sparse matrix factorization of the DFT and its concomitant properties of numerical stability might be a better candidate.

replies(1): >>46010745 #
4. amelius ◴[] No.46010312[source]
Maybe the problem is that matrices are too general.

You can have very beautiful algorithms when you assume the matrices involved have a certain structure. You can even have that A*B == B*A, if A and B have a certain structure.

5. LegionMammal978 ◴[] No.46010402[source]
If the O(n^3) schoolbook multiplication were the best that could be done, then I'd totally agree that "it's simply the nature of matrices to have a bulky multiplication process". Yet there's a whole series of algorithms (from the Strassen algorithm onward) that use ever-more-clever ways to recursively batch things up and decrease the asymptotic complexity, most of which aren't remotely practical. And for all I know, it could go on forever down to O(n^(2+ε)). Overall, I hate not being able to get a straight answer for "how hard is it, really".
6. djmips ◴[] No.46010578[source]
Ignore me then because I agree with you. :) He sounds like someone who upon first hearing jazz to complain it was ugly.
7. veqq ◴[] No.46010720[source]
> sends the pair (x, y) to the pair (−x, y)

I know linear algebra, but this part seems profoundly unclear. What does "send" mean? Following with different examples in 2 by 2 notation only makes it worse. It seems like you're changing referents constantly.

replies(1): >>46010763 #
8. jamespropp ◴[] No.46010745[source]
Yes!
9. jamespropp ◴[] No.46010763{3}[source]
Thanks for pointing this out. I’ll work on this passage tomorrow.
10. fracus ◴[] No.46010904[source]
I think it is just a matter of perspective. You can both be right. I don't think there is an objective answer to this question.
11. johngossman ◴[] No.46010913[source]
I think you're right that the inelegant part is how AI seems to just consist of endless loops of multiplication. I say this as a graphics programmer who realized years ago that all those beautiful images were just lots of MxNs, and AI takes this to a whole new level. When I was in college they told us most of computing resources were used doing Linear Programming. I wonder when that crossed over to graphics or AI (or some networking operation like SSL)?