←back to thread

193 points ingve | 2 comments | | HN request time: 0.417s | source
Show context
michalsustr ◴[] No.43713518[source]
I’m not familiar with Haskell concurrency. The combination of green threads and large memory allocations due to immutable data structures sounds like it would be hard to implement a web server handling 10k+ concurrent requests on commodity hardware?

Btw. too bad author talks about microsecond guarantees usage but does not provide a link, that would be interesting reading.

replies(7): >>43713603 #>>43713615 #>>43713878 #>>43714039 #>>43715073 #>>43716268 #>>43724017 #
1. whateveracct ◴[] No.43724017[source]
It doesn't actually have "large memory allocations" due to immutable data structures. This is a meme that isn't true. Immutable data structures, especially at small scale, do not have huge performance penalties. You don't copy the entire structure over and over...you copy the O(log n) spine.

Haskell's GC is also fast when you are mostly generating garbage, which is inherently true for web server handlers.

replies(1): >>43724114 #
2. butterisgood ◴[] No.43724114[source]
Deforestation helps with that

A composition of catamorphic and anamorphic functions can eliminate a lot of the in-between allocations (a hylomorphism)

Basically it looks like you’re building a ton of intermediate structure then consuming it - meaning much of the in-between stuff can be eliminated.

Interesting optimizations and a little mind blowing when you see it.