←back to thread

Scala 3 slowed us down?

(kmaliszewski9.github.io)
261 points kmaliszewski | 6 comments | | HN request time: 0s | source | bottom
Show context
munchler ◴[] No.46183417[source]
I’m not familiar with Scala’s macro system, but it seems like a big takeaway here is: Be careful with code that invokes the compiler (JIT) at runtime. That seems like it’s asking for trouble.
replies(1): >>46183886 #
dtech ◴[] No.46183886[source]
Macro's are compile time, there is no runtime codegen.

The problem was overly-frequent inlining generating enormous expressions, causing a lot JIT phase and slow execution.

replies(2): >>46183957 #>>46185245 #
1. munchler ◴[] No.46183957[source]
Thank you for the clarification. If I understand correctly, these large expressions are created at compile-time, but the impact isn't felt until JIT occurs in the runtime environment. In that scenario, shouldn't the JIT just run once at startup, though? I'm still not quite understanding how JIT can take so much time in a production environment.
replies(1): >>46185558 #
2. hunterpayne ◴[] No.46185558[source]
Because the jit will let the unoptimized code run a few (hundred) times to take measurements to know what needs to be optimized and how it needs to be optimized. This is a good solution and makes hotspot very effective. The problem is that it happens randomly a few minutes/seconds into the operation of the service. So you randomly have a big pause with the performance hit everytime you run the service. The upside is that this only happens once. But you have to plan for a big performance hit to requests which are unlucky enough to be called at the wrong time.
replies(2): >>46185951 #>>46187101 #
3. pretzellogician ◴[] No.46185951[source]
And this can generally be avoided as well, by doing "warmup" when starting your service (effectively, mock some calls), but before accepting requests.
replies(1): >>46186185 #
4. hunterpayne ◴[] No.46186185{3}[source]
Of course, but then you have to actually do this. It is just another complexity to add. Also, I was answering a question about the hows and whys of the jit. I wasn't saying it was impossible to work around.
5. munchler ◴[] No.46187101[source]
Ah, that’s interesting. I wasn’t aware that JIT-ing will do that sort of performance analysis first. Thank you for the explanation.
replies(1): >>46211204 #
6. still_grokking ◴[] No.46211204{3}[source]
It won't in general.

Doing so is a feature of high-end VM runtimes like the state of the art JVMs or JS runtimes.