←back to thread

3883 points kuroguro | 3 comments | | HN request time: 0s | source
Show context
comboy ◴[] No.26296735[source]
Holy cow, I'm a very casual gamer, I was excited about the game but when it came out I decided I don't want to wait that long and I'll wait until they sort it out. 2 years later it still sucked. So I abandoned it. But.. this... ?! This is unbelievable. I'm certain that many people left this game because of the waiting time. Then there are man-years wasted (in a way different than desired).

Parsing JSON?! I thought it was some network game logic finding session magic. If this is true that's the biggest WTF I saw in the last few years and we've just finished 2020.

Stunning work just having binary at hand. But how could R* not do this? GTAV is so full of great engineering. But if it was a CPU bottleneck then who works there that wouldn't just be irked to try to nail it? I mean it seems like a natural thing to try to understand what's going on inside when time is much higher than expected even in the case where performance is not crucial. It was crucial here. Almost directly translates to profits. Unbelievable.

replies(5): >>26297228 #>>26297263 #>>26297997 #>>26298680 #>>26299917 #
1. jiggawatts ◴[] No.26298680[source]
> Parsing JSON?!

Many developers I have spoken to out there in the wild in my role as a consultant have wildly distorted mental models of performance, often many orders of magnitude incorrect.

They hear somewhere that "JSON is slow", which it is, but you and I know that it's not this slow. But "slow" can encompass something like 10 orders of magnitude, depending on context. Is it slow relative to a non-validating linear binary format? Yes. Is it minutes slow for a trivial amount of data? No. But in their mind... it is, and there's "nothing" that can be done about it.

Speaking of which: A HTTPS REST API call using JSON encoding between two PaaS web servers in Azure is about 3-10ms. A local function call is 3-10ns. In other words, a lightweight REST call is one million times slower than a local function call, yet many people assume that a distributed mesh microservices architecture has only "small overheads"! Nothing could be further from the truth.

Similarly, a disk read on a mechanical drive is hundreds of thousands of times slower than local memory, which is a thousand times slower than L1 cache.

With ratios like that being involved on a regular basis, it's no wonder that programmers make mistakes like this...

replies(1): >>26299393 #
2. salawat ◴[] No.26299393[source]
Tbe funny thing is, as a long time SDET, I had to give up trying to get people to write or architect in a more "local first" manner.

Everyone thinks the network is free... Until it isn't. Every bit move in a computer has a time cost, and yes, it's small... But... When you have processors as fast as what exist today, it seems a sin that we delegate so much functionality out to some other machine across a network boundary when the same work could be done locally. The reason why though?

Monetizability and trust. All trivial computation must be done on my services so they can be metered and charged for.

We're hamstringing the programs we run for the sole reason that we don't want to make tools. We want to make invoices.

replies(1): >>26308164 #
3. gridspy ◴[] No.26308164[source]
And like so many things, we're blind to how our economic systems are throwing sand in the gears of our technical ones.

I love your point that shipping a library (of code to locally execute) with a good API would outperform an online HTTPS API for almost all tasks.