←back to thread

3883 points kuroguro | 2 comments | | HN request time: 0.001s | source
Show context
breakingcups ◴[] No.26296724[source]
It is absolutely unbelievable (and unforgivable) that a cash cow such as GTA V has a problem like this present for over 6 years and it turns out to be something so absolutely simple.

I do not agree with the sibling comment saying that this problem only looks simple and that we are missing context.

This online gamemode alone made $1 billion in 2017 alone.

Tweaking two functions to go from a load time of 6 minutes to less than two minutes is something any developer worth their salt should be able to do in a codebase like this equipped with a good profiler.

Instead, someone with no source code managed to do this to an obfuscated executable loaded with anti-cheat measures.

The fact that this problem is caused by Rockstar's excessive microtransaction policy (the 10MB of JSON causing this bottleneck are all available microtransaction items) is the cherry on top.

(And yes, I might also still be salty because their parent company unjustly DMCA'd re3 (https://github.com/GTAmodding/re3), the reverse engineered version of GTA III and Vice City. A twenty-year-old game. Which wasn't even playable without purchasing the original game.)

replies(40): >>26296812 #>>26296886 #>>26296970 #>>26297010 #>>26297087 #>>26297123 #>>26297141 #>>26297144 #>>26297184 #>>26297206 #>>26297323 #>>26297332 #>>26297379 #>>26297401 #>>26297448 #>>26297480 #>>26297806 #>>26297961 #>>26298056 #>>26298135 #>>26298179 #>>26298213 #>>26298234 #>>26298624 #>>26298682 #>>26298777 #>>26298860 #>>26298970 #>>26299369 #>>26299512 #>>26299520 #>>26300002 #>>26300046 #>>26301169 #>>26301475 #>>26301649 #>>26301961 #>>26304727 #>>26305016 #>>26311396 #
crazygringo ◴[] No.26298624[source]
I imagine the conversation between the programmer(s) and management went exactly like this:

Management: So, what can we do about the loading times?

Programmer(s): That's just how long it takes to load JSON. After all, the algorithm/function couldn't be more straightforward. Most of the complaints are probably coming from older hardware. And with new PC's and next-gen consoles it probably won't be noticeable at all.

Management: OK, guess that's that then. Sucks but nothing we can do.

Management had no idea of knowing whether this is true or not -- they have to trust what their devs tell them. And every time over the years someone asked "hey why is loading so slow?" they get told "yeah they looked into it when it was built, turns out there was no way to speed it up, so not worth looking into again."

And I'm guessing that while Rockstar's best devs are put on the really complex in-game performance stuff... their least experienced ones are put on stuff like... loading a game's JSON config from servers.

I've seen it personally in the past where the supposedly "easy" dev tasks are given to a separate team entirely, accountable to management directly, instead of accountable to the highly capable tech lead in charge of all the rest. I've got to assume that was basically the root cause here.

But I agree, this is incredibly embarrassing and unforgiveable. Whatever chain of accountability allowed this to happen... goddamn there's got to be one hell of an internal postmortem on this one.

replies(2): >>26298742 #>>26300530 #
CountHackulus ◴[] No.26298742[source]
I can pretty much guarantee that there was no discussion with management like that. From experience, live ops games are essentially a perpetually broken code base that was rushed into production, then a breakneck release schedule for new features and monetization. I've personally had this conversation a few times:

Programmer: Loading times are really slow, I want to look into it next sprint.

Management: Feature X is higher priority, put it in the backlog and we'll get to it.

replies(7): >>26298776 #>>26298779 #>>26299089 #>>26299398 #>>26299585 #>>26305946 #>>26307872 #
Shish2k ◴[] No.26299398[source]
At my last job I had that conversation several times :( Our website would regularly take 30s+ for a page to load, and we had an hour of scheduled downtime each week, because that’s how long it took the webapp to restart each time code was pushed. “Scheduled downtime doesn’t count as downtime, we still have the three 9’s that meets our SLA, and there’s nothing in the SLA about page load times. Now get back to building that feature which Sales promised a client was already finished”...

Aside from being generally shameful, the real kicker was that this was a "website performance & reliability" company x__x

replies(1): >>26299882 #
StillBored ◴[] No.26299882[source]
Reminds me of working for a company in the 1990's that ran constant television ads convincing everyone their experts could fix everyone's networking problems. OTOH, as an engineer working at the company the file share used for editing code/building/etc, died for an hour or two seemingly every day when the network took its daily vacation.

Many of us, after having run out of other crap to do, would sit around and wonder if the "B" grade network engineers were assigned to run the company LAN, or the ones we sent onsite were as incompetent.

replies(1): >>26299929 #
1. dragonwriter ◴[] No.26299929[source]
> Many of us, after having run out of other crap to do, would sit around and wonder if the "B" grade network engineers were assigned to run the company LAN

Internal IT is almost invariably a cost center, the technicians providing the service you are selling to customers are working in a profit center. So, yeah, probably that plus be managed in a way which focussed on minimizing cost not maximizing internal customer satisfaction or other quality metrics.

replies(1): >>26313559 #
2. imtringued ◴[] No.26313559[source]
Just because you are minimizing costs doesn't mean you have to minimize costs until you no longer get the quality you need.