←back to thread

3883 points kuroguro | 1 comments | | HN request time: 0.313s | source
Show context
breakingcups ◴[] No.26296724[source]
It is absolutely unbelievable (and unforgivable) that a cash cow such as GTA V has a problem like this present for over 6 years and it turns out to be something so absolutely simple.

I do not agree with the sibling comment saying that this problem only looks simple and that we are missing context.

This online gamemode alone made $1 billion in 2017 alone.

Tweaking two functions to go from a load time of 6 minutes to less than two minutes is something any developer worth their salt should be able to do in a codebase like this equipped with a good profiler.

Instead, someone with no source code managed to do this to an obfuscated executable loaded with anti-cheat measures.

The fact that this problem is caused by Rockstar's excessive microtransaction policy (the 10MB of JSON causing this bottleneck are all available microtransaction items) is the cherry on top.

(And yes, I might also still be salty because their parent company unjustly DMCA'd re3 (https://github.com/GTAmodding/re3), the reverse engineered version of GTA III and Vice City. A twenty-year-old game. Which wasn't even playable without purchasing the original game.)

replies(40): >>26296812 #>>26296886 #>>26296970 #>>26297010 #>>26297087 #>>26297123 #>>26297141 #>>26297144 #>>26297184 #>>26297206 #>>26297323 #>>26297332 #>>26297379 #>>26297401 #>>26297448 #>>26297480 #>>26297806 #>>26297961 #>>26298056 #>>26298135 #>>26298179 #>>26298213 #>>26298234 #>>26298624 #>>26298682 #>>26298777 #>>26298860 #>>26298970 #>>26299369 #>>26299512 #>>26299520 #>>26300002 #>>26300046 #>>26301169 #>>26301475 #>>26301649 #>>26301961 #>>26304727 #>>26305016 #>>26311396 #
crazygringo ◴[] No.26298624[source]
I imagine the conversation between the programmer(s) and management went exactly like this:

Management: So, what can we do about the loading times?

Programmer(s): That's just how long it takes to load JSON. After all, the algorithm/function couldn't be more straightforward. Most of the complaints are probably coming from older hardware. And with new PC's and next-gen consoles it probably won't be noticeable at all.

Management: OK, guess that's that then. Sucks but nothing we can do.

Management had no idea of knowing whether this is true or not -- they have to trust what their devs tell them. And every time over the years someone asked "hey why is loading so slow?" they get told "yeah they looked into it when it was built, turns out there was no way to speed it up, so not worth looking into again."

And I'm guessing that while Rockstar's best devs are put on the really complex in-game performance stuff... their least experienced ones are put on stuff like... loading a game's JSON config from servers.

I've seen it personally in the past where the supposedly "easy" dev tasks are given to a separate team entirely, accountable to management directly, instead of accountable to the highly capable tech lead in charge of all the rest. I've got to assume that was basically the root cause here.

But I agree, this is incredibly embarrassing and unforgiveable. Whatever chain of accountability allowed this to happen... goddamn there's got to be one hell of an internal postmortem on this one.

replies(2): >>26298742 #>>26300530 #
CountHackulus ◴[] No.26298742[source]
I can pretty much guarantee that there was no discussion with management like that. From experience, live ops games are essentially a perpetually broken code base that was rushed into production, then a breakneck release schedule for new features and monetization. I've personally had this conversation a few times:

Programmer: Loading times are really slow, I want to look into it next sprint.

Management: Feature X is higher priority, put it in the backlog and we'll get to it.

replies(7): >>26298776 #>>26298779 #>>26299089 #>>26299398 #>>26299585 #>>26305946 #>>26307872 #
1. trissylegs ◴[] No.26299585[source]
I once did get to look an optimsing a bad load time... only because at that point it was crashing due to running out of memory. (32-bit process)

In this case it was a JSON, but using .NET so Newtonsoft is at lease efficient. The issues were many cases of: * Converting string to lowercase to compare them as the keys were case insensitive. (I replaced with a StringComparison.OrderinalCaseInsensitve) * Reduntant Dictionary's * Using Dictionary<Key, Key> as a Set. replaced with HashSet.

The data wasn't meant to be that big, but when a big client was migrated it ended up with 200MB of JSON. (If there data was organised differently it would've been split accross many json blobs instead)

It would also be nice to handle it alls as UTF8 like System.Text.Json does. That would half all the strings saving a fair bit. (I mean the JSON blob it starts with is converted to UTF-16 because .NET)