Most active commenters
  • ehsankia(3)
  • CogitoCogito(3)

←back to thread

3883 points kuroguro | 39 comments | | HN request time: 1.678s | source | bottom
Show context
breakingcups ◴[] No.26296724[source]
It is absolutely unbelievable (and unforgivable) that a cash cow such as GTA V has a problem like this present for over 6 years and it turns out to be something so absolutely simple.

I do not agree with the sibling comment saying that this problem only looks simple and that we are missing context.

This online gamemode alone made $1 billion in 2017 alone.

Tweaking two functions to go from a load time of 6 minutes to less than two minutes is something any developer worth their salt should be able to do in a codebase like this equipped with a good profiler.

Instead, someone with no source code managed to do this to an obfuscated executable loaded with anti-cheat measures.

The fact that this problem is caused by Rockstar's excessive microtransaction policy (the 10MB of JSON causing this bottleneck are all available microtransaction items) is the cherry on top.

(And yes, I might also still be salty because their parent company unjustly DMCA'd re3 (https://github.com/GTAmodding/re3), the reverse engineered version of GTA III and Vice City. A twenty-year-old game. Which wasn't even playable without purchasing the original game.)

replies(40): >>26296812 #>>26296886 #>>26296970 #>>26297010 #>>26297087 #>>26297123 #>>26297141 #>>26297144 #>>26297184 #>>26297206 #>>26297323 #>>26297332 #>>26297379 #>>26297401 #>>26297448 #>>26297480 #>>26297806 #>>26297961 #>>26298056 #>>26298135 #>>26298179 #>>26298213 #>>26298234 #>>26298624 #>>26298682 #>>26298777 #>>26298860 #>>26298970 #>>26299369 #>>26299512 #>>26299520 #>>26300002 #>>26300046 #>>26301169 #>>26301475 #>>26301649 #>>26301961 #>>26304727 #>>26305016 #>>26311396 #
1. nikanj ◴[] No.26297332[source]
The old maxim of "Premature optimization is the root of all evil" has over time evolved to "If you care one iota about performance, you are not a good programmer".
replies(10): >>26297445 #>>26297456 #>>26297528 #>>26298013 #>>26298281 #>>26298654 #>>26299400 #>>26300250 #>>26304073 #>>26313590 #
2. ◴[] No.26297445[source]
3. vendiddy ◴[] No.26297456[source]
And fwiw, the full quote is:

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

Yet we should not pass up our opportunities in that critical 3%.

replies(2): >>26297809 #>>26297927 #
4. ◴[] No.26297528[source]
5. blowski ◴[] No.26297809[source]
Also, it’s not “never optimise”. It’s “only optimise once you’ve identified a bottleneck”. I guess in a profit-making business you only care about bottlenecks that are costing money. Perhaps this one isn’t costing money.
replies(5): >>26297885 #>>26297926 #>>26298312 #>>26299305 #>>26299394 #
6. skzo ◴[] No.26297885{3}[source]
ding ding ding ding
7. saagarjha ◴[] No.26297926{3}[source]
This one isn’t measured to be costing money. Trying to figure out how much money you’re losing as a result of performance problems is extremely difficult to do.
8. nicbou ◴[] No.26297927[source]
Then again, using the correct data structure is not really optimisation. I usually think of premature optimisation as unnecessary effort, but using a hash map isn't it.
replies(4): >>26298331 #>>26298348 #>>26298402 #>>26304254 #
9. smolder ◴[] No.26298013[source]
That belief is getting a bit outdated now that computing efficiency is hitting walls. Even when compute is cheaper than development, you're still making a morally suspect choice to pollute the environment over doing useful work if you spend $100k/yr on servers instead of $120k/yr on coding. When time and energy saved are insignificant compared to development expense is of course when you shouldn't be fussing with performance.

I don't think the anti-code-optimization dogma will go away, but good devs already know optimality is multi-dimensional and problem specific, and performance implications are always worth considering. Picking your battles is important, never fighting them nor knowing how is not the trick.

replies(1): >>26298542 #
10. ehsankia ◴[] No.26298281[source]
That doesn't really apply here. I don't even play GTA V but the #1 complain I've always heard for the past 6 years is that the load times are the worst thing about the game. Once something is known to be the biggest bottleneck in the enjoyment of your game, it's no longer "premature optimization". The whole point of that saying is that you should first make things, then optimize things that bring the bring the most value. The load time is one of the highest value things you can cut down on. And the fact that these two low hanging fruit made such a big difference tells me they never gave it a single try in the past 6 years.
replies(2): >>26298514 #>>26299480 #
11. CogitoCogito ◴[] No.26298312{3}[source]
Perhaps this one is costing them a lot of money.
12. CogitoCogito ◴[] No.26298331{3}[source]
My belief is that your first goal should be cognitive optimization. I.e. make it simple and clear. That includes using hash maps when that is the proper data structure since that’s is what is called for at a base design level.

The next step is to optimize away from the cognitively optimal, but only when necessary. So yeah it’s really crazy this was ever a problem at all.

replies(1): >>26300662 #
13. marcosdumay ◴[] No.26298348{3}[source]
If it doesn't change semantics, it's optimization.

It's just a zero cost one, so if anybody appears complaining that you are caring to choose the correct data structure and that's premature (yeah, it once happened to me too), that person is just stupid. But not calling it an optimization will just add confusion and distrust.

14. rodgerd ◴[] No.26298402{3}[source]
This is the key point: premature optimization would be e.g. denormalising a database because you think you might have a performance problem at some point, and breaking the data model for no good reason.

Here the wrong data structure has been used in the first place.

15. djmips ◴[] No.26298514[source]
Sure it does apply. These complaints come out after the game has been released. They should have optimized this before they released, while they even designed the system. However that's considered premature optimization, when in fact it's just bad design.
replies(1): >>26300730 #
16. djmips ◴[] No.26298542[source]
I agree 100% - the whole cheery lack of care around optimization to the point of it becoming 'wisdom' could only have happened in the artifice of the huge gains in computing power year on year.

Still, people applying optimizations that sacrifice maintainability for very little gain or increase bugs are still doing a disservice. People who understand data flow and design systems from the get-go that are optimal are where it's at.

17. GhostVII ◴[] No.26298654[source]
The problem here isn't a lack of optimization, it's a lack of profiling. Premature optimization is a problem because you will waste time and create more complex code optimizing in places that don't actually need it, since it's not always intuitive what your biggest contributors to inefficiency are. Instead of optimizing right away, you should profile your code and figure out where you need to optimize. The problem is that they didn't do that.
replies(1): >>26302637 #
18. nmfisher ◴[] No.26299305{3}[source]
Precisely. Hasn't GTA Online done over a billion in revenue?

Given how incredibly successful it's been, it's conceivable the suits decided the opportunity cost of investing man-hours to fix the issue was too high, and that effort would be better spent elsewhere.

replies(2): >>26299338 #>>26384404 #
19. Dylan16807 ◴[] No.26299338{4}[source]
They're making lots of money but the ridiculous load times absolutely cost them money. It's not worth an unlimited amount of dev time to fix, but they definitely should have assigned a dev to spend one day estimating how hard fixes would be.
replies(4): >>26300074 #>>26300343 #>>26302375 #>>26302586 #
20. astrange ◴[] No.26299394{3}[source]
Not all performance problems have obvious "bottlenecks", some are entirely second-order effects.

For instance, everyone tells you not to optimize CPU time outside of hotspots, but with memory usage even a brief peak of memory usage in cold code is really bad for system performance because it'll kick everything else out of memory.

21. 29athrowaway ◴[] No.26299400[source]
I am not sure I would call that "evolution".
22. anotherfish ◴[] No.26299480[source]
We used to start up Quake while we waited then we'd forget about GTAO. Later we'd discover GTA had kicked us out for being idle too long. Then we'd just close it.

That should be embarrassing for Rockstar but I don't think they would even notice.

23. nmfisher ◴[] No.26300074{5}[source]
That's also possible. I haven't played it myself, so I really can't comment.
24. branko_d ◴[] No.26300250[source]
I think this part of the Knuth's quote is central:

> Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered.

And he is explicitly advocating for optimizing the critical part:

> Yet we should not pass up our opportunities in that critical 3%.

And somehow, people have latched onto the catchphrase about "early optimization" that was taken out of context.

25. blowski ◴[] No.26300343{5}[source]
Arguably, it could actually make them money, since it provides a window of free advertising. I have no data either way, but I wouldn’t assume long load screens are necessarily bad for business.
replies(1): >>26308674 #
26. yxhuvud ◴[] No.26300662{4}[source]
OTOH, at some point what is good from a cognitive viewpoint depends on what idioms you use. So it can be helpful to actively chose which idioms you use and make certain to not use those that tend to blow up performance wise.
replies(1): >>26301746 #
27. ehsankia ◴[] No.26300730{3}[source]
> while they even designed the system

See, that's exactly why you're wrong. This wasn't a bad "design". If the fix required to rebuild the whole thing from scratch, then you would have a point and thinking about it "prematurely" would've been a good idea. In this case, both the fixes were bugs that could've been fixed after the game was finished without having to undo much of the work done.

The whole point of the saying is that you don't know what's gonna be a bottleneck until after. Yes by optimizing prematurely, you would've caught those two bugs early, but you would've also spent time analyzing a bunch of other things that didn't need to be analyzed. Whereas if you analyze it at the end once people complain about it being slow, you're only spending the minimum amount of time necessary on fixing the things that matter.

replies(1): >>26301852 #
28. CogitoCogito ◴[] No.26301746{5}[source]
Given the current code example, I don't even thing idioms come into it. Here a hash map data structure was called for regardless of whatever idioms come into other parts of the design. This is just fundamental and has nothing to do with functional programming, OOP, etc.
29. jcelerier ◴[] No.26301852{4}[source]
> Whereas if you analyze it at the end once people complain about it being slow

I think that we should also stop doing crash tests in cars. Just release the car to the public and analyze human crashes afterwards.

replies(2): >>26307615 #>>26310025 #
30. rjmunro ◴[] No.26302375{5}[source]
It's costing them money just in the time wasted by their own internal QAs and devs loading the game to test it.
31. danlugo92 ◴[] No.26302586{5}[source]
N=1 but one of the reasons I don't partake in modern gaming is ridiculous loading times and mandatory update sizes.
32. Cthulhu_ ◴[] No.26302637[source]
I'd like to add, while GTA is running I'm really impressed by its performance, even / especially when it was on the PS3; you could drive or fly at high speed through the whole level, see for miles and never see a loading screen. It is a really optimized game, and that same level is continued in RDR2.

Which makes the persisting loading issue all the weirder.

33. bborud ◴[] No.26304073[source]
I went back and had a look at that maxim a few years ago and found that it actually doesn't say what many people claims it says. And definitively not as the blanket excuse for slow code that it has always to some degree been used as.

The reason for the misunderstanding is that the kinds of practices it actually talks about are uncommon today. People would often take stuff written in higher level languages and reimplement them in assembler or machine code. Which makes them more time-consuming to change/evolve.

It also isn't like it is hard to figure out which part of a piece of software that is taking up your runtime these days. All worthwhile languages have profilers, these are mostly free, so there is zero excuse for not knowing what to optimize. Heck, it isn't all that uncommon for people to run profiling in production.

Also, it isn't like you can't know ahead of time which bits need to be fast. Usually you have some idea so you will know what to benchmark. Long startup times probably won't kill you, but when they are so long that it becomes an UX issue, it wouldn't have killed them to have a look.

34. bborud ◴[] No.26304254{3}[source]
Exactly. And not only being able to pick the correct data structure for the problem, (and possibly the correct implementation), but being able to know what is going to need attention even before a single line of code is written.

Most of my optimization work is still done with pencil, paper and pocket calculator. Well, actually, most of the time you won't even need a calculator.

35. handoflixue ◴[] No.26307615{5}[source]
You do understand that "my entertainment takes 6 minutes to load" is a very different problem from "my essential transportation kills people"? And therefore call for different approaches?
36. Dylan16807 ◴[] No.26308674{6}[source]
They don't need more than a full minute of advertising every load.
37. ehsankia ◴[] No.26310025{5}[source]
Putting aside the bad analogy that the other comment covers, the overall point I was making also covers doing the optimization before game release, but after the game is more or less complete. Rockstar optimizing during the final few months of playtest wouldn't be premature optimization.

Premature optimization is about trying to optimize the micro (a given function), while you don't yet have the macro (the game itself). If the function accounts for 0.1% of the performance of the final game, it doesn't matter if you make it 5x faster since that will only make the game 0.08% faster. You could've spent that time optimizing a different function that accounts for 10% of the performance, and even optimizing that function by 5% would make the game 0.5% faster, which is more impactful.

38. imtringued ◴[] No.26313590[source]
Back in the day when people talked about premature optimization it was about trivial things like people asking on stackoverflow whether a++, ++a or a += 1 is faster. It obviously is a net loss since ultimately it literally doesn't matter. If matters to you, you are already an expert in the subject and should just benchmark your code.
39. ric2b ◴[] No.26384404{4}[source]
Where does the belief that corporations are somehow perfect at doing cost-benefit analysis come from? Has anyone worked in a place where that seemed to be the case?

We're talking about an issue that has been loudly complained about for 7 years (and I am evidence that it makes people play the game much less often than they would like to) that some person without access to the source code was able to identify and fix relatively quickly, it would be surprising if it took an FTE 1 week to find this with access to the source code.

This is one of the most profitable games ever, they could hire an entire team to track this down for half a year and it wouldn't even be noticeable on their balance sheet. And I would bet money that it would have increased their active player base (which they care about because of the micro-transactions) in a noticeable way, as long as players were aware of the improvement so they would try it again.