while (this->m_fBladeAngle > 6.2831855) { this->m_fBladeAngle = this->m_fBladeAngle - 6.2831855; }
Like, "let's just write a while loop that could turn into an infinite loop coz I'm too lazy to do a division"
while (this->m_fBladeAngle > 6.2831855) { this->m_fBladeAngle = this->m_fBladeAngle - 6.2831855; }
Like, "let's just write a while loop that could turn into an infinite loop coz I'm too lazy to do a division"
But knowing they were able to they were able to blow up loading GTA5 by 5 minutes by just parsing json with sscanf, I don't have much hope.
There is absolutely no way this could turn into an infinite loop. It could underflow, but for that to happen angle would have to be less than the 2*pi, therefore exiting the loop.
When you subtract a small float from a very large float, the value doesn't change. This is because the "steps" between float values increase with the size of the value (i.e. floats have coarser resolution for larger magnitudes)
To see this in action, try running the following in a JavaScript interpreter:
console.log(1_000_000_000_000_000_000 - 1);
Writing some simple code that works with the data you expect to have without bothering with optimizations is fine, if anything it is one of the actual cases of "premature optimization": even with profiling no real time is spent on that code, your data wont make it spend any time and you should avoid wild guesses since chances are you'll be wrong (even if in this case it could be a correct guess, it'd be like a broken clock guessing the time is always 13:37).
The actual issue with that code was that, after they reused it for GTAOnline and started becoming a performance issue after some time as they added more objects, nobody thought to try and see what is wrong.
The second error of deduplicating values by linear scanning an array was way more egregious.
I guess the most robust code handling both performance and unexpected input would be one iteration of this (leveraging the assumption that angles are either always within the bounds, or had one frame of going out of bounds by a small amount); followed by a fmod if that assumption is just totally off.
I think someone estimated that error cost them millions in revenue? I'm pretty sure a fraction of that could afford an engineer who knows how fast computers ought to be.
Like, even though it's pretty critical to initial user experience initial loading time is generally what gets disregarded the most.
> I'm pretty sure a fraction of that could afford an engineer who knows how fast computers ought to be.
It can, if someone cares enough or realises it's an issue, and then someone is motivated enough to dig into it, or has the time to.
The premature optimization would be trying to optimize that piece of code without that being necessary given what the code was meant to do.