I would have expected libc to take that use case in consideration and use a different algorithm when the string exceeds a certain size. Even if the GTA parser is not optimal, I would blame libc here. The worst part is that some machines may have an optimized libc and others don't, making the problem apparent only in some configuration.
I believe standard libraries should always have a reasonable worst case by default. It doesn't have to be perfectly optimized, but I think it is important to have the best reasonable complexity class, to avoid these kinds of problems. The best implementations usually have several algorithms for different cases. For example, a sort function may do insertion (n^2, good for small n) -> quicksort (avg. nlog(n), worst n^2, good overall) -> heapsort (guaranteed nlog(n), slower than quicksort except in edge cases). This way, you will never hit n^2 but not at the cost of slow algorithm for the most common cases.
The pseudo hash table is all GTA devs fault though.