←back to thread

3883 points kuroguro | 1 comments | | HN request time: 0.298s | source
Show context
GuB-42 ◴[] No.26298512[source]
Maybe even more surprising to me is that sscanf() relies on strlen().

I would have expected libc to take that use case in consideration and use a different algorithm when the string exceeds a certain size. Even if the GTA parser is not optimal, I would blame libc here. The worst part is that some machines may have an optimized libc and others don't, making the problem apparent only in some configuration.

I believe standard libraries should always have a reasonable worst case by default. It doesn't have to be perfectly optimized, but I think it is important to have the best reasonable complexity class, to avoid these kinds of problems. The best implementations usually have several algorithms for different cases. For example, a sort function may do insertion (n^2, good for small n) -> quicksort (avg. nlog(n), worst n^2, good overall) -> heapsort (guaranteed nlog(n), slower than quicksort except in edge cases). This way, you will never hit n^2 but not at the cost of slow algorithm for the most common cases.

The pseudo hash table is all GTA devs fault though.

replies(2): >>26299325 #>>26300393 #
1. masklinn ◴[] No.26300393[source]
> For example, a sort function may do insertion […]

That’s generally called “adaptive”. A famous example of that is timsort.

Your version has issues though: insertion sort is stable, which can dangerously mislead users.