←back to thread

Why is Windows so slow?

(games.greggman.com)
337 points kristianp | 2 comments | | HN request time: 0s | source
Show context
hristov ◴[] No.3368965[source]
Interestingly enough Joel Spolsky mentioned something related to the directory listing problem more than 10 years ago. See:

http://www.joelonsoftware.com/articles/fog0000000319.html

In Joel's opinion it is an algorithm problem. He thinks that there is an O(n^2) algorithm in there somewhere causing trouble. And since one does not notice the O(n^2) unless there are hundreds of files in a directory it has not been fixed.

I believe that is probably the problem with Windows in general. Perhaps there are a lot of bad algorithms hidden in the enormous and incredibly complex Windows code base and they are not getting fixed because Microsoft has not devoted resources to fixing them.

Linux on the other hand benefits from the "many eyes" phenomenon of open source and when anyone smart enough notices slowness in Linux they can simply look in the code and find and remove any obviously slow algorithms. I am not sure all open source software benefits from this but if any open source software does, it must certainly be Linux as it is one of the most widely used and discussed pieces of OS software.

Now this is total guesswork on my part but it seems the most logical conclusion. And by the way, I am dual booting Windows and Linux and keep noticing all kinds weird slowness in Windows. Windows keeps writing to disk all the time even though my 6 GB of RAM should be sufficient, while in Linux I barely hear the sound of the hard drive.

replies(4): >>3369011 #>>3369062 #>>3369098 #>>3369928 #
barrkel ◴[] No.3369011[source]
I don't think there's an O(n^2) algorithm in there. I just created a directory with 100,000 entries. Listing it (from Cygwin, no less, using 'time ls | wc') takes 185 milliseconds. The directory is on a plain-jane 7.2k 1TB drive, though of course it's hot in cache from having been created. 'dir > nul', mind you, is quite a bit slower, at over a second.
replies(5): >>3369047 #>>3369054 #>>3369178 #>>3369402 #>>3369535 #
1. prewett ◴[] No.3369054[source]
You only did one test, so you have no idea what the complexity curve is. Do at least three tests, with 1000, 10,000 and 100,000 entries and graph the results. Three tests is still pretty skimpy to figure out what the curve is, so do tests at 10 different sizes.

Also, Joel's complaint was about the Windows Explorer GUI (specifically, opening a large recycle bin takes hours). Cygwin `ls` is using a completely different code path. Your experiment does suggest that Joel's problem is in the GUI code, though, and not the NTFS filesystem code.

replies(1): >>3369320 #
2. barrkel ◴[] No.3369320[source]
Oh, the OS treeview is dreadful, everyone who's seriously coded on Windows knows that.

As to actual complexity curve (which, knowing what I do about NTFS, I'm fairly sure is O(n log n)), I don't really care about it; since it hasn't shown up in a serious way at n=100000, it's unlikely to realistically affect anyone badly. Even if 1 million files (in a single directory!) took 18.5 seconds, it wouldn't be pathological. Other limits like disk bandwidth and FS cache size seem like they'd hit in sooner.