←back to thread

752 points dceddia | 4 comments | | HN request time: 0.622s | source
1. Joeri ◴[] No.36448665[source]
600 mhz is quite the machine for NT. I remember running NT4 on a 233 mhz pentium II with 128 MB RAM and everything felt instant and limitless.

Windows 2000 was quite the hog compared to NT4 and all it added that I had a use for was USB support. I think by that point Dave Cutler was no longer running the show and windows performance slowly started degrading.

replies(3): >>36448757 #>>36448879 #>>36448927 #
2. tapoxi ◴[] No.36448757[source]
He shows Win 2000 later in that thread, still snappy.
3. jaclaz ◴[] No.36448879[source]
I remember a HP machine (@600Mhz) which my company bought circa 2001 that came with an install CD that had both NT 4.00 and Windows 2000 and the user could decide to install the one or the other, a few machines had initially NT 4.00 due to some accounting software that did not run (for whatever reasons) on Windows 2000, while some had 2K installed.

Of course the NT 4.0 was a bit faster, but not that much with "common" programs (Office and similar).

The occupation on disk of the OS was however 3x (NT 4.00 was around 180 MB, 2K around 650 MB).

4. masswerk ◴[] No.36448927[source]
I think, the most important factor isn't so much the CPU, but loading binaries from disk. And that (I/O) improved quite massively over the 1990s. (And anything written for spinning disks will start nearly instantly using solid state storage.)

To illustrate the CPU/disk-access ratio: There's a reason for scripting languages becoming prevalent for web backends in the 1990s. Loading a script source from disk and precompiling it on the fly was still faster than loading a much bigger binary from disk – which had to be done on each CGI request. (E.g., with Perl, you could have your normal script, but you could also produce an executable binary from a core. But nobody did the latter for the exact reason.)