Most active commenters

    ←back to thread

    1525 points saeedesmaili | 12 comments | | HN request time: 0.661s | source | bottom
    1. andai ◴[] No.43653301[source]
    Ironically the old horses were faster! Run XP on modern hardware (if you can get it running at all) and you'll see what I mean. Explorer opens fully rendered in the span of a single frame (0.016 seconds). And XP was very slow and bloated for its time!

    It'll do this even in VirtualBox, running about 20x snappier than the native host, which boggles my mind.

    replies(7): >>43653493 #>>43653839 #>>43655971 #>>43656456 #>>43658105 #>>43664280 #>>43666798 #
    2. svachalek ◴[] No.43653493[source]
    It's amazing how fast we can eat up new hardware capabilities. The old 6502 1-MHz CPUs were capable of running much more sophisticated software than most people today imagine, with 1/1000 or 1/millionth the hardware. And now we're asking LLMs to answer math questions, using billions of operations to perform something a single CPU instruction can handle.
    replies(1): >>43655738 #
    3. IshKebab ◴[] No.43653839[source]
    To be fair even with modern software bloat the overall experience is a lot better now than it was in the XP days. I think it's mainly due to SSDs. They were a huge step change in performance and we fortunately haven't regressed back to the slowness of the HDD era.

    At least on most hardware. I have a shitty Dell laptop for work that's basically permanently thermally throttled... :(

    replies(2): >>43654415 #>>43679418 #
    4. hedora ◴[] No.43654415[source]
    Are you running Linux or something? I installed Win 11 in a VM, and no one that's seen its first boot screen would claim the experience has improved since the day of shovelware-bloated XP desktops. It only gets worse and worse from there.
    5. TuringTest ◴[] No.43655738[source]
    The classical answer of why more hardware resources are needed for the same tasks is that the new system allows for way much more flexibility. A problem domain can be thoroughly optimized for a single purpose, but then it can only be used for that purpose alone.

    This is quite true for LLMs. They can do basic arithmetic, but they can also read problem statements in many diverse mathematical areas and describe what they're about, or make (right or wrong) suggestions on how they can be solved.

    Classic AIs suffered the Frame problem, where some common-sense reasoning depended on facts not stated in the system logic.

    Now, LLMs have largely solved the Frame problem. It turns out the solution was to compress large swathes of human knowledge in a way that can be accessed fast, so that the relevant parts of all that knowledge are activated when needed. Of course, this approach to flexibility will need lots of resources.

    6. noisy_boy ◴[] No.43655971[source]
    I think they were designed at the time of less powerful machines so they had to be designed better. Nowadays there is not as much push to eke out every last bit of performance because there is loads of power at everyone's disposal and developers are pushed to focus on features first without being given time to refine performance because features mean adoption. So the bloat creeps up, and hardware makers keep designing more powerful machines which further enables the bloatiness. It is a vicious cycle.
    7. washadjeffmad ◴[] No.43656456[source]
    This is part of why I still have a MacBook2,1 running Snow Leopard. Even with its 4GB of memory and Core2Duo, it's optimized to prioritize my input. It also never changes, which is a form of stability I've come to cherish.

    Another point is that you can train a horse, or even eat it if in dire straits. You own that horse. I can't disable things I want to disable, and names, locations, and features change (or are removed) with no notice between minor version updates. I can't tell you the last time I built something for a new Mac, or wanted to.

    I don't know MacOS today, and it certainly doesn't make me feel like I own my computer.

    I'm less harsh about modern Windows because I view it as amends for Microsoft causing the bot/ransomware crisis of the last 15 years. Still not for me, but at least I neuter it into usefulness.

    8. Gud ◴[] No.43658105[source]
    My setup(FreeBSD+XFCE) hasn’t changed at all over the last 20 years and is just as fast as it’s always been.

    I use virtualisation for the rest.

    9. greenie_beans ◴[] No.43664280[source]
    i'm off grid right now and the only fast websites are hacker news, old reddit, and my app https://bookhead.net that is html + a little bit of htmx + a little vanilla javascript
    10. piperswe ◴[] No.43666798[source]
    Hell, my Windows XP system with a nearly 20 year old processor (Q6600, ~17ish years old) still instantly does almost everything.
    replies(1): >>43670992 #
    11. DavidPiper ◴[] No.43670992[source]
    You just gave me a hell of an nostalgia hit with "Q6600". Remember when clock speed, cache size and core count were all we cared about? AMD hadn't event bought ATI yet.

    Maybe I'll spin up an XP VirtualBox off the back of this thread just for old times' sake and see what happens.

    12. pas ◴[] No.43679418[source]
    it's ridiculously worse. getting things done on the computer got more tiresome, almost every part (booting is still slow, login is still slow, after login starting the browser is slow, rendering the start menu is slow - which should be in cache, the browser starts fast, but then it takes some time to somehow start loading and rendering the last opened page - which should be in cache, and so on) and yes, you can do a lot more things nowadays "on the computer", but it's mostly "online"