←back to thread

688 points samwho | 6 comments | | HN request time: 1.085s | source | bottom
1. wpollock ◴[] No.45020535[source]
Big-O isn't as relevant as it once was. Modern hardware includes mutithreading, piplining, numa, and complex caching. Some operations can take less than one CPU cycle and others can take hundreds, or exceptionally thousands of cycles. Trying to describe the behavior of algorithms solely as a function of the number of "trips through the innermost loop" can be a very misleading description!

Besides that, other measures such as big-Omega should be referenced in any discussion of big-O.

(I did enjoy Big-O the anime series though! /grin)

replies(2): >>45020849 #>>45020868 #
2. jonahx ◴[] No.45020849[source]
> Big-O isn't as relevant as it once was. Modern hardware includes mutithreading, piplining, numa, and complex caching. Some operations can take less than one CPU cycle and others can take hundreds, or exceptionally thousands of cycles.

Big-O theory was invented precisely to be a measure of computation that is independent of such low-level details. In that sense it is timeless. And any (decent) presentation always includes appropriate caveats like "the constant C may be very relevant for 'smaller' N".

replies(1): >>45021522 #
3. ◴[] No.45020868[source]
4. wpollock ◴[] No.45021522[source]
You're right. I didn't mean to imply that Big-O is useless.

I'm reminded on an incident in the 1990s, when I was an instructor teaching C at the local telco, when a student showed me a (COBOL) program that had stumped his programming team. It would always hang their computer when it was run. What caught my eye was the main loop was nested seven levels deep, and each loop ran 50,000 iterations. Their computer wasn't hung, they only needed to wait a decade or so for the computation to finish!

Big-O knowledge made the error obvious to me, and I added that to the curriculum for all programming courses I could.

replies(1): >>45026922 #
5. namibj ◴[] No.45026922{3}[source]
(on the order of) 110 bits of brute-force isn't anywhere close to "a decade" especially if you aren't throwing severe parallelism at it.

Edit: we're talking on the order of a million times the current age of the universe if you have a multi-GHz CPU and only take one cycle for each inner iteration.

replies(1): >>45031986 #
6. wpollock ◴[] No.45031986{4}[source]
50000^7 ÷ 100000000 = ~2.5E17 years. That's a long time!