Big O vs Hardware: Better Complexity ≠ Better Performance (blog.codingconfessions.com)
from abhi9u@lemmy.world to technology@lemmy.world on 08 Aug 07:08
https://lemmy.world/post/34125716

#technology

threaded - newest

phutatorius@lemmy.zip on 08 Aug 09:42 next collapse

TL;DR; don’t just count things when a weighted sum is more appropriate.

squaresinger@lemmy.world on 08 Aug 17:30 next collapse

It’s the old galactic algorithm. Imagine an algorithm that takes the age of the universe to compute the result, but it always takes the same time independent of the input. This makes it O(1) complexity.

Now imagine you want to sort a list with 100 elements. Pretty much any other sorting algorithm, no matter the complexity, will finish much faster than the galactic one from above.

Complexity matters only when n gets large enough, and in some cases n stays small enough that algorithmic complexity doesn’t matter.

(Just to make sure I’m not misunderstood: Algorithmic complexity is an incredibly important optimization tool. Most often it’s even the best one. But it’s not the only one and there are situations where it fails. As always, a good programmer takes everything into account, not just their favourite tool.)

ftbd@feddit.org on 09 Aug 10:51 collapse

TL;DR: Big-O notation describes asymptotic behavior