• 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: July 3rd, 2023

help-circle



  • While I get your point, you're still slightly misguided.

    Sometimes for a smaller dataset an algorithm with worse asymptomatic complexity can be faster.

    Some examples:

    • Radix sort's complexity is linear. Then why would most people still want to use e.g. quicksort? Because for relatively smaller datasets, the overhead of radix sort overpowers the gain from being asymptotically faster.
    • One of the most common and well-known optimizations for quicksort is to switch over to insertion sort when subarray sizes become smaller than a certain size. This is because for small datasets (I'm talking e.g. 10 elements) insertion sort is just objectively faster.

    Big O notation only considers the largest factor. It is still important to consider the lower order factors in some cases. Assume the theoretical time complexity for an algorithm A is 2nlog(n) + 999999999n and for algorithm B it is n^2 + 7n. Clearly with a small n, B will always be faster, even though B is O(n^2) and A is O(nlog(n)).

    Sorting is actually a great example to show how you should always consider what your data looks like before deciding which algorithm to use, which is one of the biggest takeaways I had from my data structures & algorithms class.

    This youtube channel also has a fairly nice three-part series on the topic of sorting algorithms: https://youtu.be/_KhZ7F-jOlI?si=7o0Ub7bn8Y9g1fDx