In the study of algorithms, what is the primary purpose of Big O notation?
💡 Explanation:
Big O notation is a mathematical notation used in computer science to describe the limiting behavior of a function when the argument tends towards a particular value or infinity. Its primary role is to classify algorithms according to how their running time or space requirements grow as the input size increases, describing the algorithm's worst-case efficiency and scalability.