Whenever something needs sorting, there are a lot of ways to achieve that, but the baseline is O(n²). It's the same way most people order their socks: pick one and find the match, then repeat (called selection sort). How else would one compare all elements to find their order? Better approaches, such as heap sort, merge sort, and so on, all exhibit O(n log(n)) behavior in the worst case, which is the best possible (consistent) performance for sorting algorithms. Additionally, since the best case for any sorting algorithm is O(n)—making sure everything was already in order—the average case matters the most. We will get into strategies about that later in this book.
Search (or lookup) is another topic that we will get into in Chapter 10, Finding Stuff, but the associated runtime complexities are great examples. Searching on any unsorted data structure will be O(n) most of the time, while sorted collections can utilize binary search (a tree's...