The meaning of O(3n) = O(n) is distinct from the importance of the information O(3n) is trying to convey |
I've made this very argument. In terms of asymptotic time complexity, 3n and n are practically the same as n goes to infinity. In terms of a programmer working with algorithms, they are completely different.
Some addition shows that v.1 does n(n - 1) comparisons while v.2 does n/2(n - 1) comparisons. We can compare those two expressions and get a meaningful answer without involving big-O at all |
That's the only argument I've ever made. That removing the constants to satisfy Big-O is not what's best for a programmer trying to compare algorithms.
There's nothing wrong with borrowing the notation for your own needs: O(n^2 - n) vs O((n^2)/2 - n/2). These are the worst case scenarios for V1 and V2. There would be a different omega if a condition is added to check for a swap and exit if no swap happens - creating an Ω(n). Would removing O() and Ω() make everyone cream their pants?
To say, "Well you're using Big-O notation which doesn't allow it.. so you're wrong!" is to be emotionally tied down. It's like saying you can't use a wrench as a hammer; you'd be right in some cases, but in general you're just trying to get a nail through.
Other than whether appropriate to use Big-O notation, we seemingly haven't diverged in opinion. However, others, like lastchance, from what I've read are claiming exactly the opposite - that there is NO information to be derived. That Bubble sort written as O(n^2 - n) vs O((n^2)/2 - n/2) cannot be differentiated based off these constants and should both be seen as O(n^2) and call it a day.
In fact, my last debate with Peter was certainly saying that, as he tried to give an example of an algorithm that was O(2n) by dividing the work of an O(n) algorithm between two loops - implying that the constant is meaningless since both programs do the same amount of work (setting aside loop overhead). This is a flawed argument in many ways, but not going to argue with myself here.