For computer science, the exact, absolute runtime is typically not important when implementing algorithms (you can always get a faster computer). Instead, the runtime complexity is more important since it directly influences performance as an overall measure of work, independent of details.
Since this is not an exact measurement and the actual performance is influenced by other factors, sticking with an asymptotic (read: rough) measure is the best strategy. In addition to that, algorithms have best and worst cases. Unless you are trying to improve on a particular case, the worst case is what's typically compared:
let my_vec = vec![5,6,10,33,53,77];
for i in my_vec.iter() {
if i == 5 {
break;
}
println!("{}", i);
}
Iterating over this, Vec<T> has a runtime complexity of O(n) where n is the length of Vec<T>, regardless of the fact that the loop will break right away. Why? Because of pessimism. In reality, it is often...