But I don't understand the mbits, MiB, MB differences.
Well, it didn't get easier after the opensores world decided to go all SI
-nazi... back in the olden days, in the context of computing, our 'k' meant 1024 (2^10), 'm' was 2^20, et cetera. People generally weren't confused - and we all knew the harddrive manufacturers used the normal-world convention when stating drive sizes, those weazely scumbags
Today, pretty much all you can say for certain is that... if you see somebody using the terms KiB, MiB, kibibyte, mebibyte... then you're dealing with the old 1024-based numbers (and a basement-dwelling pedantic idealist that you shouldn't ever try having a rational discussion with). Anywhere else, you'll have to study context and guesstimate. So much for standardization
mbit/s or mbps ought to mean the same - megabits per second - but whether they're talking about 2^20 or 10^3 depends on context. Also, you'll see the basement-dwellers claiming that zomg-it-should-be-capital-M, totally ignoring that outside extremely specific and rare situations, talking about millibits just doesn't make sense.
(I'm generally in favor of SI, because the units tend to simply make sense, compared to legacy crap like pounds and yards and inches and whatnot, I'm just not convinced in the context of computing. Especially since their 2^n unit names are so ridiculous).