Oh. Indeed you're correct. I was thinking in computer terms instead of scientific terms. Personally I see this as reinforcing that computers as a context wouldn't really benefit from using "proper" SI.
Note that no one is going to confuse mB for millibytes because what would that even mean? But also in practice MB versus Mb aren't ambiguous because except for mass storage no one mixes bytes with powers of ten AFAIK.
And let's take a minute to appreciate the inconsistency of (SI) km vs Mm. KB to GB is more consistent.
> no one is going to confuse mB for millibytes because what would that even mean?
Data compression. For example, look at http://prize.hutter1.net/ , heading "Contestants and Winners for enwik8". On 23.May'09, Alex's program achieved 1.278 bits per character. On 4.Nov'17, Alex achieved 1.225 bits per character. That is an improvement of 0.053 b/char, or 53 millibits per character. Similarly, we can talk about how many millibits per pixel JPEG-XL is better than classic JPEG for the same perceptual visual quality. (I'm using bits as the example, but you can use bytes and reach the same conclusion.)
Just because you don't see a use for mB doesn't mean it's open for use as a synonym of MB. Lowercase m means milli-, as already demonstrated in countless frequently used units - millilitre, millimetre, milliwatt, milliampere, and so on.
In case you're wondering, mHz is not a theoretical concept either. If you're generating a tone at say 440 Hz, you can talk about the frequency stability in millihertz of deviation.