I still don't understand the difference. GB sometimes means 1024 MB, and sometimes it means 1000. But then also sometimes it also means ether 1000 MiB or 1024 MiB, depending on who's saying it.
The whole "iB" thing was supposed to clear up confusion but it only makes it worse. I don't understand why we can't just make 1GB = 1024 MB, ditch the silly "iB" nonsense, and then just call it a day.
I blame hard drive manufacturers. They're the ones who started this whole 1GB = 1000MB bullshit.
I agree with you. A long time ago, those of us "in the know" techies could parse the difference like it was a native language. When talking anything but computers, it was always the SI of 1000. When talking about computers, it was always 1024.
I think the masses were confused and the SI purists felt their SI prefixes were being corrupted. So they made a distinction/standard between binary numbering system prefixes and decimal numbering system prefixes.
I hate it. Feels wrong because I'm old and set in my ways. People like me are confused because we still use the old nomenclature, and when someone else uses the old nomenclature (when talking about computers), it's ambiguous to us because we don't know which numbering system they are using (e.g., binary as opposed to decimal). I still have to ask and half say binary and half say decimal.
I suppose if they're teaching it in high school and college it'll become native soon enough, if it hasn't already with the next generations.
I wasn't talking about HDD sizes, but I know the person to which I was replying was talking about HDD's specifically. I shou,d have clarified I was talking in more general terms (CPU RAM, NVM sizes, etc.)
I remember being miffed about the advertising of the HDD sizes. So I think you are correct there. Wish I could go back to the mid 80's and do some research on my old HDDs and floppies. I honestly just can't remember, so thank you.
I hate the new prefixes, not just because they aren't the older nomenclature, but because they feel ridiculous to speak out loud. If a less silly sounding prefix was chosen, I probably wouldn't be tainted about it.
The problem is that each time you go up another unit, the binary and decimal units diverge further.
It rarely mattered much when you're talking about the difference between kibibytes and kilobytes. In the 1980s, with the size of memory and storage available, the difference was minor, so using the decimal unit was a pretty good approximation for most things. But as we deal with larger amounts of data, the error becomes more-significant.