I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.
This one is about why it was a mistake to call 1024 bytes a kilobyte. It's about a 20min read so thank you very much in advance if you find the time to read it.
I was confused when I just read the headline. Should be "Why I (that would be you not me) think a kilobyte should be 1000 instead of 1024". Unpopular opinion would be a better sub for it.
It totally is a matter of opinion. These are arbitrary rules, made up by us. We can make up whatever rules we want to.
I agree that it's weird that only in CS kilo means 1024. It would be logical to change that, to keep consistency across different fields of science. But that does not make it any less a matter of opinion.
You can't store data in base 10, nor address memory or storage in base 10 given present computers. It's a bit more than a matter of opinion that computers are base 2
Yes computers are base 2 but we can still make up whatever rules we want about them. We could even make up rules that say that we are to consider everything a computer does to be in base 10 but it can only use the lowest 2 values of any given digit. It would be a total mess and it would make no sense whatsoever but we could define those rules.
1024 is not the standard. The standard term for 1024 is "kibi" or "Ki" and the standard term for 1000 is "kilo" and has been since the year 1795.
There was a convention to use kilo for 1024 in the early days of computing since the "kibi" term didn't exist until 1998 (and took a while to become commonly used) — but that convention was always recognised as an incorrect use of the term. People just didn't care much especially since kilobytes were commonly rounded anyway. A 30,424 byte file is 29.7109375 kibibytes or 30.424 kilobytes... both will likely be rounded to 30 either way, so who cares if it's slightly wrong? Just use bytes if you need to know the exact size.
Also - hard drives, floppy disks, etc have always referred to their size in base 1000 numbers so if you were working with 30KB in the early days of computers it was very rarely RAM. A PDP-11 computer, for example, might have only had 8196 bytes of RAM (that's 8 kibibytes).
There are some places where the convention is still used and it can be pretty misleading as you work with larger numbers. For example 128 gigs equals 128,000,000,000 bytes (if using the correct 1000 unit) or 137,438,953,472 bytes (if kilo/mega/giga = 1024).
The "wrong" convention is commonly still used for RAM chips. So a 128GB RAM chip is significantly larger than a 128GB SSD.
Also - hard drives, floppy disks, etc have always referred to their size in base 1000 numbers
That is not true. For a long time everything (computer related) was in the base 2 variants. Then the HD manufacturers changed so their drives would appear larger than they actually were (according to everyone's notions of what kn/mb/gb meant). It was a marketing shrinkflation stunt.