this post was submitted on 27 Dec 2023
144 points (69.6% liked)

Technology

59671 readers
3353 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.

This one is about why it was a mistake to call 1024 bytes a kilobyte. It's about a 20min read so thank you very much in advance if you find the time to read it.

Feedback is very much welcome. Thank you.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] abhibeckert@lemmy.world 0 points 11 months ago* (last edited 11 months ago) (1 children)

"Kilo" means 1000 under the official International System of Units.

With some computer hardware, it's more convenient to use 1024 for a kilobyte and in the early days nobody really cared that it was slightly wrong. It has to do with the way memory is physically laid out in a memory chip.

These days, people do care and the correct term for 1024 is "Kibi" (kilo-binary). For example Kibibyte. There's also Gibi, Tebi, Exbi, etc.

It's mostly CPUs that use 1024 - and also RAM because it't tightly coupled to the CPU. The internet, hard drives, etc, usually use 1000 because they don't have any reason to use a weird numbering system.

[โ€“] mb_@lemm.ee 1 points 11 months ago

Weird numbering system? Things are still stored in blocks of 8 bits at the end, it doesn't matter.

When it gets down to what matter on hard drives, every byte still uses 8 bits, and all other numbers for people actually working with computer science that matter are multiples of 8, not 10.

And because all internal systems use base 8, base 10 is "slower" (not that it matters any longer.