this post was submitted on 01 Dec 2023
325 points (97.9% liked)

Programmer Humor

32558 readers
356 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 

I know this isn't any kind of surprise, and yet, well...

you are viewing a single comment's thread
view the rest of the comments
[–] 018118055@sopuli.xyz 42 points 11 months ago (7 children)

2100 and 2400 will be a shitshow

[–] SkybreakerEngineer@lemmy.world 35 points 11 months ago (2 children)
[–] 018118055@sopuli.xyz 10 points 11 months ago

Yeah that's a different shitshow but agreed it is likely to be worse - like y2k the effects are smeared out before and after the date.

[–] borth@sh.itjust.works 2 points 11 months ago (2 children)
[–] leo@feddit.de 8 points 11 months ago
[–] Robmart@lemm.ee 6 points 11 months ago* (last edited 11 months ago) (1 children)

32bit systems will stop working. The Unix timestamp, which increases by 1 every second and started the first second of 1970, will reach the max of 32 bit integers. Bad things will follow.

[–] bjorney@lemmy.ca 10 points 11 months ago

This has already been patched on all 64 bit OSes though - whatever 32 bit systems are still in existence in another 15 years will just roll their dates back 50 years and add another layer of duct tape to their jerry-rigged existence

[–] Frederic@beehaw.org 16 points 11 months ago

2038 will certainly be a shit show

[–] flameguy21@lemm.ee 8 points 11 months ago

Yeah but I'll be dead so not my problem lmao

[–] 0x4E4F@infosec.pub 3 points 11 months ago

Luckily, none of us will be there.

[–] deegeese@sopuli.xyz 3 points 11 months ago (1 children)

Nah.

Same thing happened in 2000 and it was a mouse’s fart.

[–] 018118055@sopuli.xyz 18 points 11 months ago (1 children)

Because of months of preparation. I know, I was doing it.

[–] deegeese@sopuli.xyz 3 points 11 months ago (1 children)

And now that every time library has been updated, we're safe until our grandchildren reimplement those bugs in a language that has not yet been invented.

[–] 018118055@sopuli.xyz 8 points 11 months ago (2 children)

I've already seen reimplementation of 2 digit dates here and there.

[–] deegeese@sopuli.xyz 5 points 11 months ago (1 children)
[–] 018118055@sopuli.xyz 4 points 11 months ago (1 children)

Fortunately I will not be involved. Hopefully I can make something from 2038 though.

[–] deegeese@sopuli.xyz 2 points 11 months ago

You’re not the only one forseeing a nice consultant payday there.

[–] JustCopyingOthers@lemmy.ml 3 points 11 months ago

I went to uni in the mid 90s when Y2K prep was all the rage, went back to do another degree 20 years later. It was interesting to see the graffiti in the CS toilets. Two digits up to about 1996, four digits for a decade, then back to two.

[–] Midnight1938@reddthat.com 3 points 11 months ago (2 children)
[–] 018118055@sopuli.xyz 16 points 11 months ago (1 children)

2100 not a leap year (divisible by 100). 2400 is a leap year (divisible by 400). Developing for dates is a minefield.

[–] Midnight1938@reddthat.com 1 points 11 months ago* (last edited 11 months ago)

Now imagine working on non Georgian, and the year is 2060

[–] xmunk@sh.itjust.works 8 points 11 months ago

Because they're not leap years but are 0 === year % 4

[–] humorlessrepost@lemmy.world 1 points 11 months ago (1 children)

Won’t the computer’s clock reset every time you go to sleep and stop cranking the power generator?

[–] 018118055@sopuli.xyz 3 points 11 months ago

Yeah who knows if our computers are sticks by either date