this post was submitted on 24 Jan 2024
569 points (97.3% liked)

Programmer Humor

19594 readers
850 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
top 32 comments
sorted by: hot top controversial new old
[–] EisFrei@lemmy.world 89 points 10 months ago (1 children)
[–] tryptaminev@feddit.de 23 points 9 months ago

This is brilliant.

[–] bangupjobasusual@sh.itjust.works 83 points 10 months ago (2 children)

I think some compilers will just drop that in the optimization step.

[–] ryannathans@aussie.zone 28 points 9 months ago (1 children)

Real pain in the ass when you're in embedded and your carefully placed NOPs get stripped

[–] vrighter@discuss.tchncs.de 5 points 9 months ago

asm("nop");

[–] wreckedcarzz@lemmy.world 17 points 10 months ago (1 children)

Homer: "oh yeah speed ~~holes~~ sleep"

[–] jaybone@lemmy.world 3 points 9 months ago

Sleep holes

[–] Darkassassin07@lemmy.ca 71 points 10 months ago (2 children)

Tell the CPU to wait for you?

Na, keep the CPU busy with useless crap till you need it.

[–] jaybone@lemmy.world 27 points 9 months ago (1 children)

Fuck those other processes. I want to hear that fan.

[–] leclownfou@sh.itjust.works 4 points 9 months ago

I paid good money for my fan, I want to know it's working!

[–] kogasa@programming.dev 24 points 9 months ago

Have you considered a career in middle management

[–] aksdb@lemmy.world 26 points 9 months ago (2 children)

On microcontrollers that might be a valid approach.

[–] kevincox@lemmy.ml 11 points 9 months ago (1 children)

I've written these cycle-perfect sleep loops before.

It gets really complicated if you want to account for time spent in interrupt handlers.

[–] aksdb@lemmy.world 2 points 9 months ago

Thankfully I didn't need high precision realtime. I just needed to wait a few seconds for serial comm.

[–] Darkassassin07@lemmy.ca 1 points 9 months ago (1 children)

But then I gotta buy a space heater too...

[–] YIj54yALOJxEsY20eU@lemm.ee 4 points 9 months ago (1 children)

Microcontrollers run 100% of the time even while sleeping.

[–] towerful@programming.dev 4 points 9 months ago (1 children)

Nah, some MCUs have low power modes.
ESP32 has 5 of them, from disabling fancy features, throttling the clock, even delegating to an ultra low power coprocessor, or just going to sleep until a pin wakes it up again. It can go from 240mA to 150uA and still process things, or sleep for only 5uA.

[–] YIj54yALOJxEsY20eU@lemm.ee 3 points 9 months ago* (last edited 9 months ago)

Nah, Sleeping != Low power mode. The now obsolete ATmega328 has a low power mode.

[–] Matty_r@programming.dev 13 points 9 months ago (1 children)

This should be the new isEven()/isOdd(). Calculate the speed of the CPU and use that to determine how long it might take to achieve a 'sleep' of a required time.

[–] henfredemars@infosec.pub 12 points 9 months ago

I took an embedded hardware class where specifically we were required to manually calculate our sleeps or use interrupts and timers rather than using a library function to do it for us.

[–] vcmj@programming.dev 10 points 9 months ago* (last edited 9 months ago)
[–] drmoose@lemmy.world 10 points 9 months ago* (last edited 9 months ago) (2 children)

Javascript enters chat:

await new Promise(r => setTimeout(r, 2000));

Which is somehow even worse.

[–] sbv@sh.itjust.works 4 points 9 months ago

As someone who likes to use the CPU, I don't think it's worse.

[–] KairuByte@lemmy.dbzer0.com 2 points 9 months ago

I mean, it’s certainly better than pre-2015.

[–] ExtraMedicated@lemmy.world 5 points 9 months ago (1 children)

I actually remember the teacher having us do this in high school. I tried it again a few years later and it didn't really work anymore.

[–] snaggen@programming.dev 13 points 9 months ago (1 children)

On my first programming lesson, we were taught that 1 second sleep was for i = 1 to 1000 😀, computers was not that fast back then...

[–] aBundleOfFerrets@sh.itjust.works 3 points 9 months ago (1 children)

I mean maybe in an early interpreted language like BASIC… even the Intel 8086 could count to 1000 in a fraction of a second

[–] snaggen@programming.dev 5 points 9 months ago

This was in 1985, on a ABC80, a Swedish computer with a 3 MHz CPU. So, in theory it would be much faster, but I assume there were many performance losses (slow basic interpretor and thing like that) so that for loop got close enough to a second for us to use.

https://en.m.wikipedia.org/wiki/ABC_80

[–] lauha@lemmy.one 4 points 10 months ago

I can relate. We have breaks ate work too.

[–] Socsa@sh.itjust.works 4 points 9 months ago

You gotta measure the latency of the first loop.

[–] Bronco1676@lemmy.ml 2 points 9 months ago

I just measured it, and this takes 0.17 seconds. And it's really reliable, I added another zero to that number and it was 1.7 seconds