My problem with C/C++ is the people behind the spec have sacrificed our sanity in the name of "compiler optimization". Signed overflow behaves the same on every cpu on the planet, why is it undefined behaviour? Even more insane, they specify intN_t
must be implemented via 2s complement.. but signed overflow is still undefined because compilers want to pretend they run on pixie dust instead of real hardware.
mrkite
Now if only the people behind Angular held the same belief.
I spent about 10 of those in roles where my primary function was to write code. The other 10 have involved managing programmers, coaching them, consulting with organizations about how to manage them, running a codebase assessment practice and these days, well, actually content marketing.
Therein lies the biggest lie in development. There is no career path. I've been programming professionally for 25 years, and in all 25 of those years my primary function was to write code, because I turned down any promotion that would put me in management and away from doing what I love.
I follow Lina on mastodon, and it's just funny to me that cutting edge Linux video driver development is live streamed by a Japanese cat girl vtuber.
I have hooks that reformat on write, so I use :w
constantly. So :wq
is easier.
Think his tombstone has :q!
on it? I love vim, I've used it daily for 25 years now.
I have a Sun Netra X1 sitting on the floor that I've been meaning to get rid of. One thing to keep in mind with the Netra and Sun Fire servers: they have a System Configuration Card in the back which holds their MAC address, NVRAM settings etc.. don't buy one with it missing.
I remember the 90s when both mac and windows crashed on a daily basis. When was the last time you saw a legitimate BSOD that didn't involve hardware failure? When was the last time you had to reset the PRAM on your mac just to get it to boot?
Are you me? I do the exact same thing.. only I also made a Makefile to do all the stow commands for me.
As comprehensive as that is, they don't explain why we use 2's complement for negatives, and that's a shame because it's a really intuitive answer when it's explained.
For the following, pretend we're dealing with 4-bit numbers.. 0000 to 1111. Just to keep the examples sane.
With 4 bits, we can represent 0 to 15... but let's add negative the dumbest way possible. Just make the high bit negative, so 0000b-0111b is 0 to 7, and 1000b-1111b is -0 to -7. We have 0 and negative 0, which is weird but let's ignore that for now. Does this system even work with basic arithmetic? Let's do a sanity check. If we have a negative number and add 1 to it, it should get closer to 0. -4 + 1 = 1100b plus 0001b = 1101b or -5. That's the opposite of what we want.
So to fix our problem we just need to reverse the order of the negative numbers. We can easily do that by inverting all the bits. 0000b-0111b is still 0 to 7, but now 1000b-1111b is -7 to -0. Does the math work? Let's see: -4 + 1 = 1011b plus 0001b = 1100b or -3! It works!
You'll find ones-complement actually works quite well for arithmetic. So why don't we use it? You'll recall we ignored the fact that there is a positive and a negative zero. Let's address that now. The two zeros means comparisons are a pain. 0 != -0 without a bunch of extra logic.
It's a simple fix. What if we shift the negative number line over so the two zeros are the same? Simply add 1 to the number after inverting. 1111b (which was previously -0 in 1s complement) rolls over to become 0000b, which is 0! 1110b (-2 in 1s complement) becomes 1111b and we'll call it -1, and so on. It even gives us an extra number because 1000b represents -8, which we couldn't represent in 1s complement.
And that's why we use 2s complement, and now you'll always remember why and how it works.
Agreed. While I've never used ChatGPT on an actual project, I've tested it on theoretical problems and I've never seen it give an answer that didn't have a problem.
So I would treat it like any answer on Stack Overflow, use it as a start, but you should definitely customize it and fix any edge cases.
Rust is the only language I know of that is actively being used at the kernel level all the way through to the web app level. Compare that with Swift which is not only mostly tied to a single ecosystem, but even the "cross platform" stuff like libdispatch is littered with code like:
if #available(macOS 10.12, iOS 10.0, tvOS 10.0, watchOS 3.0, *)