screaming about “theft!” and “hacking!!”
Sounds plausible. Or maybe they will go with a don't use it, because privacy! take. Funny thing is, I actually agree people shouldn't give them their data. But they shouldn't give it to OpenAI either...
screaming about “theft!” and “hacking!!”
Sounds plausible. Or maybe they will go with a don't use it, because privacy! take. Funny thing is, I actually agree people shouldn't give them their data. But they shouldn't give it to OpenAI either...
Wow, that's awesome!
I often think about the many devices I own with closed firmware in them, and the many amazing things these devices could be used for if they were more open and documented. Consider the amazing things people accomplish on old 80s/90s home computers and games consoles, often going way beyond what was thought possible with it at the time... the same could be done with so many other devices. Of course, people already do such hacking - like this example in the blog post. But the barrier for that would be so much lower if it didn't require elaborate reverse engineering (how do people find the time and energy for that....). I have a little collection of 90s synth modules, I would love to modify their firmware, if it was available.
Sometimes I wish there was a law that forced companies to open up datasheets/internal documentation/etc. for a product when they stop making it... But yeah, can't have that, of course.
I had to read “Uber for AI code data” so now you do too.
Wow, what a fractal of cursed meaning. I don't even understand what it really means, but it feels like understanding it any further would cause considerable psychic damage.
This shows the US is falling behind China, so you gotta give OpenAI more money!
Fear of a "bullshit gap", I guess.
Oh, and: simply perfect choice of header image on that article.
As a fan of physical media, I recently bought another drive as a spare, currently is IMO a good time for that. They still make really good drives in large enough quantities so they're cheap, but that could end any time. Once production stops, they will vanish silently. Learned that lesson back then when floppy drives were suddendly gone... kinda wish I had stocked up a few new ones (for retro computing purposes) when they were still available.
Sooner or later the only remaining source of reliable digital information will be 1990s multimedia CD-ROM encyclopedias.
various topics (e.g., AI news, crypto, fitness, personal finance)
That sure is a specifc selection of topics.
Same. They've been a staple in my RSS feed list for so long (and they are one of the few sites where the RSS feed isn't just the headlines). But recently I've been thinking several times already about throwing them out.
"Shortly after 2027" is a fun phrasing. Means "not before 2028", but mentioning "2027" so it doesn't seem so far away.
I interpret it as "please bro, keep the bubble going bro, just 3 more years bro, this time for real bro"
Maybe this is common knowledge, but I had no idea before. What an absolutely horrible decision from google to allow this. What are they thinking?? This is great for phishing and malware, but I don't know what else. (Yeah ok, the reason has probably something to do with "line must go up".)
So much wrong with this...
In a way, it reminds me of the wave of entirely fixed/premade loop-based music making tools from years ago. Where you just drag and drop a number of pre-made loops from a library onto some tracks, and then the software automatically makes them fit together musically and that's it, no further skill or effort required. I always found that fun to play around with for an evening or two, but then it quickly got boring. Because the more you optimize away the creative process, the less interesting it becomes.
Now the AI bros have made it even more streamlined, which means it's even more boring. Great. Also, they appear to think that they are the first people to ever have the idea "let's make music making simple". Not surprising they believe that, because a fundamental tech bro belief is that history is never interesting and can never teach anything, so they never even look at it.
Maybe people believe that all the AI stuff is just magic [insert sparkle emoji], and that can terminate further thought...
Edit: heh, turns out there's science about that notion