Locking this post because people are getting downright vicious to each other in the comments.
Not The Onion
Welcome
We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!
The Rules
Posts must be:
- Links to news stories from...
- ...credible sources, with...
- ...their original headlines, that...
- ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”
Please also avoid duplicates.
Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.
And that’s basically it!
So it takes ChatGPT 10 minutes to an hour of servertime and the energy equivalent of a tank of gas or two to complete a simple task the user could have done in thirty seconds using their 40W brainmeats and a couple of pudgy fingers. That's just great. Good stuff, Altman. /s
But it will get better we promise please use AI. I swear you're going to love it we are going to replace all workers with agents, just use it, use AI. Please use AI. We are going to put AI into your taskbar use it. Just use AI. It will get better, might as well use AI now. Use AI now.
Are you saying cupcakes aren't high stake?
I read this article name out loud and the person I read it to literally said
"Oh, is this The Onion?"
You all like to be so negative about this because you're scared. Truth is, he's making the right move. Another company would have released the same, but with different ethics in mind. He also makes sure to note that people shouldn't trust it with anything important.
We all know that not everyone reads manuals and uses tech responsibly. That doesn't stop time, nor progression. Best that a non-profit company with at least somewhat positive ethics comes first with stuff like this, instead of for-profit capitalist leeches.
Say all you want, there's arguments against OpenAI, for sure. But I'm not wrong.