this post was submitted on 14 Mar 2025
1118 points (98.9% liked)

Technology

66465 readers
5808 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] killeronthecorner@lemmy.world 3 points 14 minutes ago* (last edited 13 minutes ago)

This is basically a veiled admission that OpenAI are falling behind in the very arms race they started. Good, fuck Altman. We need less ultra-corpo tech bro bullshit in prevailing technology.

[–] kittenzrulz123@lemmy.blahaj.zone 4 points 19 minutes ago
[–] BlameTheAntifa@lemmy.world 14 points 1 hour ago* (last edited 1 hour ago) (1 children)

I have conflicting feelings about this whole thing. If you are selling the result of training like OpenAI does (and every other company), then I feel like it’s absolutely and clearly not fair use. It’s just theft with extra steps.

On the other hand, what about open source projects and individuals who aren’t selling or competing with the owners of the training material? I feel like that would be fair use.

What keeps me up at night is if training is never fair use, then the natural result is that AI becomes monopolized by big companies with deep pockets who can pay for an infinite amount of random content licensing, and then we are all forever at their mercy for this entire branch of technology.

The practical, socioeconomic, and ethical considerations are really complex, but all I ever see discussed are these hard-line binary stances that would only have awful corporate-empowering consequences, either because they can steal content freely or because they are the only ones that will have the resources to control the technology.

[–] patrick@lemmy.bestiver.se 2 points 25 minutes ago

Japan already passed a law that explicitly allows training on copyrighted material. And many other countries just wouldn’t care. So if it becomes a real problem the companies will just move.

I think they need to figure out a middle ground where we can extract value from the for profit AI companies but not actually restrict the competition.

[–] BostonSamurai@lemm.ee 8 points 2 hours ago* (last edited 2 hours ago)

Oh no, not the plagiarizing machine! How are rich hacks going to feign talent now? Pay an artist for it?! Crazy!

[–] Yerbouti@sh.itjust.works 2 points 2 hours ago

Open can suck some dick.

[–] frog_brawler@lemmy.world 1 points 2 hours ago
[–] geography082@lemm.ee 37 points 8 hours ago

Fuck these psychos. They should pay the copyright they stole with the billions they already made. Governments should protect people, MDF

[–] Rakonat@lemmy.world 27 points 8 hours ago (1 children)

At the end of the day the fact that openai lost their collective shit when a Chinese company used their data and model to make their own more efficient model is all the proof I need they don't care about being fair or equitable when they get mad at people doing the exact thing they did and would aggressively oppose others using their own work to advance their own.

load more comments (1 replies)
[–] merdaverse@lemmy.world 37 points 8 hours ago

TLDR: "we should be able to steal other people's work, or we'll go crying to daddy Trump. But DeepSeek shouldn't be able to steal from the stuff we stole, because China and open source"

[–] futatorius@lemm.ee 20 points 8 hours ago

Sounds fair, shut it down.

[–] azalty@jlai.lu 5 points 6 hours ago (3 children)

To be fair, they’re not wrong. We need to find a legal comprise that satisfies everyone

[–] Pyr_Pressure@lemmy.ca 16 points 6 hours ago (1 children)

It's called paying for the content

[–] EnthusiasticNature94@lemmy.blahaj.zone 4 points 2 hours ago* (last edited 2 hours ago)

This.

I support AI, but I don't understand why AI bros are complicating things or making things all-or-nothing.

OpenAI had enough money to hire a hitman on one of their whistleblowers. They can afford to pay for content, lol.

[–] whoisearth@lemmy.ca 7 points 6 hours ago

But how will corporations like Disney survive without copywrites?! Won't someone think about the poor corporations?!

/s

[–] lightnsfw@reddthat.com 7 points 6 hours ago* (last edited 6 hours ago) (7 children)

Why? Nothing they've shat out is good for anything anyway.

load more comments (7 replies)
[–] faberyayo@lemm.ee 12 points 8 hours ago

Fuck OpenAI for stealing the hard work of millions of people

[–] Daerun@lemmy.world 25 points 9 hours ago* (last edited 9 hours ago) (2 children)

Why training openai with literally millions of copyrighted works is fair use, but me downloading an episode of a series not available in any platform means years of prison?

[–] Wiz@midwest.social 1 points 2 hours ago

Have you thought about incorporating yourself into a company? Apparently that solves all legal problems.

load more comments (1 replies)
[–] Ferroto@lemmy.world 16 points 9 hours ago

Good. Fuck AI

[–] FreddyNO@lemmy.world 4 points 6 hours ago

Sounds good, fuck em

[–] Ensign_Crab@lemmy.world 59 points 12 hours ago* (last edited 12 hours ago) (3 children)

If giant megacorporations can benefit by ignoring copyright, us mortals should be able to as well.

Until then, you have the public domain to train on. If you don't want AI to talk like the 1920s, you shouldn't have extended copyright and robbed society of a robust public domain.

load more comments (3 replies)
load more comments
view more: next ›