this post was submitted on 29 Jan 2024
892 points (99.1% liked)

Technology

58780 readers
2773 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] 800XL@lemmy.world 74 points 8 months ago (4 children)

Start spinning up githubs poupulated with broken code and incorrect processes for other jobs to train the AI and make it worse

[–] agent_flounder@lemmy.world 83 points 8 months ago (2 children)

Ha that's just my regular code.

[–] hikikoma@ani.social 22 points 8 months ago

Is self harm like this allowed?

[–] 800XL@lemmy.world 3 points 8 months ago

I really appreciate this comment. I feel the same.

[–] perviouslyiner@lemmy.world 17 points 8 months ago* (last edited 8 months ago)

they've already trained on stack overflow, if you want an AI that recommends a complete change of technology stack in preference to solving the problem at hand.

I don't know if it can also insult you for wanting to solve the problem?

[–] Blackmist@feddit.uk 6 points 8 months ago (1 children)

But Microsoft already bought npm.

[–] 800XL@lemmy.world 2 points 8 months ago

Microsoft ruins everything good.

[–] Hamartiogonic@sopuli.xyz 3 points 8 months ago

Has anyone made a program for poisoning code? Sort of like the way nightshade is for pictures.