this post was submitted on 28 Jan 2024
997 points (97.7% liked)

Programmer Humor

19522 readers
293 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] tatterdemalion@programming.dev 74 points 9 months ago* (last edited 9 months ago) (4 children)

It literally cannot come up with novel solutions because it's goal is to regurgitate the most likely response to a question based on training data from the internet. Considering that the internet is often trash and getting trashier, I think LLMs will only get worse over time.

[–] space@lemmy.dbzer0.com 51 points 9 months ago (1 children)

AI has poisoned the well it was fed from. The only solution to get a good AI moving forward is to train it using curated data. That is going to be a lot of work.

On the other hand, this might be a business opportunity. Selling curated data to companies that want to make AIs.

[–] tatterdemalion@programming.dev 11 points 9 months ago (1 children)

I could see large companies paying to train the LLM on their own IP even just to maintain some level of consistency, but it obviously wouldn't be as valuable as hiring the talent that sets the bar and generates patent-worthy inventions.

[–] MagicShel@programming.dev 3 points 9 months ago

You can fine tune a model with specific stuff today. OpenAI offers that right on their website and big companies are already taking advantage. It doesn't take a whole new LLM, and the cost is a pittance in comparison.

[–] cybersandwich@lemmy.world 45 points 9 months ago (2 children)

I said this a while ago but you know how we have "pre-atomic" steel? We are going to have pre-LLM data sets.

[–] Obi@sopuli.xyz 17 points 9 months ago

Low-background steel, also known as pre-war steel, is any steel produced prior to the detonation of the first nuclear bombs in the 1940s and 1950s. Typically sourced from ships (either as part of regular scrapping or shipwrecks) and other steel artifacts of this era, it is often used for modern particle detectors because more modern steel is contaminated with traces of nuclear fallout.[1][2]

Very interesting, today I learned.

[–] DudeDudenson@lemmings.world 16 points 9 months ago

The reason why chat gpt 3.5 is still great for anything previous to it's cutoff date. It's not constantly being updated with new garbage

[–] ArrogantAnalyst@feddit.de 28 points 9 months ago (1 children)

Also the more the internet is swept with AI generated content, the more future datasets will be trained on old AI output rather than on new human input.

[–] tatterdemalion@programming.dev 15 points 9 months ago

Humans are also now incentivized to safeguard their intellectual property from AI to keep a competitive advantage.