this post was submitted on 26 Nov 2024
559 points (97.1% liked)
Microblog Memes
6023 readers
1241 users here now
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Seems pretty smart to me. Copilot took all the data out there that says that women earn 80% of what their male counterparts do on average, looked at the function and interred a reasonable guess as the the calculation you might be after.
I mean, what it's probably actually doing is recreating a similarly named method from its training data. If copilot could do all of that reasoning, it might be actually worth using ๐
Yeah llms are more suited to standardizing stuff but they are fed low quality buggy or insecure code, instead of taking the time to create data sets that would be more beneficial in the long run.
That's the whole thing about AI, LLMs and the like, its outputs reflect existing biases of people as a whole, not an idealized version of where we would like the world to be, without specific tweaking or filters to do that. So it will be as biased as what generally available data will be.
Turns out GIGO still applies but nobody told the machines.
Thr machines know, they just don't understand what's garbage vs what's less common but more correct.
It applies but we decided to ignore it and just hope things work out
More likely it pulled that but directly from other salary calculating code.