this post was submitted on 28 Jul 2023
642 points (98.9% liked)

Technology

59300 readers
4374 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] p03locke@lemmy.dbzer0.com 1 points 1 year ago (5 children)

Discrimination is the wrong word. Technology has no morals or sense of justice. It is bias in the data that developers should have accounted for.

[–] steltek@lemm.ee 3 points 1 year ago

It's totally accurate though. It's like the definition of systemic racism really. Think about housing or financial policy that disproportionately fails for minorities. They aren't some Klan manifesto. Instead they just include banal qualifications and exemptions that end up at the same result.

[–] slumberlust@lemmy.world 2 points 1 year ago (1 children)

This seems shortsighted. You are basically asking people to police their own biases. That's a tall ask for something no one can claim immunity from.

[–] p03locke@lemmy.dbzer0.com 1 points 1 year ago (1 children)

I am asking a group of scientists who should be very well-versed in statistics and weights, you know, one of the biggest components in a machine learning model, to account for how biased their data is when engineering their model.

It's really not a hard ask.

[–] Cortell@lemmy.world 1 points 1 year ago

So in other words technology is just as biased as the people who designed it

[–] Smokeless7048@lemmy.world 1 points 1 year ago (1 children)

It can be an imported bias/descrimination. I still think that words fair.

Do you have a more accurate word?

[–] p03locke@lemmy.dbzer0.com 2 points 1 year ago

I already said it: bias. It's a common problem with LLMs and other machine learning models that model engineers need to watch out for.

[–] Cortell@lemmy.world 1 points 1 year ago

Ask the people who create the data sets that machine learning models train on how they feel about racism and get back to us