this post was submitted on 03 Aug 2023
202 points (100.0% liked)

Technology

37717 readers
364 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Employees say they weren’t adequately warned about the brutality of some of the text and images they would be tasked with reviewing, and were offered no or inadequate psychological support. Workers were paid between $1.46 and $3.74 an hour, according to a Sama spokesperson.

you are viewing a single comment's thread
view the rest of the comments
[–] fred-kowalski@kbin.sh 2 points 1 year ago (1 children)

You comment inspired this thought: The older I get, the less I have faith in psychological support making us whole. I still think it should be part of work like this but the damage can be as permanent as losing a limb. What is that worth in money? (hypothetical)

[–] TwilightVulpine@kbin.social 2 points 1 year ago

That's definitely something to consider. Psychological support helps people with coping but it doesn't remove the trauma. Anyone willing to do this sort of work deserves to be very well compensated.

But all that said, it isn't even unique to AI that there is a need for people to sift through the worst stuff imaginable to prevent everyone else from being exposed. All user-generated internet content has that problem.