this post was submitted on 22 Dec 2023
425 points (93.5% liked)

Technology

59457 readers
3708 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Steve@communick.news 47 points 11 months ago (4 children)

the organization documented and reviewed more than a thousand reported instances of Meta removing content and suspending or permanently banning accounts on Facebook and Instagram.

Does 1000 seem small for an intentional, global, censorship campaign? That seems very small to me. That seems like a rounding error on a days worth of reported posts.

[–] ethan@lemmy.world 12 points 11 months ago

Most of this entire report is patently ridiculous. They asked people who follow HRW’s social media to please send them instances of censorship on social media, get about 1,500 random examples from a self-selecting population, then publish a big expose about it.

There’s no intensive comparative analysis (statistical or otherwise) to other topics discussed, other viewpoints discussed, or at other times in the past. They allege, for example, that some people didn’t have an option to request a review of the takedown- is that standard policy? Does it happen in other cases? Is it a bug? They don’t seem to want to look into it further, they just allude to some sense of nebulous wrongdoing then move on to the next assertion. Rinse and repeat.

The one part of the report actually grounded in reality (and a discussion that should be had) is how to handle content that runs afoul of standards against positive portrayal of terrorist organizations with political wings like the PFLP and Hamas. It’s an interesting challenge on where to draw the line on what to allow- but cherry picking a couple thousand taken down posts doesn’t make that discussion any more productive in any way.

[–] morrowind@lemmy.ml 9 points 11 months ago (1 children)

Those are just the documented ones. They don't exactly have access to meta's modlogs

[–] abhibeckert@lemmy.world 3 points 11 months ago* (last edited 11 months ago)

We have access to Lemmy.ml’s modlogs. I wonder how many pro-Palestinian posts have been deleted? I bet it’s more than zero… and Facebook probably handles more posts per second than lemmy.ml handles in a full day.

[–] rockSlayer@lemmy.world 4 points 11 months ago (1 children)

It's not enough to prove a pattern of behavior, but it's enough to call out as a disturbing trend.

[–] Steve@communick.news 4 points 11 months ago (1 children)

Is it? We'd need to know a lot more about how often this happens to other random groups to determine that.

[–] rockSlayer@lemmy.world 2 points 11 months ago

Facebook has a history of extreme status quo bias on issues like this. A statistical analysis should be the next priority. However a trend is still a trend, even if it's unintentional.

[–] eclectic_electron@sh.itjust.works 1 points 11 months ago (1 children)

Indeed. It would be interesting to run the same analysis for censorship of pro Israel content and compare the differences between the two, though the data would likely still be noisy and inconclusive.

[–] BlueBockser@programming.dev 3 points 11 months ago

The fact that you're being downvoted for calling for a more thorough and objective investigation really says it all.