this post was submitted on 25 Jul 2023
53 points (65.1% liked)

Technology

59201 readers
2945 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 44 comments
sorted by: hot top controversial new old
[–] CookieJarObserver@sh.itjust.works 123 points 1 year ago (3 children)

Mastodon is a Software, the stuff is hosted by others. What a idiotic, click bait and wrong headline.

[–] deweydecibel@lemmy.world 39 points 1 year ago* (last edited 1 year ago) (3 children)

It's software that also serves as a method to distribute and access it. But ultimately, it doesn't matter, the resulting pushback will be the same.

The conclusion of the study was basically that the biggest players should enter the fediverse in order to use their capabilities to scan and police it.

Wherever this shit exists, unwanted attention and scrutiny will follow, while the reputation of the platform will be harmed.

[–] DarkThoughts@kbin.social 29 points 1 year ago

That's like blaming vbulletin for Nazi forums or something.

[–] CookieJarObserver@sh.itjust.works 15 points 1 year ago (1 children)

This will literally do nothing, the people conducting the study obviously have absolutely no idea how federation and the fediverse in general work...

And the big Players will be thrown out by everyone else, we are here because we hate them.

[–] HKayn@dormi.zone 2 points 1 year ago (1 children)

the people conducting the study obviously have absolutely no idea how federation and the fediverse in general work...

How do you know that?

[–] CookieJarObserver@sh.itjust.works 18 points 1 year ago (1 children)

Study:

Mastodon = Server hoster

Mastodon = Responsible for content

Mastodon = able to moderate

Reality:

Mastodon ≠ Server hoster

Mastodon ≠ Responsible for content

Mastodon ≠ able to moderate

[–] HKayn@dormi.zone 3 points 1 year ago (1 children)

Where does the study suggest the Mastodon software's ability to moderate, or that it's responsible for content?

[–] CookieJarObserver@sh.itjust.works 1 points 1 year ago (1 children)

Did you read the headline? Or the article?

[–] HKayn@dormi.zone 2 points 1 year ago* (last edited 1 year ago)

Did you read the actual study that this article refers to?

Going by your lack of further response, I'm going to assume you didn't, otherwise you'd have noticed that you're wrong. I recommend reading the sources of articles in the future before commenting on them.

The conclusion of the study was basically that the biggest players should enter the fediverse in order to use their capabilities to scan and police it.

Not sure if that would work as users are fleeing from those big players as they don't prioritize the safety and needs of their users.

The contradictory problem is that current major corporations prioritize money at all costs even at the expense of their users so their customer base flee to the next best service/product provider.

People are currently abandoning Reddit and Twitter because their moderation system either doesn't work or has underlying contradictions to what users are asking for.

Facebook launched Threads and people only joined initially due to FOMO. With how transparent they are in harvesting user data at the expense of people's privacy I think (and hope) that people are starting to realize that this is probably not in their best interests.

I think what we're seeing is evolutionary filtration of the web similar to natural ecosystems where the species with the highest ability to adapt that survives.

Based off of one metric it seems that companies structured around proprietary software (zero-sum systems) are unsustainable. This is my untested observation however so this could be true currently but systemically wrong once examined and tested.

So the idea that

biggest players should enter the fediverse in order to use their capabilities to scan and police it.

doesn't seem to make the most logical sense as the foundation for those companies is untrustworthy and unsustainable.

[–] Synthead@lemmy.world 17 points 1 year ago

Yup. Might as well blame Nginx.

[–] pastermil@sh.itjust.works 2 points 1 year ago

Classic Verge

[–] ChaoticEntropy@feddit.uk 66 points 1 year ago

Windows is the No. 1 OS amongst cocaine, heroine and meth dealers. Time for a crackdown.

[–] LexiconDexicon@lemmy.world 53 points 1 year ago

And here we go with the corporation funded clickbait folks...

[–] deweydecibel@lemmy.world 34 points 1 year ago (4 children)

And here we go.

This will be one of the Fediverse's biggest obstacles.

Need to get this under control somehow or else in a few years, tech companies, banks, and regulators will decide a crackdown on the fediverse as a whole is needed.

[–] cerevant@lemmy.world 17 points 1 year ago

The fediverse is the name for services that use ActivityPub - a communication protocol. What you are saying is like saying “tech companies, banks and regulators need to crack down on http because there is CSAM on the web”.

[–] weedazz@lemmy.world 14 points 1 year ago

A few years? I bet Threads is doing this right now to shut down every private instance and take the fediverse for themselves. They'll argue they are the only one that can moderate the content due to their size/resources

[–] Blamemeta@lemmy.world 4 points 1 year ago (1 children)

You act like fediverse is one website, and its not.

[–] LinkOpensChest_wav@lemmy.one 6 points 1 year ago

I think they're speaking from the point-of-view of an uneducated body of legislators and average people who will not understand this

It doesn't matter what we know the nature of the fediverse to be -- it matters how they perceive it, and uninformed people are perfect targets for this type of FUD

For example, the linked article exists

[–] mbelcher@kbin.social 23 points 1 year ago

Far-right instances Gab and TruthSocial are also technically mastodon. By this metric mastodon also has a nazi problem.

Any software that allows people to communicate over the internet will be used by horrible people to do horrible things.

[–] demonsword@lemmy.world 18 points 1 year ago

100% of all child abusers have drank water in the last two days. Clearly water is the problem here.

[–] starman@programming.dev 16 points 1 year ago

Well, then http[s] also has this problem

[–] Metal_Zealot@lemmy.world 11 points 1 year ago

That's like child molesters texting each other, and then saying "TELUS AND BELL ARE PERPETUATING A CHILD SEX TRAFFICKING RING"

[–] syntacticmistake@kbin.social 10 points 1 year ago

People need to read the actual report it is actually reasonable in its findings and actually offers solutions: https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media

[–] MarioBarisa@lemmy.ml 10 points 1 year ago (1 children)

Just added “Stanford researchers” to my list of stupid people

[–] such_fifty_bucks@lemmy.one 9 points 1 year ago (1 children)

https://www.aljazeera.com/news/2023/7/20/stanford-president-resigns-following-research-ethics-probe

The president of Stanford University has stepped down in the wake of an independent investigation that found “substandard practices” in research papers he was involved in.

Already plenty of support for your cause.

[–] Glarrf@midwest.social 9 points 1 year ago

We need more tools, more automation, in order to fight the trash

Anything that allows people to escape their corporate controls will be ostracized by these people. I laugh in LEMMY

[–] whenigrowup356@lemmy.world 6 points 1 year ago (1 children)

Shouldn't it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?

[–] ozymandias117@lemmy.world 5 points 1 year ago

Those databases are highly regulated, as they are, themselves CSAM

Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all

[–] woefkardoes@lemmy.world 5 points 1 year ago

Somewhere we went from finding those who do wrong and punishing them to censor everything because everyone is bad.

[–] MyOpinion@lemm.ee 3 points 1 year ago

The Apache foundation has got a huge child sex problem. They must be policed by Microsoft. /s

[–] JazzAlien@lemm.ee 2 points 1 year ago

Reposting because the comments on the earlier post were far too informative in explaining the truth.

[–] bender@insaneutopia.com 2 points 1 year ago

They’re basically saying the fediverse has a bunch of creeps lurking.

[–] Techmaster@lemmy.world -5 points 1 year ago (1 children)

So they went on mastodon and started searching for CP? WTF is wrong with these sick people? I hope they're on an FBI list now.

[–] deong@lemmy.world 8 points 1 year ago (1 children)

So your advice to any organization seeking to minimize illegal activity is to willfully ignore any trace of it?

[–] Techmaster@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (1 children)

"I swear, officer, I was just searching for CP to catch OTHER people!"

It would be just as pathetic as that scene from Something About Mary. "Yeah I was just going to pee, too!"

Maybe they had some kind of legal sanctioning to do it, but holy crap, I wouldn't want that in my search history. I would hope software like that has some mechanism where if people search for certain words it results in an automatic reporting to some FBI API somewhere. I actually know of a couple of people who got caught with that stuff. One got 25 years. The other jumped bail and they eventually caught him. I'm not sure if he's been sentenced yet but I bet he'll get double of what the other guy who cooperated got. Those people are creepy AF and nobody in their right mind would want to be associated with any of it. Those people are 10 times worse than neo nazis.

The funny thing is the first guy, everybody could kind of tell he was a creep. But the FBI caught him and he completely cooperated and admitted everything. The second guy, he really seemed like he was going to be the only person in his family who actually turned out to be a decent guy. He was a really sweet kid in a super trashy family. And then all of a sudden everything goes down and everybody is in shock. Then he jumps bail. Last I heard his dad was about to lose his house because he used it as collateral to bail his piece of shit son out of jail.

This open source software needs to include code that reports certain search terms. There are ML algorithms out there that can automatically detect this stuff. Do not search for that kind of content, thinking you're some sort of vigilante. There are ways to deal with this shit without putting yourself in serious legal peril.

[–] deong@lemmy.world 1 points 1 year ago

I don't think you understand how a research organization works. This isn't three guys in a basement searching for child porn. It's a research institute at Stanford University. They'll have gotten funding to do the work by applying for federal grants, getting approval from multiple Institutional Review Boards who are charged with, among other things, making sure that the people involved in the research are appropriately taken care of. They'll be required to have counselors on board. However "legit" you think such an outfit might possibly be, multiply that by three.

This is their job. It is the same as if they worked for a law enforcement agency. When someone gets arrested for child porn, we don't also charge the police, prosecutors, and judges who might have to look at the material as part of prosecuting a case. I promise you Stanford isn't paying a team of professors and postdocs to just diddle themselves to kiddie porn all day.

load more comments
view more: next ›