this post was submitted on 24 Jul 2023
816 points (85.4% liked)

Showerthoughts

30039 readers
952 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. A showerthought should offer a unique perspective on an ordinary part of life.

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. Avoid politics
    • 3.1) NEW RULE as of 5 Nov 2024, trying it out
    • 3.2) Political posts often end up being circle jerks (not offering unique perspective) or enflaming (too much work for mods).
    • 3.3) Try c/politicaldiscussion, volunteer as a mod here, or start your own community.
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct

founded 2 years ago
MODERATORS
 

Yet.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] ZIRO@lemmy.world 11 points 1 year ago* (last edited 1 year ago) (1 children)

I think that the left-right dichotomy is inherently flawed. A lot of what I believe might be considered "right-leaning" or "left-leaning," but I cannot say that I prescribe to either sort of ideology fully or with any fidelity.

I will always be opposed to any view with a pervasive "moral" authority, and both the so-called left and right are obsessed with their own versions of this. The problem we run into is the false supposition that beliefs can be categorized on a spectrum spanning right to left (or, even more liberally, a spectrum spread across two dimensions). It has been a ridiculous notion from its inception, whenever that might have been.

Building one's identity (another silly notion, in general—identity itself being a frivolous construct that functions only as a fulcrum for the extortion of social power) upon a supposed spectrum is likewise ridiculous. You can be conservative or liberal, or anything, really. But those beliefs do not exist in a linear or planar dimension. They are so far removed from each other that one cannot fathom sliding incrementally from one to the next.

And to each respective party, "left" and "right," the other can be demonized as evil, even without full comprehension of the other. It's all just so damned tribalistic and silly.

load more comments (1 replies)
[–] wwaxwork@lemmy.world 11 points 1 year ago (1 children)

Algorithms and AI. Rage gets views, so it's what gets pushed to the top, so it gets even more views, so it gets pushed to the top.

[–] damnYouSun@sh.itjust.works 8 points 1 year ago* (last edited 1 year ago)

Yeah, Lemmy has address to this by just having an incredibly glitchy algorithm (look at this post with five up votes from four months ago, it deserves to be on the front page). No one can game it because no one understands it.

[–] dmmeyournudes@lemmy.world 7 points 1 year ago (7 children)

Can someone explain to me why everyone on this site thinks that everything bad about other social media sites is somehow being forced upon the users to enslave them to "the algorithm"? It's like socialist Qanon.

[–] ArbiterXero@lemmy.world 9 points 1 year ago (14 children)

Sooooo, there’s a lot of truth to it.

Once a site is big enough that they want to cash in on it, they develop tools and ai and make choices that are designed to keep you on the site longer.

These tools and ai quickly discover that the way to keep you engaged is to keep you enraged. Content that angers you will keep your engagement longer and keep you coming back.

This is well researched and I’ll cite sources if you need it.

So what happens is that the ai, while it isn’t designed explicitly to show right wing content, will end up learning that showing that content accomplished it’s actual goal. It’s original goal being “Keep people on the site longer”

Right wing content fits a nice niche where it engages a lot of people. Donald trump claiming that he lost the election will enraged the right because they believe in his horse shit and that the election was stolen, and the left gets enraged by it because it causes unnecessary violence like Jan 6th. The AI loves that because it’s fairly universally enraging, and engaging most people.

[–] DauntingFlamingo@lemmy.ml 6 points 1 year ago (1 children)

To build upon this, just getting into a petty online argument about nothing keeps users coming back. I enjoy reading the back and forth between two strangers

load more comments (1 replies)
load more comments (13 replies)
[–] Stovetop@lemmy.world 9 points 1 year ago

I don't know exactly what angle you're looking to clarify in that regard, but to ELI5 it:

There are two factors: targeted ads and algorithm manipulation.

Mainstream social media sites earn money from ads they deliver. The more people stay on the site and view posts, the more ads they see. The algorithm is designed to promote content that users are likelier to view, not necessarily content that they would like more. In practice, this tends to be content that provides some sort of shock value. That combination of targeted ads with clickbait creates "doomscrolling".

Longer explanation below:

The value that social media sites give to advertisers is that they know everything about their users. They collect data based on posts and viewing habits to learn things like income, hobbies, location, sexual orientation, political affiliation, etc. When advertisers buy ads to show on social media sites, they get to target these ads at specific people that they are likely to leave the biggest impact on.

But what happens if you want to increase the visibility of your (not ad) content on social media? A lot of companies use social media to bring people to their own sites/channels where they make money. In some cases, they can pay to be promoted, giving them an advantage in the algorithm. In other cases, they can manipulate the algorithm using clickbait (to engage users using the doomscrolling trend) or even using bots to give a false sense of engagement.

In recent major elections/referendums, there were a lot of ads and promoted content intended to sway opinions. People would intentionally be shown content to upset them, increasing doomscrolling and increasing their chances of getting out to vote against these things. However, in many cases, the content that people would see would be half-truths or outright lies. Because they were earning money, social media sites did not care about verifying the content of the ads they were showing.

It's been proven that Brexit, for example, was decided by voters who were manipulated via targeted ads and clickbait delivered by social media to believe falsehoods that swayed their vote. And in many cases, these lies weren't just spread by specific political campaigns, but actually by external governmental entities who had a vested interest in the outcome. Namely Russia, who had a lot to gain from a weaker EU.

Lemmy is not immune to doomscrolling and bot manipulation, but it doesn't have ads and, that we know of, does not sell user data. It's harder to be targeted here because the only thing people can do is try to game the vote system to make their content more visible (which is sadly easier than it should be). But all you have access to are people subscribed to specific communities or registered on specific instances. It's harder to target people en masse and you only have a single data point to target, namely people who like [community topic].

load more comments (5 replies)
[–] NumbersCanBeFun@kbin.social 6 points 1 year ago (1 children)

I mentioned this on another thread awhile back but most of Reddit is just bots or shills pushing a paid agenda. It’s been nice not to see that activity here but I know eventually it will rear it’s ugly head.

load more comments (1 replies)
load more comments
view more: ‹ prev next ›