this post was submitted on 16 Dec 2023
262 points (96.1% liked)

Technology

58999 readers
4201 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

all 28 comments
sorted by: hot top controversial new old
[–] flying_gel@lemmy.world 64 points 10 months ago (6 children)

I'm not sure it's just right leaning users. I'm pretty far to the left and I keep ketting anti-trans, anti-covid right wing talking points quite frequently. I keep pressing thumbs down but they keep coming.

[–] Stovetop@lemmy.world 32 points 10 months ago (2 children)

What YouTube sees:

"These videos keep eliciting reactions from users, which means that they prefer to engage with this content. This bodes well for our advertisers."

[–] Steve@communick.news 7 points 10 months ago (1 children)

Exactly. If you don't want to see them, best thing is to ignore them.

[–] xangadix@lemmy.world 2 points 10 months ago

I actually report every single one of them, usually the channel too, for hate speech. that seems to keep them out of my feed

[–] LWD@lemm.ee 6 points 10 months ago* (last edited 9 months ago)
[–] Caligvla@lemmy.dbzer0.com 16 points 10 months ago

Thumbs down actually makes Youtube recommend more stuff to you because you've engaged with the content. As said by other people, best way to avoid recommendations you don't want is to just ignore the content and not watch it, if it's being recommended to you then click the three dots on the video thumbnail and choose "not interested"/"don't recommend channel".

[–] ArcaneSlime@lemmy.dbzer0.com 14 points 10 months ago (1 children)

They're supposed to enrage you so you use their platform longer, hate-share the videos so others use their platform, etc. They know what they're doing.

[–] spudwart@spudwart.com 1 points 10 months ago

Which is why I don't share it, and I downvote it.

Clearly what I need to do next is immediately kill the tab or close the app.

I have definitely been hitting that “don’t recommend this channel to me” button noticeably more frequently lately…

[–] Thorny_Insight@lemm.ee 4 points 10 months ago (1 children)

I'd be really curious to know why this seems to be happening to so many people but not me. I'm a hardcore YouTube addict, but there's zero politics in my feed. I even follow many right-wing gun tubers, watch plenty of police bodycam footage and occasionally might even view one or two videos from people like Jordan Peterson, Joe Rogan and Ben Shapiro but even after that I might only get few more recommendations about their videos and once I ignore them they stop showing up. The only videos YouTube seems to be trying to force feed me are game streamers I've never heard of and judging by the view count on their videos, neither has anyone else.

[–] dexa_scantron@lemmy.world 5 points 10 months ago

You might be in a different test group. They always have a few different groups with different settings to check how well the algorithm is meeting their goals. That's how they know they make less money if they don't radicalize people.

[–] whenigrowup356@lemmy.world 1 points 10 months ago

Yeah I've gotten similar results too. Fwiw, I don't think downvoting is a good way to change your results. It seems to key into any interaction at all and also watch time. As soon as I see certain people I started just swiping immediately

[–] Amphobet@lemmy.dbzer0.com 43 points 10 months ago (1 children)

As tempted as I am to reply "Well, duh," I suppose it's good that we're getting research to back up what we already knew.

[–] PlzGivHugs@sh.itjust.works 7 points 10 months ago (2 children)

Yep. Every time I open YouTube, before signing in, the much of the front page is just far-right conspiracies, blatant misinformation, and other sketchy content.

[–] rckclmbr@lemm.ee 5 points 10 months ago* (last edited 10 months ago) (1 children)

Mines all cycling, music videos, movie trailers, and lofi music. What are we doing differently?

Edit: oh, "before signing in"

Edit 2: I loaded it in an incognito window and it just says "youtube watch history is off" with no recommendations. I really think it's you man

[–] PlzGivHugs@sh.itjust.works 1 points 10 months ago

Edit 2: I loaded it in an incognito window and it just says "youtube watch history is off" with no recommendations. I really think it's you man

Might be something to do with Chrome or that. I'm on Firefox and get a filled out home page when first loading youtube (no history or cookies, mid levels of fingerprint blocking). After logging in, I get the notification for disabled watch history, but not before.

[–] MrScottyTay@sh.itjust.works 2 points 10 months ago

Even before signing in YouTube well still try to recommend what the overall household watches by matching it up to the IP address. It's also why when signing in you sometimes get recommended what someone else in your family watches even if you don't. Same thing happens when you're not signed in too.

[–] vexikron@lemmy.zip 17 points 10 months ago (1 children)

They optimize recommendations to a large degree to induce anger and rage, because anger and rage are the most effective ways to drive platform engagement.

Facebook does the same.

[–] PoliticalAgitator@lemm.ee 1 points 10 months ago

We also have no idea what measures they take to stop the system being manipulated (if any).

The far-right could be working to ensure they're recommended as often as possible and if it just shows up as "engagement" or "impressions" on their stats, YouTube is unlikely to fight it with much enthusiasm.

[–] foggy@lemmy.world 9 points 10 months ago (1 children)

It's such a slippery slope. I avoid anything mildly right leaning to keep my algorithm clean. I sometimes watch stuff incognito to preserve my accounts algorithm. What a world...

[–] shalafi@lemmy.world 8 points 10 months ago (3 children)

LiberalGunNut™ here! (Yes, we exist.) I do not experience this. Bear with me a moment...

I consume loads of gun related content on YouTube. Historical, gunsmithing, basic repair, safety, reviews, testing, whatever. My favorite presenters are apolitical, or at least their presentations are.

My recommendations should be overrun with right-wing bullshit. Yet they are not. My recommendations are more of the same, and often include interesting and related media. I may stray off into other fringe areas like prepping, but even that doesn't get radical, and my feed comes back to center in a hurry.

Can someone explain what I'm seeing here?

As a side note, I do experience this with my default "news" tab on Edge. Yes, it's 95% crap, but I sometimes see real news I want to follow up. But fuck me, one time I clicked on a GenA vs. GenB article, flooded with it. My Android feeds me like this. Clicked on a couple of stories about wild pigs, flooded. Hummingbird story? Flooded.

But I'm not getting this on YouTube. 🤷🏻‍♂️

[–] dexa_scantron@lemmy.world 2 points 10 months ago

They don't do it to everyone. Some people get put in test groups that get 'nice' algorithms that don't try to make you angry, so they can measure the effect on their revenue.

Subjective biases can play a huge part in stuff like this. The researchers behind this story had to go through a bunch of YouTube channels and determine whether they constitute extremist right wing content or not.

I think it’s a safe assumption that if you took the people consuming that content and asked them whether the video they just watched was right wing extremist content, most of them would say no.

So, it’s possible that you don’t think you’re being overwhelmed with right wing extremist content, but that somebody else looking at your viewing history might think you are.

[–] snek@lemmy.world 1 points 10 months ago

It is entirely possible that YouTube's algorithm doesn't see you as someone interested in right-wing rhetoric (or perhaps you may have also downvoted such videos).

They explain how it works here (without technical details): https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/

[–] KinNectar@kbin.run 2 points 10 months ago

I am so sick of getting redpilled on YouTube Shorts. On the one hand I know that having the algorithm present content from across the political spectrum is healthy for discourse in some ways, I have had enough Ben Shapiro for my lifetime, thank you. Also it doesn't seem like the algorithm learns well from the "do not recommend this channel to me" function.

[–] jtk@lemmy.sdf.org 1 points 10 months ago* (last edited 10 months ago)

Isn't that basically the whole point of the algorithm? Isn't it behaving exactly as advertised?

You don't even have to feed the algorithm to get those videos. I have my history turned of so I don't get any suggestions on my home page anymore, but when I'm watching a video, the suggestions on the side invariably have a handful of right-wing idiots. You can sometimes see how YT might think they're related to what I'm watching (usually retro tech stuff), but they never actually are. I rarely see the same misfires with the left-wing videos. My guess is the brain-dead right-leaning viewers put that content in the high engagement buckets so they just get suggested more often. I don't think left leaning people engage much with the left wing media because it's usually boring politics that don't infringe on basic human rights, and we already know how bad the right is just by seeing them suck with our own eyes, we don't need to be told about it over and over to believe it, or to get bullshit "gottcha" material for water cooler conversations. We also already see how the politicians on the left suck in their own special ways without needing anyone to explain it to us. So what's the point in investing in a video when a few words, or none at all, will do the trick? Algorithms target idiots with unfounded rage, it's just that simple, I wouldn't even call it an algorithm, just basic number crunching.