this post was submitted on 10 Jul 2023
488 points (92.5% liked)

Asklemmy

43945 readers
1063 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women, ben shapiro reacting to some bullshit, joe rogan, all the works. I almost never allow it to go to that type of video and when I do it is either by accident or by curiosity. My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing, with occasional songs and videos about funny bullshit and I am not from america and I consider myself pretty liberal if I had to put it into terms used in america. But european liberal, so by american standards a socialist. Why does it recommend this shit to me, is this some kind of vector for radicalization of guys in my category? Do you have similar experience?

(page 3) 50 comments
sorted by: hot top controversial new old
[–] NAS89@lemmy.world 3 points 1 year ago (1 children)

Follow up question: why won’t YouTube Shorts trust me when I say I don’t want to be recommended channels.

So…many…dancing…trends. And they keep sending more.

[–] NightOwl@lemmy.one 2 points 1 year ago (1 children)

YouTube algorithm only cares about engagement not likes or dislikes. It has a neutral impression of likes and dislikes and only cares if people are actively leaving impressions. Whether it's from liking with joy or disliking with anger engagement is a sign to show more. I've heard contrary to logic it's better to just skip immediately and not press anything signifying reception to the content shown, since it'll perceive it as recommendation to show more.

[–] NAS89@lemmy.world 2 points 1 year ago

Yeah I can understand that for likes/dislikes or comments of “this is dumb”, but after hundreds of “do not recommend this channel”, the algorithm should be able to tell a lack of interest in a particular content.

[–] EssentialCoffee@midwest.social 3 points 1 year ago (1 children)

Like others have said, the things you watch are prime interests for right wing in the US. You have to train the algorithm that you don't want it.

[–] GoodEye8@lemm.ee 3 points 1 year ago* (last edited 1 year ago) (2 children)

I think the algorithm is so distorted by right-wing engagement that it will end up recommend right-wing content, even if you actively try to avoid it. I watch youtube shorts and I always skip if it's Shapiro, Peterson, Tate or Pierce Morgan. I also skip the moment I feel like the shorts might be right-wing. Scroll enough and eventually the algorithm will go "How about some Shapiro, Peterson, Tate or Morgan?" Give it enough time and it will always try to feed you right-wing content.

[–] jungekatz@lib.lgbt 1 points 1 year ago

Idk about that tbh , I have an account , and my algo is pretty trained , and there are times I have even told youtube to not suggest me peterson and shapiro stuff ! I moslty watch leftist content ! But i still get ben shapiro reacting to barbie and what not ( I am not even an american)

load more comments (1 replies)
[–] erogenouswarzone@lemmy.ml 3 points 1 year ago (3 children)

If they piss you off, you will stay on their platform longer, and they make more money.

That is the sad truth of EVERY social network.

Lemmy might not be that advanced yet, but as soon as they get big enough to need ads to pay for bandwidth and storage, soon after they will add algorithms that will show you stuff that pisses you off.

One way to combat this is to take a break from the site. Usually after a week, when you come back it will be better for a while.

[–] EssentialCoffee@midwest.social 2 points 1 year ago* (last edited 1 year ago)

I think it has more to do with the stuff you watch than wanting to piss you off.

All YouTube recommends to me are videos of kpop, dog grooming, Kitten Lady, and some Friesian horse stable that went across my feed once. Oh, and some historical sewing stuff.

If they started recommended stuff that pissed me off, I wouldn't bother going back except for direct videos linked from elsewhere.

Edit: Rereading what OP said they watch, their interests are primary interests of the right wing in the US. If they don't train the algorithm they don't want it, the algorithm doesn't know that those interests don't intersect.

load more comments (2 replies)
[–] Contramuffin@lemmy.world 3 points 1 year ago

I would wager a guess that it's your regular interests. YouTube sees that people who like machining, blacksmithing, etc. have a good chance of also being conservative. You probably are just part of the odd cases where you like those hobbies but aren't conservative.

Your post raises an interesting point, though: even if YouTube didn't intend for their algorithm to be a pipeline for radicalism, simply by encouraging engagement and viewership, their algorithm ends up becoming a radicalization pipeline anyways.

[–] Dohnakun@lemmy.fmhy.ml 3 points 1 year ago (1 children)

Just ignore the recommendations, it's mostly bulkshit anyway.

[–] omidmnz@lemmy.world 3 points 1 year ago

https://addons.mozilla.org/en-US/firefox/addon/youtube-recommended-videos/ Unhook "ignore"s them for me! It is available for other browsers too.

[–] IDe@lemmy.one 3 points 1 year ago

The best way to tune the algorithm on Youtube is to aggressively prune your watch/search history.
Even just one "stereotypical" video can cause your recommendations to go to shit.

[–] ScotinDub@lemm.ee 2 points 1 year ago

I used to get reasonable content as per my interests. Now after having a kid its just baby songs and bluey. I use smarttubenext on a firestick so at least I don't get ads but it's not great.

[–] Monkeyhog@lemmy.world 2 points 1 year ago

You need to learn how to properly prune your feed. I got some of that stuff briefly, but I kept blocking it and choosing not interested and eventually it stops. My feed shows me nothing I dont want now. Its just a matter of shaping it into what you want.

[–] SaltySalamander@lemmy.fmhy.ml 2 points 1 year ago (5 children)

Why does it recommend this shit to me

Because you, or someone using your account, has watched this type of shit in the past.

load more comments (5 replies)
[–] OneNot@lemmy.world 2 points 1 year ago

Yeah. I never really watched shorts, but I recently started watching them and I get tons of right-wing/red-pill shit in there despite clicking on "dont recommend this channel" (or whatever it says) on like 50 of those types of channels.

[–] reddig33@lemmy.world 2 points 1 year ago

Could be paid promotion. I get a lot of suggestions in my feed for some really awful music in genres that I never listen to. I wouldn’t be surprised if the record label is paying to put it there.

[–] 80085@lemmy.world 2 points 1 year ago

If I use a private window, and don't log in I get a lot of right-wing stuff. I've noticed it probably depends on IP/location as well. If at work, youtube seems recommend me things other people at the office listen to.

If I'm logged in, I only get occasional right-wing recommendations interspersed with the left-wing stuff I typically like. About 1/20 videos are right-wing.

YouTube Shorts is different. It's almost all thirst-traps and right-wing, hustle culture stuff for me.

It could also be because a lot of the people who watch the same videos you do tend to also watch right-wing stuff.

In general, the algorithm tries to boost the stuff that maximizes "engagement," which is usually outrage-type stuff.

[–] angrymouse@lemmy.world 2 points 1 year ago

You did not need to watch something to be bombarded with similar content, Youtube recommends things that are watched by ppl that watch things you watch (sorry about that). And it seems to considerate the overall popularity, at leat for me, so it usually recommends stupid popular right wing things just because it is overall popular and happens to be watched by a lot ppl that also watch dota 2 for example. I had to disable YouTube tu use my history to suggest content, my front page is full of things that I already watched from my subscriptions but for its better than YouTube stupid suggestions.

I am guessing you probably viewed enough of these videos that YouTube's dumb algorithm is like, "Oh hey @V01t45@lemmy.fmhy.ml wants to see right wing stuff so let's show him that." I agree that it is very annoying. This is why we need to rally behind starting to use PeerTube and cancelling YouTube.

load more comments
view more: ‹ prev next ›