this post was submitted on 10 Jul 2023
488 points (92.5% liked)

Asklemmy

43945 readers
985 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women, ben shapiro reacting to some bullshit, joe rogan, all the works. I almost never allow it to go to that type of video and when I do it is either by accident or by curiosity. My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing, with occasional songs and videos about funny bullshit and I am not from america and I consider myself pretty liberal if I had to put it into terms used in america. But european liberal, so by american standards a socialist. Why does it recommend this shit to me, is this some kind of vector for radicalization of guys in my category? Do you have similar experience?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] bownt@lemmy.ml 10 points 1 year ago

you are a closet conservative. the algoritham has spoken.

[–] nom_nom@lemmy.ml 9 points 1 year ago* (last edited 1 year ago)

A few years ago I downloaded a browser extension to stop showing me recommended videos, both on the homepage and on the side of videos. I can only watch what I'm subscribed to and what I search for - you'd think its a big sacrifice because you can't discover as many videos, but in reality I've gained so much more of my time back and control over what I actually want to watch.

[–] CurlyWurlies4All@prxs.site 8 points 1 year ago
[–] ImplyingImplications@lemmy.ca 8 points 1 year ago (1 children)

metalurgy, machining, blacksmithing

That could be it. I don't know YouTube's algorithm, but typically they work by finding what other users watch the videos you watch and recommending you other videos those people also watched. I wouldn't be surprised if the guys watching blacksmithing videos also tend to watch Joe Rogan and the like.

load more comments (1 replies)
[–] Bencodec@waveform.social 7 points 1 year ago

The algorithm is clever enough to know that people that watch a few of those videos are likely to watch a whole lot more. So it’s good business to recommend them as often as possible. If they CAN convince you to dive into that, the stats are that you will start to watch a ton more YouTube content.

[–] PowerCrazy@lemmy.ml 7 points 1 year ago

The algorithm wants engagement first and foremost (positive vs negative is irrelevant), after that it wants to push view points that preserve the status quo since change is scary to shareholders. So of course capitalist/fascist propaganda is preferred especially if the host is wrong about basic facts (being wrong drives engagement.)

[–] ChaoticEntropy@feddit.uk 7 points 1 year ago* (last edited 1 year ago)

I almost never allow it

The times you do allow it are all the algorithm cares about, sadly. Any kind of engagement is great for companies.

"Hate Rogan? Cool, watch some Rogan as hateporn, hate watching is still watching."

[–] jballs@sh.itjust.works 7 points 1 year ago* (last edited 1 year ago)

You can see what Google (thinks it) knows about you.

  • Go to your Google Account (https://myaccount.google.com)
  • Manage your Google Account
  • Select Privacy and personalisation.
  • Under this Data & privacy page you'll find History Settings, Ad Settings, and more.
  • For example, go to Ad Settings and click on Ad Personalization.
  • Now you'll see How your ads are personalized.

I think you can even remove stuff if you want.

[–] ezmack@lemmy.ml 7 points 1 year ago

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women

Yeah it really wants me to watch Tim pool. Its like cmon guys I'm too old for that guys tastes. Tiktok will atleast take the hint

[–] maniajack@lemmy.world 6 points 1 year ago (2 children)

I'll see major swings in the algorithm from time to time (a week of bass guitar video recommendations for some reason), but I can usually trace back a video or two that I watched that it just decided to try and cram the topic down my throat. I would just say make sure you keep avoiding the conservative vids to try and get the recommendations to stop. I wonder if maybe one of your kids is watching videos on your account and skewing it?

[–] NightOwl@lemmy.one 4 points 1 year ago

It can also be linked based where it recommends videos that conservatives have enjoyed that seem unrelated. Like maybe watching home improvement videos and historical war videos then the algorithm feeding new content that seems unrelated but actually is according to it, since it's trying to expose you to new interests to expand the time you spend there.

Best I found is just not using YouTube with an account and having cookies cleared on exit, or using newpipe (Android) and freetube (desktop) which let's you have an accountless subscription feed with Adblock and sponsorblock support.

load more comments (1 replies)
[–] lustyargonian@lemm.ee 6 points 1 year ago

I think such content gets most engagement. Dunking on leftist ideas brings right wingers celebrating and parroting the piece while pissed left wingers trying to explain why the argument doesn't make sense.

You can view what Google "knows" about you on your account settings. I made my account when I was very young, I lied about my age and my gender, then it made assumptions based on my interests of my professional situation. I guess many people in my gender and age group, sharing my actual interests (tech, movies, culture, food) are also interested in the kind of content you described (Joe Rogan, Jordan Peterson, Yiannopolous, etc). I keep clicking "not interested", but the algorithm keep suggesting these videos to me. I don't mind that Google doesn't know my politics. I'm a feminist, but there's really not a lot of interesting discourse about feminism on Youtube, so I just read and attend real life lectures instead.

[–] borlax@lemmy.borlax.com 5 points 1 year ago

Because controversy makes money and conservatism is filled with controversial opinions and purposely obtuse takes intended to spark conversation and promote divisiveness. That’s the grift.

[–] kava@lemmy.world 5 points 1 year ago

I get right wing stuff only on YouTube shorts typically. And I think it's because I'll watch them. I find it interesting in a detached sense.

Good to know what you're up against. Same reason I try and watch as many Trump speeches as I reasonably can.

[–] Atrabiliousaurus@lemmy.world 5 points 1 year ago (1 children)

machining

No idea about your algorithm problems but have you seen the Cutting Edge Engineering Australia channel? It's so good.

[–] PipedLinkBot@feddit.rocks 9 points 1 year ago (1 children)

Here is an alternative Piped link(s): https://piped.video/zvKG5dgUHNw

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

[–] spacedancer@lemmy.world 5 points 1 year ago

Piped

Cool, I didn't know about this. Good bot.

[–] Hikermick@lemmy.world 4 points 1 year ago

"The calls are coming from inside the house"~~__****

[–] yoz@aussie.zone 3 points 1 year ago

This can't be any more clearer

Android TV- install smartnexttube Android phone- Set private DNS to Nextdns and block all ads + install firefox w/ unlock origin Or if you want an app install - newpipe from fdroid app store

load more comments
view more: β€Ή prev next β€Ί