“We just joined Truth Social, mostly because we thought it would be very funny,” it explained. “Follow us there for truths and retruths or whatever they call them.”

  • @Cossty@lemmy.world
    link
    fedilink
    391 year ago

    Couple of hours ago I saw video on yt with title “Biden joins Truth Social. THEY ARE DESPERATE” I didn’t watch the video so idk what was acually said. Some years ago I watched conservative content. Even though I dont watch it at all anymore I still sometimes get shit like this in my homepage.

    • @Sacha@lemmy.world
      link
      fedilink
      121 year ago

      Yup, I watched part of a Tim Cast (never heard of him before the yt recommendation, nor consumed any similar content previously) video, a lot of it was unhinged and I clicked out quickly with the ramping hatred intensity and rage baiting.

      It took like a year for yt to stop recommending me similar far alt-right bullshit because I watched maybe five or ten minutes of a Tim Cast video

      • @PoliticalAgitator@lemm.ee
        link
        fedilink
        141 year ago

        You don’t have to click on a far-right video. You can also get them when you click on a progressive video, or a medical video, or a gaming video.

        When Amazon first introduced their “People who liked X also liked Y” feature, it was hilariously easy to manipulate. You could just spam links back and forth and within a few minutes, “People who liked The King James Bible also like A Hand in the Bush: The Lost Art of Vaginal Fisting”.

        It didn’t take too long for them to put a stop to it but decades later, are we actually sure content suggestion algorithms are any better at withstanding manipulation?

        The far-right aren’t exactly good people so I doubt they’re saying “Sure, I openly celebrate mass shooters that target minorities but I draw the line at using sleazy techniques manipulate content suggestions”.

        • @SkyezOpen@lemmy.world
          link
          fedilink
          51 year ago

          My yt algorithm has turned into endless lefty stuff with the occasional funny video, and my ads are nothing but Jordan peterson and TPUSA.

          • @PoliticalAgitator@lemm.ee
            link
            fedilink
            61 year ago

            Usually the ones I see are clearly fishing for vulnerable people; a practise they put a huge amount of time, money and effort into.

            They’re active in conspiracy groups because it’s a good place to find unmedicated schizophrenics.

            They’re active in gaming groups because it’s a good place to find lonely, disaffected young men.

            They’re active in anti-vax groups because they’re a good place to find uneducated, paranoid people.

            They’re active in fundamentalist circles because it’s a good place to find bigots.

            I’ve found that going anywhere near any of those topics will cause the algorithm to start baiting its hook.

        • @Bytemeister@lemmy.world
          link
          fedilink
          Ελληνικά
          3
          edit-2
          1 year ago

          I made the mistake of watching a nice little old guy showing how to collect and refine clay. In his video, I got an ad from Epoch times about their new anti-trans smear piece, and two videos from some dude saying that a Democrat planned global food shortage was going to subdue the resistant population so here is why you need solar panels and colloidal silver. The website in the ad went to a single page that tried to sell me “free” trump gold bars that would skyrocket in value. No other links on the page worked. Needless to say, I reported all the ads, but YouTube is still running this harmful and scam-bait content.

      • I have a similar problem from watching firearm content on YouTube. YouTube just assumes I love right wing media because I’m interested in guns.