Recommendation algorithms operated by social media giants TikTok and X have shown evidence of substantial far-right political bias in Germany ahead of a federal election that takes place Sunday, according to new research carried out by Global Witness.

The non-government organization (NGO) undertook an analysis of social media content displayed to new users via algorithmically sorted “For You” feeds — finding both platforms skewed heavily toward amplifying content that favors the far-right AfD party in algorithmically programmed feeds.

Global Witness’ tests identified the most extreme bias on TikTok, where 78% of the political content that was algorithmically recommended to its test accounts, and came from accounts the test users did not follow, was supportive of the AfD party. (It notes this figure far exceeds the level of support the party is achieving in current polling, where it attracts backing from around 20% of German voters.)

MBFC
Archive

  • Nalivai@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    18 hours ago

    You don’t know that, like at all. They say that it is, but there is absolutely no reason to belive them, musks and zukerbergs are lying pieces of shit.

    • KeenFlame@feddit.nu
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      17 hours ago

      Let’s pretend that if you want. I know they are lying. But in perfectly “legal” ways

      • Nalivai@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 hours ago

        Pretend what? Pretend that you (and also everone else) don’t have an insight into how their algorithms work, besides their words? We need to pretend that?