• @herpaderp
    link
    English
    7
    edit-2
    9 months ago

    I’ve literally watched friends of mine descend into far right thinking and I can point to the moment when they started having algorithms suggest content that puts them down a “rabbit hole”

    Like, you’re not wrong they were right wing initially but they became the “lmao I’m an unironic fascist and you should be pilled like me” variety over a period of six months or so. Started stock piling guns and etc.

    This phenomena is so commonly reported it makes you start wonder where all these people deciding to “radicalize themselves” all at once seemingly came out in droves.

    Additionally, these companies are responsible for their content serving algorithms, and if they did not matter for affecting the thoughts of the users: why do propaganda efforts from nation states target their narratives and interests appearing within them if it was not effective? Did we forget the spawn and ensuing fall out of the Arab Spring?