• FauxLiving@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    22 hours ago

    How many social media sites are you a part of that is run by a person in the Epstein files?

    Most of them apparently

    • partofthevoice@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      18 hours ago

      Are we a part of the social media site? I’d argue, given all the effects those damned sites have on its users, its governments, and the economy… the damned sites are actually a part of us. Our behavior changes after the introduction of those things, because we are now (1) whatever we were before + (2) whatever those sites have done to our species. That makes us (3) the result.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        17 hours ago

        We are certainly affected by the social norms and memes which are generated by the larger site’s algorithms even on Lemmy.

        Even here you see arguments that are Reddit styled, you see self censorship from TikTok (unalived, f*ck, etc) and you see the purest outrage-bait from X. A huge portion of the content that we see on Lemmy is content that was surfaced by the primary social media site’s algorithms.

        We do get to dodge the hyper targeted nature of the content so, assuming you’re Lemmy-only (if not, delete those apps for your own sanity) you won’t be given content that is hypertuned to your specific psychological traits so a lot of things will fail to make as big of an impact on you which can give you enough mental space to maintain perspective.

        • partofthevoice@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          14 hours ago

          Yeah, and now that you mention it directly it’s got me thinking… technology in its own right seems to maintain this capacity to destabilize power dynamics, given it can change fundamental ways we depend on the world. With social media, you could say discourse in many ways has become dependent on a platform built by the private interests of its creators. In a perverse way, maybe as a consequence of it being able to change our way of life, technology poses this constant risk — doesn’t it? And with our societal culture of glorifying technological innovation (e.g., social media at its start) without proper risk assessment — aren’t we inviting this kind of power disruption?

          I suppose, in a way, a “functional” government should be able to intervene to prevent changes in power structure where it shouldn’t occur. Or, perhaps some kind of social paradigm that has the passive capacity to cannibalize any such movements in its power structures? What do you think is the cause effect relationship there, and a proper response to maintaining long term stability?

          • FauxLiving@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            14 hours ago

            You’re right, as a consequence of the power of this new technology to change our life it poses a constant risk to fabric of society and our ability to understand facts about the world.

            Discourse and culture are shaped by the structure of these social networks. Those structures are designed to the benefit of a dozen or so specific people. The amount of power that it gives them over all of society is not an amount of power that should be wielded by a private individual.

            We wouldn’t let Oppenheimer have an arsenal of nuclear weapons because he was part of the team that invented The Bomb. We recognized, as a species, that this technology was too dangerous for anybody to have (even though we all thought we were the exceptions) and we tightly control access to this technology and stack all kinds of safeguards and checks on their usage as if our lives depend on it… because they do.

            We can all see the power of controlling the perception and attention of society. We can see how discourse is shaken and manipulated for views and profit instead of for understanding and knowledge. We need to treat these technologies like they are dangerous cyber weapons. They need to be studied by professionals and the structure of these systems of discourse need to be set for the public good.

            Just to head off the obvious attack angle. I don’t mean regulate speech, but the upvote system from Reddit is a terrible way to handle the ‘which comments should we show people’ problem. It’s also probably not a good idea to use machine learning to optimize ‘Engagement’ or other metrics when we know the outcome is that it drives content that creates fear, hatred, disgust and anger. A video recommendation algorithm that prioritizes views and comment engagement over anything else ends of amplifying the viewpoints of the most extreme opinions and this creates a false perception of consensus towards extremism. Allowing programs to advertise themselves as ‘News’ when they’re just ‘entertainment shows’ is about as harmful as letting companies claim their peanut butter is ‘allergen free’.

            We’re in the wild wild west with an incredibly destructive technology being driven by a couple of dozen people who appear to have little empathy and a taste for power that may lead them into flying too close to the sun.