Foreign influence campaigns, or information operations, have been widespread in the run-up to the 2024 U.S. presidential election. Influence campaigns are large-scale efforts to shift public opinion, push false narratives or change behaviors among a target population. Russia, China, Iran, Israel and other nations have run these campaigns by exploiting social bots, influencers, media companies and generative AI.

[…]

[Influence campaigns include] which researchers call inauthentic coordinated behavior. [They] identify clusters of social media accounts that post in a synchronized fashion, amplify the same groups of users, share identical sets of links, images or hashtags, or perform suspiciously similar sequences of actions.

[…]

[Researchers] have uncovered many examples of coordinated inauthentic behavior. For example, we found accounts that flood the network with tens or hundreds of thousands of posts in a single day. The same campaign can post a message with one account and then have other accounts that its organizers also control “like” and “unlike” it hundreds of times in a short time span. Once the campaign achieves its objective, all these messages can be deleted to evade detection. Using these tricks, foreign governments and their agents can manipulate social media algorithms that determine what is trending and what is engaging to decide what users see in their feeds.

[…]

One technique increasingly being used is creating and managing armies of fake accounts with generative artificial intelligence. [Researchers] estimate that at least 10,000 accounts like these were active daily on the platform, and that was before X CEO Elon Musk dramatically cut the platform’s trust and safety teams. We also identified a network of 1,140 bots that used ChatGPT to generate humanlike content to promote fake news websites and cryptocurrency scams.

In addition to posting machine-generated content, harmful comments and stolen images, these bots engaged with each other and with humans through replies and retweets.

[…]

These insights suggest that social media platforms should engage in more – not less – content moderation to identify and hinder manipulation campaigns and thereby increase their users’ resilience to the campaigns.

The platforms can do this by making it more difficult for malicious agents to create fake accounts and to post automatically. They can also challenge accounts that post at very high rates to prove that they are human. They can add friction in combination with educational efforts, such as nudging users to reshare accurate information. And they can educate users about their vulnerability to deceptive AI-generated content.

[…]

These types of content moderation would protect, rather than censor, free speech in the modern public squares. The right of free speech is not a right of exposure, and since people’s attention is limited, influence operations can be, in effect, a form of censorship by making authentic voices and opinions less visible.

  • niucllos@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    2 days ago

    No, the Republicans also don’t have the power to fix the system. That’s not their goal. Both parties have the power to completely gum up the works of the government, which is antithetical to fixing the system, but is perfectly acceptable if your goal is to weaken protections to allow a privileged few to gain more power through extragovernmental levers. If we entered a mirror world where the Democratic party were gunning to be a fascist dictatorship and the Republicans were gunning to stop them, but all voters retained their current alliances, not much would change long-term because there are enough people in both parties to obstruct and roadblock, unless the now-pro-civil-rights supreme court kept being radical but in a positive direction

      • averyminya@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        2 days ago

        People like us getting into politics, which will only happen if we are allowed to hold positions of office.

        Right now, we’ve seen that hate crimes and death threats against PoC and queer politicians forced them to step down out of fear of their livelihood because of the events of 2016, and we’ve seen that throughout 2008-2016 and after 2020 that Democrat-running candidates can actually accomplish small steps towards making progress, even if the machine as a whole still tries to stop them. If Republicans/Conservatives had their way, someone like Jabari Brisport would not be safe to exist in politics.