Two weeks ago, executives from Facebook, Twitter, and Google took the stand to testify about the role of Russian bots and trolls in the 2016 Presidential election.
While the coverage was fascinating to follow, two deeper, more pervasive lessons became apparent, first, this will not be an isolated event and, second, social media will forever be changed by it. As a social media marketer and user, here’s what you need to know.
First, let’s define how this works:
Bot: device or piece of software that can execute commands, reply to messages, or perform routine tasks, as online searches, either automatically or with minimal human intervention.
Troll: to post inflammatory or inappropriate messages or comments on (the Internet, especially a message board) for the purpose of upsetting other users and provoking a response.
To get one perspective of how this works, check out this five-minute video.
In the video, Ben Nimno, Senior fellow at the Atlantic Council Research Center states,
“You can try and make an individual tweet look really popular by getting a hundred thousand bots to retweet it.”
Herein lies why this tactic is so effective, and yet so problematic. Most social media algorithms use factors like timeliness and engagement to drive trending topics and raise the visibility of content in the News Feed. Bad actors have engineered a system that uses the algorithm to support their agenda. Until this tactic proves ineffective, we should be prepared to see it continue.
A recent NPR article stated,
“Internet freedom is on the decline for the seventh consecutive year as governments around the world take to distorting information on social media in order to influence elections, a new report says.”
This comes one day after news reports allege that Russia used Twitter bots to influence the Brexit vote. Freedom House found online "manipulation and disinformation tactics played an important role" in elections in 18 countries.
In response, users and platforms are changing how they function. According to a recent Pew Research Center report, 67% of Americans get at least some news on social media, yet 51% say they see inaccurate news, only 24% of people said social media did a good job separating fact from fiction, and 5% have a lot of trust in the news they receive on social media. Science content fares worse, only a quarter of social media users trust these platforms as a source of science news. This distrust is inevitable and, likely prudent, considering the rise of misinformation campaigns. But if users no longer trust what they see on social media, will it continue to be of benefit to them, will they continue to use it?
Last month, Facebook announced its new ad transparency efforts. Users will be able to view all ads a Page is running and all ads must be associated with a Page. These changes are expected to impact U.S. Pages this summer. Federal-election related ads are required to comply with a higher level of transparency, revealing details on ad spend, impressions, and demographics.
What this means for the social media health care marketer:
What do you think about the use of bots and trolls and how is your organization responding?
Page loaded in 0.424 seconds