A new tool in the fight against online disinformation has been launched, called BotSlayer, developed by the Indiana University’s Observatory on Social Media.
The software, which is free and open to the public, scans social media in real time to detect evidence of automated Twitter accounts – or bots – pushing messages in a coordinated manner, an increasingly common practice to manipulate public opinion by creating the false impression that many people are talking about a particular subject. The method is also known as “astroturfing” because it mimics the appearance of legitimate grassroots political activity.
By leveraging the observatory’s expertise and technological infrastructure, BotSlayer gives groups and individuals of any political affiliation the power to detect coordinated disinformation campaigns in real time — without any prior knowledge of these campaigns. The software’s development was supported by a gift from Craig Newmark Philanthropies.
“We developed BotSlayer to make it easier for journalists and political campaigns to monitor potential new disinformation campaigns that attempt to manipulate public opinion using bots,” said Filippo Menczer, a professor in the IU School of Informatics, Computing and Engineering and director of the Observatory on Social Media.
“If there is a suspicious spike in traffic around some specific topic, BotSlayer allows you to spot it very quickly so you can investigate the content and its promoters and, if there appears to be abuse of the platform, report it or communicate to your followers about it.”
The use of deceptive bots to sway public opinion is a growing issue in politics in the U.S. and internationally, added Menczer, who is also a part of a group of researchers who found prevalent use of bots in the runup to the 2016 U.S. presidential election.
Other bot campaigns have sought to influence votes related to the U.K. Brexit movement and elections in France, Germany and Italy.
During the runup to the midterm elections in 2018, for example, the Democratic Congressional Campaign Committee used publicly available tools created by the observatory to report over 10,000 bots spreading voter suppression messages to Twitter, which shut down the accounts.
The tools used to inform the report were Botometer, which uses an algorithm to assign a score to Twitter accounts based upon the likelihood they’re automated, and Hoaxy, which lets individuals search and visualize the spread of specific topics on Twitter in real time.
Botometer is one of the observatory’s most popular tools, currently receiving over 100,000 queries per day.
How does it work?
BotSlayer, which combines technology from Hoaxy and Botometer, was created in part based on feedback from political and news organizations asking to make the observatory’s tools faster, more powerful and more user-friendly. These organizations include The Associated Press, The New York Times and CNN.
The system uses an “anomaly detection algorithm” to quickly report trending activity whose sudden surge is likely driven by bots, Menczer said.
For example, BotSlayer could be used during a presidential debate to not only instantly detect when a candidate’s username or related hashtags are trending, but also automatically assign a “bot score” to indicate whether the surge appears related to bot activity.
In business, BotSlayer could help organizations protect against reputational threats that rely on automated accounts to amplify messages. In journalism, the tool could be used to monitor against manipulation of reporting on trending topics, or warn the public against disinformation attacks.
In addition to detecting trends, BotSlayer can instantly generate a “network map” that illustrates how a particular topic is spreading over time. A bot score is also assigned to each user in the network, providing an easy way to see the most influential accounts — real or fake — in the conversation.
Each trending “entity” — a hashtag; a user handle; an image, video, gif or meme; or a keyword or phrase — is also assigned a percentage to indicate how quickly it’s surging. A percentage of 5,000 indicates a 50-fold increase in mentions in the past four hours, for example.
Security
via https://www.aiupnow.com
Help Net Security, Khareem Sudlow