Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
United Kingdom
Related: About this forumThe Spectator: Can you distinguish between a bot and a human?
Weve all gone a bit bot-mad in the past few weeks. Automated accounts have invaded our civic life especially pesky Russian ones and politicians on both sides of the Atlantic have woken up to the fact that a new propaganda war is taking place online.
Bots which is of course short for robot are essentially accounts which can be programmed to automatically post, share, re-tweet, or do whatever the programmer chooses. Creating a bot is extremely easy, and huge amounts of cheap bots are available on dark net markets for next to nothing. There are millions of harmless bots out there doing all sorts of helpful and funny things, including breaking news stories. But Russia twigged early that bots can also be usefully deployed to influence public opinion. It has been using them for years. During the Senate investigation into Russian meddling in the US election, Twitter revealed the handles of some 36,746 Russian-linked bots. They had tweeted a total of 1.4 million times in the two months before the election, and the accounts had been viewed almost 300 million times.
This new world of pseudonyms, virals, and digital public opinion is becoming murky. Its not always easy to tell humans and bots apart, because some bots behave like humans; and some humans behave like bots. One academic report earlier this year tried to measure Labour bots during the election. It estimated that any account which tweets over 50 times a day on a single hashtag is a bot. Myself and colleagues at Demos took a closer look at this and it turned out that many of these bots were in fact fanatic Labour supports who were tweeting so frenetically they looked machines. Equally, improvements in machine learning mean bots are looking more and more human. Soon, it will be very difficult to tell them apart.
Far more worrying than bots though are the paid content producers. A decent amount of the Russian interference appears to emanate from the Russian troll factory based in St Petersburg where hundreds of people work 12-hour shifts spreading information that supports the Kremlins line. (Salaries of around between £575 £830pcm). My guess is that a lot of the accounts they run are cyborg which are half bot, half human. A human operator runs thousands of accounts, adding the odd bit of human content to bots in order to evade standard spam filters.
https://blogs.spectator.co.uk/2017/11/can-you-distinguish-between-a-bot-and-a-human/
Bots which is of course short for robot are essentially accounts which can be programmed to automatically post, share, re-tweet, or do whatever the programmer chooses. Creating a bot is extremely easy, and huge amounts of cheap bots are available on dark net markets for next to nothing. There are millions of harmless bots out there doing all sorts of helpful and funny things, including breaking news stories. But Russia twigged early that bots can also be usefully deployed to influence public opinion. It has been using them for years. During the Senate investigation into Russian meddling in the US election, Twitter revealed the handles of some 36,746 Russian-linked bots. They had tweeted a total of 1.4 million times in the two months before the election, and the accounts had been viewed almost 300 million times.
This new world of pseudonyms, virals, and digital public opinion is becoming murky. Its not always easy to tell humans and bots apart, because some bots behave like humans; and some humans behave like bots. One academic report earlier this year tried to measure Labour bots during the election. It estimated that any account which tweets over 50 times a day on a single hashtag is a bot. Myself and colleagues at Demos took a closer look at this and it turned out that many of these bots were in fact fanatic Labour supports who were tweeting so frenetically they looked machines. Equally, improvements in machine learning mean bots are looking more and more human. Soon, it will be very difficult to tell them apart.
Far more worrying than bots though are the paid content producers. A decent amount of the Russian interference appears to emanate from the Russian troll factory based in St Petersburg where hundreds of people work 12-hour shifts spreading information that supports the Kremlins line. (Salaries of around between £575 £830pcm). My guess is that a lot of the accounts they run are cyborg which are half bot, half human. A human operator runs thousands of accounts, adding the odd bit of human content to bots in order to evade standard spam filters.
https://blogs.spectator.co.uk/2017/11/can-you-distinguish-between-a-bot-and-a-human/
2 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
The Spectator: Can you distinguish between a bot and a human? (Original Post)
Denzil_DC
Nov 2017
OP
Not sure how but what I am sure of is on this site and others I frequent
Eliot Rosewater
Nov 2017
#1
According to a study I read months ago, the bots that fooled people the most...
Buckeye_Democrat
Nov 2017
#2
Eliot Rosewater
(32,537 posts)1. Not sure how but what I am sure of is on this site and others I frequent
there are many of them.
Buckeye_Democrat
(15,059 posts)2. According to a study I read months ago, the bots that fooled people the most...
... were the "angry" and "insulting" ones. The bots that behaved like trolls.