Playbooks often set out to subvert democracy and sow discord with social media. And it happens on sites like Reddit, through accounts like Bootinbull and JerryRansom mentioned above, both of which were identified by Stringhini and colleagues, in an attempt to feed a controversial message while using a stream of ordinary and mundane posts as a cover. Like Bootinbull, JerryRansom has used the same cute animal images and 4chan-baiting memes, then gradually slipped into political rhetoric — with posts added at r/sexygirls. Notably, several accounts that Stringeni says have behavior similar to those emphatically linked to Russia have previously been posted on r/aww, encouraging users to share images that might elicit “aww” reactions — often of lovable animals.
The way trolls accounts behave can be discerned by what Stringhini calls “loose formatting patterns”. With less sophisticated bot accounts, their nature can be determined by the timing and type of content posted – because they often send the same message from a number of different Twitter accounts created specifically for this purpose or picked up from innocent stakeholders through cyber-attacks that steal login details Their own. But troll calculations require a deeper analysis.
The troll method – which involves real people behind the calculations, rather than pre-programmed bots – is becoming more popular as the old, blunt robotic tools lose their power. They often focus on Russia, says Eliot Higgins, founder of Bellingcat, which documents and exposes the use of such campaigns and open source intelligence. “Trolls tend to be more influential, because then they take advantage of the natural evolution of these online communities, rather than trying to build something from scratch, which is a lot harder to do.”
Instead, Russian trolls build fake personas online, trying to integrate themselves into pre-existing Reddit communities, and then move the conversation to their real targets. Like Bootinbull and JerryRansom, they started with harmless posts about dogs and animals before moving on to geopolitics. The goal is to make the person behind the accounts appear more realistic, and more human – making it easier to seed the most controversial content. Their focus, Stringeni and colleagues say, is on contentious social issues: Troll accounts have benefited from the split over Black Lives Matter, and in the US presidential election – arguably a factor in Donald Trump’s victory over Hillary Clinton in 2016. Posts viewed 140 million An American farmer trolls on Facebook ahead of the 2020 election, according to an internal document compiled by the social media platform. However, they also focus on topics that are perceived to be integrating into the Reddit community. “They are totally pro-cryptocurrency, and they defend it on social media,” Stringeni says, while at the same time the account itself might drive some political rhetoric as well. They are trying to integrate. “
It’s all part of the state-sponsored handbook for Russian trolls to work under – and it’s increasingly common among a number of different countries. Stringhini points to Russia, China, Venezuela and Iran as countries trying to shape the talks through organized wiretapping campaigns on social media. However, despite their attempt to give the veneer of normalcy, academics have found some accounts that could point to a lack of originality. Troll accounts tend to post less than 10 percent of the comments as a “real” account on Reddit, based on a random sample, indicating it’s hard to pretend reality for too long, or they give up when I think their work is done. Conversely, they are more willing to broadcast rather than engage in a conversation: they make an average of 42 transmissions during their lifetime, compared to a non-phishing account of 32. The most obvious, and in line with the way deep spies cover this is often discovered because they end up Meeting well-known spies at a standstill, state-sponsored trolls are often caught on social media because they can’t help but post on each other’s threads.