Fake news ‘as a Service’ on the rise according to Digital Shadows
A screenshot of the Dark Web site, TheInsider

Fake news ‘as a Service’ on the rise according to Digital Shadows

New research indicates that criminals are increasingly exploiting the growth of ‘fake news’ for commercial gain. The growth of so-called fake news has traditionally been associated with the political sphere and executed for ideological purposes. However, research conducted by analysts at Digital Shadows reveals the growth of toolkits and services – available for just $7 – that are aimed specifically at causing financial and reputational damage for companies.

In particular, Digital Shadows has noticed a recent trend towards the growth on the Dark Web of so-called ‘pump and dump’ services. These are designed to work by gradually purchasing major shares in altcoin (cryptocurrencies other than bitcoin) and drumming up interest in the coin through posts on social media. The tool then trades these coins between its multiple accounts, driving the price up, before selling these to unsuspecting traders on currency exchanges looking to buy while share prices are still rising. An analysis of the bitcoin wallet of one such popular service noted that it received the equivalent of $326,000 from wannabe criminals in less than two months.

Similarly, Digital Shadows identified over ten services that allow users to download software, which controls the activities of social media bots. One offers users downloadable software for a trial period of just $7. Other tools claim to promote content across hundreds of thousands of platforms, including forums, blogs and bulletin boards. These work by controlling large numbers of bots; armies of computers that the individuals control and can configure the bots to post on specific types of forums on different topics. Overall, mentions of these sites across criminal forums can give an indication of their popularity – these have increased 300% in just two years from 418 in 2015 to 1,381 in 2017 so far.

The battle against fake news could be getting even more difficult with advertisements for toolkits increasingly claiming to include built-in features that bypass captcha methods, which were initially brought in to prevent bots and automated scripts from posting advertisements indiscriminately across these platforms.

Unsurprisingly, media organisations are a particular target of purveyors of fake news. Digital Shadows analysed the top 40 global news websites and checked over 85,000 possible variations on their domain. In doing so, it discovered some 2,858 live spoof domains. Simply by altering characters on a domain (e.g. an ‘m’ may have changed to a ‘rn’) and by using cloning services it is possible to create a fake website of a legitimate news organisation that looks realistic.

Retailers too are a target. One managed service offers ‘Amazon ranking, reviews, votes, listing optimisation and selling promotions’ with pricing ranging from $5 for an unverified review, $10 for a verified review, to a $500 monthly retainer.

“The sheer availability of tools means that barriers to entry are lower than ever. It means this now extends beyond geopolitical to financial interests that affect businesses and consumers,” said Rick Holland, VP Strategy, Digital Shadows. “Of course, rumours, misinformation and fake news have always been part of human society. But what has changed in the digital world is the speed such techniques spread around the world, and the fact tools are freely available on the Dark and Surface Web to enable those wanting to carry out these sorts of campaigns to do so with ease and by locating and using the services and tools they need online.”

Digital Shadows has issued the following advice for firms looking to combat disinformation:

• Combat domain spoofing – organisations should proactively monitor for the registration of malicious domains and have a defined process of dealing with infringements when they occur. An agile and scalable takedown capability is critical for combating domain spoofing.

• Combat the ‘bots’ – monitor social media for brand mentions and seek to detect the ‘bots’. Though it’s not always immediately obvious, there are often clues such as looking at the age of the account, the content being posted, and the number of friends and followers.

• Monitor forums for information that could manipulate the share price – organisations should search for mentions of their brand or staff across forums, which could be instances of malicious actors spreading disinformation.

• Keep an eye on trending activity – monitor trending activity as it relates to an organisation’s digital footprint and potentially identify disinformation activity.

Click below to share this article

Browse our latest issue

Intelligent CIO Africa

View Magazine Archive