Continue reading this on our app for a better experience

Open in App
Floating Button
Home Digitaledge In Focus

Disinformation: The next big threat

Lim Hui Jie
Lim Hui Jie • 12 min read
Disinformation: The next big threat
In an increasingly connected world, a subtle but more damaging cyberattack could be coming
Font Resizer
Share to Whatsapp
Share to Facebook
Share to LinkedIn
Scroll to top
Follow us on Facebook and join our Telegram channel for the latest updates.

In an increasingly connected world, a subtle but more damaging cyberattack could be coming

In the event of such a scenario, industrial control systems, power grids and critical infrastructure will fall prey to cyberattacks. Similar to horror movie plots, cities could shut down, TVs go dark and trains grind to a halt, causing a widespread paralysis on the way of life.

Amid the Covid-19 pandemic, commentators have called 2020 the “year of digitalisation”. People were forced to stay home to conduct business and work, and the reliance on digital communication technologies grew more than ever. Subsequently, digital activity increased, with more external connections made on companies’ networks, leading to greater exposure and risks to cyberattacks.

According to findings from a 2020 Harvey Nash/KPMG CIO Survey titled Everything Changed, Or Did It, companies have been spending the equivalent of around US$15 billion ($19.8 billion) more a week on technology to enable safe and secure home working during Covid-19.

This puts it as one of the biggest surges in technology investment in history, as companies spend in three months what they usually spent in a year before the crisis hit. Furthermore, most of this spending is on security and privacy. Despite this surge, four in 10 information technology (IT) leaders report that their company suffered from more cyberattacks during the pandemic.

While viruses, malware and hacking attempts can be stopped via firewalls, antivirus software, encryption as well as physical measures like air gapping and two-factor authentication, there are still additional vulnerabilities that this long list of measures cannot plug.

There is a more subtle but damaging attack that could be more widespread in the future. Such attacks will not be directed at cyber systems, but at the human mind. The weapons in this battle are not lines of code, but rather, the content that people see on their screens and the thoughts they translate to action. This is exacerbated by the popularity of the internet’s crowning achievement — social media.


SEE:GIC and Carlyle sued by Qatari fund on shattered American Express deal

Disinformation in businesses

Increasingly, businesses need to fight so-called information wars, such as disinformation campaigns or influence operations. A rival need not attempt to penetrate the layers of enterprise-grade cybersecurity measures if a social engineering attack can be launched, be it at the company’s staff or the company’s credibility itself. Fake news — or false or misleading information presented as news — can drive customers and partners away by smearing a company’s reputation over social media. Often, this could also happen by simply starting a rumour in the industry.

But before companies dismiss these campaigns as simply limited to the world of political punch-ups or Twitter tirades, there is a real and tangible cost to disinformation. A 2019 study conducted by the University of Baltimore revealed that fake news now costs the global economy a staggering US$78 billion annually.

The report, which analyses the direct economic cost from fake news, also estimates that it has contributed a loss in stock market value of about US$39 billion a year.

“Disinformation can directly impact businesses, either through reputation, their valuation, or share price of the organisation,” says Brice Chambraud, APAC managing director of AI technology provider Blackbird.ai.

Disinformation can affect the company’s reputation by “affecting their existing customer base and customer loyalty, down to very precious market share,” he adds.

For example, there have been various conspiracy theories proliferating as countries roll out the Covid-19 vaccine world wide-ranging from the shots having the ability to alter people’s DNA to the vaccination exercises being a microchipping attempt by Bill Gates.

“While it is anticipated that governments will roll out the vaccines when ready — undoubtedly a key piece to normalcy and economic recovery — the influence of these conspiracy narratives creates volatility,” says Chambraud.

“Narratives not anchored in truth can create the opportunity for resistance from the masses. This not only results in safety concerns, but the reputational impact can erode trust in governments and vaccine manufacturers,” he adds. If the resistance is widespread, this can impact bottom lines across pharmaceutical companies due to uncertainty around order loads and an uptake across affected markets.

Companies, as a result of disinformation, also suffer from additional costs. The University of Baltimore also revealed that firms subjected to targeted misinformation spent about US$9.5 billion a year in reputation management fees.

“Businesses really need to take it very seriously that disinformation doesn’t only affect the psychological operations space, but there is a commercial impact to it as well. So recovery is very hard once the organisation takes a reputational hit,” explains Chambraud.

The appeal behind a disinformation campaign is also the difficulty of securing the avenue of attack. With a conventional cyberattack, a company can review and patch vulnerabilities in its cyber infrastructure, but it is hard to secure the thoughts in the human mind. Short of governing people’s thoughts, it is hard to ‘patch’ human thoughts.

Manipulating cyberspace

Disinformation campaigns are not new. But the difference now, according to an Overseas Development Institute briefing in September last year, is the pace at which digital information ecosystems have evolved and the way in which they are being manipulated.

To put it another way, information — either true or false — can now travel at much greater speed and scale, including across borders, through social media, which is designed to encourage people to share information and content quickly, with very little fact-checking. Jeffrey Kok, vice president, solution engineers for Asia Pacific and Japan at identity security company CyberArk, observes how basic human psychology is at play when it comes to receiving new information. “The first thing that comes into your mind becomes the predominant source of truth, or source of relevance,” he tells The Edge Singapore.

For example, in the 2016 book iWar, information specialist at the US Congressional Research Service Catherine A. Theohary said “the ability to rapidly disseminate graphic images and ideas to shape the narrative transforms social media into a strategic weapon”. Chambraud says the vast amounts of information — and by extension, disinformation — that humans come into contact with today is so much, that companies are not set up to effectively beat disinformation. “We see organisations, governments, big agencies or PR agencies hiring hundreds of analysts, manually reading upwards of 200,000 words a day trying to understand what’s happening in the digital conversational landscape,” he adds.

Overwhelming the narrative

Timothy McGeehan writes in the US Army War College Quarterly in 2018 that one strategy of disinformation campaigns or influence operations is to overwhelm the information space, and “pollute this space with falsehoods to the point where all truth becomes relative”. In iWar, technology expert Shelly Palmer also said: “The narrative that wins is not one that can draw the best fit to the truth, but the one that is inside the blanket that comforts the listener”.

“Typically consumers pick their own truths to suit their beliefs. They are very influenced by the communities that they are part of, so it gets increasingly challenging to position yourself well against a very strong disinformation campaign,” offers Chambraud.

He believes the platforms that are currently spreading digital content are typically optimised for engagement. “So disinformation is very much naturally a driver of engagement, due to the fact that it is polarising news, and these campaigns can easily find the target audiences because of the way that platforms optimise [their content].” This means that the ability to provoke a huge reaction from these echo chambers can be hugely damaging to a company’s reputation. “If you want to make up a lie about a company that is burning fossil fuels, you just need to target an echo chamber of environmentalists and you know you will be able to get a huge engagement,” he adds.

Coupled with social media algorithms today that feed people with more of what they want to see and what they think will engage them, this will lead to the creation and hardening of echo chambers — spaces where a person encounters only beliefs or opinions that coincide with their own so that their existing views are reinforced while alternative ideas are not considered.

Flip the script

But there are ways to fight back. “You actually have to put out narratives that in a sense, pre-empt these attempts and misinformation,” says associate professor Alan Chong, acting head for the Centre of Multilateralism Studies at the S Rajaratnam School of International Studies (RSIS). He acknowledges that it is not easy, and describes the process as “building the equivalent of a firewall in the minds of your target consumers, your partners, your friends. It is easier said than done.” As such, Chong thinks the average company needs to keep their partners in the loop, whether in terms of a shift in their business strategies, or new products, so as to preempt wild rumours.

Disinformation, being more optimised to social media algorithms, has also been proven to reach further than the correction itself. A 2018 Massachusetts Institute of Technology study found that false news stories are 70% more likely to be retweeted than true stories. “It also takes true stories about six times as long to reach 1,500 people as it does for false stories to reach the same number of people,” the study said.

In 2013, Forbes reported that US$130 billion in stock value was wiped out in a matter of minutes following an Associated Press tweet about an “explosion” that injured then US President Barack Obama. Despite the news agency later saying its Twitter account was hacked and stock prices recovering shortly thereafter, this instance exactly illustrates how fast disinformation on social media can affect businesses, especially listed companies where their share value is of importance.

“If you can lower the price of a stock by 1% by purposefully manipulating the news flow by producing content and if you have the right trading mechanism in place, you can capitalise on that,” says Anton Gordon, cofounder of Indexer.me, a software devel oper that builds algorithms to decipher the reliability of text and visual content.

Vigilance is key

However, Chong believes there is no real “firewall” against disinformation. “One has to have a PR department, or PR section operating almost like a 24/7 kind of information patrol unit”.

Another way to ward off such attacks is for companies to build up their “social factor” by strengthening “the corporate branding, corporate image, and how they are building the trust and integrity with their community and their customers,” says CyberArk’s Kok. However, trust has to be built over a long period of time, so that there is a sufficient amount of trust and goodwill in order to mount an effective defence, he adds.

Chong calls trust between the company, its consumers and partners a “key ingredient” that can guard against disinformation. However, he warns: “Trust is part and parcel of building this equivalent of a psychological firewall. But trust is one of those things like the weather. It comes and it goes”.

In other words, companies have to not only gain trust, but also constantly devote what Chong calls “existential efforts” to keep it. For Chambraud, the threat of disinformation is quickly growing, and one that touch es all sectors, including politics, healthcare businesses and financial markets.

“There is so much disruption happening. All it takes is one really strong campaign or one really strong message to completely shift consumer focus. Organisations need to start preparing themselves and start to look at disinformation more seriously,” he warns.

“Even though they are not attacked, there is a huge strategic advantage to understand the drivers behind certain campaigns within the sector to help you navigate and to steer clear of this minefield”.

Disinformation in the geopolitical sphere

In the world of geopolitics, the goal of an influence operation is not to directly degrade the target’s systems via a virus. Instead, it is to sway public opinion within the target state so that the target is more vulnerable to attacks. In the case of liberal democracies, this means taking or not taking a certain action that will favour the attacker by the weight of public opinion.

One well-known example is the alleged Russian interference in the 2016 US election to get former US President Donald Trump elected into office. A subsequent investigation by Special Counsel Robert Mueller concluded that Russia interfered in the election by waging a social media campaign that favoured Trump while disparaging rival candidate Hillary Clinton.

Before wrapping up its investigation in 2019, Mueller’s office netted eight convictions and charged more than two dozen Russian individuals and entities, reports Politico.

The shift in public opinion as a result of disinformation campaigns cannot be conclusively pinned down by a single pivotal moment, like the 1914 assassination of Archduke Ferdinand that precipitated the first World War or the 1941 attack on Pearl Harbour that pulled the US out of its self-imposed neutrality and into the Second World War.

In many ways, fake news campaigns are much more subtle with a lot of grey areas. For example, there is no way to conclusively determine that a single piece of news, sent by a single entity, is responsible for the disinformation. It could be an epiphany for some, or reinforce a belief in others — which makes it all the harder to defend against.

Highlights

Re test Testing QA Spotlight
1000th issue

Re test Testing QA Spotlight

Get the latest news updates in your mailbox
Never miss out on important financial news and get daily updates today
×
The Edge Singapore
Download The Edge Singapore App
Google playApple store play
Keep updated
Follow our social media
© 2024 The Edge Publishing Pte Ltd. All rights reserved.