Is the coronavirus merely an invention of the pharmaceutical lobby or is it perhaps triggered by 5G technology? Conspiracy theories have not only become more prominent in the public eye in view of the corona pandemic, but are also finding a growing number of followers. Especially, the conspiracy movement QAnon is becoming increasingly popular in Germany as well. At the same time QAnon is a special case among the conspiracy myths: QAnon is not only extremely “successful”, the movement can also be seen as an example of a new dimension of digital opinion making and mobilisation, as it makes use of the functional mechanisms of the social web in a highly elaborated manner. In this article we explain what the QAnon movement is, how it works and what threats and implications can be derived from it for economy and politics. For one thing is clear: online movements can only be against a fictional conspiracy, but also against real companies, institutions and individuals.
The origins of QAnon go back to the dark depths of the internet in October 2017. At that time an anonymous user took the floor on the online platform 4chan. 4chan is a so-called image board, on which users can publicly exchange information without registration or an account – i.e. completely anonymously. Due to its few community rules, 4chan is known, among other things, for being a meeting place for internet trolls and a starting point for (right-wing extremist) hate and propaganda campaigns.
Back then, the user claimed on 4chan to be a member of Donald Trump’s innermost government circle and to have access to top secret government information and plans. His pseudonym “Q” refers to the highest clearance level in the US Department of Energy, the Q-Clearance. Q quickly found a growing number of followers on the internet, who from then on attentively followed his posts and were convinced of their truthfulness. The fact that Q’s announcements – such as the imminent arrest of Hillary Clinton – never came true did not bother users. Soon the online community around Q and its posts will started calling itself QAnon. Q in reference to the user’s name and Anon in reference to the term “Anonymous”, which appears instead of a user name on imageboards.
Q continued to post allegedly secret information from the White House (so-called Q-Drops) and quickly users began spinning crude conspiracy theories from the information: In summary, it’s about a secret elite of corrupt politicians, stars, and representatives of the press and business that would control the US. This so-called deep state, which includes, for example, Barack Obama and Hillary Clinton, is preparing a coup, according to the users, to turn the USA into a dictatorship.
Moreover, this “deep state” is satanic and paedophilic, kidnapping children, holding them captive and torturing them. The goal is to “extract” the metabolic substance adrenochrome from the children, in order to consume it as a life-prolonging elixir. The only “saviour” who fights against these criminal machinations is Donald Trump, so the QAnon supporters. In addition there are set pieces of various other conspiracy myths as well as the inclusion of current social developments. Thus also the corona pandemic is taken up by the QAnon community. According to the QAnon followers it does not exist – the Covid-19-lockdown is in fact only a pretext to free the captured children from the hands of the deep state, so the QAnon community.
You got it wrong. Even if the content of the QAnon-narrative sounds absurd, it is not only short thought, but rather dangerous to dismiss the movement simply as nonsense or a conglomerate of nutcases. The success of the movement speaks for itself: In the last years QAnon led to a large number of online content and offline demonstrations, the Times named Q as one of the most influential people on the internet in 2018 and a book written by QAnon followers about the ideology and movement reached the second place of the Amazon bestseller list in the USA in 2019.
All in all, it seems that QAnon is more than a conspiracy community – scientists are convinced of this, too. They describe the movement increasingly as a kind of religious community or sect. How dangerous the QAnon “faith” can be, is shown by terrible events outside of the digital space: The attacker of Pittsburgh, who killed a total of 11 people in 2018, for example, is considered a follower of the QAnon movement.
If one becomes aware of the dimensions of QAnon, the question quickly arises: How could the posts of a single user become such a thing? The answer to this question lies somewhere between content and distribution. What is clear, however, is that the QAnon movement and its international success illustrate a new dimension of opinion making and mobilisation via digital and above all social media. If one abstracts QAnon on its functioning, the content of the narrative seems to be secondary and exchangeable: Thus, by means of the “methodology” of QAnon, other contents could be propagated and a large following for them could be generated – as long as they are only mysterious, sensational and above all emotionalizing enough. And these contents can turn against a multiplicity of actors like e.g. enterprises, industries, persons or political institutions.
Let’s take a closer look at QAnon. In many respects the QAnon narrative resembles other known conspiracy myths both in content and structure: The story is not only exciting and mysterious, but offers its followers a simplifying explanation of the highly complex events in a constantly dynamic world. In addition, it defines an actor who is responsible for all the suffering in the world and thus conveys the feeling of regaining control over one’s own life world.
At the same time QAnon conveys positive feelings like hope, sublime and meaningfulness: On the one hand the “great awakening” of the entire population and the rescue by Donald Trump is announced again and again and on the other hand the followers are simply given the feeling of being something special. For they alone have recognized the truth and fight – almost like secret agents – against the evil in the world. Moreover, QAnon follows a superior master narrative that mixes facts with fiction, unites many other conspiracy narratives and immunizes itself against any form of criticism through its density and interweaving. Arguments against the truthfulness of QAnon can thus always be turned around to advocate for QAnon. This self-immunizing effect is also used for example in misinformation and disinformation campaigns.
At the same time QAnon differs from other conspiracy myths and communities because it takes advantage of two factors that work particularly well in the digital world: Emotionalisation and interaction or participation. The internet is no longer just about the rigid reception of informative content, but rather about a reciprocal exchange in which every user can participate. Classical gatekeepers are losing their relevance and the individual user becomes an information producer and trader himself. Content is published, shared, liked, commented or remixed and can quickly achieve enormous reach.
This in turn is further enhanced by new gatekeepers such as search engines and algorithms. In this context, social media becomes not only a visualizer, but an amplifier of opinions and stories that would probably have received little or no attention in the traditional media system. And what works particularly well in social networks? Emotionalizing, exciting and polarizing content. It generates clicks, triggers discussions and manages to stand out in the outrage economy of the Internet.
The large following and strong mobilization of QAnon can be attributed to these factors: the content is not only mysterious and exciting, it also provides a strong emotionalization, after all it is about the well-being of children. In addition, they strongly encourage participation and conspiracy: the foundation of QAnon – the Q-Drops – are usually short and cryptic and therefore have to be deciphered, interpreted and put together into a big picture by the online users themselves. This makes QAnon something like a collaborative conspiracy narrative, where not only a single person sets the story, but every follower can contribute his or her part.
Moreover, the process seems almost gamified: Deciphering and interpreting vague and cryptic hints and posts is fun. In addition, active Q-Drop interpreters receive recognition from their peers. QAnon thus becomes a kind of online game in which it is all about finding the most conclusive explanations and convincing as many users as possible of its truth. From this it follows that the followers of the QAnon movement are also interested in spreading their opinions and interpretations as far as possible in order to awaken and convince the unbelievers. Because those who successfully persuade are rewarded by the community with prestige and confirmation.
QAnon also works so well because in the digital age certain framework conditions have developed which favour the spreading of and belief in rumours and disinformation or conspiracy myths. Thus we are permanently confronted with a multiplicity of (wrong) information and opinions from different sources by the digital media. Tracking whether these sources are credible and valid information is not only difficult, but also time-consuming and exhausting. Our convenience favors that alleged facts are not always checked and false information is taken at face value. In addition, the distribution of (false) information in a pseudo-message framework is becoming increasingly easier for the individual user. Anyone with Internet access can quickly and easily communicate and disseminate information or their own opinion in social networks or news websites.
After all, in the so-called “postfactual age” we are experiencing a permanent state of doubt. After all, we know about attempts to manipulate elections or the targeted dissemination of disinformation, and ultimately this leads to the feeling that we can never be completely sure what is true and what is false. Even scientific facts can be doubted or even produced pseudo-scientifically (Fake Science) and the basis of our debates no longer seems to be a common consensus of facts. Rather, the apparently strongest arguments are not constituted by their truth content and factual basis, but by their emotional connectivity (such as the potential for outrage). The dangerous thing about the postfactual age is thus not only the massive spread of disinformation, but rather the consequence resulting from it: nowadays, every assertion, whether it is based on true or false facts, seems at first legitimate and convincing – as long as it has a high emotional persuasiveness.
This emotionality leads to attention and makes an information, a conspiracy myth or a story stand out from the abundance of available (false) information and leads us to ignore, actively fade out or ignore other contradictory information. In QAnon the followers are strongly emotionalized e.g. by the component “child abuse” – according to the motto “This is so bad, we have to do something against it immediately” the believers fade out counter-arguments afterwards.
In principle, conspiracies against companies or entire industries are nothing new – think, for example, of the pharmaceutical industry, which time and again becomes the enemy of various conspiracy supporters. It is no surprise that companies can easily be included as enemies in such narratives: on the one hand, corporations and large companies in particular are ascribed both greater power and responsibility because of their resources. This, in turn, leads to a sharper moral assessment of corporate missteps and the suspicion that companies abuse their power to pursue secret goals is much more obvious than, for example, in the case of the average average consumer. On the other hand, corporations and large companies virtually represent the Goliaths of our time: they embody money-hungry and ice-cold power machines that are in direct conflict with the well-being of the little man, thus inviting a multitude of online Davids to fight against the big evil.
These points not only encourage companies to be (made) the focus of a conspiracy campaign organically, but also purposefully – i.e. with a specific intention. If individual companies become the focus of such a campaign, one can speak of “corporate conspiracies”.
There are usually two groups of actors behind such corporate conspiracies: attackers or profiteers. For attackers, it is usually a matter of causing (financial) damage to a company or an industry with a targeted campaign. There are many reasons for this: As a competitor, for example, you want to gain a competitive advantage, as a dissatisfied ex-employee you want to harm the company, or as an interest group you want to advance a certain economic, political or social agenda. It is not difficult for attackers to spread false information about companies or industries and thus manipulate relevant stakeholders in a targeted manner.
Profiteers, on the other hand, have realized that spreading conspiracy myths can be lucrative. Around the QAnon movement there is for example a variety of merchandising articles like T-shirts or cups. Furthermore, contents about QAnon provide a large amount of clicks by the QAnon followers. To convince more people of the QAnon story brings YouTubers, influencers or online media simply more traffic and therefore more advertising revenue. In the corporate context this aspect leads to a greater danger for well-known corporations. They attract attention, generate more clicks and traffic when mentioned, and thus also larger margins. If profiteers are themselves the initial disseminators of disseminators of corporate conspiracies, they calculate precisely the chance that conspiracy stories around a company will ensure a wide reach. If profiteers jump on the bandwagon of an attack by third parties, they also usually reinforce the effect of the attacker’s campaign.
This question can hardly be answered in a generalized way. Although there are factors that increase the likelihood of becoming the victim of a conspiracy campaign (e.g. the size and publicity of the company), there are also other factors that can be taken into account. However, this does not guarantee that companies that do not meet these factors are out of danger. Thus a corporate conspiracy can hit in particular also small enterprises on local or regional level completely after the manner: In this small restaurant always so few customers are and nevertheless it holds itself already many years… surely they are laundering money!
You don’t believe that something like this happens? And if so, what is so bad about it? Here is a short real-life example to illustrate the seriousness of such a situation. During the American presidential election campaign in 2016, a conspiracy theory was shared on the Internet, which eventually led to an event known today as Pizzagate. At the center of the theory was a small pizzeria in Washington D.C. Online users were convinced that a child porn ring was operating from the basement of the pizzeria.
Part of the criminal network – note the strong resemblance to QAnon – were, among others, Hillary Clinton, Barack Obama and Lady Gaga. The theory was shared by both real users and social bots on the social networks, found many supporters and finally ended with an assassination attempt in December 2016. An armed man stormed into the pizzeria to free the children allegedly abused there and fired two shots – luckily nobody was hurt. This example tragically illustrates what a conspiracy story on the net can lead to and that it can strike just about anyone.
To find out individually to what extent your company is at risk or already affected, a targeted risk assessment is suitable. In such a risk assessment, on the one hand, it should be examined how high the chance is of becoming the focus of a digital smear campaign. On the other hand, it should determine how great the damage that would result would be. Possible questions here are Could such a campaign have a negative impact on my reputation? Could policymakers be influenced to my disadvantage? What are the consequences for me if my entire industry becomes the target of such a conspiracy attack?
Web monitoring is also worthwhile in order to keep a permanent eye on the risk. After all, who wants to get involved every week and search the depths of the Internet for new conspiracy myths? To achieve the greatest possible benefit, monitoring should be set up specifically for defined topics or channels. I.e. based on the risk assessment, certain trigger keywords and sources are defined and monitored. Typical platforms or well-known groups, in which conspiracy theories often have their origin, for example, are suitable here. These include Reddit, 4chan, Telegram or the Russian Facebook VKontakte. Furthermore, influencers of the conspiracy scene can be identified and indexed on an influencer map. The map then helps to classify certain monitoring findings faster and easier.
At this point there is good and bad news. The bad news first: You can never completely prevent a negative campaign from spreading on the web. Especially if it is a direct attack. Because many forces are working on spreading rumors, false reports or conspiracies about you and your company online.
Now the good news: You can mitigate the risk as well as possible effects of such a campaign. Risk analysis and monitoring help to identify emerging campaigns early on and take timely action against them. A comprehensive strategy helps you to act adequately in case of an emergency, to initiate the right measures and to minimize consequential damages. And sensitized and trained employees reduce risks and independently contribute to greater crisis resilience.
Do you need support and advice on this topic? We are happy to help you.
Better safe than sorry.