Disinformation Soviet and American Style

During the early 1980s, I had a phase in which I read anything I could about the KGB, the Soviet counterpart to the CIA. Of the books I read on this topic, John Barron’s KGB: The Secret Work of Soviet Secret Agents, published in 1974 was and remains my favorite. In 1983, he also wrote as a follow-up The KGB Today, and then in the late 1980s he wrote Breaking the Ring, which described how the Walker family spy network plundered the US Navy’s cryptographic keys (giving the Soviets unlimited access to our military planning and making this the worst intelligence disaster in US history).

In this post, I’ll focus on Barron’s first book on the KGB (the one published in 1974 and depicted above), and then draw lessons from it for the present. This book covers many different aspects of Soviet spycraft. But the one I’ll be concerned with is disinformation, a concept that in the last few years has gotten enormous attention in the US. In particular, Russian disinformation aimed at US election interference has been much discussed in the mainstream media. In this respect, the current Russian leader Vladimir Putin has become an arch villain. Putin, who will be 72 this October, served as a KGB officer from 1975 to 1991, rising to the rank of lieutenant colonel (unlike the civilian CIA, the KGB used military ranks).

The KGB, short for Komitet Gosudarstvennoy Bezopasnosti (Committee for State Security), was the main security agency for the Soviet Union from 1954 until both it and the Soviet government collapsed in 1991. Originating from earlier Soviet secret police organizations like the Cheka and NKVD, the KGB played a crucial role in imposing Soviet control both domestically and internationally during the Cold War. Even after its official dissolution, the KGB’s legacy is evident in Russia’s current intelligence services, such as the FSB, which operates out of the KGB’s former quarters in Moscow. Putin’s inner circle includes former KGB associates such as Nikolai Patrushev and Sergei Ivanov.

Barron’s KGB is now 50 years old and long out of print. Used copies still circulate, and it can be viewed at the Web Archive. In what follows, I’ll quote from it extensively, giving page numbers to the 1974 Reader’s Digest Press version. At the same time, I’ll intersperse commentary of my own. And then, in the second part of this post, I’ll draw on Soviet disinformation as a lens for understanding American disinformation in the present. To get that ball rolling, let’s start with Barron’s definition of disinformation:

The KGB tries to discredit individuals, institutions, governments, or policies of other nations through what it calls Dezinformatsiya (Дезинформация), or disinformation. This Russian term embraces not only forgeries, literary hoaxes, and dissemination of false information but also the commission of physical acts such as sabotage and murder for psychological effect. (p. 23)

According to this definition, disinformation is about more than simply disseminating misleading information intended to deceive people. Compare this definition to the Wikipedia entry on disinformation, where disinformation is characterized as a form of deceptive verbal communication. But, as the Soviet example shows, disinformation’s arsenal contains a lot more than deceptive words.

Nor should disinformation be understood simply in terms of propaganda—getting people to believe some party line, whatever it happens to be. Governments engage in overt propaganda. With such propaganda, we know who is doing the communicating and who the target audience is. With disinformation, there’s always secrecy about its ultimate source—something overt is happening, but something behind the scenes is pulling the strings.

The key word in Barron’s definition of disinformation is discredit. Whatever its means, that is its aim. Everything has a reputation. And so, if the KGB found some reputations as weighing against Soviet interests, those reputations were fair game to become a target for disinformation. Disinformation is about using covert means to discredit reputations, be these the reputations of individuals, institutions, governments, policies, or whatever.

In this light, consider the letter from October 2020 signed by 51 CIA agents. Released just before the November election, it claimed that Hunter Biden’s laptop was Russian disinformation—an elaborate hoax concocted by the Russians. Supposedly, the Russians had created this laptop to discredit not only Hunter Biden but also his father, Joe Biden, who at the time was running for president of the US.

As it is, the laptop was confirmed to be genuine after the election, so any discrediting done by the laptop to the Biden family could be squarely laid at the feet of Hunter Biden for foolishly letting the laptop out of his hands. In this case, the discrediting was deserved, and none of it could reasonably be laid at the feet of the Russians.

Unlike discredit that is earned and deserved, disinformation is in the business of producing undeserved discredit. Thus it could be argued that the CIA letter was itself an act of disinformation, designed to discredit the laptop’s authenticity. It can be an act of disinformation to call something disinformation when in fact it is true.

This example illustrates how disinformation can also be time sensitive in that it was enough to discredit the Hunter Biden laptop just until the election. If the CIA letter was disinformation, it worked splendidly: a 2022 poll indicates that voters would probably have swung the election away from Joe Biden if the authenticity of the laptop had been accurately reported in the weeks before the 2020 election.

Hunter Biden’s laptop, whatever its status as disinformation, took the form of digital communication/information (emails, pictures, videos). Yet for the Soviets, the means of disinformation were far more wide and varied (not just alphanumeric characters shared across communication channels). Consider the following illustration of disinformation that comes immediately after Barron’s definition of the term. Even though the disinformation described here never made it past the drawing board, the Soviets did seriously consider carrying it out:

Some disinformation operations are designed to exploit legitimate popular concerns about world problems or issues. The U.S. Department of Defense, for example, has learned authoritatively that the KGB seriously considered leaking radioactive waste into waters around a base used by Polaris submarines. “Discovery” of the contamination would have been widely cited as proof that nuclear submarines menace the environment. Such “proof” in turn was to be used in a KGB propaganda campaign against construction of the new generation of Trident submarines, which will form the backbone of the Western nuclear deterrent in coming years. The KGB abandoned this particular operation as technically unfeasible. But it continues to survey other means of capitalizing on environmental apprehensions to obstruct Western military and technological programs inimical to Soviet interests. (p. 24)

Submarine in the Polaris Programme

From this example, we see that disinformation is achieved by manipulating perceptions using any available means, not limited to disseminating verbal or digital information. Radioactive waste leaked near Polaris submarines, once discovered, would have led to “the obvious conclusion” that these submarines were environmental catastrophes waiting to happen, and would have made it much harder for the US military to convince politicians (especially with worried constituencies breathing down their necks) to fund the newer generation of Trident nuclear submarines. The US nuclear submarine program would thereby have been discredited. The disinformation here would have consisted of environmental sabotage.

Later in Barron’s book, in a chapter devoted entirely to disinformation (titled “Disinformation: Poisoning Public Opinion”), readers learn the official Soviet definition of disinformation. Even though the definition is short and sweet, and seems simply to denote the spread of false information, as Barron explains, it covers quite a bit more ground:

The Russians define disinformation as “the dissemination of false and provocative information.” As practiced by the KGB, disinformation is far more complex than the definition implies. It entails the distribution of forged or fabricated documents, letters, manuscripts, and photographs; the propagation of misleading or malicious rumors and erroneous intelligence by agents; the duping of visitors to the Soviet Union; and physical acts committed for psychological effect.

These techniques are used variously to influence policies of foreign governments, disrupt relations among other nations, undermine the confidence of foreign populations in their leaders and institutions, discredit individuals and groups opposed to Soviet policies, deceive foreigners about Soviet intentions and conditions within the Soviet Union, and, at times, simply to obscure depredations and blunders of the KGB itself. (pp. 165–166)

As this elaboration of the Soviet idea of disinformation shows, while the principal aim of disinformation is to discredit, a constellation of related aims accompanies this principal aim. These aims—with discredit at the top of the list—include:

  • Discrediting: Undermine the credibility of individuals, organizations, governments, or ideas.

  • Demoralizing: Weaken the morale and resolve of a population or group, often by spreading false negative rumors about one’s own or one’s adversary’s strength.

  • Defaming: Tarnish or ruin the reputation of a target by spreading harmful lies or exaggerations.

  • Covering Tracks: Divert attention from real actions or events, obscuring the truth and preventing accountability.

  • Creating Confusion: Generate uncertainty and doubt by spreading conflicting information, making it difficult for people to discern the truth.

  • Manipulating Public Opinion: Influence the beliefs and attitudes of a population to serve specific political, ideological, or economic interests.

  • Inciting Conflict: Foster division and strife within or between groups through inflammatory or divisive provocations.

  • Undermining Social Cohesion: Erode trust and unity within a society, making it more vulnerable to external or internal threats.

  • Obfuscation: Cloud the understanding of an issue by mixing facts with lies, making it difficult to separate truth from falsehood.

  • Distracting: Shift public focus away from critical issues or scandals by introducing sensational or misleading narratives.

The most instructive example of disinformation described in Barron’s book concerns a Soviet campaign in the late 1950s to discredit West Germany by getting people around the world to think that Germany’s Nazi past was anything but past and that antisemitism was running amuck within its borders. The Soviet disinformation campaign in this instance was masterfully executed and at the time caused an international sensation. I’ll quote Barron at length because the example is so revealing about Soviet disinformation and so insightful about disinformation in our own day. This example also underscores how difficult it can be to recognize disinformation as disinformation—only years later did defectors reveal the true story.

The Soviets organized the following disinformation campaign to reduce Germany’s standing in NATO and with its allies:

The KGB perceived another means of defaming West Germany (when) a West German high-school teacher in 1958 made vile anti-Jewish remarks, juvenile delinquents defiled gravestones in a Jewish cemetery, and a few Jewish families received anonymous hate letters. The incidents angered the German press and stimulated a spate of stories in the foreign press speculating about a possible rebirth of Nazism. They also bred an idea in the mind of General (Ivan) Agayants.

On Christmas Eve 1959, a twenty-five-year-old German, aided by an accomplice, smeared swastikas and a slogan, “Germans Demand That Jews Get Out,” on the synagogue in Cologne. A Jewish memorial a mile away also was defaced. In the next few nights, swastikas and anti-Semitic slogans were painted on synagogues, tombstones, and Jewish-owned stores in more than twenty West German towns and cities. Jews received threatening anonymous telephone calls and letters.

During the New Year’s weekend, swastikas and slogans were daubed on synagogues and Jewish buildings in London, Oslo, Vienna, Paris, Parma, Glasgow, Copenhagen, Stockholm, Milan, Antwerp, and New York. On January 3, further outbreaks of anti-Semitism were reported in Melbourne, Manchester, Athens, and Perth, Australia. On January 6, more desecrations occurred in Bogota, Buenos Aires, Milan, Oslo, Vienna, and the summer home of King Frederik IX of Denmark. A Jewish member of the British Parliament was provided a bodyguard after his life was threatened by an anonymous caller purporting to be a representative of the “British Nazi Party.” Meanwhile, the epidemic of desecrations intensified and spread throughout West Germany.

The worldwide reaction was instant and almost uniformly disparaging of West Germany. The American poet Carl Sandburg advocated death for anyone caught painting swastikas. In London, Lord Robert Boothby, saying he had been told of a “rising tide of Nazism,” announced that he was going to Germany to investigate the situation. West German diplomats were ostracized. British businessmen canceled contracts for purchases of West German products, fired German employees, and removed German goods from their shelves. British newspapers voiced doubts about whether West Germany could be trusted as a partner in NATO.

The attitude of leading newspapers in Europe and America was typified by a headline in the New York Herald Tribune: “Bonn Unable to Eliminate Nazi Poison.” The loudest shrieks came from Moscow. “These disgusting fascist provocations and manifestations of the swastika are directed toward the fanning of the cold war and toward the poisoning of peoples against peoples,” intoned Pravda.

West German leaders were reduced to offering abject apologies and statements of self-abasement. Bishop Otto Dibelius termed the outbreaks proof that the German nation had not overcome its past. West Germany found few defenders. One foreign leader who came publicly to its defense was Israeli Prime Minister David Ben-Gurion. “The young generation in Germany is not Hitlerist but the opposite,” he declared.

Between Christmas Eve 1959 and mid-February 1960, West German authorities recorded 833 separate anti-Jewish acts. Then the epidemic ceased almost as suddenly and mysteriously as it had begun. Police arrested and interrogated 234 people. Analyzing their motives, the government concluded that 24 percent acted out of “subconscious Nazi motives”; 8 percent were inspired by extreme rightist or leftist beliefs; 48 percent were drunks or thugs; 15 percent were children; and 5 percent were mentally deranged.

Responsibility for a majority of the acts remained unfixed. All along, West German spokesmen expressed a vague suspicion of organized, clandestine communist involvement. The young German and his accomplice who were convicted of desecrating the Cologne synagogue on Christmas Eve belonged to a small right-wing political party and at their trial unrepentantly mouthed Nazilike statements. But police established that both frequently had visited East Germany, and one had a Communist Party badge hidden behind his coat lapel.

In a separate case, Bernhard Schlottmann, the twenty-two-year-old treasurer of a neo-Nazi organization in West Berlin, confessed after his arrest that he was an East German agent under orders to infiltrate and foment anti-Semitism among extremist factions. The simultaneous appearance of swastikas in many cities on different continents as well as the sudden abatement of the campaign suggest an organized operation. But in the absence of proof, insinuations of communist complicity sounded unpersuasive at the time.

Western security services did not begin to learn what actually happened until later in the 1960s, when defectors revealed that the whole swastika operation had been conceived by General Agayants himself. In the shocked reaction to the early isolated incidents of anti-Semitism in 1959, Agayants discerned the world’s sensitivity to anything smacking of resurgent Nazism. He reasoned that if a few acts caused concern, a massive and continuing rash of them would generate great fear and distrust of the West Germans.

To test the feasibility and practicality of desecrating Jewish shrines, he sent a team of KGB agents to a village about fifty miles from Moscow. One night they smeared swastikas, kicked over tombstones, and painted anti-Jewish slogans, then escaped undetected. KGB agents stationed in the village to gauge the public reaction reported that most people were disturbed or frightened by the swastikas. But appearance of the Nazi insignia also awakened latent anti-Semitism among a few Russians and inspired them to commit a variety of anti-Jewish acts on their own.

Some weeks after this trial run in the Soviet village, the KGB began the operation, relying upon East Germans in West Germany and its own agents in other parts of the world. A later defector, Rupert Sigl, who worked for the KGB in East Berlin from 1957 to 1969, corroborated the reports received from other KGB officers. He stated that at the height of the swastika campaign, he was ordered to translate hate letters from Russian into German for mailing to Jewish families in West Germany. (pp. 172–174)

This KGB disinformation campaign successfully discredited West Germany by fabricating a wave of anti-Semitic incidents, including the defacement of synagogues and Jewish cemeteries with swastikas. This campaign, conceived by Soviet General Ivan Agayants, was designed to create the appearance of a resurgent Nazi movement on German soil, leading to widespread international condemnation of West Germany.

Coordinated antisemitic acts extended beyond Germany to other parts of the world, severely damaging West Germany’s reputation and credibility. Though it seemed at the time a spontaneous movement of Nazi fervor, defectors later revealed the operation’s KGB origins, highlighting the Soviet Union’s strategic use of psychological operations to manipulate global opinion. Repercussions from this disinformation campaign included the following:

  • Economic Impact on West Germany: British businesses canceled contracts, fired German employees, and removed German goods from their shelves, resulting in significant economic loss for West Germany.

  • Visceral Reaction from Public Intellectuals: For instance, the American poet and three-time Pulitzer Prize winner Carl Sandburg was emblematic of severe public outrage, advocating the death penalty for anyone caught painting swastikas.

  • Ostracism of West German Diplomats: West German diplomats were socially and politically ostracized by the international community due to the widespread belief that Nazism was resurging in the country.

  • Doubts about NATO Partnership: British newspapers and other international media expressed doubts about West Germany’s reliability as a NATO partner, questioning whether the country could be trusted.

  • Religious Shaming of West Germany: Bishop Otto Dibelius publicly declared that the anti-Semitic outbreaks were proof that Germany had not fully overcome its Nazi past, adding to the national shame and self-reproach.

  • Chest Beating by West German Leaders: West German leaders were compelled to offer abject apologies and statements of self-abasement, reflecting the deep impact of the campaign on the nation’s dignity.

  • Heightened Global Fear of Nazi Resurgence: The orchestrated campaign successfully heightened global fears of a potential Nazi resurgence, significantly damaging West Germany’s international reputation.

  • Public Outcry and Media Criticism: Leading newspapers in Europe and America harshly criticized West Germany, with headlines like “Bonn Unable to Eliminate Nazi Poison,” contributing to the global condemnation.

  • Reputation Damage and Diplomatic Isolation: The disinformation campaign significantly damaged West Germany’s reputation on the global stage, leading to its diplomatic isolation and increased scrutiny from its allies.

  • Soviet Propaganda Victory: Soviet media, particularly Pravda, exploited the situation to further discredit West Germany and fan the flames of the Cold War, portraying the incidents as proof of Western decadence.

The deviousness of this disinformation campaign makes one want to say to the KGB, “Job well done!” In fact, the only downside for the Soviets here was the discovery years after the campaign that the KGB did indeed incite world opinion against West Germany by deceptively making it seem that antisemitism was rampant within its borders. Even the feasibility study on Russians outside Moscow, provoking and entrapping them to commit antisemitic acts, was conducted masterfully.

My own experience confirms the likely effectiveness of this disinformation campaign. I spent three and a half years growing up in West Germany from 1963 to 1967. In subsequent years, I would also spend entire summers in Germany (mainly in the University town of Erlangen). I well remember that Germans being obsessive in trying to live down their Nazi past, denying as much as possible any knowledge or complicity in the persecution of Jewish people. At one point, I approached a German bookseller to buy a copy of Hitler’s Mein Kampf in the original German. It was forbidden to buy the book. The only way to get a copy was as an academic doing research that required the book, and that only after an official review.

To round out our discussion of Soviet disinformation, consider some additional examples and insights by Barron on the topic.

Disinformation’s Potential to Backfire: Disinformation is based on deception, so if the deception is uncovered, the disinformation can backfire.

In practicing what it calls disinformation, the Soviet Union has for years sponsored grand deceptions calculated to mislead, confound, or inflame foreign opinion. Some of these subterfuges have had considerable impact on world affairs. Some also have had unforeseeable consequences severely detrimental to Soviet interests. Ultimately, they have made the Soviet Union the victim of its own deceit.

After World War II, when the Russians were threatening Greece, Iran, and Berlin, they leaked data grossly exaggerating the strength of the Red Army. They intended to intimidate the West. Instead, they provided an added stimulus to Western rearmament and the formation of NATO. (p. 164)

The Need of Secrecy in Disinformation: This is a point already made, but it underscores why organizations engaged in disinformation do so through covert measures.

Disinformation operations differ from conventional propaganda in that their true origins are concealed, and they usually involve some form of clandestine action. For this reason, Soviet rulers always have charged their clandestine apparatus with primary responsibility for disinformation. (p. 166)

Creating a Potemkin Village: Disinformation thrives on creating alternate realities, thereby turning its targets into stooges willing to drink the Kool-Aid and recommend it to others. No less a figure than George Bernard Shaw was duped in this way.

Visiting politicians, scholars, journalists, clergymen, and other professionals whose opinions are influential are the most common targets of concerned disinformation efforts. Typically, the KGB strives to control what the foreigner sees while persuading him that he is freely seeing what he wishes. It tries to shape his conclusions while making him think that he is reaching them on his own. The KGB often succeeds because of its ability to control the Soviet environment. The record reveals many illustrious victims of its manipulations.

Sir John Maynard, a British agricultural expert, went on an OGPU-guided tour of the Ukraine at the height of the 1932–33 famine that, according to Robert Conquest’s calculations, took five to six million lives. Back in London, Maynard assured the world there was no famine; isolated food scarcities perhaps, but certainly no widespread hunger. Similarly, George Bernard Shaw returned from an OGPU tour to aver that there was no evidence of starvation. After all, he noted, the hotels where he dined abounded with food. (p. 167)

It tries to shape his conclusions while making him think that he is reaching them on his own.

Forged and Fraudulent Documents: Disinformation employs symbols of authenticity, often purloined, and uses them to convince their targets of the truth of something that is in fact false. In reading this next quote, imagine if the Soviets had had access to our current deep fakes.

The most common form of KGB disinformation is based upon forged documents and fraudulent literature. The KGB and satellite services throughout the world regularly collect signatures of prominent foreigners, letterheads, and governmental forms for use in forgeries. The STB (the Czech equivalent of the KGB) perceived some years ago that many Westerners receiving a Christmas card feel socially obligated to send one in return. So Czech embassies adopted the custom of mailing handsome Christmas greetings to government officials and prominent people. The signatures harvested in response are filed away at the monastery in Prague, but are also shared with the KGB. Occasionally the KGB distributes forged documents through communist agents. More often, the bogus documents are mailed to newspapers, magazines, and government offices with a covering letter bearing an illegible signature or none at all. (p. 174)

Aleksandr Solzhenitsyn

Making Heroes Look Contemptible: It’s hard to imagine any Russian figure in the latter half of the 20th century with greater moral authority than Aleksandr Solzhenitsyn. Yet KGB disinformation attempted—unsuccessfully—to make him look vain and contemptible.

(KGB agent Victor) Louis returned to Europe in the spring of 1968, this time peddling a stolen copy of Solzhenitsyn’s then unpublished Cancer Ward. Had Louis’s machinations not been exposed by the émigré publishing house of Grani, the KGB could have arrested Solzhenitsyn on grounds of unauthorized publication abroad and justified its suppression of his book in the Soviet Union on the pretext that the work was being exploited as anticommunist propaganda abroad.

In another disinformation effort against Solzhenitsyn, Louis on March 16, 1969, published in the Washington Post an article purportedly based on an “interview” with the author. The statements Louis attributes to him smack of sheer fabrication. Solzhenitsyn has demonstrated personal courage, integrity, and selflessness, yet the quotations Louis ascribes to him are those of a whining, self-pitying, selfish seeker of martyrdom. The Louis article suggests that Solzhenitsyn viewed Russia’s wartime occupation by the Nazis favorably and that the horrors of the concentration camps were the fault of Beria rather than Stalin. (pp. 178–179)

Disinformation as Damage Control: Sometimes KGB intelligence operations would go completely off the rails. In the late 1960s, for instance, the KGB tried suborning a Lebanese pilot to steal a French Mirage fighter jet. The Lebanese pilot had a wire, recorded all the conversations, and caught the KGB agents red-handed. Two KGB agents even ended up in a hospital for engaging with Lebanese soldiers in a gun battle and getting shot. It should have been the ultimate embarrassment to the KGB, except it wasn’t…

The Soviet Union resolved to brazen and lie its way out of the mess, and the KGB hastily began a massive disinformation campaign. By midmorning of October 1, KGB agents were closeted with influential Soviet sympathizers in Lebanon, Egypt, Syria, and Iraq, telling them what to say: the whole Mirage affair was an American provocation from beginning to end, contrived to sabotage Soviet-Lebanese relations.

Novosti (the Soviet news agency) flooded Arab newspapers with stories stressing all that the Russians had done for the Arabs. In Beirut, Vladislav Petrovich Zhukov, a KGB officer assigned to work with Palestinian guerrillas, went quickly to the Lebanese Foreign Office. Hissing his words because of a dental problem, Zhukov threatened the expulsion of all Lebanese diplomats from Moscow if the Lebanese government continued to publicize the incident…

The pressures generated by the KGB began to have their effects. “A provocation and fabrication!” shrieked Radio Damascus in Syria. “All indications point to a foreign intelligence conspiracy,” declared the semiofficial Egyptian newspaper Al Ahram. “A cheap American conspiracy to undermine our relations with the Soviet Union,” charged Kamal Jumblatt, a leading leftist member of the Lebanese Parliament. Other leftists denounced the Second Bureau for daring to embarrass Lebanon’s good friend, the Soviet Union.

More important than the public bombast were the private visitations by representatives of other Arab states, especially Egypt and Syria, all of whom conveyed the same message: hush up the Mirage affair immediately and get our Russian friends off the hook.

The pressure intensified all day on October 2, and that afternoon little Lebanon, which survives by the sufferance of its bigger Arab neighbors, capitulated. The government issued a censorship ban prohibiting press mention of the case and forbidding foreign correspondents from filing any more dispatches about it. Further references would be “extremely harmful to Lebanon’s higher interests,” said the censorship decree.

The story of the Soviet plot to steal the Mirage flashed only fleetingly before world attention, then vanished entirely. Confronted by the censorship order, foreign journalists wondered whether the government did not, in fact, have something to hide, and few publications troubled to pursue the story further. The KGB had succeeded in making the happening a non-event. (pp. 185–186)

The brazenness of this disinformation campaign calls to mind the classic scene in the film A Guide for the Married Man (1967) in which Joey Bishop is caught by his wife in bed with another woman, only to escape any fallout by his relentless denials. What are you going to believe, your lying eyes? How much different is Joey Bishop gaslighting his wife from our media shills gaslighting Americans?

In their day, the Soviets took disinformation to new heights, turning it into both a science and an art form. In the second half of this post, I want to examine disinformation in our own place and time, drawing lessons, where relevant, from what the Soviets have to teach about disinformation.

Much of what gets called disinformation these days is just garden variety lies that one side in a controversy ascribes to another side in order to invalidate their views. This approach to disinformation at once trivializes and weaponizes it. It trivializes it by making disinformation largely a synonym for misinformation (false information) but then also weaponizes it by depicting those who promote it as bad actors driven by criminal intentions. And so, obviously, these bad actors must be brought to account. And who better to bring them to account than the government and its legal muscle?

By contrast, all the examples of KGB disinformation that we considered earlier in this post were done in the shadows. Indeed, the success of these KGB disinformation campaigns depended on people not recognizing that they were being disinformed. The only reason we learned of the KGB disinformation operations discussed earlier is through defectors and other means by which the KGB’s veil of secrecy was penetrated. It’s probably safe to say that the vast majority of KGB disinformation operations have to this day remained behind an intact veil of secrecy, and so we don’t even know the many instances by which the KGB disinformed its victims.

But most of the time these days that we hear about disinformation, it is used merely as a label applied to narratives deemed inconvenient by those applying the label. Applying this label gets serious—weaponized—when it is given legal teeth and used to enforce compliance by tech masters and government overlords. Disinformation whose source is known is no longer disinformation. At best, it is misinformation. And often it is malinformation—information that’s true but that makes ruling the masses more difficult for those in power.

Nina Jankowicz embodies this unserious understanding of disinformation. Briefly in 2022, she headed the Department of Homeland Security’s Disinformation Governance Board (DGB). Jankowicz was the DGB’s first and last executive director. The DGB began April 27, 2022 and lasted all of three weeks, being dissolved May 18, 2022. Jankowicz helped ensure the DGB’s quick demise by being so thoroughly partisan and unprofessional. She released a goofy video of herself singing about disinformation to a Mary Poppins tune. Lacking serious academic credentials, she nonetheless styled herself an expert on disinformation because of research she did as a “disinformation fellow” with the Smithsonian’s Woodrow Wilson Center.

This research led to her 2020 book on disinformation titled How to Lose the Information War: Russia, Fake News and the Future of Conflict. Casting Russia as a prime purveyor of disinformation, it reads like a partisan piece that openly advocates progressive politics, dismissing any opposing views as threats to democracy. Nowhere in the book does she define the term disinformation, though she regards it as a serious problem: “It shakes the bedrock of democracy, and it can be deadly.” (p. xiii) Imagine her as the Counter-Disinformation Czar charging citizens with disinformation, but without needing to define the term and able to make it mean whatever she wants it to mean.

Lest readers think I’m exaggerating Jankowicz’s unseriousness, consider the following two paragraphs from her book. This passage typifies her lack of substance. In reading it, ask yourself how her book could reasonably have qualified her as our chief government official to secure America against actual disinformation (this book was her main qualification for the job). For context, the person described below is Ryan Clayton, a progressive friend of hers, whose opposition to Donald Trump she applauds, but who she claims, following the protest described below, became an object of Russian disinformation (apparently an $80 Facebook ad was enough to dupe him). Here’s her account of her friend before the Russians supposedly disinformed him:

Clayton and a few progressive friends started ATA (Americans Take Action) right after Trump’s November 2016 victory. They were in New York, standing “outside of Clinton’s Victory Party,” he says, using air quotes, “and I’m like, look, he’s going to win … We can literally be the first people protesting during his victory speech.” Clayton and his friends headed to the Trump Victory Party. “We are the first people protesting Trump, on the timeline. Everyone else is drinking their tears and shell-shocked and in hiding.” ATA’s protests continued; they secured tickets to Trump’s inauguration and stood up and linked arms as the President-elect was taking the Oath of Office, revealing blue T-shirts with red-and-white letters that spell out “RESIST.” Clayton wore the letter T. He and his friends were arrested; the picture of their protest became an icon of the Trump era.

At the Conservative Political Action Conference that year, Clayton and ATA handed out Russian flags emblazoned with Trump’s name in gold letters during his address to the attendees. They were found out and ejected from the speech, but not before hundreds of audience members, unaware that they were holding the Russian standard, blissfully waved the flags while chanting “USA! USA!” Later that year ATA attended the Washington Nationals baseball home opener and dropped a “RESIST” banner from the upper levels of the stadium. In short, creative protest is what ATA built their movement around. (pp. 12–13)

Disinformation is always aimed not at friends but at opponents. Instead of falling under a hot or kinetic war, disinformation falls under PsyOp or PsyWar (psychological operations or psychological warfare). PsyOp/PsyWar is well studied. The CIA and the military have even produced PsyOp manuals that are available from Amazon: The CIA PSYOP Manual (2017) and Military Psychological Operations Manual (2009).

In the aftermath of Covid-19, PsyOp/PsyWar has come into greater public consciousness, with many people suspecting that disinformation campaigns were used to influence public opinion about the virus’s origin and its proper treatment. Recently, Hollywood’s Mike Smith did an interesting documentary on PsyOp/PsyWar called Into the Light (which includes a remarkable interview with Lara Logan). Robert Malone’s PsyWar: Enforcing the New World Order is due out October 8, 2024. Mike Benz has a fascinating perspective on what he regards as the chief purveyor of disinformation in the US—the Blob:

Psychological warfare covers an array of techniques, some of which overlap. The following list highlights the scope of PsyOp/PsyWar. This list is meant to be wide-ranging but not exhaustive:

Disinformation: Covertly injecting words and actions into situations to discredit people, institutions, policies, and ideas in the eyes of a target audience by deceiving and manipulating their perceptions and attitudes.

Propaganda: Overtly promoting certain ideas, doctrines, or ideologies, typically by a government or political group, to shape public opinion or behavior in favor of the propagator’s objectives.

Behavioral Influence/Operant Conditioning: Employing methods from cognitive-behavioral psychology to shape the behavior of individuals or groups through psychological pressure, rewards, punishments, or persuasive messaging.

Social Engineering: Manipulating individuals or groups by exploiting their trust, emotions, or cognitive biases to make decisions that compromise their security, privacy, or well-being.

Demoralization: Undermining the morale and willpower of adversaries through various psychological tactics that create feelings of hopelessness, fear, and doubt, leading to decreased combat effectiveness and weakened resolve to resist or fight.

Information Overload: Overwhelming targets with excessive information to impair decision-making abilities by causing confusion, distraction, and difficulty in distinguishing between important and irrelevant data, thereby paralyzing their ability to respond effectively.

Cognitive Dissonance: Creating conflicting beliefs or values to induce psychological discomfort and influence behavior, leading individuals to change their attitudes or actions to reduce the mental stress caused by internal conflict.

Emotional Manipulation: Exploiting emotions like fear, anger, or hope to shape perceptions and actions by bypassing rational thought, thereby guiding individuals or groups towards decisions that align with the manipulator’s objective.

Perception Management: Controlling how information is presented to influence public opinion and decision-making by carefully crafting messages, framing narratives, and selectively disclosing facts to create a desired perception or reality.

Symbol Manipulation/Cultural Subversion: Leveraging cultural symbols, values, and norms for the specific emotional or ideological responses they evoke, exploiting their grip on society to influence attitudes, behaviors, and beliefs.

Cultural Awareness: Understanding and leveraging cultural nuances to enhance message resonance by incorporating local traditions, languages, and symbols, thus ensuring that psychological operations are culturally sensitive, contextually relevant, and more likely to influence the target audience effectively.

Gaslighting: Manipulating targets to question their own memory, perception, or sanity by presenting false information, denying facts, and creating doubts, often leading individuals to depend on the manipulator’s version of reality.

Narrative Control: Shaping and controlling the dominant stories or explanations surrounding events or issues by promoting certain interpretations, suppressing alternative views, and ensuring that the preferred narrative is accepted as true.

Rumor Campaigns: Deliberately spreading rumors or unverified information to create uncertainty, panic, or mistrust among the target population, often leading to social unrest or confusion.

Psychological Intimidation: Using threats, fear, psychological coercion, and even terror in small doses to compel individuals or groups to comply with certain demands or to deter them from taking undesired actions.

Target Audience Analysis: Identifying and studying specific groups to tailor messages for maximum impact, ensuring that psychological operations resonate with the targeted audience to exploit vulnerabilities and reinforce desired behaviors.

Credibility Building: Establishing trust and reliability with the target audience to increase message effectiveness by aligning with the audience’s existing beliefs and trusted sources, thereby enhancing the perceived legitimacy of a PSYOP campaign.

Measurement and Evaluation: Assessing the effectiveness of PSYOP campaigns by analyzing their impact on target behaviors, monitoring feedback loops, and refining messages to improve future planning and operations.

Disinformation is about surreptitiously influencing people’s psychology to benefit those doing the influencing and undermine those being influenced. Because a disinformation campaign, by definition, operates in the shadows, with the people behind it trying to keep their roles and identities hidden, disinformation always results from a conspiracy.

The very word conspiracy comes from the Latin conspirare, which combines con, meaning together, and spirare, meaning to breathe. Its original sense therefore was to agree, to unite, to act in harmony (breathing the same air together) to achieve a common goal—which could be good, bad, or indifferent. Over time, the word took on a negative connotation, referring to a secret agreement between people to perform illegal, harmful, or deceitful acts.

It’s in this sense that we use the word today. Yet in this sense, conspiracies are and have always been widespread and unexceptional. The economist Adam Smith noted the ordinariness of conspiracies in his Wealth of Nations when he wrote, “People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.” Who can therefore doubt that big tech, big pharma, big ag, big military, big energy, and big government are all involved in such conspiracies, colluding within and among themselves?

When people try to dismiss and ridicule conspiracies, often they play scary music and pretend that conspiracies, to be conspiracies, must be overtly satanic. But as Hannah Arendt noted after the Adolf Eichmann trial, most of the evil that people do—even as done by an Eichmann who, in his role as a paper-pushing Nazi bureaucrat, saw to the efficient destruction of millions of Jews—most of that evil is banal. It doesn’t require devious or nefarious planning. It is at home with complicity and compromise. It flows quite naturally out of human corruption.

The comedian George Carlin saw through attempts to dismiss conspiracies when he remarked, “You don’t need a formal conspiracy when interests converge.” Indeed, converging interests readily give rise to tacit conspiracies (video @ 92 sec):

These days, it’s common to disparage people who believe in conspiracies by labelling them conspiracy theorists. Applying this label is meant discredit their views and make them seem like kooks. But critics applying the label then go further by invoking disinformation as the reason that they are conspiracy theorists. Consequently, they are portrayed as dupes who have been the target of and succumbed to a disinformation campaign, fooled into thinking there’s a conspiracy when in fact there is none.

Also, as dupes of a disinformation campaign, they now presumably embrace the conspiracy and are therefore conduits of the disinformation. Thus, like a contagion, they spread the disinformation that was previously spread to them. In this way, conspiracy theorists are both dupes of disinformation and disseminators of it. Depending on the laws of the land, they may therefore be liable to civil and criminal penalties—and of course they will face censorship and vilification. In consequence, the US’s robust tradition of free speech now faces considerable pressure.

What then are we supposed to think when someone like Harvard’s Steven Pinker writes:

False beliefs about vaccines, public health measures, and climate change threaten the well-being of billions. Conspiracy theories incite terrorism, pogroms, wars, and genocide. A corrosion of standards of truth undermines democracy and clears the ground for tyranny. But for all the vulnerabilities of human reason, our picture of the future need not be a bot tweeting fake news forever. The arc of knowledge is a long one, and it bends toward rationality. (Rationality, 2021)

Pinker is known to be a free-speech fundamentalist, allowing only the usual exceptions (such as falsely shouting “Fire!” in a crowded theater). And yet he claims to know the truth about a host of issues, including “vaccines, public health measures, and climate change.” Moreover, he claims that false beliefs about these issues, when given expression, threaten the well-being of billions. Many smart and well-informed people differ with him on these issues (such as Robert Malone on health and William Happer on climate). Pinker, to his credit, would allow such dissenters at the table for debate.

But what happens when dissenters like Malone and Happer are no longer considered merely to be dissenters but rather are demonized as conspiracy theorists inciting massive human suffering because they themselves been disinformed and are in turn disinforming others? A generous free-speech perspective, such as the one taken by Pinker, seems hard to square with conspiracies that are the product of disinformation and that, by being promulgated, are likely to cause widespread harm.

A literature is now emerging in which thinkers and public intellectuals write about how to understand and deal with conspiracy theorists. Skeptic Michael Shermer’s Conspiracy: Why the Rational Believe the Irrational (2022) and behavioral economist Dan Ariely’s Misbelief: What Makes Rational People Believe Irrational Things (2023) are contributions to this literature. Even the titles are very close.

Shermer calls out conspiracy directly in his title. Ariely softens the language of his title by referring to misbelief rather than conspiracy. But the word “conspiracy” occurs over 130 times in Ariely’s book. Their arguments for why people become conspiracy theorists are based in psychology. Their method is to debunk by putting conspiracy theorists on the couch, analyzing the cognitive biases and irrational emotional factors that lead them to embrace conspiracies.

What I find particularly annoying about these books, and I include here as well Pinker’s book quoted earlier titled Rationality, is that the authors present themselves as having a God’s eye. They portray themselves as infallible oracles giving the right answer on controversial matters where conspiracy theorists give the wrong answer. They know they have the right answer. They know the conspiracy theorists have the wrong answer. Nowhere do they even gesture at their own fallibility.

Thus on such controversial questions as vaccines, public health, climate change, and many more, they give the conventional, politically correct answers. This means that they regard those who don’t provide the same answers as cognitively defective. Shermer and Ariely try to take a tolerant view toward such individuals. But it’s easy to imagine government bureaucrats reading this literature, tiring of the opposition to their governing efforts from so-called conspiracy theorists, and ultimately giving up on rational discourse in favor of enforced compliance.

This, then, is where the conspiratorialization of disinformation is headed: the formation of a new underclass defined as disinformed people who give into conspiracy theories, which then lead them to disinform others. In the definition of this underclass, it doesn’t matter if its members are in fact disinformed or if the conspiracies they believe are in fact real. It’s enough if the conspiracies run afoul of the ruling class. Nina Jankowicz, whose job it would have been to deal with this underclass, was our first Counter-Disinformation Czar. She may not have been the last.

The following points describe some of the ways that technology can empower disinformation in our digital age:

Social Media Amplification: Social media platforms like Facebook, Twitter, and YouTube have become central hubs for spreading disinformation. Their algorithms prioritize engagement, which can lead to the amplification of misleading or sensationalist content. This can result in disinformation spreading rapidly and widely, reaching global audiences and creating confusion and discord. An echo chamber effect with these platforms further reinforces biases and isolates users from diverse viewpoints.

Bots and Automated Accounts: The use of bots and automated accounts on social media can create the illusion of widespread consensus or outrage. These bots can engage in “astroturfing,” simulating grassroots support for particular ideas, thereby manipulating public opinion. They can overwhelm users with high-volume content production, making genuine information difficult to discern. Advanced bots can mimic human behavior, making detection of real humans increasingly challenging.

Deepfakes and AI-Generated Content: Advances in AI technology have enabled the creation of deepfakes—highly realistic fake videos or audio recordings. These can be used to discredit individuals or institutions by fabricating evidence of actions or statements that never occurred. Deepfakes are particularly concerning because they are difficult to detect and can significantly impact public perception. AI can also generate convincing text-based disinformation at scale, flooding platforms with seemingly real fabricated content.

Micro-Targeting through Personal Data Exploitation: Technologies like smartphone tracking and listening devices (e.g., Amazon Alexa) collect vast amounts of personal data, which can be exploited for targeted disinformation. By analyzing search history, location data, or overheard conversations, disinformation agents can craft highly personalized false narratives that exploit specific fears or biases.

Hack-and-Leak Operations: Hack-and-leak operations engage in stealing and strategically releasing private information to damage reputations or influence public opinion. These operations can sway political outcomes and damage reputations. The verified authenticity of leaked information can lend credibility to associated disinformation campaigns.

Search Engine Manipulation: As explained by Robert Epstein, search engines, by manipulating search engine results to control information visibility, can subtly influence public opinion without people’s awareness. Manipulation extends to omitting, re-ranking, or including irrelevant and misleading results, obscuring important information. Even autofill can manipulate search by steering users toward certain topics, thereby influencing the direction of their search.

Information Overload: Overwhelming targets with excessive information impairs critical thinking and decision-making. By flooding individuals with an abundance of data, disinformation agents can make it difficult for people to review and evaluate information critically, leading to confusion and indecision. In an information age where digital information can overwhelm people and where finding relevant information becomes increasingly difficult, information overload is an increasingly effective tool in disinformation campaigns.

Encrypted Messaging Platforms: End-to-end encrypted messaging apps, such as WhatsApp and Telegram, offer disinformation campaigns a secure and private channel to spread false narratives without the scrutiny of content moderation or fact-checking. These platforms create closed environments where disinformation can be shared and amplified within like-minded groups, making it difficult for outside observers to monitor or counter false claims. The use of encryption not only protects the identity of those spreading disinformation but also allows these campaigns to operate covertly, bypassing the broader public discourse while still effectively influencing specific target audiences.

Cross-Platform Coordination: Disinformation campaigns can exploit the interconnectedness of modern digital platforms by coordinating the spread of false information across multiple channels simultaneously. By seeding the same misleading content on social media, news websites, forums, and messaging apps, this multi-platform approach not only amplifies the reach of disinformation but also gives it credibility, as users encounter the same narrative in different contexts (“converging lines of evidence”), making it appear as widely accepted or corroborated by various independent sources.

Cryptocurrency and Money Laundering: Cryptocurrency offers disinformation campaigns a potent tool for funding operations and obscuring financial trails. Cryptocurrencies enable anonymous transactions, which can be impossible to trace back to their source. Cryptocurrencies allow disinformation agents to move funds across borders with minimal traceability, effectively laundering money without the oversight facing traditional financial systems. This financial flexibility enables the covert funding of large-scale disinformation efforts, including the hiring of bot armies, the creation of fake accounts, and the production of sophisticated disinformation content. The use of cryptocurrency in disinformation campaigns undermines efforts to track and disrupt these activities.

The Soviets taught us that disinformation is always about discredit—lowering our opinion of things where previously it was higher. So the first question we should ask when we suspect disinformation is this: What have we previously regarded as normal or acceptable that we are now required to regard as abnormal or unacceptable. I use the word “required” rather than “asked” because disinformation comes at us with an urgency and moral imperative—disinformation obliges right thinking, morally just people to lower their opinion of the thing discredited. That in itself provides a clue whether we are dealing with disinformation or merely a justified reassessment of value.

Over the last 25 years, we’ve seen many novel orthodoxies touted and many traditional views belittled. The list is long and uncontroversial even if the issues under question remain controversial. ChatGPT4o readily delivers such lists. Here’s a prompt I gave it: “Review 20 hot-button cultural issues that in the last 25 years have seen a considerable shift in public perceptions away from traditional positions and where those who now dissent from the current orthodoxy are regarded as ‘conspiracy theorists’ or people otherwise out of touch with the current best information and practices. For each issue you list, include a single sentence explanation.” ChatGPT4o responded as follows:

Here are 20 hot-button cultural issues from the last 25 years where public perception has shifted significantly, often resulting in those who dissent being labeled as out of touch or conspiracy theorists:

1. Climate Change: The overwhelming scientific consensus now recognizes human-driven climate change, and skepticism or denial is often labeled as out-of-touch or anti-science.

2. Same-Sex Marriage: Public support for same-sex marriage has risen sharply, with opposition increasingly viewed as outdated or discriminatory.

3. Transgender Rights: There has been a growing acceptance of transgender identities, and questioning or opposing this acceptance can lead to being labeled as transphobic.

4. Vaccine Mandates: Public health policies promoting vaccine mandates have become widely accepted, and opposition is often seen as anti-science or dangerous.

5. Critical Race Theory: The teaching of systemic racism and white privilege in schools has gained support, with critics often dismissed as racist or uninformed.

6. Gun Control: Advocacy for stricter gun control laws has gained momentum, and those opposed are sometimes seen as indifferent to public safety.

7. Immigration Policy: There has been a shift towards more inclusive immigration policies, with opposition increasingly viewed as xenophobic or nativist.

8. Cancel Culture: The rise of social accountability through “canceling” has been widely accepted, with detractors often labeled as defending harmful behavior.

9. #MeToo Movement: The movement against sexual harassment and assault has gained broad support, and skeptics or opponents are often seen as dismissive of victims.

10. Defunding the Police: Calls to defund or reform police departments have grown, with opponents often labeled as resistant to necessary change.

11. Gender-Neutral Language: The use of gender-neutral pronouns and language has become more common, and those who resist are often seen as insensitive or transphobic.

12. Legalization of Marijuana: Public support for the legalization of marijuana has increased, with opposition often viewed as outdated or uninformed.

13. Renewable Energy: The shift towards renewable energy sources is broadly supported, with dissenters often labeled as anti-environmental or backward-looking.

14. Intersectionality: The concept of intersectionality has become mainstream in discussions of social justice, and those who question it may be seen as ignorant of modern social issues.

15. Sex Work Decriminalization: There is growing support for the decriminalization of sex work, with opponents often viewed as moralistic or out of touch with contemporary views on personal autonomy.

16. Euthanasia and Assisted Suicide: Support for euthanasia and assisted suicide has increased, and opposition is often seen as lacking compassion or being overly religious.

17. Body Positivity Movement: The body positivity movement has gained traction, and critics are often viewed as perpetuating harmful beauty standards.

18. Animal Rights: There has been a significant shift towards recognizing animal rights, with opposition increasingly viewed as unethical or cruel.

19. Legalization of Same-Sex Adoption: Public support for same-sex couples adopting children has risen, and opponents are often seen as holding outdated views on family structures.

20. Feminist Theory: Modern feminist theory, including concepts like toxic masculinity, has gained acceptance, with critics often labeled as sexist or misogynistic.

These issues reflect significant cultural shifts, where traditional perspectives have been largely replaced by new norms, and those who dissent from these norms often find themselves marginalized or labeled as out of touch.

Has each of the issues on this list been the object of a disinformation campaign? Probably not. In advancing the credibility of the new orthodoxies represented here, propaganda has no doubt played a key role. Unlike disinformation, propaganda operates in the open. It can accredit as well as discredit. Still, to do discredit effectively when the discredit is undeserved, disinformation is the weapon of choice. Even if propaganda promoted these new orthodoxies, it’s not unreasonable to think that disinformation was involved in discrediting at least some of the views here that are at variance with the new orthodoxies.

What, then, is the answer to disinformation? It is better information. Disinformation paints a picture that discredits a person, institution, or idea. Better information corrects that picture. Better information is not just more information. Mere quantity of information, especially in a digital age that thrives on information overload, means nothing. What matters is the quality of the information, the relevant facts it identifies, the broader truths it reveals, and the falsehoods it uncovers.

To say that the answer to disinformation is better information may sound bland, but it is the right answer. Disinformation campaigns are typically designed to resist correction, making it difficult for better information to refute disinformation. Even if better information can be discovered, getting the word out about it may be difficult, as when social media and search engines conspire to suppress the better information. Worse yet, governments are increasingly viewing freedom of thought and expression as obstacles to their agendas, leading them to ban or outlaw better information that challenges their preferred narratives.

Simply countering disinformation with better information is therefore not enough. To counter disinformation effectively requires not just better information but also a dogged attitude about how to deploy it. Such an attitude includes:

  • an unwillingness to back down in the face of intimidation,

  • a refusal to be bribed or otherwise sell out,

  • a readiness to step outside echo chambers and avoid groupthink,

  • a willingness to be disliked for jumping off the bandwagon,

  • a skepticism toward governments, corporations, NGOs, media, and academics with conflicts of interest,

  • a commitment to think for yourself and not have opinions foisted on you,

  • a mindset that follows evidence wherever it leads,

  • a rejection of barriers on where you can go and what you can think,

  • and above all, an insistence on freedom of thought and expression as a fundamental right and not as a privilege that can be lost or revoked.

If you want an example of how to effectively answer disinformation, look at the work of Jeffrey Sachs, Rand Paul, and Robert F. Kennedy Jr. on the origin of SARS-CoV-2. If you read the Wikipedia entry on the origin of this virus, if you query the major search engines (like Google and Bing), or if you chat with the popular chatbots (like ChatGPT, Claude, and Perplexity), they will all tell the same story about the origin of the virus, attributing it to a zoonotic origin from animals like bats or pangolins, which then unhappily but naturally jumped to humans.

The argument that the virus jumped naturally from animals to humans is based on the virus’s genetic similarities to coronaviruses found in animals, and also on the history of zoonotic spillovers involving other coronaviruses such as SARS and MERS (in Bayesian terms, there’s said to be a high prior probability of natural origin). To leave no doubt as to where the current orthodoxy lies, Wikipedia states in its entry on the origin of the virus: “While other explanations, such as speculations that SARS-CoV-2 was accidentally released from a laboratory have been proposed, such explanations are not supported by evidence. Conspiracy theories (note the obligatory hat tip) about the virus’s origin have also proliferated.”

In fact, not only is the evidence for the lab-leak hypothesis convincing, but the disinformation campaign behind the wildlife-to-human transmission hypothesis stands exposed. Email correspondence from NIH head Francis Collins, NIAID head Anthony Fauci, and their colleagues just prior to February 2020 reveal a worried group of scientists who funded gain-of-function research at the Wuhan Institute of Virology (through Peter Daczak’s EcoHealth Alliance) and who thought it very likely that the virus was in fact the result of a lab leak. And then, beginning that February, they closed ranks and published articles officially declaring the virus to have a natural origin, which remains the dominant narrative to this day. As with the Soviet disinformation campaigns that were later uncovered, this one stands uncovered.

A full discussion of the lab-leak hypothesis and the evidence for it can be found in Rand Paul’s Deception: The Great COVID Cover-Up (2023) and in RFK Jr’s The Wuhan Cover-Up: And the Terrifying Bioweapons Arms Race (2023). Readers who don’t have the time to read these books and check out their citations can go to shorter online sources that summarize the case for the lab-leak hypothesis—provided the search engines allow these sources to be found! Using not Google but Freespoke, I found an article dated this month (September 3, 2024) from the Brownstone Institute. Written by Pat Fidopiastis, a professor of microbiology at California Polytechnic State University, the article is titled “The War against the Lab Leak Hypothesis.” It’s a good place to start if all you want is quick talking points on why the lab-leak hypothesis is reasonable.

I close with a quote from Scripture that gives me personally great confidence that disinformation is never the last word and that better information will in the end always destroy its machinations by exposing it to the light of truth: “For there is nothing hidden that will not be disclosed, and nothing concealed that will not be known or brought out into the open.” —Luke 8:17 (NIV)

You May Also Like

More From Author