print-icon
print-icon

The Golden Age Of Disinformation Has Only Just Begun

Tyler Durden's Photo
by Tyler Durden
Authored...

Authored by Boyan Radoykov via The Epoch Times,

Disinformation is all about power, and because of the harmful and far-reaching influence that disinformation exerts, it cannot achieve much without power.

As a tool for shaping public perceptions, disinformation can be used by authoritarian regimes and democracies alike. The dissemination of false information is not a new practice in human history. However, over the last few decades, it has become professionalized and has taken on exorbitant proportions at both national and international levels.

The Origins of Disinformation

Disinformation can be understood as misleading information, intentionally produced and deliberately disseminated, to mislead public opinion, harm a target group, or advance political or ideological objectives.

The term disinformation is a translation of the Russian дезинформация (dezinformatsiya). On Jan. 11, 1923, the Politburo of the Communist Party of the Soviet Union decided to create a Department of Disinformation. Its mission was “to mislead real or potential adversaries about the true intentions” of the USSR. From then on, disinformation became a tactic of Soviet political warfare known as “active measures,” a crucial element of Soviet intelligence strategy involving falsification, subversion, and media manipulation.

During the Cold War, from 1945 to 1989, this tactic was used by numerous intelligence agencies. The expression “disinformation of the masses” came into increasing use in the 1960s and became widespread in the 1980s. Former Soviet bloc intelligence officer Ladislav Bittman, the first disinformation professional to defect to the West, observed in this regard that ”The interpretation [of the term] is slightly distorted because public opinion is only one of the potential targets. Many disinformation games are designed only to manipulate the decision-making elite, and receive no publicity.”

With its creation in July 1947, the CIA was given two main missions: to prevent surprise foreign attacks against the United States and to hinder the advance of Soviet communism in Europe and Third World countries. During the four decades of the Cold War, the CIA was also at the forefront of U.S. counter-propaganda and disinformation.

The Soviet Union’s successful test of a nuclear weapon in 1949 caught the United States off guard and led to the advent of the two nuclear powers clashing on the world stage in an international atmosphere of extreme tension, fear, and uncertainty. In 1954, President Dwight Eisenhower received a top-secret report from a commission chaired by retired Gen. James H. Doolittle, which concluded: “If the United States is to survive, long-standing American concepts of ‘fair play’ must be reconsidered. We must develop effective espionage and counterespionage services and must learn to subvert, sabotage and destroy our enemies by more clever, more sophisticated and more effective methods than those used against us. It may become necessary that the American people be acquainted with, understand and support this fundamentally repugnant philosophy.” Of course, “repugnant” philosophy includes subversion through disinformation.

Although the United States had high expertise in this field, it did not react much to the disinformation that was sent its way until 1980, when a false document claimed that Washington supported apartheid in South Africa. Later on, they also took offense at Operation Denver, a Soviet disinformation campaign aimed at having the world believe that the United States had intentionally created HIV/AIDS.

In the United States, the intellectual influence of Edward Bernays is at the root of institutional political propaganda and opinion manipulation. A double nephew of Sigmund Freud, he worked as a press agent for Italian tenor Enrico Caruso and for the Ballets Russes. He took part, alongside President Woodrow Wilson, in the Creel Commission (1917), which helped turn American public opinion in favor of going to war. His wife and business partner, Doris Fleischman, advised him to avoid using the overused term “propaganda.” Instead, she coined the term “public relations” to replace it, a term still in use today.

China and Its Digital Authoritarianism

In China, deception, lies, and the rewriting of history are disinformation techniques used by the Chinese Communist Party, according to tactics learned in the Soviet Union in the 1950s. Today, the CCP has a sophisticated arsenal of disinformation on all fronts. Its main objectives are to turn public opinion upside down, interfere in foreign political circles, influence elections, discredit its opponents, and hide its own intentions and priorities.

In September 2021, the French Institute for Strategic Research at the École Militaire published a report on China’s influence operations, which warned: “For a long time, it could be said that China, unlike Russia, sought to be loved rather than feared; that it wanted to seduce, to project a positive image of itself in the world, to arouse admiration. Beijing has not given up on seduction ... but, at the same time, Beijing is increasingly taking on the role of infiltrator and coercer: its influence operations have become considerably tougher in recent years, and its methods increasingly resemble those employed by Moscow.”

On Sept. 28, 2023, the U.S. government published a report in which it accused China of seeking to “reshape the global information landscape” through a vast network specialized in disinformation. “[China’s] global information manipulation is not simply a matter of public diplomacy—but a challenge to the integrity of the global information space.” This “manipulation” encompasses “propaganda, disinformation, and censorship.”

“Unchecked, [China’s] efforts will reshape the global information landscape, creating biases and gaps that could even lead nations to make decisions that subordinate their economic and security interests to Beijing’s,” according to the report.

According to the U.S. State Department, China spends billions of dollars every year on these “foreign information manipulation” operations. At the same time, Beijing suppresses critical information that runs counter to its rhetoric on politically sensitive subjects. The report goes on to state that China manipulates information by resorting to “digital authoritarianism,” exploiting international and UN organizations and controlling Chinese-language media abroad.

When Disinformation Becomes Military Doctrine

In some countries, policymakers may turn to their national history to justify the implementation of certain regulations on information. German politicians, for example, frequently refer to the Nazi past or that of the communist Stasi to justify the regulations they want to put in place. Yet these historical comparisons don’t always hold water. The Nazis, for example, did not come to power because they controlled the then-new technology of radio. Rather, once in power, they used the state control on radio stations that the previous Weimar governments had put in place—in the hope of saving democracy—to their own benefit. This decision by the Weimar governments had the perverse effect of enabling the Nazis to control radio much more quickly than with newspapers.

Disinformation is mainly orchestrated by government agencies. In the post-Soviet era, and with the advent of the information society, when the media and social networks became a central relay for the dissemination of fake news, disinformation evolved to become a fundamental tactic in the military doctrine of powerful countries. In the early 2000s, the European Union and NATO realized that the problem of Russian disinformation was such that they had to set up special units to process and debunk mass-produced false information.

The Methods and Processes of Disinformation

There are four main methods of spreading disinformation: selective censorship, manipulation of search indexes, hacking and dissemination of fraudulently obtained data, and amplification of disinformation through excessive sharing.

By way of example, disinformation activities involve the following processes:

  • The creation of fabricated characters or websites with networks of fake experts who disseminate supposedly reliable references.

  • The creation of “deep-fakes” and synthetic media through photos, videos, and audio clips that have been digitally manipulated or entirely fabricated to deceive the public. Today’s artificial intelligence (AI) tools can make synthetic content almost impossible to detect or distinguish from reality.

  • The development or amplification of conspiracy theories, which attempt to explain important events through the secret actions of powerful actors acting in the shadows. Conspiracy theories aim not only to influence people’s understanding of events, but also their behavior and worldview.

  • Astroturfing and inundation of information environments. At the root of disinformation campaigns are huge quantities of similar content, published from fabricated sources or accounts. This practice, called astroturfing, creates the impression of widespread support or opposition to a message while concealing its true origin. A similar tactic, inundation, involves spamming social media posts and comment sections with the aim of shaping a narrative or stifling opposing viewpoints. In recent years, the use of troll factories to spread misleading information on social networks has gained momentum.

  • Exploiting alternative social media platforms to reinforce beliefs in a disinformation narrative. Disinformation actors take advantage of platforms offering fewer protections for users and fewer options for detecting and removing inauthentic content and accounts.

  • Amplification of information gaps, when there isn’t enough credible information to answer a specific search. Misinformation leaders can exploit these gaps by generating their own content and feeding the search.

  • Manipulating unsuspecting protagonists. Disinformation facilitators target high-profile individuals and organizations to corroborate their stories. Targets are often not even aware that they are repeating a disinformation actor’s narrative, or that this narrative is intended to influence or manipulate public opinion.

  • Dissemination of targeted content: The instigators of disinformation produce customized influential content likely to resonate with a specific audience, based on its worldview, beliefs, and interests. It’s a long-term tactic that involves disseminating targeted content over time to build trust and credibility with the target audience, making it easier to manipulate them.

A Race Against Time to Protect the Younger Generation

In the early 2000s, most publications about the internet hailed its unprecedented potential for development. Only a few years later, commentators, analysts, and policymakers began to worry that the internet, and social media platforms in particular, posed new threats to democracy, global governance, and the integrity of information.

Since then, the world has become increasingly interconnected and interdependent, and the opportunities for misinformation have become almost limitless. With more than 5.5 billion internet users and more than 8.58 billion mobile subscriptions worldwide by 2022, compared to a global population of 7.95 billion at mid-year, the great paradox is that the rise of information technology has created a much more conducive, even thriving, environment for misinformation, and that the development of AI is leading to even worse and more rampant misinformation.

Some experts agree that while online misinformation and propaganda are widespread, it is difficult to determine the extent to which this misinformation has an impact on the public’s political attitudes and, consequently, on political outcomes. Other data have shown that disinformation campaigns rarely succeed in changing the policies of targeted states, but it would be irresponsible to believe that misinformation has little impact. If that were the case, major countries would have abandoned the practice long ago. The opposite is true. With the gradual increase in the foolishness of ruling elites and the rise of new technologies, the policy of destabilization through disinformation has a bright future ahead of it. The risks and stakes remain enormous, and the erosion of public trust in institutions and the media is deeply significant in this regard.

The fight against disinformation must go beyond simplistic solutions such as shutting down Facebook or X (formerly Twitter) accounts, publicly denouncing the actions of one’s adversary, or containing false information through technical means. And it is certainly not enough to focus on measures such as fact-checking or media education to help individuals master and consume information; the average person carries little weight in the face of government disinformation machines.

It would therefore be preferable to address the political and economic operating conditions of the structures that facilitate the spread of disinformation, such as large technology companies, the state actors involved, the media, and other information systems.

Of course, the human factor must remain at the center of leaders’ concerns in the face of growing state and media disinformation. The price of educating young people will always be less than the price of their ignorance.

0
Loading...