Hubbry Logo
PropagandaPropagandaMain
Open search
Propaganda
Community hub
Propaganda
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Propaganda
Propaganda
from Wikipedia

James Montgomery Flagg’s famous “Uncle Sam” propaganda poster, made during World War I

Propaganda is communication that is primarily used to influence or persuade an audience to further an agenda, which may not be objective and may be selectively presenting facts to encourage a particular synthesis or perception, or using loaded language to produce an emotional rather than a rational response to the information that is being presented.[1] Propaganda can be found in a wide variety of different contexts.[2]

Beginning in the twentieth century, the English term propaganda became associated with a manipulative approach, but historically, propaganda had been a neutral descriptive term of any material that promotes certain opinions or ideologies.[1][3]

A wide range of materials and media are used for conveying propaganda messages, which changed as new technologies were invented, including paintings, cartoons, posters,[4] pamphlets, films, radio shows, TV shows, and websites. More recently, the digital age has given rise to new ways of disseminating propaganda, for example, in computational propaganda, bots and algorithms are used to manipulate public opinion, e.g., by creating fake or biased news to spread it on social media or using chatbots to mimic real people in discussions in social networks.

Etymology

[edit]

Propaganda is a modern Latin word, the neuter plural gerundive form of propagare, meaning 'to spread' or 'to propagate', thus propaganda means the things which are to be propagated.[5] Originally this word derived from a new administrative body (congregation) of the Catholic Church created in 1622 as part of the Counter-Reformation, called the Congregatio de Propaganda Fide (Congregation for Propagating the Faith), or informally simply Propaganda.[3][6] Its activity was aimed at "propagating" the Catholic faith in non-Catholic countries.[3]

From the 1790s, the term began being used also to refer to propaganda in secular activities.[3] In English, the cognate began taking a pejorative or negative connotation in the mid-19th century, when it was used in the political sphere.[3]

Non-English cognates of propaganda as well as some similar non-English terms retain neutral or positive connotations. For example, in official party discourse, xuanchuan is treated as a more neutral or positive term, though it can be used pejoratively through protest or other informal settings within China.[7][8]: 4–6 

Definitions

[edit]
Nazi Propaganda poster of 27th SS Volunteer Division Langemarck with anti-semitic title: "Together we will crush him!".

Historian Arthur Aspinall observed that newspapers were not expected to be independent organs of information when they began to play an important part in political life in the late 1700s, but were assumed to promote the views of their owners or government sponsors.[9] In the 20th century, the term propaganda emerged along with the rise of mass media, including newspapers and radio. As researchers began studying the effects of media, they used suggestion theory to explain how people could be influenced by emotionally-resonant persuasive messages. Harold Lasswell provided a broad definition of the term propaganda, writing it as: "the expression of opinions or actions carried out deliberately by individuals or groups with a view to influencing the opinions or actions of other individuals or groups for predetermined ends and through psychological manipulations."[10] Garth Jowett and Victoria O'Donnell theorize that propaganda and persuasion are linked as humans use communication as a form of soft power through the development and cultivation of propaganda materials.[11]

In a 1929 literary debate with Edward Bernays, Everett Dean Martin argues that, "Propaganda is making puppets of us. We are moved by hidden strings which the propagandist manipulates."[12] In the 1920s and 1930s, propaganda was sometimes described as all-powerful. For example, Bernays acknowledged in his book Propaganda that "The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country. We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of."[13]

NATO's 2011 guidance for military public affairs defines propaganda as "information, ideas, doctrines, or special appeals disseminated to influence the opinion, emotions, attitudes, or behaviour of any specified group in order to benefit the sponsor, either directly or indirectly".[14] More recently the RAND Corporation coined the term Firehose of Falsehood to describe how modern communication capabilities enable a large number of messages to be broadcast rapidly, repetitively, and continuously over multiple channels (like news and social media) without regard for truth or consistency.

History

[edit]

Primitive forms of propaganda have been a human activity as far back as reliable recorded evidence exists. The Behistun Inscription (c. 515 BCE) detailing the rise of Darius I to the Persian throne is viewed by most historians as an early example of propaganda.[15] Another striking example of propaganda during ancient history is the last Roman civil wars (44–30 BCE) during which Octavian and Mark Antony blamed each other for obscure and degrading origins, cruelty, cowardice, oratorical and literary incompetence, debaucheries, luxury, drunkenness and other slanders.[16] This defamation took the form of uituperatio (Roman rhetorical genre of the invective) which was decisive for shaping the Roman public opinion at this time. Another early example of propaganda was from Genghis Khan. The emperor would send some of his men ahead of his army to spread rumors to the enemy. In many cases, his army was actually smaller than his opponents'.[17]

Holy Roman Emperor Maximilian I was the first ruler to utilize the power of the printing press for propaganda – in order to build his image, stir up patriotic feelings in the population of his empire (he was the first ruler who utilized one-sided battle reports – the early predecessors of modern newspapers or neue zeitungen – targeting the mass.[18][19]) and influence the population of his enemies.[20][21][22] Propaganda during the Reformation, helped by the spread of the printing press throughout Europe, and in particular within Germany, caused new ideas, thoughts, and doctrine to be made available to the public in ways that had never been seen before the 16th century. During the era of the American Revolution, the American colonies had a flourishing network of newspapers and printers who specialized in the topic on behalf of the Patriots (and to a lesser extent on behalf of the Loyalists).[23] Academic Barbara Diggs-Brown conceives that the negative connotations of the term "propaganda" are associated with the earlier social and political transformations that occurred during the French Revolutionary period movement of 1789 to 1799 between the start and the middle portion of the 19th century, in a time when the word started to be used in a nonclerical and political context.[24]

A 1918 Finnish propaganda leaflet signed by General Mannerheim circulated by the Whites urging the Reds to surrender during the Finnish Civil War. [To the residents and troops of Tampere! Resistance is futile. Raise the white flag and surrender. The blood of the citizen has been shed enough. We will not kill our prisoners as the Reds do. Send your representative with a white flag.]

The first large-scale and organised propagation of government propaganda was occasioned by the outbreak of the First World War in 1914. After the defeat of Germany, military officials such as General Erich Ludendorff suggested that British propaganda had been instrumental in their defeat. Adolf Hitler came to echo this view, believing that it had been a primary cause of the collapse of morale and revolts in the German home front and Navy in 1918 (see also: Dolchstoßlegende). In Mein Kampf (1925) Hitler expounded his theory of propaganda, which provided a powerful base for his rise to power in 1933. Historian Robert Ensor explains that "Hitler...puts no limit on what can be done by propaganda; people will believe anything, provided they are told it often enough and emphatically enough, and that contradicters are either silenced or smothered in calumny."[25] This was to be true in Germany and backed up with their army making it difficult to allow other propaganda to flow in.[26] Most propaganda in Nazi Germany was produced by the Ministry of Public Enlightenment and Propaganda under Joseph Goebbels. Goebbels mentions propaganda as a way to see through the masses. Symbols are used towards propaganda such as justice, liberty and one's devotion to one's country.[27] World War II saw continued use of propaganda as a weapon of war, building on the experience of WWI, by Goebbels and the British Political Warfare Executive, as well as the United States Office of War Information.[28]

In the early 20th century, the invention of motion pictures (as in movies, diafilms) gave propaganda-creators a powerful tool for advancing political and military interests when it came to reaching a broad segment of the population and creating consent or encouraging rejection of the real or imagined enemy. In the years following the October Revolution of 1917, the Soviet government sponsored the Russian film industry with the purpose of making propaganda films (e.g., the 1925 film The Battleship Potemkin glorifies Communist ideals). In WWII, Nazi filmmakers produced highly emotional films to create popular support for occupying the Sudetenland and attacking Poland. The 1930s and 1940s, which saw the rise of totalitarian states and the Second World War, are arguably the "Golden Age of Propaganda". Leni Riefenstahl, a filmmaker working in Nazi Germany, created one of the best-known propaganda movies, Triumph of the Will. In 1942, the propaganda song Niet Molotoff was made in Finland during the Continuation War, making fun of the Red Army's failure in the Winter War, referring the song's name to the Soviet's Minister of Foreign Affairs, Vyacheslav Molotov.[29] In the US, animation became popular, especially for winning over youthful audiences and aiding the U.S. war effort, e.g., Der Fuehrer's Face (1942), which ridicules Hitler and advocates the value of freedom. Some American war films in the early 1940s were designed to create a patriotic mindset and convince viewers that sacrifices needed to be made to defeat the Axis powers.[30] Others were intended to help Americans understand their Allies in general, as in films like Know Your Ally: Britain and Our Greek Allies. Apart from its war films, Hollywood did its part to boost American morale in a film intended to show how stars of stage and screen who remained on the home front were doing their part not just in their labors, but also in their understanding that a variety of peoples worked together against the Axis menace: Stage Door Canteen (1943) features one segment meant to dispel Americans' mistrust of the Soviets, and another to dispel their bigotry against the Chinese. Polish filmmakers in Great Britain created the anti-Nazi color film Calling Mr. Smith[31][32] (1943) about Nazi crimes in German-occupied Europe and about lies of Nazi propaganda.[33]

The John Steinbeck novel The Moon Is Down (1942), about the Socrates-inspired spirit of resistance in an occupied village in Northern Europe, was presumed to be about Norway's response to the German occupiers. In 1945, Steinbeck received the King Haakon VII Freedom Cross for his literary contributions to the Norwegian resistance movement.[34]

The West and the Soviet Union both used propaganda extensively during the Cold War. Both sides used film, television, and radio programming to influence their own citizens, each other, and Third World nations. Through a front organization called the Bedford Publishing Company, the CIA through a covert department called the Office of Policy Coordination disseminated over one million books to Soviet readers over the span of 15 years, including novels by George Orwell, Albert Camus, Vladimir Nabokov, James Joyce, and Pasternak in an attempt to promote anti-communist sentiment and sympathy of Western values.[35] George Orwell's contemporaneous novels Animal Farm and Nineteen Eighty-Four portray the use of propaganda in fictional dystopian societies. During the Cuban Revolution, Fidel Castro stressed the importance of propaganda.[36][better source needed] Propaganda was used extensively by Communist forces in the Vietnam War as means of controlling people's opinions.[37]

During the Yugoslav wars, propaganda was used as a military strategy by governments of Federal Republic of Yugoslavia and Croatia. Propaganda was used to create fear and hatred, and particularly to incite the Serb population against the other ethnicities (Bosniaks, Croats, Albanians and other non-Serbs). Serb media made a great effort in justifying, revising or denying mass war crimes committed by Serb forces during these wars.[38]

Public perceptions

[edit]

In the early 20th century the term propaganda was used by the founders of the nascent public relations industry to refer to their people. Literally translated from the Latin gerundive as "things that must be disseminated", in some cultures the term is neutral or even positive, while in others the term has acquired a strong negative connotation. The connotations of the term "propaganda" can also vary over time. For example, in Portuguese and some Spanish language speaking countries, particularly in the Southern Cone, the word "propaganda" usually refers to the most common manipulative media in business terms – "advertising".[39]

Poster of the 19th-century Scandinavist movement

In English, propaganda was originally a neutral term for the dissemination of information in favor of any given cause. During the 20th century, however, the term acquired a thoroughly negative meaning in western countries, representing the intentional dissemination of often false, but certainly "compelling" claims to support or justify political actions or ideologies. According to Harold Lasswell, the term began to fall out of favor due to growing public suspicion of propaganda in the wake of its use during World War I by the Creel Committee in the United States and the Ministry of Information in Britain: Writing in 1928, Lasswell observed, "In democratic countries the official propaganda bureau was looked upon with genuine alarm, for fear that it might be suborned to party and personal ends. The outcry in the United States against Mr. Creel's famous Bureau of Public Information (or 'Inflammation') helped to din into the public mind the fact that propaganda existed. ... The public's discovery of propaganda has led to a great of lamentation over it. Propaganda has become an epithet of contempt and hate, and the propagandists have sought protective coloration in such names as 'public relations council,' 'specialist in public education,' 'public relations adviser.' "[40] In 1949, political science professor Dayton David McKean wrote, "After World War I the word came to be applied to 'what you don't like of the other fellow's publicity,' as Edward L. Bernays said...."[41]

Contestation

[edit]

The term is essentially contested and some have argued for a neutral definition,[42][43]: 9  arguing that ethics depend on intent and context,[43] while others define it as necessarily unethical and negative.[44] Emma Briant defines it as "the deliberate manipulation of representations (including text, pictures, video, speech etc.) with the intention of producing any effect in the audience (e.g. action or inaction; reinforcement or transformation of feelings, ideas, attitudes or behaviours) that is desired by the propagandist."[43]: 9  The same author explains the importance of consistent terminology across history, particularly as contemporary euphemistic synonyms are used in governments' continual efforts to rebrand their operations such as 'information support' and strategic communication.[43]: 9  Other scholars also see benefits to acknowledging that propaganda can be interpreted as beneficial or harmful, depending on the message sender, target audience, message, and context.[2]

David Goodman argues that the 1936 League of Nations "Convention on the Use of Broadcasting in the Cause of Peace" tried to create the standards for a liberal international public sphere. The Convention encouraged empathetic and neighborly radio broadcasts to other nations. It called for League prohibitions on international broadcast containing hostile speech and false claims. It tried to define the line between liberal and illiberal policies in communications, and emphasized the dangers of nationalist chauvinism. With Nazi Germany and Soviet Russia active on the radio, its liberal goals were ignored, while free speech advocates warned that the code represented restraints on free speech.[45]

Types

[edit]
Poster in a North Korean primary school targeting the United States military. The Korean text reads: "Are you playing the game of catching these guys?"

Identifying propaganda has always been a problem.[46] The main difficulties have involved differentiating propaganda from other types of persuasion, and avoiding a biased approach. Richard Alan Nelson provides a definition of the term: "Propaganda is neutrally defined as a systematic form of purposeful persuasion that attempts to influence the emotions, attitudes, opinions, and actions of specified target audiences for ideological, political or commercial purposes[47] through the controlled transmission of one-sided messages (which may or may not be factual) via mass and direct media channels."[48] The definition focuses on the communicative process involved – or more precisely, on the purpose of the process, and allow "propaganda" to be interpreted as positive or negative behavior depending on the perspective of the viewer or listener.

Propaganda can often be recognized by the rhetorical strategies used in its design. In the 1930s, the Institute for Propaganda Analysis identified a variety of propaganda techniques that were commonly used in newspapers and on the radio, which were the mass media of the time period. Propaganda techniques include "name calling" (using derogatory labels), "bandwagon" (expressing the social appeal of a message), or "glittering generalities" (using positive but imprecise language).[49] With the rise of the internet and social media, Renee Hobbs identified four characteristic design features of many forms of contemporary propaganda: (1) it activates strong emotions; (2) it simplifies information; (3) it appeals to the hopes, fears, and dreams of a targeted audience; and (4) it attacks opponents.[50]

Propaganda is sometimes evaluated based on the intention and goals of the individual or institution who created it. According to historian Zbyněk Zeman, propaganda is defined as either white, grey or black. White propaganda openly discloses its source and intent. Grey propaganda has an ambiguous or non-disclosed source or intent. Black propaganda purports to be published by the enemy or some organization besides its actual origins[51] (compare with black operation, a type of clandestine operation in which the identity of the sponsoring government is hidden). In scale, these different types of propaganda can also be defined by the potential of true and correct information to compete with the propaganda. For example, opposition to white propaganda is often readily found and may slightly discredit the propaganda source. Opposition to grey propaganda, when revealed (often by an inside source), may create some level of public outcry. Opposition to black propaganda is often unavailable and may be dangerous to reveal, because public cognizance of black propaganda tactics and sources would undermine or backfire the very campaign the black propagandist supported.

The propagandist seeks to change the way people understand an issue or situation for the purpose of changing their actions and expectations in ways that are desirable to the interest group. Propaganda, in this sense, serves as a corollary to censorship in which the same purpose is achieved, not by filling people's minds with approved information, but by preventing people from being confronted with opposing points of view. What sets propaganda apart from other forms of advocacy is the willingness of the propagandist to change people's understanding through deception and confusion rather than persuasion and understanding. The leaders of an organization know the information to be one sided or untrue, but this may not be true for the rank and file members who help to disseminate the propaganda.

Woodcuts (1545) known as the Papstspotbilder or Depictions of the Papacy in English,[52] by Lucas Cranach, commissioned by Martin Luther.[53] Title: Kissing the Pope's Feet.[54] German peasants respond to a papal bull of Pope Paul III. Caption reads: "Don't frighten us Pope, with your ban, and don't be such a furious man. Otherwise we shall turn around and show you our rears."[55][56]

Religious

[edit]

Propaganda was often used to influence opinions and beliefs on religious issues, particularly during the split between the Roman Catholic Church and the Protestant churches or during the Crusades.[57]

The sociologist Jeffrey K. Hadden has argued that members of the anti-cult movement and Christian counter-cult movement accuse the leaders of what they consider cults of using propaganda extensively to recruit followers and keep them. Hadden argued that ex-members of cults and the anti-cult movement are committed to making these movements look bad.[58]

Propaganda against other religions in the same community or propaganda intended to keep political power in the hands of a religious elite can incite religious hate on a global or national scale. It could make use of many propaganda mediums. War, terrorism, riots, and other violent acts can result from it. It can also conceal injustices, inequities, exploitation, and atrocities, leading to ignorance-based indifference and alienation.[59]

Wartime

[edit]
A famous example of propaganda, this poster made by Paul Revere portrays the Boston Massacre in a way that he hoped would make Americans angry and support the Revolutionary War.

In the Peloponnesian War, the Athenians exploited the figures from stories about Troy as well as other mythical images to incite feelings against Sparta. For example, Helen of Troy was even portrayed as an Athenian, whose mother Nemesis would avenge Troy.[60][61] During the Punic Wars, extensive campaigns of propaganda were carried out by both sides. To dissolve the Roman system of socii and the Greek poleis, Hannibal released without conditions Latin prisoners that he had treated generously to their native cities, where they helped to disseminate his propaganda.[62] The Romans on the other hand tried to portray Hannibal as a person devoid of humanity and would soon lose the favour of gods. At the same time, led by Q.Fabius Maximus, they organized elaborate religious rituals to protect Roman morale.[63][62]

In the early sixteenth century, Maximilian I invented one kind of psychological warfare targeting the enemies. During his war against Venice, he attached pamphlets to balloons that his archers would shoot down. The content spoke of freedom and equality and provoked the populace to rebel against the tyrants (their Signoria).[22]

Propaganda poster shows a terrifying gorilla with a helmet labeled "militarism" holding a bloody club labeled "kultur" and a half-naked woman as he stomps onto the shore of America.
Destroy this Mad Brute: Enlist— propaganda poster encouraging men in the United States to enlist and fight Germany as part of WWI, by Harry R. Hopps, c. 1917
Soviet "Ne Boltai" poster. Translates to "Don't Chatter". Similar to American "Loose Lips Sink Ships" posters, this iconic piece of propaganda tries to warn citizens against giving out secrets.

Propaganda is a powerful weapon in war; in certain cases, it is used to dehumanize and create hatred toward a supposed enemy, either internal or external, by creating a false image in the mind of soldiers and citizens. This can be done by using derogatory or racist terms (e.g., the racist terms "Jap" and "gook" used during World War II and the Vietnam War, respectively), avoiding some words or language or by making allegations of enemy atrocities. The goal of this was to demoralize the opponent into thinking what was being projected was actually true.[64] Most propaganda efforts in wartime require the home population to feel the enemy has inflicted an injustice, which may be fictitious or may be based on facts (e.g., the sinking of the passenger ship RMS Lusitania by the German Navy in World War I). The home population must also believe that the cause of their nation in the war is just. In these efforts it was difficult to determine the accuracy of how propaganda truly impacted the war.[65] In NATO doctrine, propaganda is defined as "Information, especially of a biased or misleading nature, used to promote a political cause or point of view."[66] Within this perspective, the information provided does not need to be necessarily false but must be instead relevant to specific goals of the "actor" or "system" that performs it.

Propaganda is also one of the methods used in psychological warfare, which may also involve false flag operations in which the identity of the operatives is depicted as those of an enemy nation (e.g., The Bay of Pigs Invasion used CIA planes painted in Cuban Air Force markings). The term propaganda may also refer to false information meant to reinforce the mindsets of people who already believe as the propagandist wishes (e.g., During the First World War, the main purpose of British propaganda was to encourage men to join the army, and women to work in the country's industry. Propaganda posters were used because regular general radio broadcasting was yet to commence and TV technology was still under development).[67] The assumption is that, if people believe something false, they will constantly be assailed by doubts. Since these doubts are unpleasant (see cognitive dissonance), people will be eager to have them extinguished, and are therefore receptive to the reassurances of those in power. For this reason, propaganda is often addressed to people who are already sympathetic to the agenda or views being presented. This process of reinforcement uses an individual's predisposition to self-select "agreeable" information sources as a mechanism for maintaining control over populations.[improper synthesis?]

Serbian propaganda from the Bosnian War (1992–95) presented as an actual photograph from the scene of, as stated in report below the image, a "Serbian boy whose whole family was killed by Bosnian Muslims". The image is derived from an 1879 "Orphan on mother's grave" painting by Uroš Predić (alongside).[68]

Propaganda may be administered in insidious ways. For instance, disparaging disinformation about the history of certain groups or foreign countries may be encouraged or tolerated in the educational system. Since few people actually double-check what they learn at school, such disinformation will be repeated by journalists as well as parents, thus reinforcing the idea that the disinformation item is really a "well-known fact", even though no one repeating the myth is able to point to an authoritative source. The disinformation is then recycled in the media and in the educational system, without the need for direct governmental intervention on the media. Such permeating propaganda may be used for political goals: by giving citizens a false impression of the quality or policies of their country, they may be incited to reject certain proposals or certain remarks or ignore the experience of others.

Britannia arm-in-arm with Uncle Sam symbolizes the British-American alliance in World War I.
Poster depicting Winston Churchill as a "British Bulldog"

In the Soviet Union during the Second World War, the propaganda designed to encourage civilians was controlled by Stalin, who insisted on a heavy-handed style that educated audiences easily saw was inauthentic. On the other hand, the unofficial rumors about German atrocities were well founded and convincing.[69] Stalin was a Georgian who spoke Russian with a heavy accent. That would not do for a national hero so starting in the 1930s all new visual portraits of Stalin were retouched to erase his Georgian facial characteristics[clarify][70] and make him a more generalized Soviet hero. Only his eyes and famous moustache remained unaltered. Zhores Medvedev and Roy Medvedev say his "majestic new image was devised appropriately to depict the leader of all times and of all peoples."[71]

Article 20 of the International Covenant on Civil and Political Rights prohibits any propaganda for war as well as any advocacy of national or religious hatred that constitutes incitement to discrimination, hostility or violence by law.[72]

Naturally, the common people don't want war; neither in Russia nor in England nor in America, nor for that matter in Germany. That is understood. But, after all, it is the leaders of the country who determine the policy and it is always a simple matter to drag the people along, whether it is a democracy or a fascist dictatorship or a Parliament or a Communist dictatorship. The people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked and denounce the pacifists for lack of patriotism and exposing the country to danger. It works the same way in any country.

Simply enough the covenant specifically is not defining the content of propaganda. In simplest terms, an act of propaganda if used in a reply to a wartime act is not prohibited.[74]

Advertising

[edit]

Propaganda shares techniques with advertising and public relations, each of which can be thought of as propaganda that promotes a commercial product or shapes the perception of an organization, person, or brand. For example, after claiming victory in the 2006 Lebanon War, Hezbollah campaigned for broader popularity among Arabs by organizing mass rallies where Hezbollah leader Hassan Nasrallah combined elements of the local dialect with classical Arabic to reach audiences outside Lebanon. Banners and billboards were commissioned in commemoration of the war, along with various merchandise items with Hezbollah's logo, flag color (yellow), and images of Nasrallah. T-shirts, baseball caps and other war memorabilia were marketed for all ages. The uniformity of messaging helped define Hezbollah's brand.[75]

In the journalistic context, advertisements evolved from the traditional commercial advertisements to include also a new type in the form of paid articles or broadcasts disguised as news. These generally present an issue in a very subjective and often misleading light, primarily meant to persuade rather than inform. Normally they use only subtle propaganda techniques and not the more obvious ones used in traditional commercial advertisements. If the reader believes that a paid advertisement is in fact a news item, the message the advertiser is trying to communicate will be more easily "believed" or "internalized". Such advertisements are considered obvious examples of "covert" propaganda because they take on the appearance of objective information rather than the appearance of propaganda, which is misleading. Federal law[where?] specifically mandates that any advertisement appearing in the format of a news item must state that the item is in fact a paid advertisement.

Edmund McGarry illustrates that advertising is more than selling to an audience but a type of propaganda that is trying to persuade the public and not to be balanced in judgement.[76]

Politics

[edit]
Propaganda and manipulation can be found in television, and in news programs that influence mass audiences. An example was the Dziennik (Journal) news cast, which criticised capitalism in the then-communist Polish People's Republic using emotive and loaded language.

Propaganda has become more common in political contexts, in particular, to refer to certain efforts sponsored by governments, political groups, but also often covert interests. In the early 20th century, propaganda was exemplified in the form of party slogans. Propaganda also has much in common with public information campaigns by governments, which are intended to encourage or discourage certain forms of behavior (such as wearing seat belts, not smoking, not littering, and so forth). Again, the emphasis is more political in propaganda. Propaganda can take the form of leaflets, posters, TV, and radio broadcasts and can also extend to any other medium. In the case of the United States, there is also an important legal (imposed by law) distinction between advertising (a type of overt propaganda) and what the Government Accountability Office (GAO), an arm of the United States Congress, refers to as "covert propaganda." Propaganda is divided into two in political situations, they are preparation, meaning to create a new frame of mind or view of things, and operational, meaning they instigate actions.[77]

Roderick Hindery argues[78][79] that propaganda exists on the political left, and right, and in mainstream centrist parties. Hindery further argues that debates about most social issues can be productively revisited in the context of asking "what is or is not propaganda?" Not to be overlooked is the link between propaganda, indoctrination, and terrorism/counterterrorism. He argues that threats to destroy are often as socially disruptive as physical devastation itself.

Since 9/11 and the appearance of greater media fluidity, propaganda institutions, practices and legal frameworks have been evolving in the US and Britain. Briant shows how this included expansion and integration of the apparatus cross-government and details attempts to coordinate the forms of propaganda for foreign and domestic audiences, with new efforts in strategic communication.[80] These were subject to contestation within the US Government, resisted by Pentagon Public Affairs and critiqued by some scholars.[43] The National Defense Authorization Act for Fiscal Year 2013 (section 1078 (a)) amended the US Information and Educational Exchange Act of 1948 (popularly referred to as the Smith-Mundt Act) and the Foreign Relations Authorization Act of 1987, allowing for materials produced by the State Department and the Broadcasting Board of Governors (BBG) to be released within U.S. borders for the Archivist of the United States. The Smith-Mundt Act, as amended, provided that "the Secretary and the Broadcasting Board of Governors shall make available to the Archivist of the United States, for domestic distribution, motion pictures, films, videotapes, and other material 12 years after the initial dissemination of the material abroad (...) Nothing in this section shall be construed to prohibit the Department of State or the Broadcasting Board of Governors from engaging in any medium or form of communication, either directly or indirectly, because a United States domestic audience is or may be thereby exposed to program material, or based on a presumption of such exposure." Public concerns were raised upon passage due to the relaxation of prohibitions of domestic propaganda in the United States.[81]

In the wake of this, the internet has become a prolific method of distributing political propaganda, benefiting from an evolution in coding called bots. Software agents or bots can be used for many things, including populating social media with automated messages and posts with a range of sophistication. During the 2016 U.S. election a cyber-strategy was implemented using bots to direct US voters to Russian political news and information sources, and to spread politically motivated rumors and false news stories. At this point it is considered commonplace contemporary political strategy around the world to implement bots in achieving political goals.[82]

Techniques

[edit]
Anti-capitalist propaganda (1911 Industrial Workers of the World poster)

Common media for transmitting propaganda messages include news reports, government reports, historical revision, junk science, books, leaflets, movies, radio, television, posters and social media. Some propaganda campaigns follow a strategic transmission pattern to indoctrinate the target group. This may begin with a simple transmission, such as a leaflet or advertisement dropped from a plane or an advertisement. Generally, these messages will contain directions on how to obtain more information, via a website, hotline, radio program, etc. (as it is seen also for selling purposes among other goals). The strategy intends to initiate the individual from information recipient to information seeker through reinforcement, and then from information seeker to opinion leader through indoctrination.[83]

A number of techniques based in social psychological research are used to generate propaganda. Many of these same techniques can be found under logical fallacies, since propagandists use arguments that, while sometimes convincing, are not necessarily valid.

Some time has been spent analyzing the means by which the propaganda messages are transmitted. That work is important but it is clear that information dissemination strategies become propaganda strategies only when coupled with propagandistic messages. Identifying these messages is a necessary prerequisite to study the methods by which those messages are spread.

Theodor W. Adorno wrote that fascist propaganda encourages identification with an authoritarian personality characterized by traits such as obedience and extreme aggression.[84]: 17  In The Myth of the State, Ernst Cassirer wrote that while fascist propaganda mythmaking flagrantly contradicted empirical reality, it provided a simple and direct answer to the anxieties of the secular present.[84]: 63 

Propaganda can also be turned on its makers. For example, postage stamps have frequently been tools for government advertising, such as North Korea's extensive issues.[85] The presence of Stalin on numerous Soviet stamps is another example.[86] In Nazi Germany, Hitler frequently appeared on postage stamps in Germany and some of the occupied nations. A British program to parody these, and other Nazi-inspired stamps, involved airdropping them into Germany on letters containing anti-Nazi literature.[87][88]

In 2018 a scandal broke in which the journalist Carole Cadwalladr, several whistleblowers and the academic Emma Briant revealed advances in digital propaganda techniques showing that online human intelligence techniques used in psychological warfare had been coupled with psychological profiling using illegally obtained social media data for political campaigns in the United States in 2016 to aid Donald Trump by the firm Cambridge Analytica.[89][90][91] The company initially denied breaking laws[92] but later admitted breaking UK law, the scandal provoking a worldwide debate on acceptable use of data for propaganda and influence.[93]

Models

[edit]

Persuasion in social psychology

[edit]
Public reading of the anti-Semitic newspaper Der Stürmer, Worms, Germany, 1935

The field of social psychology includes the study of persuasion. Social psychologists can be sociologists or psychologists. The field includes many theories and approaches to understanding persuasion. For example, communication theory points out that people can be persuaded by the communicator's credibility, expertise, trustworthiness, and attractiveness. The elaboration likelihood model, as well as heuristic models of persuasion, suggest that a number of factors (e.g., the degree of interest of the recipient of the communication), influence the degree to which people allow superficial factors to persuade them. Nobel Prize–winning psychologist Herbert A. Simon won the Nobel prize for his theory that people are cognitive misers. That is, in a society of mass information, people are forced to make decisions quickly and often superficially, as opposed to logically.

According to William W. Biddle's 1931 article "A psychological definition of propaganda", "[t]he four principles followed in propaganda are: (1) rely on emotions, never argue; (2) cast propaganda into the pattern of "we" versus an "enemy"; (3) reach groups as well as individuals; (4) hide the propagandist as much as possible."[94]

More recently, studies from behavioral science have become significant in understanding and planning propaganda campaigns, these include for example nudge theory which was used by the Obama Campaign in 2008 then adopted by the UK Government Behavioural Insights Team.[95] Behavioural methodologies then became subject to great controversy in 2016 after the company Cambridge Analytica was revealed to have applied them with millions of people's breached Facebook data to encourage them to vote for Donald Trump.[96]

Haifeng Huang argues that propaganda is not always necessarily about convincing a populace of its message (and may actually fail to do this) but instead can also function as a means of intimidating the citizenry and signalling the regime's strength and ability to maintain its control and power over society; by investing significant resources into propaganda, the regime can forewarn its citizens of its strength and deterring them from attempting to challenge it.[97]

Propaganda theory and education

[edit]

During the 1930s, educators in the United States and around the world became concerned about the rise of anti-Semitism and other forms of violent extremism. The Institute for Propaganda Analysis was formed to introduce methods of instruction for high school and college students, helping learners to recognize and desist propaganda by identifying persuasive techniques. This work built upon classical rhetoric and it was informed by suggestion theory and social scientific studies of propaganda and persuasion.[98] In the 1950s, propaganda theory and education examined the rise of American consumer culture, and this work was popularized by Vance Packard in his 1957 book, The Hidden Persuaders. European theologian Jacques Ellul's landmark work, Propaganda: The Formation of Men's Attitudes framed propaganda in relation to larger themes about the relationship between humans and technology. Media messages did not serve to enlighten or inspire, he argued. They merely overwhelm by arousing emotions and oversimplifying ideas, limiting human reasoning and judgement.

In the 1980s, academics recognized that news and journalism could function as propaganda when business and government interests were amplified by mass media. The propaganda model is a theory advanced by Edward S. Herman and Noam Chomsky which argues systemic biases exist in mass media that are shaped by structural economic causes. It argues that the way in which commercial media institutions are structured and operate (e.g. through advertising revenue, concentration of media ownership, or access to sources) creates an inherent conflict of interest that make them act as propaganda for powerful political and commercial interests:

The 20th century has been characterized by three developments of great political importance: the growth of democracy, the growth of corporate power, and the growth of corporate propaganda as a means of protecting corporate power against democracy.[99][100]

First presented in their book Manufacturing Consent: The Political Economy of the Mass Media (1988), the propaganda model analyses commercial mass media as businesses that sell a product – access to readers and audiences – to other businesses (advertisers) and that benefit from access to information from government and corporate sources to produce their content. The theory postulates five general classes of "filters" that shape the content that is presented in news media: ownership of the medium, reliance on advertising revenue, access to news sources, threat of litigation and commercial backlash (flak), and anti-communism and "fear ideology". The first three (ownership, funding, and sourcing) are generally regarded by the authors as being the most important. Although the model was based mainly on the characterization of United States media, Chomsky and Herman believe the theory is equally applicable to any country that shares the basic political economic structure, and the model has subsequently been applied by other scholars to study media bias in other countries.[101]

By the 1990s, the topic of propaganda was no longer a part of public education, having been relegated to a specialist subject. Secondary English educators grew fearful of the study of propaganda genres, choosing to focus on argumentation and reasoning instead of the highly emotional forms of propaganda found in advertising and political campaigns.[102] In 2015, the European Commission funded Mind Over Media, a digital learning platform for teaching and learning about contemporary propaganda. The study of contemporary propaganda is growing in secondary education, where it is seen as a part of language arts and social studies education.[103]

Self-propaganda

[edit]

Self-propaganda is a form of propaganda that refers to the act of an individual convincing themself of something, no matter how irrational that idea may be.[104] Self propaganda makes it easier for individuals to justify their own actions as well as the actions of others. Self-propaganda often works to lessen the cognitive dissonance felt by individuals when their personal actions or the actions of their government do not line up with their moral beliefs.[105] Self-propaganda is a type of self deception.[106] Self-propaganda can have a negative impact on those who perpetuate the beliefs created by using self-propaganda.[106]

Children

[edit]
A 1938 propaganda of the Estado Novo (New State) regime depicting Brazilian president Getúlio Vargas flanked by children. The text reads: "Children! Learning, at home and in school, the worship of the Fatherland, you will bring all chances of success to life. Only love builds and, strongly loving Brazil, you will lead it to the greatest of destinies among Nations, fulfilling the desires of exaltation nestled in every Brazilian heart."

Of all the potential targets for propaganda, children are the most vulnerable because they are the least prepared with the critical reasoning and contextual comprehension they need to determine whether message is a propaganda or not. The attention children give their environment during development, due to the process of developing their understanding of the world, causes them to absorb propaganda indiscriminately. Also, children are highly imitative: studies by Albert Bandura, Dorothea Ross and Sheila A. Ross in the 1960s indicated that, to a degree, socialization, formal education and standardized television programming can be seen as using propaganda for the purpose of indoctrination. The use of propaganda in schools was highly prevalent during the 1930s and 1940s in Germany in the form of the Hitler Youth.

Anti-Semitic propaganda for children

[edit]

In Nazi Germany, the education system was thoroughly co-opted to indoctrinate the German youth with anti-Semitic ideology. From the 1920s on, the Nazi Party targeted German youth as one of their special audience for its propaganda messages.[107] Schools and texts mirrored what the Nazis aimed of instilling in German youth through the use and promotion of racial theory. Julius Streicher, the editor of Der Stürmer, headed a publishing house that disseminated anti-Semitic propaganda picture books in schools during the Nazi dictatorship. This was accomplished through the National Socialist Teachers League, of which 97% of all German teachers were members in 1937.[108]

The League encouraged the teaching of racial theory. Picture books for children such as Trust No Fox on his Green Heath and No Jew on his Oath, Der Giftpilz (translated into English as The Poisonous Mushroom) and The Poodle-Pug-Dachshund-Pinscher were widely circulated (over 100,000 copies of Trust No Fox... were circulated during the late 1930s) and contained depictions of Jews as devils, child molesters and other morally charged figures. Slogans such as "Judas the Jew betrayed Jesus the German to the Jews" were recited in class. During the Nuremberg Trial, Trust No Fox on his Green Heath and No Jew on his Oath, and Der Giftpilz were received as documents in evidence because they document the practices of the Nazis[109] The following is an example of a propagandistic math problem recommended by the National Socialist Essence of Education: "The Jews are aliens in Germany—in 1933 there were 66,606,000 inhabitants in the German Reich, of whom 499,682 (0.75%) were Jews."[110]

Comparisons with disinformation

[edit]
Whether and to what degree disinformation and propaganda overlap is subject to debate. Some (like U.S. Department of State) define propaganda as the use of non-rational arguments to either advance or undermine a political ideal, and use disinformation as an alternative name for undermining propaganda,[111] while others consider them to be separate concepts altogether.[112] One popular distinction holds that disinformation also describes politically motivated messaging designed explicitly to engender public cynicism, uncertainty, apathy, distrust, and paranoia, all of which disincentivize citizen engagement and mobilization for social or political change.[113]

See also

[edit]

References

[edit]

Further reading

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Propaganda is a deliberate form of communication designed to influence the perceptions, emotions, and behaviors of targeted audiences toward objectives set by the communicator, frequently employing selective truths, omissions, or fabrications to achieve ideological, political, or military aims, and differing from persuasion in its one-directional structure, lack of emphasis on rational dialogue, and prioritization of the propagandist's intent over audience autonomy. The term derives from the Latin propagare, meaning "to propagate" or "to spread," originally applied to the 1622 Congregatio de Propaganda Fide, a Vatican committee established by Pope Gregory XV to oversee the global dissemination of Catholic doctrine through missionary work. Historically, have been employed since antiquity for religious proselytizing and political mobilization, but the concept crystallized in its modern pejorative sense during the , particularly amid the mass media revolutions of , where state-directed campaigns shaped , justified , and vilified adversaries through posters, films, and pamphlets. In totalitarian regimes, such as and the , it served as a core instrument of control, integrating , repetitive messaging, and cult-of-personality narratives to enforce ideological conformity and suppress dissent. Democracies have also relied on it during conflicts, as seen in Allied efforts to sustain morale and demonize enemies, revealing its utility beyond overt when aligned with national interests. Central to propaganda are techniques like name-calling to discredit opponents, bandwagon appeals to exploit , card-stacking to present lopsided , and glittering generalities invoking vague virtues, which exploit cognitive biases such as tendencies and emotional priming over empirical scrutiny. Its efficacy stems from causal mechanisms including repetition fostering illusory truth, heuristics, and , though outcomes vary with audience predispositions and counter-narratives; empirically, wartime propaganda has demonstrably boosted enlistment and resource allocation but also prolonged conflicts by entrenching dehumanizing stereotypes. Controversies arise from its ethical dual-use—capable of unifying against existential threats or inciting atrocities—and definitional ambiguity, wherein prevailing powers often designate adversaries' efforts as propaganda while framing their own as legitimate , underscoring systemic biases in source evaluation across media and academic institutions. In contemporary contexts, digital platforms amplify its reach, blending state-sponsored operations with algorithmic echo chambers that simulate consensus, challenging distinctions between organic and orchestrated influence.

Etymology and Terminology

Origins of the Term

The term "propaganda" derives from the Latin propaganda, the neuter plural of propagare, meaning "to propagate" or "to spread," referring to things that ought to be propagated, such as doctrines or ideas. This linguistic root entered common usage through the Catholic Church's institutional efforts to disseminate its faith during the . In response to the Protestant Reformation's gains and the need to coordinate missionary activities amid expanding European exploration, established the Sacra Congregatio de Propaganda Fide (Sacred Congregation for the Propagation of the Faith) on January 6, 1622, formalized by the Inscrutabili Divinae Providentiae Arcano issued on June 22, 1622. The congregation served as the central body overseeing global , including training missionaries, printing materials in languages, and countering non-Catholic influences in regions like the , , and . The name's "propaganda" element directly reflected its mandate to propagate (propagare) , particularly among non-believers and lapsed faithful, without the implications it later acquired. At , the term connoted organized dissemination of religious truth, akin to in or —systematic extension rather than —and was viewed positively within contexts as essential for the Church's survival and expansion. This ecclesiastical origin marked the term's formal institutionalization, distinguishing it from earlier informal uses of related Latin concepts in classical texts, such as Cicero's references to spreading ideas. By the late , "propaganda" began appearing in European vernaculars to describe the congregation's activities, initially retaining neutrality but gradually broadening to secular contexts by the 18th century Enlightenment, where it started shifting toward critiques of manipulative influence.

Evolution in Modern Usage

In the early , "propaganda" largely retained its historical neutrality as organized efforts to disseminate ideas or ideologies, but its application expanded into secular political contexts amid mass media's rise. During , the U.S. government formed the (CPI) in April 1917, led by , to mobilize public support for the war through pamphlets, posters, films, and speeches reaching an estimated 75 million Americans. Creel explicitly rejected the label "propaganda" for his initiatives, citing its association with German deception, and instead framed them as "educational and informative" campaigns, reflecting the term's still-ambivalent status even as it described systematic opinion-shaping. Postwar disillusionment accelerated a pejorative shift, as revelations of wartime exaggerations—such as unverified atrocity claims against Germans—fostered distrust of state-managed messaging. By the , the term connoted manipulation and excess, prompting rebranding in professional circles; , a CPI veteran and nephew of , titled his 1928 book Propaganda to advocate "engineering consent" via psychological insights, yet acknowledged the word's growing stigma and pivoted to "" to sanitize similar practices for commercial and political use. This evolution mirrored broader causal dynamics: mass democracy's demands for public buy-in clashed with transparency ideals, rendering overt suspect. World War II and the Cold War solidified "propaganda" as predominantly derogatory, evoking totalitarian control rather than mere advocacy. Nazi Germany's Reich Ministry of Public Enlightenment and Propaganda, established in 1933 under , centralized media to enforce , producing over 1,300 newspapers and films that distorted facts for regime loyalty, which postwar analyses framed as paradigmatic . In Allied and democratic contexts, the term increasingly delegitimized opponents' narratives—e.g., Soviet —while one's own efforts were rephrased as "information" or "," highlighting its weaponized asymmetry. By the late into the present, modern usage broadened to encompass , , and digital campaigns perceived as ideologically skewed, often without requiring outright falsehoods but implying selective framing or emotional appeals over evidence. This reflects causal realism in ecosystems: amid fragmented media, the label critiques systemic biases, such as in authoritarian regimes (e.g., China's global broadcasting via CGTN since 2016) or Western outlets' narrative alignment, though accusations remain subjective and rarely self-applied. Scholarly distinctions persist—e.g., propaganda as intentional manipulation versus neutral —but colloquial deployment prioritizes intent, underscoring source credibility's role in evaluation.

Definitions and Core Characteristics

Essential Elements

Propaganda fundamentally entails a deliberate, organized effort to influence the attitudes, beliefs, or behaviors of a , typically through the selective presentation of information via channels. This distinguishes it from casual by its systematic nature and aim to advance ideological, political, or institutional agendas, often prioritizing emotional resonance over comprehensive factual disclosure. defined propaganda as the "manipulation of collective attitudes by the use of significant symbols (words, pictures, tunes) rather than , a 'peaceful' battle of words." further characterized it as a sociological inherent to technological mass societies, where it functions continuously to integrate individuals into prevailing social norms, rendering it unavoidable and total in scope. Key elements include intentionality and secrecy regarding sources or ultimate goals, ensuring the message appears authoritative or spontaneous while concealing manipulative objectives. Ellul, drawing on earlier analyses like John Albig's, identified core definitional components: the covert nature of propaganda's origins and aims; a explicit intention to alter opinions or actions; broad to mass publics; unrelenting continuity rather than episodic bursts; and structured involving specialized personnel and resources. These features enable propaganda to exploit pre-existing narratives, amplifying them through repetition and simplification to foster or agitation without requiring overt . Psychological targeting forms another essential pillar, leveraging symbols, slogans, and emotional appeals—such as fear, pride, or enmity—to bypass rational scrutiny and embed ideas subconsciously. Lasswell's examination of efforts highlighted techniques like stereotyping enemies and glorifying national virtues to unify publics, demonstrating propaganda's reliance on manipulation over empirical debate. Unlike neutral , propaganda often omits counterevidence or distorts to attribute outcomes to favored narratives, as seen in its historical adaptation to media scales from print to digital platforms. Ellul noted that while falsehoods occur, propaganda's efficacy stems more from contextual orchestration of truths, fostering dependency on mediated realities. In practice, these elements converge in orchestrated campaigns, where audience segmentation—via Lasswell's "who says what to whom" framework—tailors messages for maximum effect, such as reinforcing in-group solidarity against out-groups. This causal mechanism, rooted in human predispositions to processing, underscores propaganda's distinction from : the latter seeks autonomous understanding, while the former engineers compliance through perpetual exposure. Empirical analyses confirm that without reach and sustained application, such efforts devolve into mere , lacking propaganda's transformative potency.

Distinctions from Persuasion and Influence

Propaganda differs from primarily in its intent, methods, and scope. encompasses a broad range of communicative efforts to alter attitudes or behaviors through appeals to reason, , or mutual interest, often in reciprocal or contexts. In contrast, propaganda constitutes a deliberate, systematic subcategory of persuasion aimed at advancing a predetermined ideological or , frequently employing selective truths, omissions, or distortions to manipulate rather than inform. This distinction hinges on propaganda's covert asymmetry: it prioritizes the propagandist's objectives over audience , using techniques like repetition and emotional priming to foster uncritical acceptance, as opposed to persuasion's potential for verifiable . Jacques Ellul further delineates propaganda from mere by emphasizing its totalizing effect in modern technological societies, where it integrates individuals into a comprehensive through pervasive, non-rational conditioning rather than isolated argumentative influence. , while framing propaganda as an essential mechanism for "establishing reciprocal understanding" between leaders and publics, acknowledged its manipulative underpinnings in mass-scale opinion engineering, blurring lines with but underscoring its departure from transparent advocacy. Empirical analyses, such as those by Jowett and O'Donnell, quantify this through propaganda's reliance on ideological intent—measured by the communicator's exclusion of counter-evidence—versus 's openness to scrutiny, with historical cases like atrocity stories illustrating propaganda's willingness to fabricate for mobilization. Influence, broader than both, refers to any process—intentional or incidental—by which external factors shape or action, including cultural or personal example without structured messaging. Propaganda qualifies as a targeted form of influence only when it involves organized dissemination of biased information to predefined ends, distinguishing it from diffuse social pressures; for instance, campaigns during the systematically propagated narratives to sway alliances, unlike organic cultural shifts. This causal realism reveals propaganda's efficacy in overriding individual reasoning via scale and repetition, as evidenced by studies showing higher susceptibility in low-information environments, whereas neutral influence lacks such engineered deceit.

Historical Development

Ancient and Pre-Modern Instances

In ancient , Assyrian kings disseminated propaganda through royal annals and reliefs to exaggerate military victories and instill fear in subjects and enemies. (r. 883–859 BCE), for instance, inscribed detailed accounts of campaigns involving mass executions and impalements, claiming to have built a with materials from 50 enemy cities while thousands, thereby projecting unassailable power and divine favor. Ancient Egyptian pharaohs similarly employed temple inscriptions and monumental art to fabricate narratives of triumph and god-like supremacy. Ramesses II's reliefs at the and temples (circa 1270s BCE) portrayed the against the (1274 BCE) as a personal rout of the enemy, omitting the battle's inconclusive outcome and subsequent , to affirm his role as protector of ma'at (cosmic order) and deter internal dissent. In , strategic deception served propagandistic ends during conflicts. Athenian general in 480 BCE spread false rumors via a "trusted" defector to mislead Persian king Xerxes into deploying his fleet at the narrow Salamis straits, enabling a Greek victory that orators like later mythologized in works such as to exalt Athenian heroism and democracy. Roman emperors systematized visual and epigraphic propaganda to consolidate imperial legitimacy across diverse provinces. (r. 27 BCE–14 CE) commissioned the , an autobiographical inscription erected posthumously at key sites like , enumerating 35 specific achievements—including closing temple doors signaling peace after 200 years of war—to frame his rule as a restoration of republican virtues rather than . Coins bearing his image alongside motifs of victory and circulated empire-wide, reinforcing loyalty among illiterate masses. During the medieval period, the propagated crusading ideology through papal decrees and sermons to mobilize European knights against perceived Islamic threats. Pope Urban II's 1095 CE Council of Clermont address promised spiritual rewards for reclaiming , framing the as a penitential divinely sanctioned, which chronicles like amplified by emphasizing miraculous signs and enemy atrocities to sustain fervor amid high casualties. Secular rulers, such as England's Henry II, used illuminated manuscripts and charters to justify Angevin expansion, depicting conquests as rightful inheritance while vilifying rivals like post-1170 assassination to mitigate rebellion.

Enlightenment to World War I

The Enlightenment facilitated the expansion of print media, enabling the dissemination of political ideas beyond elite circles and foreshadowing systematic propaganda through appeals to reason and public sentiment. Pamphlets and essays critiqued absolutism, promoting concepts of and that influenced revolutionary movements. This period's emphasis on and debate shifted persuasion from oral traditions to reproducible texts, amplifying reach amid rising circulation in and America. In the , Thomas Paine's , published January 10, 1776, served as a pivotal propagandistic tool, framing British rule as tyrannical and advocating republican independence through accessible, emotive language that resonated with colonists. The 47-page pamphlet sold approximately 120,000 copies in its first three months, equivalent to reaching about one in five free Americans, and galvanized support for the Continental Congress's on July 4, 1776. Paine's subsequent series, beginning December 23, 1776, further boosted morale, with its opening line—"These are the times that try men's souls"—reportedly read aloud to troops before the . The French Revolution intensified pamphlet warfare, with an estimated 100,000 distinct titles produced between 1788 and 1795, targeting the monarchy's legitimacy and rallying support for radical change through satirical caricatures, accusations of corruption, and visions of egalitarian utopias. Prints and engravings depicted figures like Marie Antoinette as decadent, fueling public outrage that contributed to events such as the storming of the Bastille on July 14, 1789. Revolutionary leaders, including the Jacobins, leveraged these materials to consolidate power, though their hyperbolic claims often distorted facts to justify purges like the Reign of Terror from September 1793 to July 1794. Throughout the , drove propagandistic campaigns in Europe, particularly in fragmented states seeking unification. In German territories after the , authorities promoted national identity via patriotic festivals, monuments, and school curricula emphasizing shared language and history, as seen in the 1817 where students burned foreign symbols to assert cultural purity. Similar efforts in during the Risorgimento, led by figures like , used writings and secret societies to evoke romanticized past glories against Austrian dominance, culminating in unification by 1870. The rise of cheap newspapers, such as Britain's Daily Telegraph reaching 240,000 circulation by 1877, allowed governments and movements to shape on imperial expansion and domestic reforms. World War I elevated propaganda to a state-orchestrated industry, with belligerents deploying agencies to sustain recruitment, morale, and resource mobilization amid . Britain's , operational from September 1914, circulated reports of German atrocities in —such as the alleged execution of 6,000 civilians in on August 23, 1914—to vilify the and justify Allied intervention, though subsequent inquiries revealed exaggerations in claims like widespread bayoneting of babies. In the United States, the , formed April 13, 1917, under , distributed over 75 million pamphlets and produced 6,000 reels of film, employing slogans like "The Hun Within" to stoke fears of subversion and enforce loyalty via the , which prosecuted over 2,000 dissenters. Techniques emphasized enemy , patriotic symbolism, and atrocity narratives, but post-war revelations, including the 1920 Bryce Committee disavowals, exposed fabricated elements designed to override rational skepticism for causal support of aims.

Interwar and World War II Totalitarian Regimes

Totalitarian regimes in the and elevated propaganda to a core mechanism of governance, centralizing control over information to indoctrinate populations, legitimize leaders, and mobilize societies for ideological conformity and conflict. In , , Stalinist , and Imperial , state apparatuses systematically deployed media monopolies, mass events, and repetitive messaging to demonize enemies, exalt rulers, and suppress dissent, achieving unprecedented penetration into daily life. These efforts relied on modern technologies like radio and , enabling regimes to bypass traditional elites and directly influence the masses. Nazi Germany's propaganda machine, directed by after his appointment as Reich Minister for Public Enlightenment and Propaganda on March 13, 1933, achieved total dominance over print, broadcast, and visual media within months of Hitler's rise. Goebbels insisted that effective propaganda required a single authoritative source to prevent contradictory messages, subordinating all cultural and informational outlets to the Ministry. Key techniques included the "big lie" strategy—coined in Hitler's but operationalized through ceaseless repetition of falsehoods, such as Jewish conspiracies orchestrating Germany's defeat—and orchestration of spectacles like the annual , attended by over 400,000 in 1938, to instill fervor. Radio broadcasts reached 70% of households by 1939, with cheap "People's Receivers" designed for one-purpose listening to state content. Anti-Semitic campaigns, via outlets like , escalated from 1933 boycotts to justifying by portraying Jews as existential threats. In the under , the Communist Party's (Agitation and Propaganda) Department, formalized in the 1920s and intensified during the 1930s Great Purges, coordinated indoctrination across newspapers like , films, posters, and theater troupes targeting workers. Propaganda glorified collectivization and industrialization as triumphs over "kulaks" and saboteurs, with depicted in over 5,000 statues by and films like Lenin in October (1937) rewriting history to center his role. Techniques emphasized "agitpoints"—mobile units disseminating simplified Bolshevik ideology—and suppression of facts, such as the 1932-1933 famine killing 3-5 million, reframed as capitalist slander. By , renamed the Propaganda Department in 1946 but active earlier, it mobilized 20 million recruits partly through patriotic narratives blending with . Fascist Italy under Mussolini centralized propaganda through the Ministry of Popular Culture, established in 1937, which censored press and cinema while promoting imperial revival via youth groups like Balilla and grandiose architecture emulating Rome. Mussolini's persona as Il Duce was propagated through 3,000+ speeches broadcast on radio from 1924 onward, emphasizing virility and anti-Bolshevism, with campaigns like the 1935 Ethiopia invasion framed as civilizing missions. Despite early successes in consolidating power post-1922 March on Rome, propaganda faltered in sustaining war enthusiasm, as battlefield defeats in 1940-1943 exposed regime boasts, contributing to Mussolini's 1943 ouster. Film production, under state control from 1922, produced over 1,000 features by 1943, often embedding fascist values subtly to evade public resistance. Imperial Japan's Cabinet Information Bureau, created in 1936 and expanded during WWII, enforced media compliance to propagate the "" as liberation from Western imperialism, censoring dissent under the Peace Preservation Law. Techniques involved school indoctrination, radio scripts reaching 80% coverage by 1941, and posters depicting Allied forces as barbaric, sustaining mobilization despite defeats like Midway in 1942. The military police augmented this with terror, arresting 70,000 for "thought crimes" by 1945, ensuring propaganda's coercive efficacy. These regimes' propaganda not only facilitated internal purges—Nazi (1934), Soviet Great Terror (1936-1938)—but also wartime atrocities, with dehumanizing narratives enabling events like the Nazi killings of 1.5 million Jews by 1943 and Japanese (1937, 200,000+ deaths). Postwar analyses reveal propaganda's limits against empirical failures, as Allied victories eroded credibility, underscoring its dependence on perceived successes for sustained belief.

Cold War and Decolonization Era

The Cold War (1947–1991) featured intense ideological propaganda between the United States and the Soviet Union, each seeking to portray its system as superior while demonizing the opponent. The United States initiated the "Campaign of Truth" on April 20, 1950, when President Harry Truman called for expanded information efforts to combat "the big lie" of Soviet propaganda, emphasizing factual broadcasting over deception. This initiative boosted funding for outlets like Voice of America (VOA), which by the 1950s transmitted news in over 40 languages to counter Soviet narratives in Europe, Asia, and beyond. Complementing VOA, Radio Free Europe (RFE) commenced operations on July 4, 1950, delivering uncensored news and cultural programming to Soviet-occupied Eastern Europe from Munich, initially under covert CIA funding to undermine communist regimes without direct U.S. government attribution. The Soviet Union responded with state-controlled media such as Radio Moscow, which broadcast anti-capitalist messages globally, often exaggerating Western imperialism and U.S. racial inequalities to erode American credibility. Soviet propaganda also glorified proletarian internationalism through posters, films, and literature, depicting the USSR as the vanguard against fascism and exploitation, though these efforts relied on centralized censorship that suppressed dissenting views. In 1953, the U.S. formalized its propaganda apparatus by creating the (USIA), tasked with disseminating American values like and free enterprise via libraries, films, and exchanges in over 100 countries, reaching millions annually during the era's peak. USIA materials highlighted economic successes under , such as post-Marshall Plan recoveries in , contrasting them with Soviet famines and purges. The Soviets, through agencies like the and (until 1956), propagated narratives of inevitable communist victory, using in occupied territories and allied states to foster loyalty. This bilateral contest extended to psychological operations; for instance, U.S. leaflet drops and broadcasts during the (1950–1953) urged North Korean defections by promising humane treatment, while Soviet counterparts accused the U.S. of bacteriological warfare without evidence. Decolonization from the late to the amplified superpower propaganda as over 50 African and Asian nations gained independence, becoming battlegrounds for influence. The positioned itself as an anti-colonial champion, supporting liberation movements with rhetoric and material aid; for example, it backed the Algerian Front de Libération Nationale (FLN) during the 1954–1962 war against through propaganda framing as a partner in dismantling imperialism. Soviet posters and broadcasts in the celebrated African decolonization waves, such as Ghana's independence in 1957, while critiquing Western neocolonialism to attract leaders like . This approach yielded alliances, including Cuba's 1959 revolution and Soviet arms to Angola's in the civil war, where propaganda portrayed interventions as solidarity against "reactionary" forces. The countered with development-focused messaging via USIA and AID programs, promoting non-communist paths to modernity; in the (), U.S. operations, including CIA-backed propaganda, supported Joseph Mobutu against Soviet-favored , emphasizing stability and anti-communism to prevent Soviet footholds. U.S. efforts often involved covert media manipulation, such as funding anti-communist outlets in during the 1965 coup, though these prioritized geopolitical containment over unvarnished truth. Both powers exploited local grievances—Soviets via class struggle appeals, Americans via modernization promises—but Soviet state monopoly on information enabled more uniform narratives, while U.S. initiatives faced domestic scrutiny over covert elements. In non-aligned forums like the , propaganda clashes highlighted third-world leaders' navigation of bipolar pressures, with Soviet denunciations of colonialism contrasting U.S. portrayals of partnership.

Post-Cold War to Digital Age

The in 1991 reduced the scale of global ideological propaganda contests, yet propaganda persisted in regional ethnic conflicts and Western-led interventions. In the of the 1990s, Serbian state-controlled media under propagated narratives of Serb victimhood and demonized other ethnic groups, inciting violence through broadcasts that exaggerated threats and historical grievances. Similarly, during the 2003 U.S.-led invasion of , administration officials cited intelligence on weapons of mass destruction—later revealed as erroneous—to justify preemptive action, shaping and media coverage to emphasize imminent threats from Saddam Hussein's regime. The rise of the and platforms in the 2000s marked a pivotal shift toward decentralized, rapid dissemination of propaganda, enabling both grassroots mobilization and state countermeasures. The Arab Spring protests from 2010 to 2012 demonstrated this duality: activists in , , and elsewhere used and to organize demonstrations and share uncensored footage, circumventing regime monopolies on traditional media, though participation remained limited to digitally connected urban elites. Authoritarian governments responded by enhancing digital , deploying pro-regime bots, and fabricating counter-narratives to discredit protesters as foreign agents. State actors adapted legacy tactics to , with and exemplifying hybrid approaches blending official outlets and covert operations. 's , a St. Petersburg-based entity operational by 2013, ran troll farms employing hundreds to post divisive content on platforms like and , including efforts to exacerbate U.S. racial tensions and influence the 2016 presidential election through and targeted ads reaching millions. , meanwhile, orchestrates vast campaigns, generating approximately 448 million fabricated social media comments yearly to amplify positive state narratives, distract from criticisms, and promote policies like the via state media and influencers. Advancements in have further amplified digital propaganda's potency since the mid-2010s, facilitating s and automated content generation. Russian operations during the 2022 incorporated AI-generated s and sham websites to spread false narratives, such as fabricated atrocities to undermine Western support. Chinese state-linked actors have used AI for similar ends, including videos targeting Taiwanese elections to erode trust in democratic institutions. These tools extend Cold War-era —rooted in KGB ""—into algorithmic precision, allowing precise targeting while challenging attribution and verification.

Techniques and Methodologies

Psychological Manipulation Tactics

Psychological manipulation tactics in propaganda target cognitive and emotional vulnerabilities to shape perceptions and induce compliance, often circumventing rational evaluation. These tactics draw on principles of , including the exploitation of heuristics, biases, and , to foster uncritical acceptance of messages. Unlike transparent , they prioritize subconscious influence through repetition, emotional priming, and selective framing, as outlined in his 1928 book Propaganda, where he emphasized manipulating "organized habits and opinions" via psychological stimuli to form habits without conscious resistance. Empirical studies confirm their efficacy; for instance, repeated exposure to claims increases perceived truthfulness via the , regardless of factual accuracy, as demonstrated in experiments where subjects rated statements as more valid after multiple viewings. A seminal classification comes from the Institute for Propaganda Analysis, established in 1937, which identified seven core devices based on observed patterns in during the . These devices systematically exploit emotional responses and cognitive shortcuts:
  • Name-calling: Propagandists attach loaded, negative labels (e.g., "traitor" or "extremist") to opponents or ideas to provoke instinctive aversion and , bypassing evidence-based scrutiny. This tactic leverages affective , where emotional disgust overrides factual assessment.
  • Glittering generalities: Positive, vague terms like "" or ""—linked to cherished values but devoid of specifics—are invoked to evoke uncritical approval, exploiting the where association with ideals transfers unearned credibility.
  • Transfer: Symbols of , sanctity, or prestige (e.g., flags, religious icons) are borrowed to lend legitimacy to unrelated claims, capitalizing on conditioned respect to manipulate associations.
  • : Endorsements from ostensibly credible figures (celebrities, experts) are used to sway audiences, invoking the even when the endorser's expertise is irrelevant or fabricated.
  • : Propagandists present themselves or their messages as relatable to ordinary people, fostering trust through feigned commonality and reducing perceived , which appeals to in-group identification.
  • Card stacking: Selective presentation of facts—omitting contradictions or unfavorable data—creates a skewed , exploiting by reinforcing desired interpretations while ignoring disconfirming evidence.
  • Bandwagon: Urging adoption of a position by claiming "everyone" supports it, this preys on and pressures, as individuals conform to perceived majorities to avoid isolation, a dynamic amplified in group settings.
Beyond these, propaganda frequently employs fear appeals, which heighten perceived threats to trigger fight-or-flight responses and compliance, as seen in historical wartime campaigns where exaggerated dangers prompted ; neuroimaging studies show such appeals activate amygdala-driven processing over rationality. further sustains manipulation, as audiences favor and retain propaganda aligning with prior beliefs, filtering out dissonant information—a pattern observed in political where partisan sources reinforce echo chambers. Repetition compounds this, with Bernays noting its role in habit formation through reaction , where frequent exposure embeds ideas subconsciously, a mechanism validated in research showing diminished skepticism after 3–5 iterations. In digital contexts, algorithmic amplification exacerbates these tactics by prioritizing engaging (often manipulative) content, exploiting availability to make skewed views seem normative. Countering requires meta-cognitive awareness, as unexamined es enable sustained influence without overt .

Rhetorical and Narrative Devices

Propaganda frequently employs rhetorical devices to manipulate emotions and bypass rational scrutiny, drawing from classical appeals to , , and but distorting them for ideological ends. The Institute for Propaganda Analysis, founded in 1937 by educators including Clyde Miller, systematically outlined seven key devices in its publications to educate the public on detecting manipulative amid rising totalitarian influences in . These include name-calling, which substitutes derogatory labels for substantive debate, such as branding political opponents as "traitors" or "enemies of the people" to incite visceral rejection without evidence; for instance, Soviet propaganda under routinely applied such terms to purge rivals, contributing to the execution of over 680,000 individuals deemed disloyal between 1937 and 1938. Glittering generalities invoke vague, emotionally charged virtues like "" or "honor" to link ideas to unassailable ideals, evading specific scrutiny; Nazi propaganda exalted the "" community in this manner to foster uncritical loyalty, as seen in speeches by emphasizing abstract "Aryan purity" without empirical backing. Transfer associates a cause with respected symbols, such as draping policies in religious or national icons to borrow their prestige—British posters transferred imperial glory to recruitment drives, portraying enlistment as a sacred duty akin to historical heroism. Testimonial leverages endorsements from admired figures, often out of context; for example, during the 1930s, fascist regimes secured quotes from intellectuals to legitimize , despite the endorsers' limited expertise in . Plain folks portrays leaders as ordinary people to build relatability and trust, masking elite agendas; American politicians in the 20th century, including , used radio "fireside chats" starting in 1933 to project approachable personas amid economic crisis. Card stacking selectively presents facts while omitting counterevidence, creating a skewed ; tobacco industry campaigns in the mid-20th century highlighted isolated studies on mildness to downplay health risks, influencing public perception until epidemiological data from the 1950s exposed the deception. Bandwagon exploits by implying widespread support, urging individuals to join the "winning side"; this was evident in Cold War-era McCarthyist rhetoric claiming inevitable communist takeover unless opposed en masse, amplifying fears documented in congressional hearings from 1950 to 1954. Narrative devices in propaganda construct overarching stories that simplify complex realities into digestible, emotionally resonant plots, often employing binary oppositions of protagonists versus antagonists to foster group cohesion. Demonization narratives frame adversaries as existential threats embodying pure evil, as in Imperial Japanese propaganda during depicting Americans as barbaric "devils" to justify aggression, a tactic analyzed in postwar declassified materials revealing its role in sustaining troop morale. Hero-villain archetypes glorify in-group figures while vilifying out-groups, evident in Bolshevik narratives post-1917 Revolution portraying Lenin as a savior against "bourgeois villains," which omitted internal famines like the 1921-1922 Volga crisis that killed over 5 million. Framing techniques selectively emphasize attributes to shape interpretation, such as portraying economic policies as "rescue missions" during crises while ignoring causal failures; this was critiqued in analyses of interwar fascist media, where recovery claims under Mussolini ignored persistent rates exceeding 20% in by 1939. Repetition reinforces narratives through redundancy, embedding them subconsciously—Goebbels' principle that a repeated becomes truth underpinned Nazi radio broadcasts from onward, which aired anti-Semitic tropes daily to normalize them among the populace. These devices, while rooted in universal cognitive biases toward , enable propagandists to engineer consent by prioritizing causal narratives that align with power interests over verifiable data, as empirical studies in have since corroborated through experiments on susceptibility.

Technological and Media Strategies

The advent of technologies has amplified the reach and precision of propaganda efforts by enabling rapid, scalable dissemination of targeted messages to large audiences. The , invented by around 1440, marked an early technological milestone, allowing for the inexpensive production of pamphlets and books that spread ideological narratives, such as those during the Protestant Reformation where Martin Luther's writings reached broad European readerships within months. This shift from manuscript copying to mechanized printing reduced costs and barriers, facilitating state and religious authorities' control over information flows while enabling dissident voices to challenge orthodoxies through vernacular translations. Broadcast media, particularly radio, revolutionized propaganda during the 20th century by providing one-to-many communication that bypassed literacy requirements and penetrated private homes. In , ' Propaganda Ministry, established in 1933, centralized radio control, distributing 70-80% of households with affordable "Volk receivers" by 1939 to broadcast speeches and ideological content, fostering national unity and demonizing enemies in real-time during events like the 1936 Olympics. Similarly, Allied forces employed radio for morale-boosting broadcasts and psychological operations, such as the BBC's wartime programming that reached millions across . complemented radio's auditory focus with visual symbolism; Leni Riefenstahl's (1935) used innovative cinematography to glorify , screening to over 10 million Germans and influencing cinematic worldwide. These media allowed propagandists to synchronize messages across formats, exploiting emotional appeals through synchronized sound and imagery for greater persuasive impact. In the post-World War II era, television extended these strategies by combining motion pictures with live broadcasting, enabling immersive narratives that shaped public perceptions during conflicts like the . State broadcasters, such as the Soviet Union's , aired scripted content promoting collectivism, while Western networks like disseminated anti-communist footage, with viewership spiking to 90% of U.S. households by 1960 for events like the Kennedy-Nixon debates, which highlighted television's role in image-based persuasion. The digital revolution from the 1990s onward introduced algorithmic amplification and micro-targeting, where platforms like and (now X) use data analytics to tailor content, creating filter bubbles that reinforce biases; a 2021 study documented over 80 countries employing computational propaganda, including bots generating 20-30% of certain political discussions to sway elections. Social media's virality, driven by engagement metrics favoring sensationalism, has enabled state actors like Russia's to deploy troll farms, disseminating 2016 U.S. election interference content viewed by millions, while non-state groups leverage encrypted apps for decentralized coordination. Emerging technologies such as deepfakes and AI-generated content further refine media strategies by fabricating hyper-realistic audiovisual deceptions, with instances like 2023 videos mimicking Ukrainian President surrendering, viewed over 10 million times before removal, illustrating risks of eroded trust in visual evidence. These tools exploit cognitive heuristics, prioritizing speed over verification, and underscore how technological advancements prioritize virality and personalization over factual accuracy, often amplifying propaganda in low-gatekeeper environments.

Categories of Propaganda

Political and Ideological Forms


Political propaganda involves systematic campaigns by governments, parties, or movements to influence toward specific policies, leaders, or electoral outcomes, often employing biased narratives to mobilize support or discredit opponents. Ideological forms extend this by promoting overarching belief systems, such as racial hierarchies or class struggle doctrines, framing them as inevitable truths while suppressing contradictory evidence. These efforts typically rely on state-controlled media in authoritarian contexts to achieve saturation, contrasting with more fragmented applications in democracies where independent outlets limit total dominance.
In , ideological propaganda under ' Ministry of Enlightenment and Propaganda centralized control over radio, film, and press to inculcate Aryan supremacy and anti-Semitism as core tenets, portraying Jews as existential threats to the Volk through posters, newsreels like Der Ewige Jude, and school curricula revised by 1933 to exclude "degenerate" influences. This apparatus facilitated the regime's shift from electoral gains in —when Nazis became Germany's largest party—to dictatorial consolidation, with propaganda deceiving the public on events like the staged to justify invading on September 1, 1939. Academic analyses note that while such propaganda exploited economic despair post-Versailles Treaty, its effectiveness stemmed from repetitive demonization rather than empirical validation, as evidenced by sustained support amid military setbacks after 1943. Fascist Italy under similarly harnessed propaganda to forge a mythic , using posters and rallies to exalt the Duce's against Bolshevik threats and glorify Roman imperial revival, with media like the Istituto Luce producing films that reached millions by . Official manifestos in voting stations listed Mussolini atop candidate slates, embedding party loyalty into electoral processes from 1924 onward, while events like the 1932 Exhibition of the Fascist Revolution reinforced ideological continuity from . This approach, blending antiquity motifs with modern , sustained regime stability until Allied invasions in 1943, though sources from Western archives highlight its role in masking economic stagnation under corporatist policies. Soviet communist propaganda propagated Marxist-Leninist ideology through historical revisionism, such as the 1938 Short Course on the History of the All-Union Communist Party that airbrushed rivals like Trotsky and reframed events to depict the Bolshevik Revolution as predestined proletarian triumph, disseminated via posters and Pravda to over 100 million citizens by Stalin's death in 1953. It emphasized class enemies as saboteurs, justifying purges that executed 681,692 in 1937-1938 alone, while glorifying Five-Year Plans despite famines like the 1932-1933 Holodomor killing 3-5 million Ukrainians, which propaganda attributed to kulak resistance rather than collectivization failures. Post-WWII, it pivoted to anti-fascist narratives while promoting global revolution, with continuity in Russian state media tactics observed into the 2020s. In democracies, political propaganda often surfaces in wartime mobilization or elections, as with the U.S. Committee on Public Information's 1917-1919 posters urging enlistment against German "Huns," which reached 20 million via 3,000 speakers but faced postwar backlash for exaggerating atrocity claims. Modern instances include partisan ads card-stacking facts, yet pluralistic media and fact-checking mitigate totalitarian-style indoctrination, though studies indicate vulnerability to echo chambers in digital eras. Sources from military academies emphasize that while authoritarian regimes integrate propaganda into governance for ideological hegemony, democratic variants prioritize persuasion over coercion, reflecting causal differences in institutional accountability. ![I Want You for U.S. Army by James Montgomery Flagg][float-right] Wartime propaganda encompasses government-led campaigns to mobilize populations, sustain morale, recruit personnel, and delegitimize adversaries during armed conflicts. These efforts often employ posters, films, leaflets, and media broadcasts to foster unity and portray the enemy as barbaric or existential threats. In , the established the (CPI) in April 1917 under to coordinate propaganda, producing over 2,000 titles in posters, pamphlets, and films that emphasized and German atrocities. The CPI's "Four Minute Men" initiative deployed 75,000 volunteers to deliver short speeches in theaters and public spaces, reaching an estimated 400 million Americans and contributing to war bond sales exceeding $18 billion. British propaganda similarly amplified reports of German crimes in , as detailed in the 1915 Bryce Report, which, while based on witness accounts, included unverified claims of bayoneting babies to incite Allied support. During , propaganda intensified with state-controlled apparatuses on both sides. Nazi Germany's Reich Ministry of Public Enlightenment and Propaganda, headed by since March 1933, monopolized media to glorify the regime, demonize Jews and Allies, and justify expansionism through films like Triumph of the Will (1935) and radio broadcasts reaching millions. The ministry orchestrated the 1933 book burnings and censored dissent, fostering a around that sustained domestic support until late 1944. In response, the U.S. Office of War Information (OWI), created in June 1942, disseminated posters such as "" to encourage women's workforce participation, boosting female employment from 12 million in 1940 to 18 million by 1944, while films and cartoons depicted as monstrous aggressors. Allied campaigns also included leaflet drops over enemy territories, with the U.S. distributing over 6 billion leaflets in alone to undermine morale and promote surrender. In later conflicts, propaganda adapted to and media landscapes. During the (1955–1975), North Vietnamese forces used posters and radio to frame the U.S. as imperial invaders, portraying downed American aircraft as victories to rally domestic support and international sympathy. U.S. efforts, including over 20 billion leaflets via psychological operations, aimed to induce defections but largely failed amid graphic media coverage of events like the in January 1968, which shifted American public opinion against the war despite tactical U.S. successes. In the 2003 , U.S. administration claims of weapons of mass destruction and Saddam Hussein's alleged 9/11 ties, echoed uncritically by major media, facilitated initial invasion support but eroded credibility post-invasion when no such stockpiles were found, as confirmed by the 2004 report. and "" framing further shaped perceptions, though insurgent videos via early internet platforms countered official narratives. These cases illustrate propaganda's dual role in short-term and long-term risks of backlash when discrepancies emerge.

Commercial and Economic

Commercial propaganda adapts systematic persuasion methods, originally refined during , to commercial ends, fostering by linking products to emotional desires, , and cultural narratives rather than mere utility. This approach treats consumers as malleable audiences whose behaviors can be directed toward , often prioritizing over factual product attributes. Edward Bernays, leveraging insights from uncle Freud's theories on the unconscious, pioneered these tactics in the by reorienting wartime propaganda toward private enterprise. In his 1928 book Propaganda, Bernays argued that an "invisible government" of experts must organize to avert chaos, explicitly applying to boost sales for industries like and appliances. A landmark example was his breakfast campaign for Beech-Nut Packing, where he commissioned surveys of 5,000 physicians endorsing as a health-promoting food, resulting in widespread media adoption of "bacon and eggs" as the ideal meal and a sales surge. Similarly, the 1929 "" effort for American staged a march of hired women cigarettes during New York's , framing the act as a symbol of and normalizing female consumption, which correlated with a rise in women smokers from 5% in 1924 to 12% by 1929. Common techniques mirror political propaganda: bandwagon appeals urge purchases by implying universal participation, as in ads claiming "everyone's switching to [brand]"; testimonials deploy celebrities or experts for endorsement, like athlete-backed energy drinks; and transfer associates products with aspirational values, such as luxury cars evoking freedom or prestige. These methods, while effective in driving revenue—U.S. advertising spending rose from $1.3 billion in 1920 to $3.4 billion by 1930—have drawn scrutiny for cultivating artificial needs and debt-driven economies, with Bernays himself acknowledging the deliberate creation of demand to sustain growth. Economic propaganda, distinct yet overlapping, deploys similar tools by states or institutions to legitimize policies, obscure failures, or rally support for resource distribution amid scarcity or ideology. In the , Joseph Stalin's First Five-Year Plan (1928–1932) was propagandized via posters, films, and rallies depicting steel mills and collective farms as engines of proletarian triumph, with slogans like "Fulfill the Five-Year Plan in Four!" mobilizing labor quotas under threat of . Official claims touted industrial output growth—steel production jumped from 4 million tons in 1928 to 5.9 million in 1932—but concealed inefficiencies, forced collectivization, and the famine killing millions, using metrics selectively to project socialist superiority. Such campaigns sustained regime control by equating economic sacrifice with ideological destiny, influencing subsequent plans through 1941. In the United States, the Roosevelt administration's (1933–1939) employed posters and radio broadcasts to portray programs like the as collective salvation from the , with symbols of unity and recovery encouraging compliance despite mixed empirical outcomes, such as temporary unemployment spikes from codes. During , Treasury Department efforts sold $185 billion in war bonds via celebrity drives and ads framing purchases as economic , while campaigns justified shortages by emphasizing shared burden, achieving 85 million participants by 1945. These instances highlight economic propaganda's role in aligning public action with policy imperatives, often amplifying successes while downplaying causal trade-offs like inflation or .

Religious and Cultural

Religious propaganda refers to organized efforts by religious authorities to disseminate doctrines, inspire devotion, and expand influence through persuasive narratives, symbols, and media. The term "propaganda" originated with the Roman Catholic Church's Sacra Congregatio de Propaganda Fide, founded on June 22, 1622, by Pope Gregory XV through the bull Inscrutabili Divinae Divinae Providentiae, to oversee missionary propagation amid the Protestant Reformation and European colonial ventures. This congregation standardized training at the Urban College, funded expeditions, and produced vernacular texts, contributing to Catholicism's growth in regions like Latin America, where by 1700, millions had been baptized, often blending evangelization with colonial administration. Early Christianity employed similar tactics; the Apostle Paul's epistles, circulated from approximately 50-60 CE, adapted Jewish messianic claims to Gentile contexts, using rhetoric to counter Roman paganism and foster communities across the empire. In , da'wah—the call to faith—has functioned as a core propagation strategy since the , with the Muhammad's Meccan preaching (610-622 CE) emphasizing monotheism through public recitation and treaties, later expanding via conquests that integrated persuasion with territorial control. By the (661-750 CE), da'wah incorporated administrative policies favoring converts, such as tax incentives, leading to rapid demographic shifts in the , where non-Muslim populations declined from majorities to minorities over centuries. Modern Islamist groups, including those affiliated with the since 1928, have digitized da'wah via , reaching billions while framing it as defensive against , though critics note its selective emphasis on appealing verses over doctrinal rigor. Cultural propaganda promotes or defends shared identities, norms, and aesthetics to foster cohesion or superiority, often intersecting with religious elements. During British colonialism in (1858-1947), officials deployed and exhibitions to portray indigenous customs as primitive, justifying "civilizing" interventions; for instance, images of sati or practices, captured post-1857 , were exhibited in to garner public support for empire, despite selective framing that ignored adaptive reforms. In (1933-1945), ' Ministry of Propaganda synchronized culture via the , purging over 16,000 "degenerate" artworks in 1937 exhibitions while glorifying Nordic myths in films like Triumph of the Will (1935), which drew 500,000 viewers to instill racial purity as cultural destiny. Post-World War II, U.S. cultural exports—Hollywood films averaging 200 annual releases by the —projected democratic individualism abroad, influencing global tastes but critiqued for eroding local traditions, as evidenced by European quotas limiting American imports to counter "Coca-Colonization." Such efforts highlight propaganda's dual role in preservation, as seen in indigenous resistance media, and imposition, where dominant narratives marginalize alternatives through institutional control.

State-Sponsored and Institutional

State-sponsored propaganda refers to efforts by governments to systematically produce and distribute information aimed at shaping in favor of official policies, ideologies, or wartime objectives, often through dedicated ministries or agencies. In , the Ministry of Propaganda and Public Enlightenment, established in 1933 under , centralized control over media, arts, and public communications to promote supremacy and anti-Semitism, achieving near-total domination of information flow by 1939. Similarly, during , the government created the Office of War Information in 1942 to coordinate propaganda campaigns, producing posters, films, and radio broadcasts that mobilized public support for the war effort, with over 200,000 posters distributed to encourage enlistment and resource conservation. Institutional propaganda extends to state-controlled media outlets and educational systems designed to indoctrinate populations. In authoritarian regimes, such as the , government directives integrate propaganda into , with posters and curricula emphasizing loyalty to the ruling family and anti-Western narratives, fostering generational adherence to state ideology. China's , employing over 8,000 staff and operating 105 branches worldwide as of 2005, functions as the primary conduit for official narratives, blending news with ideological messaging to project Beijing's global influence while suppressing dissenting views domestically. Russia's RT (formerly Russia Today), funded by the state since its 2005 launch, broadcasts content challenging Western narratives on issues like , reaching millions internationally through multilingual platforms. Contemporary state-sponsored efforts increasingly incorporate digital tools, with at least 62 countries employing agencies for computational propaganda as of 2021, including automated accounts to amplify official positions. In , state media like CGTN extends this through strategies that promote governance models portraying authoritarian efficiency over democratic alternatives, targeting global audiences amid U.S.-China tensions. These institutional mechanisms often operate under the guise of , but their alignment with directives raises questions of credibility, particularly when Western analyses highlight adversarial propaganda while domestic efforts, such as U.S. historical wartime , are retrospectively framed as patriotic information campaigns rather than equivalent manipulation. Empirical studies indicate that such propaganda's effectiveness depends on audience predispositions and repetition, underscoring the causal role of institutional monopoly on information in sustaining regime legitimacy.

Theoretical Frameworks

Models from Social Psychology

Social psychology examines propaganda through models that highlight mechanisms of influence on individual cognition, group dynamics, and attitude formation. These frameworks reveal how propaganda exploits innate tendencies toward conformity, obedience, and identity-based biases to shape beliefs and behaviors without necessitating rational scrutiny. Empirical studies, such as those on conformity and authority, demonstrate that ordinary individuals can adopt propagated views under social pressure, often prioritizing group harmony or hierarchical cues over personal judgment. Conformity and Social Proof. Solomon Asch's 1951 line judgment experiments illustrated how individuals conform to erroneous group consensus, with about 75% of participants yielding at least once to a unanimous , even when aware of the inaccuracy. Propaganda leverages this by fabricating perceived support through repeated messaging or staged endorsements, creating an illusion of normative behavior that pressures dissenters to align. Robert Cialdini's principle of , derived from observational studies, posits that people look to others' actions in ambiguous situations to guide their own, a tactic evident in propaganda campaigns that amplify testimonials or crowd simulations to imply widespread acceptance. For instance, wartime posters depicting unified public enthusiasm exploit this to foster compliance. Obedience to . Stanley Milgram's 1961-1962 obedience studies found that 65% of participants administered what they believed were lethal electric shocks to a learner when instructed by an experimenter in a white lab coat, underscoring the potency of perceived in overriding inhibitions. In propaganda contexts, this model explains adherence to directives from leaders or institutions portrayed as legitimate experts, where cues like uniforms, titles, or official reduce personal . Theoretical extensions link authority propagation to evolutionary adaptations for hierarchical coordination, enabling rapid belief shifts in populations via top-down inculcation. Cognitive Dissonance. Leon Festinger's 1957 theory describes the psychological discomfort from holding conflicting cognitions, prompting individuals to resolve it by altering beliefs or rationalizing actions. Propaganda induces dissonance by juxtaposing new narratives against existing views—such as portraying out-groups as threats—motivating acceptance to restore consistency, particularly when commitment to initial actions (e.g., public endorsements) entrenches the shift. Empirical applications show this in disinformation campaigns, where repeated exposure amplifies selective reinforcement, biasing information processing toward propagated ideologies. Social Identity Theory. and John Turner's 1979 framework argues that self-concept derives from group memberships, fostering and out-group discrimination via minimal cues alone, as shown in Tajfel's 1970s experiments where arbitrary groupings led to biased . Propaganda amplifies this by emphasizing collective identities (e.g., national or ideological) to heighten perceived intergroup threats, justifying or exclusion; studies confirm stronger effects under , where identity-affirming messages solidify loyalty. This model underscores propaganda's role in sustaining divisions, as individuals derogate contrary evidence to protect group-derived esteem.

Sociological and Educational Theories

Sociological theories frame propaganda as an embedded mechanism for maintaining social cohesion and control in mass societies. Jacques Ellul's 1965 analysis posits propaganda not merely as deliberate persuasion but as a pervasive sociological process in technological civilizations, where individuals are continuously integrated into collective attitudes through "pre-propaganda" mechanisms like education, media, and group affiliations. This horizontal propaganda operates subtly, fostering conformity by aligning personal needs with societal norms, distinct from vertical political directives that impose top-down ideology. Ellul argued that modern efficiency demands total propaganda, rendering it inevitable and inescapable, as it exploits the individual's isolation in urban, industrialized settings to manufacture unanimous public opinion. Harold Lasswell's foundational work in the 1920s and 1930s examined propaganda as a tool of elite influence over mass behavior, emphasizing the strategic dissemination of symbols to mobilize support during conflicts. In Propaganda Technique in the World War (1927), Lasswell documented how belligerents used repetitive messaging across channels to sustain morale and demonize enemies, laying groundwork for viewing propaganda as a rational instrument of power in democratic and authoritarian contexts alike. His communication model—who says what, through which channel, to whom, with what effect—highlights propaganda's causal role in shaping perceptions and actions within stratified societies. Educational theories distinguish propaganda from genuine by its intent to suppress in favor of ideological uniformity. In autocratic systems, state-controlled curricula function as propaganda vectors, embedding ruling narratives to deter and justify authority, as evidenced by models showing how such indoctrination correlates with reduced political opposition and sustained regime stability. For instance, North Korean integrates propaganda posters and texts promoting leader worship from early grades, conditioning obedience over empirical inquiry. Conversely, democratic educational responses, such as programs developed since the 1930s, treat propaganda as a teachable , training students to evaluate sources and biases through frameworks like public , which views learning as a battleground for competing narratives. These approaches underscore propaganda's epistemological threat, where deliberate falsehoods erode fact-based discourse, prompting curricula reforms to prioritize verification skills amid rising digital manipulation.

Cognitive and Self-Propaganda Mechanisms

Propaganda exploits cognitive biases that shape human judgment and decision-making, notably , whereby individuals preferentially process and retain information aligning with preexisting beliefs, thereby amplifying receptivity to ideologically congruent messages while discounting disconfirming evidence. This bias operates through selective exposure and interpretation, as empirical studies show people gravitate toward sources reinforcing their views, fostering echo chambers that entrench propagandistic claims. Complementing this, the renders vivid or recent propagandistic imagery more persuasive, as repeated exposure elevates perceived plausibility independent of factual accuracy. Emotional mechanisms further underpin cognitive susceptibility, with reliance on affective cues over deliberative reasoning correlating with heightened belief in deceptive narratives; for instance, experimental data indicate that emotion-driven processing increases endorsement of false claims by up to 20-30% compared to analytical approaches. , triggered when propaganda contradicts held convictions, prompts rationalization or selective reinterpretation to alleviate discomfort, as individuals adjust attitudes to align with authoritative or group-endorsed messages. Group dynamics exacerbate these effects via and , where pressures lead to uncritical acceptance of collective narratives, as modeled in experiments. Self-propaganda manifests through internalized processes, where individuals actively construct arguments supporting external propaganda, thereby deepening personal commitment; field experiments at deliberative forums reveal that self-articulation of positions boosts perceived factual and moral validity by 15-25%, simulating voluntary endorsement. This self- hinges on effort , with greater anticipated cognitive investment yielding stronger attitudinal shifts, as demonstrated in controlled studies varying assumptions. Recursive contributes by equating narrative coherence with truth, enabling absurd or ideologically extreme claims to gain traction through iterative self-reinforcement, particularly in isolated informational environments. Such mechanisms sustain long-term adherence, as habitual rumination on aligned content overrides metacognitive scrutiny, per psychological models of .

Empirical Applications and Examples

Major Historical Case Studies

One prominent historical case study of propaganda involves the efforts of the during , where the , established on April 13, 1917, under , produced over 20 million posters, 75 million pamphlets, and thousands of films to mobilize public support for the war. These materials emphasized enlistment, bond purchases, and conservation, with iconic posters like 's "I Want You" depicting directly addressing viewers to boost recruitment, contributing to over 4 million American troops mobilized by 1918. The campaign also fostered , leading to suppression of German-language publications and cultural elements, as evidenced by the closure of over 500 German newspapers and the renaming of to "liberty cabbage." In , propaganda was centralized under the Reich Ministry of Public Enlightenment and Propaganda, created on March 13, 1933, and led by , who controlled media, film, radio, and arts to promote supremacy and . Films like The Eternal Jew (1940) and posters depicted Jews as vermin or economic parasites, facilitating the of September 15, 1935, and escalating to the , where propaganda justified the deportation and extermination of 6 million Jews by portraying them as threats to national purity. This apparatus reached broad audiences via mandatory radio ownership initiatives, with 70% of households equipped by , sustaining support for the regime until late in despite military setbacks. Soviet propaganda under exemplified state control through the department of the , which from the 1920s onward used posters, newspapers like , and films to glorify collectivization and industrialization, such as the Five-Year Plans starting in 1928 that claimed to transform the USSR into an industrial power, though at the cost of millions in the famine of 1932-1933. During , after the German invasion on June 22, 1941, propaganda shifted to nationalism, producing over 200,000 posters depicting the "Great Patriotic War" and Stalin as a defender, which helped mobilize 34 million Soviet soldiers and maintain civilian resolve amid 27 million deaths. Postwar, it falsified history, such as rewriting the 1939 Molotov-Ribbentrop Pact to emphasize Soviet victimhood. Allied propaganda in World War II, particularly by the Office of War Information formed in June 1942, utilized emotional appeals in posters to promote sales totaling $185 billion and rationing compliance, with designs focusing on fear of Axis brutality rather than abstract ideals for greater impact. British efforts, including BBC broadcasts and leaflets dropped over , aimed to undermine morale, with studies indicating limited but measurable effects on desertions in occupied . These campaigns contrasted with Axis efforts by emphasizing democratic values and unity, contributing to sustained home-front production that outpaced enemies, as U.S. industrial output rose 96% from 1941 to 1945.

Contemporary Instances in Media and Politics

In the digital era, propaganda in media and politics has proliferated through algorithms and state-sponsored campaigns, enabling rapid dissemination of tailored narratives to influence and electoral outcomes. During the 2020 U.S. presidential election, false claims of widespread voter fraud propagated by former President and supporters were amplified across platforms, contributing to the , 2021, Capitol riot, though subsequent investigations found no evidence of fraud sufficient to alter results. outlets, often characterized by left-leaning institutional biases, framed these claims uniformly as , potentially suppressing debate on verifiable irregularities like ballot harvesting in states such as , where over 1 million mail-in ballots were processed amid chain-of-custody concerns raised in filings. The 2024 U.S. saw further define narratives, with foreign actors like and deploying bots and fake accounts to exacerbate divisions on issues such as and economic policy, reaching millions via platforms like X and . A Stanford study revealed that partisan loyalty overrides factual accuracy, with both Democrats and Republicans accepting misleading information aligning with their views—e.g., conservatives endorsing unverified interference claims, while liberals dismissed documented border security data as exaggerated. This echoes patterns in media coverage, where outlets like and MSNBC allocated 90% negative airtime to Trump in 2024, per analysis, fostering perceptions of coordinated anti-conservative propaganda rather than objective . In international conflicts, Russia's invasion of on February 24, 2022, prompted extensive state propaganda via outlets like RT and Sputnik, reframing the operation as "" and denying atrocities such as the , where over 400 civilian bodies were documented by and eyewitness accounts. Western media, while countering Russian narratives, has been critiqued for selective emphasis on Ukrainian successes—e.g., underreporting Russian territorial gains in , which controlled 20% of by mid-2023—potentially serving alliance-building propaganda amid aid totaling $100 billion by 2024. RAND analysis showed Russian extremist content reaching 500 million impressions globally via proxies, manipulating data like casualty figures to claim Ukrainian losses at 1 million versus official estimates of 500,000 combined. During the , governments and media propagated unified messaging on measures like mask mandates and , with the U.S. CDC reporting over 1 million excess deaths linked to hesitancy fueled by counter-narratives, though suppression of lab-leak hypotheses—later deemed plausible by FBI assessments in 2023—exemplified institutional alignment over empirical inquiry. Chinese disseminated propaganda minimizing origins and efficacy of lockdowns, promoting unsubstantiated claims of Western bioweapon development, which garnered billions of views on and influenced global skepticism toward WHO data. A NIH review linked such to 20-30% refusal rates in low-trust populations, underscoring propaganda's role in eroding compliance.

Emerging AI-Driven and Digital Propaganda

The integration of into propaganda has enabled the rapid generation of , including deepfakes, text, and images, allowing actors to disseminate tailored at unprecedented scale and low cost. Tools like generative adversarial networks and large models facilitate the creation of convincing content that mimics real events or personas, often evading initial detection by reviewers. For instance, benchmarks indicate that AI-driven "fake news" sites proliferated tenfold between 2023 and 2024, flooding online ecosystems with algorithmically optimized . State and non-state actors exploit these capabilities to amplify influence operations, with Russia's government-directed campaigns employing AI to produce election-related content targeting Western democracies as early as 2024. Similarly, Iranian and Chinese entities have leveraged generative AI, such as Google's Gemini, to accelerate narrative dissemination, though empirical assessments reveal limited behavioral sway compared to traditional methods. In the 2024 U.S. presidential election, AI-generated visuals emerged as a vector for partisan messaging, with posting at least 19 such images or videos on to rally supporters and critique opponents, including depictions of fabricated scenarios like immigrants invading suburbs. audio and video, hyped as a existential threat, appeared in scattered instances—such as a mimicking President Biden's voice urging voters to abstain—but analyses of 78 election-related deepfakes found they were no more persuasive than conventional , with detection rates improving via forensic tools and public skepticism. Foreign malign influence compounded this, as U.S. intelligence reported and deploying AI to generate divisive content, including synthetic endorsements and scandal fabrications, though platforms like disrupted several state-affiliated attempts by revoking access to models. Digital platforms exacerbate AI-driven propaganda through algorithmic amplification, where prioritizes engaging—often polarizing—content, creating echo chambers that reinforce preconceptions rather than convert skeptics. Chinese state-aligned firms, such as GoLaxy, have pioneered AI for multilingual propaganda bots that simulate grassroots discourse on , targeting diasporas and international audiences with narratives aligned to Beijing's interests. In authoritarian regimes, governments influence AI models by requiring alignment with state narratives, raising concerns about embedded propaganda, though no direct evidence links training on government data to widespread skewing of propaganda influence in 2025-2026. Experts anticipate increased use of AI for disinformation and propaganda by state actors as capabilities advance. Despite these advances, causal evaluations underscore that AI's propaganda efficacy hinges on audience priors; synthetic media reinforces biases but rarely shifts entrenched views, as evidenced by post-election studies showing minimal vote impact from deepfakes in contests like Slovakia's 2023 ballot. Countermeasures, including watermarking standards and AI detection classifiers, are proliferating, yet lag behind generative tools' evolution, posing ongoing risks to informational integrity in hybrid analog-digital environments.

Societal Perceptions and Debates

Contested Legitimacy and Bias Claims

The term "propaganda" carries contested legitimacy due to its evolving and ambiguous definitions, originally denoting the neutral propagation of faith by the Catholic Church's Congregatio de Propaganda Fide established in , but increasingly connoting deliberate manipulation since the early . Scholars debate whether propaganda encompasses all organized or requires intent to deceive, with some arguing its inherent power renders it illegitimate in open societies, while others view distinctions as subjective and ideologically driven. This definitional fluidity allows actors to label disfavored communications as propaganda while exempting aligned efforts, undermining claims of objective legitimacy. Bias accusations frequently frame media and institutional outputs as propagandistic, with empirical patterns showing partisan asymmetry: political conservatives issue such claims more often than liberals, correlating with documented disparities in news coverage favoring left-leaning perspectives on issues like and . For instance, a analysis of 2024 data found that extreme partisan views and one-sided media consumption predict biased perceptions, yet objective metrics reveal mainstream outlets underrepresenting conservative viewpoints relative to distributions. Public trust metrics reinforce these contests, as Gallup polls indicate only 31% of held a "great deal" or "fair amount" of confidence in in 2024, with Republicans at 14% versus Democrats at 54%, reflecting perceptions of systemic institutional rather than mere ideological disagreement. In academic and journalistic institutions, claims of propaganda arise from evidence of overrepresentation of left-leaning viewpoints, with surveys showing faculty political donations skewing 96% Democratic in social sciences as of recent cycles, potentially causal in shaping narratives presented as neutral scholarship. These biases manifest in selective emphasis or omission, as seen in coverage of events like the 2020 U.S. election disputes, where outlets accused of right-wing propaganda faced counter-claims of left-driven suppression. Such mutual delegitimization highlights how propaganda labels often prioritize causal self-interest over empirical verification, with higher-frequency accusations from marginalized ideological groups signaling genuine distortions rather than symmetric equivalence.

Cross-Ideological Accusations

Accusations of propaganda frequently traverse ideological divides, as partisans on both the left and right attribute manipulative to opponents' communications, often framing them as deliberate distortions to advance agendas. Studies indicate that both major U.S. political affiliations routinely accuse the opposing party of conspiratorial behavior, including spreading propaganda, which reinforces mutual distrust and contributes to affective polarization. For instance, conservatives have long charged outlets with left-leaning , labeling coverage of issues like , , and elections as propagandistic efforts to undermine traditional values and ; a 2024 analysis of discourse found that claims of "leftist " predominate in such accusations, though they do not span a broad ideological spectrum among accusers. Conversely, liberals and left-leaning commentators accuse conservative media and figures of deploying propaganda to stoke division, such as portraying right-wing narratives on election fraud or cultural issues as echoes of foreign tactics. Notable examples include Democratic-aligned critics labeling former President Trump's rhetoric and appointees' statements as parroting Russian propaganda, as seen in 2024 objections to Tulsi Gabbard's intelligence role nomination for allegedly promoting narratives aligned with adversarial states. This bidirectional pattern extends to entertainment and education, where right-leaning voices decry left-influenced content in television and schools as —evident in critiques of programs like for embedding progressive ideologies—while left-leaning sources counter that conservative outlets amplify misinformation on topics like during the . Such cross-accusations are amplified in digital echo chambers, where partisan sharing of "fake news" claims correlates with ideological affiliation, yet empirical reviews reveal asymmetry in vulnerability: right-leaning users show higher rates of sharing misleading content, though both sides perceive the other's information ecosystem as propagandistic. This dynamic not only mirrors historical propaganda rivalries but also sustains polarization, as each side's claims of victimhood to the other's tactics discourage cross-ideological dialogue and bolster in-group cohesion. Mainstream academic and media analyses, often from left-leaning institutions, tend to emphasize right-wing propaganda risks while downplaying equivalent left-wing efforts, highlighting credibility concerns in source selection for these debates.

Resistance and Counter-Propaganda

Resistance to propaganda encompasses both individual psychological mechanisms that mitigate susceptibility and organized societal efforts to debunk or neutralize propagandistic messaging. Empirical studies identify cognitive factors such as prior knowledge, analytical thinking, and emotional regulation as key barriers to by misleading narratives. For instance, individuals with higher cognitive reflection tendencies are less prone to endorsing , as they engage in effortful scrutiny rather than acceptance. Social influences, including exposure to diverse viewpoints, further bolster resistance by fostering toward uniform echo chambers. Inoculation theory, developed in the 1960s and validated through decades of experimentation, provides a structured approach to building attitudinal resistance by preemptively exposing individuals to weakened forms of propagandistic arguments, enabling them to generate refutations. This "vaccination" analogy has demonstrated efficacy in reducing susceptibility to conspiracy theories, such as those surrounding the 9/11 attacks, where inoculated participants showed sustained motivational defenses against subsequent exposure. Meta-analyses confirm inoculation's robustness across domains, outperforming post-hoc corrections by activating threat recognition and counterarguing prior to full confrontation. Recent applications, including social media campaigns, have scaled this method to confer resilience against misinformation tactics like discrediting sources or false dichotomies, with prebunking videos increasing resistance by up to 20% in controlled trials. Counter-propaganda involves deliberate state or non-state initiatives to expose and dismantle adversarial messaging, often through revelation of origins, factual rebuttals, or amplification of alternative narratives. Historically, the in the 1980s systematically debunked Soviet disinformation campaigns, such as fabricated atrocity claims, by disseminating evidence of orchestration to targeted audiences in and beyond. During , Allied psychological operations countered Axis propaganda by airdropping leaflets that highlighted inconsistencies in Nazi claims, such as exaggerated military successes, thereby eroding enemy and civilian compliance. Similar tactics were employed against propaganda in the 2010s, combining content takedowns with counter-narratives emphasizing ideological contradictions, though cyber disruptions proved more immediately disruptive than persuasive rebuttals. Fact-checking represents a common modern counter-strategy, verifying claims against to undermine propagandistic assertions; however, its impact is limited, primarily enhancing factual recall without consistently altering deeply held beliefs or . Studies indicate that while fact-checks correct specific inaccuracies, they can trigger backfire effects among audiences ideologically aligned with the original message, particularly when checkers are perceived as biased—a substantiated by analyses revealing selective in outlets like . In contrast, inoculation and media literacy programs, which train recognition of manipulative techniques rather than disputing content, yield more durable resistance, as evidenced by reduced polarization in experimental groups exposed to propaganda simulations. Overall, effective counter-propaganda prioritizes preempting over reactive correction, aligning with causal pathways where early disrupts belief formation more reliably than ex post interventions.

Differentiations from Adjacent Concepts

Propaganda vs. Disinformation and Misinformation

Propaganda involves the deliberate and systematic dissemination of information—facts, arguments, rumors, half-truths, or lies—to advance a specific political, ideological, or organizational agenda, often by state or institutional . Unlike mere , it employs techniques such as selective emphasis, emotional appeals, and repetition to shape public attitudes or behaviors in alignment with the propagator's interests. This distinguishes it from neutral information-sharing, as propaganda prioritizes over comprehensive truth, though it may incorporate verifiable facts when they serve the . In contrast, misinformation refers to false or inaccurate information circulated without deliberate intent to deceive, often resulting from errors, misunderstandings, or careless sharing. For instance, an individual might unwittingly spread outdated statistics due to reliance on unverified sources, lacking awareness of their inaccuracy. Disinformation, however, entails the intentional creation and distribution of fabricated or manipulated falsehoods to mislead audiences, typically for strategic gains like sowing discord or undermining trust. The core differentiator here is : disinformation requires purposeful deception, as seen in coordinated campaigns fabricating events, whereas misinformation arises from or . Key variances emerge in veracity, structure, and objectives. Propaganda can be truthful in parts but is inherently biased through omission or framing, aiming to mobilize support rather than merely confuse. Disinformation and , by definition, involve untruths, but propaganda's organized, agenda-driven nature—often involving media control or mass campaigns—sets it apart from the potentially sporadic spread of dis/ via social networks. Overlaps exist, as propaganda may incorporate (e.g., state-sponsored fabrications during wartime), yet not all qualifies as propaganda without a broader persuasive framework. Empirical analyses highlight that while proliferates virally through cognitive biases like confirmation seeking, propaganda leverages institutional resources for sustained influence, as evidenced in historical cases like broadcasts.
AspectPropagandaMisinformationDisinformation
Truth ContentCan include facts, but selectively biasedAlways false or inaccurateDeliberately false or misleading
Intent for specific agendaNone; unintentional error and harm
OrganizationSystematic, often institutional, individual or viralCoordinated, often covert
ExamplesGovernment posters rallying supportShared based on mistakeFabricated stories to incite panic
This table summarizes distinctions drawn from communication , underscoring propaganda's potential legitimacy in contexts versus the inherent unreliability of dis/. Such delineations aid in assessing ecosystems, where conflating terms risks overlooking propaganda's role in shaping narratives through truthful but partial disclosures.

Relations to Public Relations, Education, and Journalism

Public relations emerged in the early 20th century as a professionalized extension of , with , often called the father of PR, explicitly framing it as "" in his 1928 book Propaganda, where he described propaganda as an essential tool for managing through psychological manipulation and media influence. , nephew of , applied to corporate and government campaigns, such as promoting women's smoking in 1929 by linking it to , demonstrating how PR uses selective facts and emotional appeals akin to wartime propaganda but rebranded for commercial and policy goals. While PR practitioners emphasize and mutual understanding—distinguishing it from one-sided propaganda—critics argue this distinction is semantic, as both prioritize over unfiltered truth, with PR often serving elite interests through staged events and narrative control rather than empirical scrutiny. In education, propaganda manifests through curricula designed to instill ideological conformity, as seen in historical regimes where state-controlled schooling propagated or ; for instance, Nazi Germany's 1933 curriculum reforms under the Reich Ministry of Education integrated racial into textbooks, reaching 11 million students by 1939 to foster loyalty to Hitler and via mandatory youth organizations like the , which enrolled over 7.7 million members by 1939. Similarly, Soviet indoctrination under Stalin from the 1930s emphasized Marxist-Leninist doctrine in schools, with history texts rewritten to glorify the regime, affecting generations through purges of dissenting educators and compulsory ideological training. Even in democracies, educational materials have served propagandistic ends, such as U.S. War-era films from 1945-1965 produced by the government to counter , blending factual instruction with anti-Soviet messaging to shape student perceptions without overt labeling as propaganda. These cases illustrate how education systems, by controlling narratives under the guise of patriotism or civic duty, can prioritize causal engineering of beliefs over critical inquiry, with long-term effects measurable in surveys showing heightened regime support among exposed youth. ![Propaganda poster in a North Korean primary school][float-right] Journalism intersects with propaganda when reporting systematically favors certain viewpoints, often through source selection and framing that aligns with institutional biases rather than balanced empiricism; the , proposed by and in 1988, posits five filters—ownership, advertising, sourcing, flak, and (later adapted to other ideologies)—that produce media content serving dominant power structures, as evidenced by U.S. coverage of Central American conflicts in the 1980s, where elite sources dominated 80-90% of quotes in major outlets like The New York Times. Empirical studies confirm systematic left-leaning bias in Western , with a 2013 analysis of U.S. media finding 28.6% liberal vs. 7.1% conservative opinion pieces in prestige papers, correlating with underreporting of facts challenging progressive narratives, such as or failures. In authoritarian contexts, like North Korea's KCNA functions as overt propaganda, fabricating events to sustain regime legitimacy, while in free societies, overlaps arise from advertiser pressures and ideological in newsrooms, where 2022 surveys indicated 90% of U.S. journalists identify as Democrats or independents leaning left, leading to selective emphasis that propagandizes by omission. Distinguishing from propaganda requires assessing intent and verifiability: true verifies claims against primary data, whereas propaganda subordinates facts to agenda, though blurred lines persist when bias distorts causal reality.

Impacts, Effectiveness, and Measurement

Quantifiable Effects on Behavior and Belief

Empirical research on propaganda's effects, often studied under influence operations or campaigns, indicates measurable but typically small impacts on beliefs and attitudes, with effects varying by medium, duration, and audience predispositions. A of 82 studies from 1995 to 2020 found that 41% reported small effects, 16% medium effects, and 5% large effects on outcomes such as political opinions, behaviors, and social norms, with traditional showing persistence over periods from days to decades. These findings challenge early minimal-effects theories, such as those from the limited-effects paradigm, by demonstrating causal links through field experiments and , though toward positive results may inflate estimates. In political contexts, propaganda has shifted voting intentions and policy support in controlled settings. For instance, randomized field experiments on voter mobilization, akin to , showed nonpartisan door-to-door increased turnout by 8-10 percentage points in U.S. elections, with effects lasting up to eight months, though partisan messaging yielded smaller shifts of 1-2 points due to of existing views rather than conversion. Long-term media campaigns, such as radio broadcasts in post-World War I U.S. elections, weakened biracial coalitions and boosted Democratic vote shares by an estimated 2-5% in targeted areas, as evidenced by archival on propaganda exposure correlating with reduced cross-racial voting. Social media influence operations, like Twitter bots countering hate speech, reduced racist language usage by 10-20% over two months in experimental groups. Health propaganda campaigns provide clearer quantifiable behavioral changes. In , radio-based messaging on child health increased clinic consultations for under-5s by 35-56% for symptoms following sustained exposure, with effects sustained for months via repeated reinforcement. Short exposures to misleading health claims, simulating propaganda, altered behaviors in lab settings; for example, articles viewed for under 5 minutes modified unconscious donation preferences toward affected causes by 15-25% in implicit association tests. Meta-analyses of fear-based appeals, common in propaganda, confirm average increases of 0.2-0.3 standard deviations in intentions and behaviors, such as uptake, when paired with messages, though effects diminish without audience vulnerability. Limitations persist: effects often fail against strong priors, with meta-analyses of political field experiments showing average zero persuasion for oppositional audiences, and backfiring in 3-5% of cases, such as increased polarization from cross-ideological exposure. The "sleeper effect," where discounted propaganda gains persuasiveness over time, occurs in 20-30% of cases per meta-review, amplifying delayed behavioral shifts by decoupling from message content. Overall, while propaganda reliably nudges marginal beliefs and actions—e.g., 5-10% shifts in aggregate surveys—large-scale transformations require institutional authority and repetition, not isolated efforts.

Factors Influencing Success or Failure

The effectiveness of propaganda hinges on audience predispositions, including pre-existing beliefs and cognitive biases that make individuals more receptive to messages aligning with their worldview; empirical studies indicate that propaganda reinforcing succeeds by exploiting intuitive rather than analytical thinking, as people are less likely to scrutinize familiar narratives. Social influences, such as reliance on group norms or figures, further amplify success, with research showing that perceived —rooted in trust or expertise—can override factual inaccuracies, as demonstrated in behavioral coevolution models where cues drive compliance even against material self-interest. Conversely, failure occurs when propaganda clashes with deeply held values or evident realities, leading to reactance or dismissal, particularly among audiences with high or exposure to counter-narratives. Message design plays a pivotal role, with emotional appeals—targeting fear, anger, or hope—proving more potent than rational arguments, as psychological analyses reveal that affective content bypasses critical evaluation and fosters behavioral change through heightened arousal. Repetition enhances perceived truthfulness via the "illusory truth effect," where familiar statements are rated as more credible regardless of veracity, supported by experiments showing increased belief after mere exposure. Simplicity and vivid imagery further boost efficacy, as complex or abstract propaganda demands greater cognitive resources, reducing persuasion under low-motivation conditions; historical cases, such as World War II posters leveraging stark visuals to drive enlistment, illustrate how concise, emotive formats achieved measurable upticks in voluntary participation. Failure ensues from overly nuanced or contradictory messaging, which invites scrutiny and erodes impact, as seen in analyses of state propaganda undermined by internal inconsistencies. Contextual elements, including media saturation and societal conditions, determine propagation scale; total control over information channels, as in authoritarian regimes, correlates with higher success rates by limiting alternatives, whereas fragmented digital environments dilute effects through competing voices. Crises or economic distress heighten vulnerability, enabling propaganda to frame events favorably—Nazi Germany's exploitation of post-Versailles Treaty resentment, for instance, mobilized support via scapegoating narratives that resonated amid and peaking at 30% in 1932. In contrast, stable or prosperous contexts foster skepticism, contributing to failures like Iran's military propaganda, which persists despite inefficacy due to audience awareness of discrepancies between claims and outcomes, such as unfulfilled regional dominance promises. Empirical reviews of influence operations underscore that while short-term attitude shifts occur, sustained behavioral change requires alignment with audience motivations, with quantification challenges highlighting overestimation in uncontrolled settings.

Long-Term Societal Consequences

Sustained exposure to propaganda over generations can entrench distorted beliefs, fostering societal polarization that persists beyond the initial campaign. Empirical reviews of influence operations indicate that long-term efforts measurably alter public beliefs and behaviors, such as voting patterns, by reinforcing ideological divides. In democratic contexts, this contributes to affective polarization, where partisan animosity intensifies, eroding interpersonal trust across group lines. Historical cases demonstrate propaganda's capacity to imprint biases that endure for decades. In , indoctrination through state-controlled media and education correlated with a persistent elevation in anti-Semitic attitudes; individuals born in the 1920s and exposed to intensive propaganda during youth exhibited significantly higher anti-Semitism levels in surveys conducted post-World War II, even after regime collapse. This effect stemmed from the regime's monopolization of information channels, which suppressed counter-narratives and normalized dehumanizing rhetoric, leading to societal complicity in atrocities like . In autocratic settings, prolonged propaganda diminishes public support for democratic reforms by cultivating perceptions of external threats and internal stability under authoritarian rule. Analysis of autocratic propaganda campaigns shows they reduce collective protest inclinations and bolster legitimacy across generations, perpetuating cycles of conformity over critical . Post-regime transitions, such as in former Soviet states, reveal lingering distrust in institutions and media, attributable to decades of state narratives that prioritized ideological purity over empirical reality, hindering economic and social adaptation. These dynamics undermine societal resilience, as populations habituated to manipulated information exhibit lower adaptability to factual challenges, amplifying vulnerability to future manipulations. Studies link chronic exposure—often propagandistic—to generalized declines in media trust and heightened reliance on partisan sources, which in turn sustains echo chambers and impedes consensus on shared facts. Over time, this fosters fragmented , elevating risks of instability, as polarized societies prove less capable of collective problem-solving.

Moral Critiques and Justifications

Moral critiques of propaganda frequently invoke deontological principles, asserting that its manipulative techniques inherently violate duties to truthfulness and respect for rational autonomy. Immanuel Kant's , which prohibits treating individuals as mere , condemns propaganda for deploying deception or selective emphasis to engineer consent rather than foster informed judgment. Similarly, Platonic critiques, echoed in modern moralist philosophy, view propaganda as a form of aimed at domination over truth-seeking, eroding personal agency and societal freedom by systematically distorting reality. further argued that propaganda constitutes a total assault on human personality, substituting conditioned responses for critical thought, which undermines ethical responsibility. Consequentialist objections highlight propaganda's empirical harms, such as fostering division, justifying atrocities, or breeding long-term cynicism toward institutions. For instance, Nazi wartime campaigns, while effective in mobilizing support, contributed to and by amplifying ethnic stereotypes, illustrating how unchecked propaganda can cascade into irreversible societal damage. Utilitarian analyses sometimes weigh against it when short-term gains, like boosted , yield net losses through distorted or backlash, as seen in revelations eroding public trust in Allied efforts. Justifications for propaganda often adopt a realist or utilitarian stance, portraying it as a morally neutral instrument whose ethics depend on context and ends rather than intrinsic qualities. Harold Lasswell described it as akin to a tool like a "pump handle," analyzable scientifically without presuming immorality, applicable for defensive or unifying purposes in crises. In wartime, utilitarian defenses argue that even deceptive elements—such as exaggerated threat portrayals—can maximize welfare by sustaining troop morale and civilian resolve, potentially shortening conflicts and averting greater casualties, as evidenced by World War I efforts that maintained home-front support amid attrition. Some ethicists propose conditional criteria for "just" use, including transparency where feasible, proportionality to threats, and avoidance of exploiting vulnerabilities, allowing counter-hegemonic applications against dominant ideologies without descending into pure manipulation. These views acknowledge propaganda's inevitability in mass societies, prioritizing regulated application over outright prohibition to harness its persuasive power for survival or equity.

Regulatory Approaches and Free Speech Tensions

In the United States, the (FARA) of 1938 mandates public disclosure by individuals or entities acting on behalf of foreign principals to influence U.S. policy or , primarily targeting foreign propaganda dissemination without prohibiting the content itself. Enacted amid concerns over Nazi and communist influence, FARA emphasizes transparency over censorship, requiring agents to label materials and report activities quarterly, with over 700 active registrations reported by the Department of Justice as of 2023. This approach aligns with First Amendment protections, which courts have interpreted to safeguard even deceptive or propagandistic speech absent direct incitement to imminent harm, as affirmed in cases like (2012), where the struck down the Act for punishing false statements without sufficient justification. The Smith-Mundt Act of 1948 originally barred the U.S. government from disseminating its international broadcasting materials domestically to prevent state propaganda targeting citizens, but the 2012 modernization amendment lifted this restriction, allowing access to programs like Voice of America content within the U.S. Critics argue this enables covert government influence on public discourse, prompting legislative pushes such as H.R. 5704 in 2025 to repeal the modernization and reinstate the ban, citing risks of "apple pie propaganda" that normalizes federal narrative control. Empirical analyses indicate limited abuse post-2013, with domestic viewership remaining low—under 1% of U.S. audiences for such content—but tensions persist over whether disclosure suffices or if outright prohibitions are needed to preserve an independent marketplace of ideas. In the , the (DSA), enforced from August 2023 for large platforms, imposes obligations on intermediaries to mitigate systemic risks from propaganda and , including rapid removal of illegal content and algorithmic transparency to curb amplification. The accompanying 2018 on , strengthened in 2022, commits signatories like Meta and to and ad labeling, with 83% of Europeans perceiving as a democratic threat per EU surveys. However, these measures have sparked free expression concerns, as platforms' overcompliance—such as preemptive content demotion—can chill legitimate debate, evidenced by a 2023 Center for International Media Assistance report documenting in 20+ countries under similar "" laws. Internationally, the International Covenant on (ICCPR) Article 20(1), ratified by 173 states as of 2023, explicitly prohibits "propaganda for war," while restricts subversive propaganda aimed at destabilizing governments through violence, though non-violent advocacy for remains protected. Enforcement is sporadic, with bodies like the UN Committee critiquing broad applications that suppress , as in Russia's 2012 foreign agent law mirroring FARA but extending to domestic NGOs labeled as propagandists. These regulatory frameworks engender profound tensions with free speech principles, rooted in the causal reality that content-based restrictions invite favoring incumbents, as historical precedents like interwar propaganda bans demonstrate evasion via proxies and underground channels. Proponents of regulation invoke prevention, citing studies linking unchecked propaganda to polarized beliefs and reduced trust—e.g., a 2022 analysis showing disinformation's role in electoral interference—but skeptics, drawing from first-amendment absolutism, warn of viewpoint , where defining "propaganda" subjectively erodes the epistemic competition essential for truth discernment. In practice, disclosure regimes like FARA prove less intrusive than outright bans, fostering accountability without preempting speech, though global trends toward platform liability risk privatized , as platforms err toward removal to evade fines exceeding 6% of global revenue under DSA rules. Empirical evidence from counter-disinformation efforts underscores that transparency and yield behavioral shifts without legal coercion, contrasting with 's potential to amplify biases in enforcement institutions.

Truth-Seeking Alternatives and Debunking

Truth-seeking alternatives to propaganda emphasize empirical verification, logical scrutiny, and decentralized evaluation over centralized narrative control. skills, such as assessing evidence quality, identifying logical fallacies, and cross-referencing multiple independent sources, enable individuals to resist manipulative messaging by prioritizing falsifiable claims testable against observable reality. These methods draw from philosophical traditions like Karl Popper's emphasis on and refutation, where hypotheses are rigorously challenged rather than accepted on . Unlike propaganda, which often relies on emotional appeals and repetition, truth-seeking fosters —acknowledging uncertainty and updating beliefs based on new data—reducing susceptibility to coordinated influence campaigns. Debunking propaganda involves targeted corrections that provide accurate alternatives while avoiding reinforcement of falsehoods. Empirical studies indicate that simple fact-checks can reduce belief in by 10-20% on average, though effects diminish over time without reinforcement. Prebunking, or application, proves more proactive: exposing individuals to weakened forms of deceptive arguments beforehand builds cognitive resistance, akin to , with meta-analyses showing sustained reductions in persuasion by misleading claims up to months later. For instance, online games teaching recognition of manipulation tactics like false dichotomies or have lowered acceptance of propaganda narratives by alerting users to common rhetorical ploys. However, debunking carries risks, including the "backfire effect," where entrench false beliefs among those with strong prior convictions, particularly if the source lacks perceived credibility. Recent replications find this effect rare and mostly tied to low-reliability or conflicts, occurring in fewer than 5% of cases under controlled conditions, but it underscores the need for source-neutral delivery and emphasis on facts over . Effective strategies mitigate this by with a "truth sandwich"—stating facts first, addressing the briefly, then reiterating truth—which preserves accuracy without undue repetition that could foster illusory truth via familiarity. Independent verification tools, such as blockchain-based tracking or crowdsourced , offer scalable alternatives, though their adoption remains limited by technical barriers and institutional resistance. Long-term societal resilience requires institutional reforms, like incentivizing transparent and adversarial collaboration in , to counter propaganda's entrenchment. While organizations provide utility, their outputs must be scrutinized for ideological skew, as studies reveal selective application that amplifies certain narratives over others. Truth-seeking thrives through open platforms where claims compete on evidentiary merit, not , yielding higher discernment than top-down debunking alone.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.