Recent from talks
Nothing was collected or created yet.
Propaganda
View on Wikipedia

Propaganda is communication that is primarily used to influence or persuade an audience to further an agenda, which may not be objective and may be selectively presenting facts to encourage a particular synthesis or perception, or using loaded language to produce an emotional rather than a rational response to the information that is being presented.[1] Propaganda can be found in a wide variety of different contexts.[2]
Beginning in the twentieth century, the English term propaganda became associated with a manipulative approach, but historically, propaganda had been a neutral descriptive term of any material that promotes certain opinions or ideologies.[1][3]
A wide range of materials and media are used for conveying propaganda messages, which changed as new technologies were invented, including paintings, cartoons, posters,[4] pamphlets, films, radio shows, TV shows, and websites. More recently, the digital age has given rise to new ways of disseminating propaganda, for example, in computational propaganda, bots and algorithms are used to manipulate public opinion, e.g., by creating fake or biased news to spread it on social media or using chatbots to mimic real people in discussions in social networks.
Etymology
[edit]Propaganda is a modern Latin word, the neuter plural gerundive form of propagare, meaning 'to spread' or 'to propagate', thus propaganda means the things which are to be propagated.[5] Originally this word derived from a new administrative body (congregation) of the Catholic Church created in 1622 as part of the Counter-Reformation, called the Congregatio de Propaganda Fide (Congregation for Propagating the Faith), or informally simply Propaganda.[3][6] Its activity was aimed at "propagating" the Catholic faith in non-Catholic countries.[3]
From the 1790s, the term began being used also to refer to propaganda in secular activities.[3] In English, the cognate began taking a pejorative or negative connotation in the mid-19th century, when it was used in the political sphere.[3]
Non-English cognates of propaganda as well as some similar non-English terms retain neutral or positive connotations. For example, in official party discourse, xuanchuan is treated as a more neutral or positive term, though it can be used pejoratively through protest or other informal settings within China.[7][8]: 4–6
Definitions
[edit]
Historian Arthur Aspinall observed that newspapers were not expected to be independent organs of information when they began to play an important part in political life in the late 1700s, but were assumed to promote the views of their owners or government sponsors.[9] In the 20th century, the term propaganda emerged along with the rise of mass media, including newspapers and radio. As researchers began studying the effects of media, they used suggestion theory to explain how people could be influenced by emotionally-resonant persuasive messages. Harold Lasswell provided a broad definition of the term propaganda, writing it as: "the expression of opinions or actions carried out deliberately by individuals or groups with a view to influencing the opinions or actions of other individuals or groups for predetermined ends and through psychological manipulations."[10] Garth Jowett and Victoria O'Donnell theorize that propaganda and persuasion are linked as humans use communication as a form of soft power through the development and cultivation of propaganda materials.[11]
In a 1929 literary debate with Edward Bernays, Everett Dean Martin argues that, "Propaganda is making puppets of us. We are moved by hidden strings which the propagandist manipulates."[12] In the 1920s and 1930s, propaganda was sometimes described as all-powerful. For example, Bernays acknowledged in his book Propaganda that "The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country. We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of."[13]
NATO's 2011 guidance for military public affairs defines propaganda as "information, ideas, doctrines, or special appeals disseminated to influence the opinion, emotions, attitudes, or behaviour of any specified group in order to benefit the sponsor, either directly or indirectly".[14] More recently the RAND Corporation coined the term Firehose of Falsehood to describe how modern communication capabilities enable a large number of messages to be broadcast rapidly, repetitively, and continuously over multiple channels (like news and social media) without regard for truth or consistency.
History
[edit]Primitive forms of propaganda have been a human activity as far back as reliable recorded evidence exists. The Behistun Inscription (c. 515 BCE) detailing the rise of Darius I to the Persian throne is viewed by most historians as an early example of propaganda.[15] Another striking example of propaganda during ancient history is the last Roman civil wars (44–30 BCE) during which Octavian and Mark Antony blamed each other for obscure and degrading origins, cruelty, cowardice, oratorical and literary incompetence, debaucheries, luxury, drunkenness and other slanders.[16] This defamation took the form of uituperatio (Roman rhetorical genre of the invective) which was decisive for shaping the Roman public opinion at this time. Another early example of propaganda was from Genghis Khan. The emperor would send some of his men ahead of his army to spread rumors to the enemy. In many cases, his army was actually smaller than his opponents'.[17]
Holy Roman Emperor Maximilian I was the first ruler to utilize the power of the printing press for propaganda – in order to build his image, stir up patriotic feelings in the population of his empire (he was the first ruler who utilized one-sided battle reports – the early predecessors of modern newspapers or neue zeitungen – targeting the mass.[18][19]) and influence the population of his enemies.[20][21][22] Propaganda during the Reformation, helped by the spread of the printing press throughout Europe, and in particular within Germany, caused new ideas, thoughts, and doctrine to be made available to the public in ways that had never been seen before the 16th century. During the era of the American Revolution, the American colonies had a flourishing network of newspapers and printers who specialized in the topic on behalf of the Patriots (and to a lesser extent on behalf of the Loyalists).[23] Academic Barbara Diggs-Brown conceives that the negative connotations of the term "propaganda" are associated with the earlier social and political transformations that occurred during the French Revolutionary period movement of 1789 to 1799 between the start and the middle portion of the 19th century, in a time when the word started to be used in a nonclerical and political context.[24]

The first large-scale and organised propagation of government propaganda was occasioned by the outbreak of the First World War in 1914. After the defeat of Germany, military officials such as General Erich Ludendorff suggested that British propaganda had been instrumental in their defeat. Adolf Hitler came to echo this view, believing that it had been a primary cause of the collapse of morale and revolts in the German home front and Navy in 1918 (see also: Dolchstoßlegende). In Mein Kampf (1925) Hitler expounded his theory of propaganda, which provided a powerful base for his rise to power in 1933. Historian Robert Ensor explains that "Hitler...puts no limit on what can be done by propaganda; people will believe anything, provided they are told it often enough and emphatically enough, and that contradicters are either silenced or smothered in calumny."[25] This was to be true in Germany and backed up with their army making it difficult to allow other propaganda to flow in.[26] Most propaganda in Nazi Germany was produced by the Ministry of Public Enlightenment and Propaganda under Joseph Goebbels. Goebbels mentions propaganda as a way to see through the masses. Symbols are used towards propaganda such as justice, liberty and one's devotion to one's country.[27] World War II saw continued use of propaganda as a weapon of war, building on the experience of WWI, by Goebbels and the British Political Warfare Executive, as well as the United States Office of War Information.[28]
In the early 20th century, the invention of motion pictures (as in movies, diafilms) gave propaganda-creators a powerful tool for advancing political and military interests when it came to reaching a broad segment of the population and creating consent or encouraging rejection of the real or imagined enemy. In the years following the October Revolution of 1917, the Soviet government sponsored the Russian film industry with the purpose of making propaganda films (e.g., the 1925 film The Battleship Potemkin glorifies Communist ideals). In WWII, Nazi filmmakers produced highly emotional films to create popular support for occupying the Sudetenland and attacking Poland. The 1930s and 1940s, which saw the rise of totalitarian states and the Second World War, are arguably the "Golden Age of Propaganda". Leni Riefenstahl, a filmmaker working in Nazi Germany, created one of the best-known propaganda movies, Triumph of the Will. In 1942, the propaganda song Niet Molotoff was made in Finland during the Continuation War, making fun of the Red Army's failure in the Winter War, referring the song's name to the Soviet's Minister of Foreign Affairs, Vyacheslav Molotov.[29] In the US, animation became popular, especially for winning over youthful audiences and aiding the U.S. war effort, e.g., Der Fuehrer's Face (1942), which ridicules Hitler and advocates the value of freedom. Some American war films in the early 1940s were designed to create a patriotic mindset and convince viewers that sacrifices needed to be made to defeat the Axis powers.[30] Others were intended to help Americans understand their Allies in general, as in films like Know Your Ally: Britain and Our Greek Allies. Apart from its war films, Hollywood did its part to boost American morale in a film intended to show how stars of stage and screen who remained on the home front were doing their part not just in their labors, but also in their understanding that a variety of peoples worked together against the Axis menace: Stage Door Canteen (1943) features one segment meant to dispel Americans' mistrust of the Soviets, and another to dispel their bigotry against the Chinese. Polish filmmakers in Great Britain created the anti-Nazi color film Calling Mr. Smith[31][32] (1943) about Nazi crimes in German-occupied Europe and about lies of Nazi propaganda.[33]
The John Steinbeck novel The Moon Is Down (1942), about the Socrates-inspired spirit of resistance in an occupied village in Northern Europe, was presumed to be about Norway's response to the German occupiers. In 1945, Steinbeck received the King Haakon VII Freedom Cross for his literary contributions to the Norwegian resistance movement.[34]
The West and the Soviet Union both used propaganda extensively during the Cold War. Both sides used film, television, and radio programming to influence their own citizens, each other, and Third World nations. Through a front organization called the Bedford Publishing Company, the CIA through a covert department called the Office of Policy Coordination disseminated over one million books to Soviet readers over the span of 15 years, including novels by George Orwell, Albert Camus, Vladimir Nabokov, James Joyce, and Pasternak in an attempt to promote anti-communist sentiment and sympathy of Western values.[35] George Orwell's contemporaneous novels Animal Farm and Nineteen Eighty-Four portray the use of propaganda in fictional dystopian societies. During the Cuban Revolution, Fidel Castro stressed the importance of propaganda.[36][better source needed] Propaganda was used extensively by Communist forces in the Vietnam War as means of controlling people's opinions.[37]
During the Yugoslav wars, propaganda was used as a military strategy by governments of Federal Republic of Yugoslavia and Croatia. Propaganda was used to create fear and hatred, and particularly to incite the Serb population against the other ethnicities (Bosniaks, Croats, Albanians and other non-Serbs). Serb media made a great effort in justifying, revising or denying mass war crimes committed by Serb forces during these wars.[38]
Public perceptions
[edit]In the early 20th century the term propaganda was used by the founders of the nascent public relations industry to refer to their people. Literally translated from the Latin gerundive as "things that must be disseminated", in some cultures the term is neutral or even positive, while in others the term has acquired a strong negative connotation. The connotations of the term "propaganda" can also vary over time. For example, in Portuguese and some Spanish language speaking countries, particularly in the Southern Cone, the word "propaganda" usually refers to the most common manipulative media in business terms – "advertising".[39]

In English, propaganda was originally a neutral term for the dissemination of information in favor of any given cause. During the 20th century, however, the term acquired a thoroughly negative meaning in western countries, representing the intentional dissemination of often false, but certainly "compelling" claims to support or justify political actions or ideologies. According to Harold Lasswell, the term began to fall out of favor due to growing public suspicion of propaganda in the wake of its use during World War I by the Creel Committee in the United States and the Ministry of Information in Britain: Writing in 1928, Lasswell observed, "In democratic countries the official propaganda bureau was looked upon with genuine alarm, for fear that it might be suborned to party and personal ends. The outcry in the United States against Mr. Creel's famous Bureau of Public Information (or 'Inflammation') helped to din into the public mind the fact that propaganda existed. ... The public's discovery of propaganda has led to a great of lamentation over it. Propaganda has become an epithet of contempt and hate, and the propagandists have sought protective coloration in such names as 'public relations council,' 'specialist in public education,' 'public relations adviser.' "[40] In 1949, political science professor Dayton David McKean wrote, "After World War I the word came to be applied to 'what you don't like of the other fellow's publicity,' as Edward L. Bernays said...."[41]
Contestation
[edit]The term is essentially contested and some have argued for a neutral definition,[42][43]: 9 arguing that ethics depend on intent and context,[43] while others define it as necessarily unethical and negative.[44] Emma Briant defines it as "the deliberate manipulation of representations (including text, pictures, video, speech etc.) with the intention of producing any effect in the audience (e.g. action or inaction; reinforcement or transformation of feelings, ideas, attitudes or behaviours) that is desired by the propagandist."[43]: 9 The same author explains the importance of consistent terminology across history, particularly as contemporary euphemistic synonyms are used in governments' continual efforts to rebrand their operations such as 'information support' and strategic communication.[43]: 9 Other scholars also see benefits to acknowledging that propaganda can be interpreted as beneficial or harmful, depending on the message sender, target audience, message, and context.[2]
David Goodman argues that the 1936 League of Nations "Convention on the Use of Broadcasting in the Cause of Peace" tried to create the standards for a liberal international public sphere. The Convention encouraged empathetic and neighborly radio broadcasts to other nations. It called for League prohibitions on international broadcast containing hostile speech and false claims. It tried to define the line between liberal and illiberal policies in communications, and emphasized the dangers of nationalist chauvinism. With Nazi Germany and Soviet Russia active on the radio, its liberal goals were ignored, while free speech advocates warned that the code represented restraints on free speech.[45]
Types
[edit]
Identifying propaganda has always been a problem.[46] The main difficulties have involved differentiating propaganda from other types of persuasion, and avoiding a biased approach. Richard Alan Nelson provides a definition of the term: "Propaganda is neutrally defined as a systematic form of purposeful persuasion that attempts to influence the emotions, attitudes, opinions, and actions of specified target audiences for ideological, political or commercial purposes[47] through the controlled transmission of one-sided messages (which may or may not be factual) via mass and direct media channels."[48] The definition focuses on the communicative process involved – or more precisely, on the purpose of the process, and allow "propaganda" to be interpreted as positive or negative behavior depending on the perspective of the viewer or listener.
Propaganda can often be recognized by the rhetorical strategies used in its design. In the 1930s, the Institute for Propaganda Analysis identified a variety of propaganda techniques that were commonly used in newspapers and on the radio, which were the mass media of the time period. Propaganda techniques include "name calling" (using derogatory labels), "bandwagon" (expressing the social appeal of a message), or "glittering generalities" (using positive but imprecise language).[49] With the rise of the internet and social media, Renee Hobbs identified four characteristic design features of many forms of contemporary propaganda: (1) it activates strong emotions; (2) it simplifies information; (3) it appeals to the hopes, fears, and dreams of a targeted audience; and (4) it attacks opponents.[50]
Propaganda is sometimes evaluated based on the intention and goals of the individual or institution who created it. According to historian Zbyněk Zeman, propaganda is defined as either white, grey or black. White propaganda openly discloses its source and intent. Grey propaganda has an ambiguous or non-disclosed source or intent. Black propaganda purports to be published by the enemy or some organization besides its actual origins[51] (compare with black operation, a type of clandestine operation in which the identity of the sponsoring government is hidden). In scale, these different types of propaganda can also be defined by the potential of true and correct information to compete with the propaganda. For example, opposition to white propaganda is often readily found and may slightly discredit the propaganda source. Opposition to grey propaganda, when revealed (often by an inside source), may create some level of public outcry. Opposition to black propaganda is often unavailable and may be dangerous to reveal, because public cognizance of black propaganda tactics and sources would undermine or backfire the very campaign the black propagandist supported.
The propagandist seeks to change the way people understand an issue or situation for the purpose of changing their actions and expectations in ways that are desirable to the interest group. Propaganda, in this sense, serves as a corollary to censorship in which the same purpose is achieved, not by filling people's minds with approved information, but by preventing people from being confronted with opposing points of view. What sets propaganda apart from other forms of advocacy is the willingness of the propagandist to change people's understanding through deception and confusion rather than persuasion and understanding. The leaders of an organization know the information to be one sided or untrue, but this may not be true for the rank and file members who help to disseminate the propaganda.

Religious
[edit]Propaganda was often used to influence opinions and beliefs on religious issues, particularly during the split between the Roman Catholic Church and the Protestant churches or during the Crusades.[57]
The sociologist Jeffrey K. Hadden has argued that members of the anti-cult movement and Christian counter-cult movement accuse the leaders of what they consider cults of using propaganda extensively to recruit followers and keep them. Hadden argued that ex-members of cults and the anti-cult movement are committed to making these movements look bad.[58]
Propaganda against other religions in the same community or propaganda intended to keep political power in the hands of a religious elite can incite religious hate on a global or national scale. It could make use of many propaganda mediums. War, terrorism, riots, and other violent acts can result from it. It can also conceal injustices, inequities, exploitation, and atrocities, leading to ignorance-based indifference and alienation.[59]
Wartime
[edit]This section needs additional citations for verification. (April 2021) |

In the Peloponnesian War, the Athenians exploited the figures from stories about Troy as well as other mythical images to incite feelings against Sparta. For example, Helen of Troy was even portrayed as an Athenian, whose mother Nemesis would avenge Troy.[60][61] During the Punic Wars, extensive campaigns of propaganda were carried out by both sides. To dissolve the Roman system of socii and the Greek poleis, Hannibal released without conditions Latin prisoners that he had treated generously to their native cities, where they helped to disseminate his propaganda.[62] The Romans on the other hand tried to portray Hannibal as a person devoid of humanity and would soon lose the favour of gods. At the same time, led by Q.Fabius Maximus, they organized elaborate religious rituals to protect Roman morale.[63][62]
In the early sixteenth century, Maximilian I invented one kind of psychological warfare targeting the enemies. During his war against Venice, he attached pamphlets to balloons that his archers would shoot down. The content spoke of freedom and equality and provoked the populace to rebel against the tyrants (their Signoria).[22]


Propaganda is a powerful weapon in war; in certain cases, it is used to dehumanize and create hatred toward a supposed enemy, either internal or external, by creating a false image in the mind of soldiers and citizens. This can be done by using derogatory or racist terms (e.g., the racist terms "Jap" and "gook" used during World War II and the Vietnam War, respectively), avoiding some words or language or by making allegations of enemy atrocities. The goal of this was to demoralize the opponent into thinking what was being projected was actually true.[64] Most propaganda efforts in wartime require the home population to feel the enemy has inflicted an injustice, which may be fictitious or may be based on facts (e.g., the sinking of the passenger ship RMS Lusitania by the German Navy in World War I). The home population must also believe that the cause of their nation in the war is just. In these efforts it was difficult to determine the accuracy of how propaganda truly impacted the war.[65] In NATO doctrine, propaganda is defined as "Information, especially of a biased or misleading nature, used to promote a political cause or point of view."[66] Within this perspective, the information provided does not need to be necessarily false but must be instead relevant to specific goals of the "actor" or "system" that performs it.
Propaganda is also one of the methods used in psychological warfare, which may also involve false flag operations in which the identity of the operatives is depicted as those of an enemy nation (e.g., The Bay of Pigs Invasion used CIA planes painted in Cuban Air Force markings). The term propaganda may also refer to false information meant to reinforce the mindsets of people who already believe as the propagandist wishes (e.g., During the First World War, the main purpose of British propaganda was to encourage men to join the army, and women to work in the country's industry. Propaganda posters were used because regular general radio broadcasting was yet to commence and TV technology was still under development).[67] The assumption is that, if people believe something false, they will constantly be assailed by doubts. Since these doubts are unpleasant (see cognitive dissonance), people will be eager to have them extinguished, and are therefore receptive to the reassurances of those in power. For this reason, propaganda is often addressed to people who are already sympathetic to the agenda or views being presented. This process of reinforcement uses an individual's predisposition to self-select "agreeable" information sources as a mechanism for maintaining control over populations.[improper synthesis?]
Propaganda may be administered in insidious ways. For instance, disparaging disinformation about the history of certain groups or foreign countries may be encouraged or tolerated in the educational system. Since few people actually double-check what they learn at school, such disinformation will be repeated by journalists as well as parents, thus reinforcing the idea that the disinformation item is really a "well-known fact", even though no one repeating the myth is able to point to an authoritative source. The disinformation is then recycled in the media and in the educational system, without the need for direct governmental intervention on the media. Such permeating propaganda may be used for political goals: by giving citizens a false impression of the quality or policies of their country, they may be incited to reject certain proposals or certain remarks or ignore the experience of others.


In the Soviet Union during the Second World War, the propaganda designed to encourage civilians was controlled by Stalin, who insisted on a heavy-handed style that educated audiences easily saw was inauthentic. On the other hand, the unofficial rumors about German atrocities were well founded and convincing.[69] Stalin was a Georgian who spoke Russian with a heavy accent. That would not do for a national hero so starting in the 1930s all new visual portraits of Stalin were retouched to erase his Georgian facial characteristics[clarify][70] and make him a more generalized Soviet hero. Only his eyes and famous moustache remained unaltered. Zhores Medvedev and Roy Medvedev say his "majestic new image was devised appropriately to depict the leader of all times and of all peoples."[71]
Article 20 of the International Covenant on Civil and Political Rights prohibits any propaganda for war as well as any advocacy of national or religious hatred that constitutes incitement to discrimination, hostility or violence by law.[72]
Naturally, the common people don't want war; neither in Russia nor in England nor in America, nor for that matter in Germany. That is understood. But, after all, it is the leaders of the country who determine the policy and it is always a simple matter to drag the people along, whether it is a democracy or a fascist dictatorship or a Parliament or a Communist dictatorship. The people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked and denounce the pacifists for lack of patriotism and exposing the country to danger. It works the same way in any country.
Simply enough the covenant specifically is not defining the content of propaganda. In simplest terms, an act of propaganda if used in a reply to a wartime act is not prohibited.[74]
Advertising
[edit]Propaganda shares techniques with advertising and public relations, each of which can be thought of as propaganda that promotes a commercial product or shapes the perception of an organization, person, or brand. For example, after claiming victory in the 2006 Lebanon War, Hezbollah campaigned for broader popularity among Arabs by organizing mass rallies where Hezbollah leader Hassan Nasrallah combined elements of the local dialect with classical Arabic to reach audiences outside Lebanon. Banners and billboards were commissioned in commemoration of the war, along with various merchandise items with Hezbollah's logo, flag color (yellow), and images of Nasrallah. T-shirts, baseball caps and other war memorabilia were marketed for all ages. The uniformity of messaging helped define Hezbollah's brand.[75]
In the journalistic context, advertisements evolved from the traditional commercial advertisements to include also a new type in the form of paid articles or broadcasts disguised as news. These generally present an issue in a very subjective and often misleading light, primarily meant to persuade rather than inform. Normally they use only subtle propaganda techniques and not the more obvious ones used in traditional commercial advertisements. If the reader believes that a paid advertisement is in fact a news item, the message the advertiser is trying to communicate will be more easily "believed" or "internalized". Such advertisements are considered obvious examples of "covert" propaganda because they take on the appearance of objective information rather than the appearance of propaganda, which is misleading. Federal law[where?] specifically mandates that any advertisement appearing in the format of a news item must state that the item is in fact a paid advertisement.
Edmund McGarry illustrates that advertising is more than selling to an audience but a type of propaganda that is trying to persuade the public and not to be balanced in judgement.[76]
Politics
[edit]
Propaganda has become more common in political contexts, in particular, to refer to certain efforts sponsored by governments, political groups, but also often covert interests. In the early 20th century, propaganda was exemplified in the form of party slogans. Propaganda also has much in common with public information campaigns by governments, which are intended to encourage or discourage certain forms of behavior (such as wearing seat belts, not smoking, not littering, and so forth). Again, the emphasis is more political in propaganda. Propaganda can take the form of leaflets, posters, TV, and radio broadcasts and can also extend to any other medium. In the case of the United States, there is also an important legal (imposed by law) distinction between advertising (a type of overt propaganda) and what the Government Accountability Office (GAO), an arm of the United States Congress, refers to as "covert propaganda." Propaganda is divided into two in political situations, they are preparation, meaning to create a new frame of mind or view of things, and operational, meaning they instigate actions.[77]
Roderick Hindery argues[78][79] that propaganda exists on the political left, and right, and in mainstream centrist parties. Hindery further argues that debates about most social issues can be productively revisited in the context of asking "what is or is not propaganda?" Not to be overlooked is the link between propaganda, indoctrination, and terrorism/counterterrorism. He argues that threats to destroy are often as socially disruptive as physical devastation itself.
Since 9/11 and the appearance of greater media fluidity, propaganda institutions, practices and legal frameworks have been evolving in the US and Britain. Briant shows how this included expansion and integration of the apparatus cross-government and details attempts to coordinate the forms of propaganda for foreign and domestic audiences, with new efforts in strategic communication.[80] These were subject to contestation within the US Government, resisted by Pentagon Public Affairs and critiqued by some scholars.[43] The National Defense Authorization Act for Fiscal Year 2013 (section 1078 (a)) amended the US Information and Educational Exchange Act of 1948 (popularly referred to as the Smith-Mundt Act) and the Foreign Relations Authorization Act of 1987, allowing for materials produced by the State Department and the Broadcasting Board of Governors (BBG) to be released within U.S. borders for the Archivist of the United States. The Smith-Mundt Act, as amended, provided that "the Secretary and the Broadcasting Board of Governors shall make available to the Archivist of the United States, for domestic distribution, motion pictures, films, videotapes, and other material 12 years after the initial dissemination of the material abroad (...) Nothing in this section shall be construed to prohibit the Department of State or the Broadcasting Board of Governors from engaging in any medium or form of communication, either directly or indirectly, because a United States domestic audience is or may be thereby exposed to program material, or based on a presumption of such exposure." Public concerns were raised upon passage due to the relaxation of prohibitions of domestic propaganda in the United States.[81]
In the wake of this, the internet has become a prolific method of distributing political propaganda, benefiting from an evolution in coding called bots. Software agents or bots can be used for many things, including populating social media with automated messages and posts with a range of sophistication. During the 2016 U.S. election a cyber-strategy was implemented using bots to direct US voters to Russian political news and information sources, and to spread politically motivated rumors and false news stories. At this point it is considered commonplace contemporary political strategy around the world to implement bots in achieving political goals.[82]
Techniques
[edit]
Common media for transmitting propaganda messages include news reports, government reports, historical revision, junk science, books, leaflets, movies, radio, television, posters and social media. Some propaganda campaigns follow a strategic transmission pattern to indoctrinate the target group. This may begin with a simple transmission, such as a leaflet or advertisement dropped from a plane or an advertisement. Generally, these messages will contain directions on how to obtain more information, via a website, hotline, radio program, etc. (as it is seen also for selling purposes among other goals). The strategy intends to initiate the individual from information recipient to information seeker through reinforcement, and then from information seeker to opinion leader through indoctrination.[83]
A number of techniques based in social psychological research are used to generate propaganda. Many of these same techniques can be found under logical fallacies, since propagandists use arguments that, while sometimes convincing, are not necessarily valid.
Some time has been spent analyzing the means by which the propaganda messages are transmitted. That work is important but it is clear that information dissemination strategies become propaganda strategies only when coupled with propagandistic messages. Identifying these messages is a necessary prerequisite to study the methods by which those messages are spread.
Theodor W. Adorno wrote that fascist propaganda encourages identification with an authoritarian personality characterized by traits such as obedience and extreme aggression.[84]: 17 In The Myth of the State, Ernst Cassirer wrote that while fascist propaganda mythmaking flagrantly contradicted empirical reality, it provided a simple and direct answer to the anxieties of the secular present.[84]: 63
Propaganda can also be turned on its makers. For example, postage stamps have frequently been tools for government advertising, such as North Korea's extensive issues.[85] The presence of Stalin on numerous Soviet stamps is another example.[86] In Nazi Germany, Hitler frequently appeared on postage stamps in Germany and some of the occupied nations. A British program to parody these, and other Nazi-inspired stamps, involved airdropping them into Germany on letters containing anti-Nazi literature.[87][88]
In 2018 a scandal broke in which the journalist Carole Cadwalladr, several whistleblowers and the academic Emma Briant revealed advances in digital propaganda techniques showing that online human intelligence techniques used in psychological warfare had been coupled with psychological profiling using illegally obtained social media data for political campaigns in the United States in 2016 to aid Donald Trump by the firm Cambridge Analytica.[89][90][91] The company initially denied breaking laws[92] but later admitted breaking UK law, the scandal provoking a worldwide debate on acceptable use of data for propaganda and influence.[93]
Models
[edit]Persuasion in social psychology
[edit]
The field of social psychology includes the study of persuasion. Social psychologists can be sociologists or psychologists. The field includes many theories and approaches to understanding persuasion. For example, communication theory points out that people can be persuaded by the communicator's credibility, expertise, trustworthiness, and attractiveness. The elaboration likelihood model, as well as heuristic models of persuasion, suggest that a number of factors (e.g., the degree of interest of the recipient of the communication), influence the degree to which people allow superficial factors to persuade them. Nobel Prize–winning psychologist Herbert A. Simon won the Nobel prize for his theory that people are cognitive misers. That is, in a society of mass information, people are forced to make decisions quickly and often superficially, as opposed to logically.
According to William W. Biddle's 1931 article "A psychological definition of propaganda", "[t]he four principles followed in propaganda are: (1) rely on emotions, never argue; (2) cast propaganda into the pattern of "we" versus an "enemy"; (3) reach groups as well as individuals; (4) hide the propagandist as much as possible."[94]
More recently, studies from behavioral science have become significant in understanding and planning propaganda campaigns, these include for example nudge theory which was used by the Obama Campaign in 2008 then adopted by the UK Government Behavioural Insights Team.[95] Behavioural methodologies then became subject to great controversy in 2016 after the company Cambridge Analytica was revealed to have applied them with millions of people's breached Facebook data to encourage them to vote for Donald Trump.[96]
Haifeng Huang argues that propaganda is not always necessarily about convincing a populace of its message (and may actually fail to do this) but instead can also function as a means of intimidating the citizenry and signalling the regime's strength and ability to maintain its control and power over society; by investing significant resources into propaganda, the regime can forewarn its citizens of its strength and deterring them from attempting to challenge it.[97]
Propaganda theory and education
[edit]During the 1930s, educators in the United States and around the world became concerned about the rise of anti-Semitism and other forms of violent extremism. The Institute for Propaganda Analysis was formed to introduce methods of instruction for high school and college students, helping learners to recognize and desist propaganda by identifying persuasive techniques. This work built upon classical rhetoric and it was informed by suggestion theory and social scientific studies of propaganda and persuasion.[98] In the 1950s, propaganda theory and education examined the rise of American consumer culture, and this work was popularized by Vance Packard in his 1957 book, The Hidden Persuaders. European theologian Jacques Ellul's landmark work, Propaganda: The Formation of Men's Attitudes framed propaganda in relation to larger themes about the relationship between humans and technology. Media messages did not serve to enlighten or inspire, he argued. They merely overwhelm by arousing emotions and oversimplifying ideas, limiting human reasoning and judgement.
In the 1980s, academics recognized that news and journalism could function as propaganda when business and government interests were amplified by mass media. The propaganda model is a theory advanced by Edward S. Herman and Noam Chomsky which argues systemic biases exist in mass media that are shaped by structural economic causes. It argues that the way in which commercial media institutions are structured and operate (e.g. through advertising revenue, concentration of media ownership, or access to sources) creates an inherent conflict of interest that make them act as propaganda for powerful political and commercial interests:
The 20th century has been characterized by three developments of great political importance: the growth of democracy, the growth of corporate power, and the growth of corporate propaganda as a means of protecting corporate power against democracy.[99][100]
First presented in their book Manufacturing Consent: The Political Economy of the Mass Media (1988), the propaganda model analyses commercial mass media as businesses that sell a product – access to readers and audiences – to other businesses (advertisers) and that benefit from access to information from government and corporate sources to produce their content. The theory postulates five general classes of "filters" that shape the content that is presented in news media: ownership of the medium, reliance on advertising revenue, access to news sources, threat of litigation and commercial backlash (flak), and anti-communism and "fear ideology". The first three (ownership, funding, and sourcing) are generally regarded by the authors as being the most important. Although the model was based mainly on the characterization of United States media, Chomsky and Herman believe the theory is equally applicable to any country that shares the basic political economic structure, and the model has subsequently been applied by other scholars to study media bias in other countries.[101]
By the 1990s, the topic of propaganda was no longer a part of public education, having been relegated to a specialist subject. Secondary English educators grew fearful of the study of propaganda genres, choosing to focus on argumentation and reasoning instead of the highly emotional forms of propaganda found in advertising and political campaigns.[102] In 2015, the European Commission funded Mind Over Media, a digital learning platform for teaching and learning about contemporary propaganda. The study of contemporary propaganda is growing in secondary education, where it is seen as a part of language arts and social studies education.[103]
Self-propaganda
[edit]Self-propaganda is a form of propaganda that refers to the act of an individual convincing themself of something, no matter how irrational that idea may be.[104] Self propaganda makes it easier for individuals to justify their own actions as well as the actions of others. Self-propaganda often works to lessen the cognitive dissonance felt by individuals when their personal actions or the actions of their government do not line up with their moral beliefs.[105] Self-propaganda is a type of self deception.[106] Self-propaganda can have a negative impact on those who perpetuate the beliefs created by using self-propaganda.[106]
Children
[edit]This section needs additional citations for verification. (January 2009) |

Of all the potential targets for propaganda, children are the most vulnerable because they are the least prepared with the critical reasoning and contextual comprehension they need to determine whether message is a propaganda or not. The attention children give their environment during development, due to the process of developing their understanding of the world, causes them to absorb propaganda indiscriminately. Also, children are highly imitative: studies by Albert Bandura, Dorothea Ross and Sheila A. Ross in the 1960s indicated that, to a degree, socialization, formal education and standardized television programming can be seen as using propaganda for the purpose of indoctrination. The use of propaganda in schools was highly prevalent during the 1930s and 1940s in Germany in the form of the Hitler Youth.
Anti-Semitic propaganda for children
[edit]In Nazi Germany, the education system was thoroughly co-opted to indoctrinate the German youth with anti-Semitic ideology. From the 1920s on, the Nazi Party targeted German youth as one of their special audience for its propaganda messages.[107] Schools and texts mirrored what the Nazis aimed of instilling in German youth through the use and promotion of racial theory. Julius Streicher, the editor of Der Stürmer, headed a publishing house that disseminated anti-Semitic propaganda picture books in schools during the Nazi dictatorship. This was accomplished through the National Socialist Teachers League, of which 97% of all German teachers were members in 1937.[108]
The League encouraged the teaching of racial theory. Picture books for children such as Trust No Fox on his Green Heath and No Jew on his Oath, Der Giftpilz (translated into English as The Poisonous Mushroom) and The Poodle-Pug-Dachshund-Pinscher were widely circulated (over 100,000 copies of Trust No Fox... were circulated during the late 1930s) and contained depictions of Jews as devils, child molesters and other morally charged figures. Slogans such as "Judas the Jew betrayed Jesus the German to the Jews" were recited in class. During the Nuremberg Trial, Trust No Fox on his Green Heath and No Jew on his Oath, and Der Giftpilz were received as documents in evidence because they document the practices of the Nazis[109] The following is an example of a propagandistic math problem recommended by the National Socialist Essence of Education: "The Jews are aliens in Germany—in 1933 there were 66,606,000 inhabitants in the German Reich, of whom 499,682 (0.75%) were Jews."[110]
Comparisons with disinformation
[edit]See also
[edit]- Agitprop
- Artificial intelligence and elections
- Big lie
- Brainwashing
- Cartographic propaganda
- Firehose of falsehood
- Hate media
- Incitement
- Internet troll
- Mind control
- Misinformation
- Music and political warfare
- Overview of 21st century propaganda
- Political warfare
- Psychological warfare (aka Psyops)
- Propaganda model
- Public diplomacy
- Sharp power
- Smear campaign
- Spin (propaganda)
- The Basic Principles of War Propaganda
References
[edit]- ^ a b Smith, Bruce L. (17 February 2016). "Propaganda". Encyclopædia Britannica, Inc. Retrieved 23 April 2016.
- ^ a b Hobbs, Renee (2020). Mind Over Media: Propaganda Education for a Digital Age. New York: W.W. Norton.
- ^ a b c d e Diggs-Brown, Barbara (12 August 2011). Strategic Public Relations: An Audience-Focused Approach. Cengage Learning. ISBN 978-0-534-63706-4.
- ^ Gibson, Stephanie (1 June 2008). "Display folk: Second World War posters at the Museum of New Zealand Te Papa Tongarewa". Tuhinga. 19: 7–27. doi:10.3897/tuhinga.19.e34165. ISSN 2253-5861.
- ^ "propaganda, n." Oxford English Dictionary. Oxford University Press. December 2020. Retrieved 20 April 2021.
- ^ "Online Etymology Dictionary". Retrieved 6 March 2015.
- ^ Edney, Kingsley (2014). The Globalization of Chinese Propaganda. New York: Palgrave Macmillan US. pp. 22–24, 195. doi:10.1057/9781137382153. ISBN 978-1-349-47990-0.
Outside the realm of official discourse, however, propaganda (xuanchuan), is occasionally used in a negative way...(p. 195)
- ^ Lin, Chunfeng (2023). Red Tourism in China: Commodification of Propaganda. Routledge. ISBN 9781032139609.
- ^ Arthur Aspinall, Politics and the Press 1780-1850, p. v ISBN 978-0-2-08012401 New York: Barnes and Noble Books (1949)
- ^ Ellul, Jacques (1965). Introduction by Konrad Kellen in Propaganda: The Formation of Men's Attitudes, pp. xi–xii. Trans. Konrad Kellen & Jean Lerner from original 1962 French edition Propagandes. Knopf, New York. ISBN 978-0-394-71874-3 (1973 edition by Vintage Books, New York).
- ^ Jowett, Garth; O'Donnell, Victoria (2012). Propaganda and Persuasion (5th ed.). Sage Publications Inc. ISBN 978-1412977821.[page needed]
- ^ Martin, Everett Dean (March 1929). Leach, Henry Goddard (ed.). "Are We Victims of Propaganda, Our Invisible Masters: A Debate with Edward Bernays" (PDF). The Forum. 81. Forum Publishing Company: 142–150. Retrieved 22 February 2020.
- ^ Bernays L, Edward (1928). Propaganda. Liveright: Horace. p. 9.
- ^ Kuehl, Dan (10 March 2014). "Chapter 1: Propaganda in the Digital Age". In Snow, Nancy (ed.). Propaganda and American Democracy. Louisiana State University Press. p. 12. ISBN 978-0-8071-5416-8.
- ^ Nagle, D. Brendan; Stanley M Burstein (2009). The Ancient World: Readings in Social and Cultural History. Pearson Education. p. 28. ISBN 978-0-205-69187-6.
- ^ Borgies, Loïc (2016). Le conflit propagandiste entre Octavien et Marc Antoine. De l'usage politique de la uituperatio entre 44 et 30 a. C. n. Éditions Latomus. ISBN 978-90-429-3459-7.
- ^ Davison, W. Phillips (1971). "Some Trends in International Propaganda". The Annals of the American Academy of Political and Social Science. 398: 1–13. doi:10.1177/000271627139800102. ISSN 0002-7162. JSTOR 1038915. S2CID 145332403.
- ^ Kunczic, Michael (2016). "Public Relations in Kriegzeiten – Die Notwendigkeit von Lüge und Zensur". In Preußer, Heinz-Peter (ed.). Krieg in den Medien (in German). Brill. p. 242. ISBN 978-94-012-0230-5. Retrieved 7 February 2022.
- ^ Kunczik, Michael (6 May 2016). Images of Nations and International Public Relations. Routledge. p. 158. ISBN 978-1-136-68902-4. Retrieved 7 February 2022.
- ^ Museum, Cincinnati Art; Becker, David P. (1993). Six Centuries of Master Prints: Treasures from the Herbert Greer French Collection. Cincinnati Art Museum. p. 68. ISBN 978-0-931537-15-8. Retrieved 7 February 2022.
- ^ Silver, Larry (2008). Marketing Maximilian: The Visual Ideology of a Holy Roman Emperor. Princeton University Press. p. 235. ISBN 978-0-691-13019-4. Retrieved 7 February 2022.
- ^ a b Füssel 2020, pp. 10–12.
- ^ Cole, Richard G. (1975). "The Reformation in Print: German Pamphlets and Propaganda". Archive for Reformation History. 66: 93–102. doi:10.14315/arg-1975-jg07. ISSN 0003-9381. S2CID 163518886.
- ^ Diggs-Brown, Barbara (2011). Cengage Advantage Books: Strategic Public Relations: An Audience-Focused Approach. Cengage Learning. p. 48. ISBN 978-0-534-63706-4.
- ^ Robert Ensor in David Thomson, ed., The New Cambridge Modern History: volume XII The Era of Violence 1890–1945 (1st edition 1960), p 84.
- ^ Yourman, Julius (November 1939). "Propaganda Techniques Within Nazi Germany". Journal of Educational Sociology. 13 (3): 148–163. doi:10.2307/2262307. JSTOR 2262307.
- ^ Cantril, Hadley (1938). "Propaganda Analysis". The English Journal. 27 (3): 217–221. doi:10.2307/806063. JSTOR 806063.
- ^ Fox, J. C., 2007, "Film propaganda in Britain and Nazi Germany : World War II cinema.", Oxford:Berg.
- ^ "Fono.fi – Äänitetietokanta". www.fono.fi (in Finnish). Retrieved 13 March 2020.
- ^ Philip M. Taylor, 1990, "Munitions of the mind: A history of propaganda", Pg. 170.
- ^ "Calling Mr. Smith – LUX". Archived from the original on 25 April 2018. Retrieved 30 January 2018.
- ^ "Calling Mr Smith". Centre Pompidou.
- ^ "Franciszka and Stefan Themerson: Calling Mr. Smith (1943) – artincinema". 21 June 2015.
- ^ "THE MOON IS DOWN by John Steinbeck on Sumner & Stillman". Sumner & Stillman. Archived from the original on 13 January 2019. Retrieved 13 January 2019.
- ^ Nick Romeo (17 June 2014). "Is Literature 'the Most Important Weapon of Propaganda'?". The Atlantic. Retrieved 28 February 2022.
- ^ prudentiapolitica (20 May 2014). "Prudentia Politica". Retrieved 6 March 2015.
- ^ Sophana Srichampa (30 August 2007). "Vietnamese propaganda reflections from 1945 to 2000" (PDF). Mon-Khmer Studies. 37. Thailand: Mahidol University: 87–116.
- ^ "Serbian Propaganda: A Closer Look". 12 April 1999. Archived from the original on 4 June 2013. Retrieved 21 December 2007.
NOAH ADAMS: The European Center for War, Peace and the News Media, based in London, has received word from Belgrade that no pictures of mass Albanian refugees have been shown at all, and that the Kosovo humanitarian catastrophe is only referred to as the one made up or over-emphasised by Western propaganda.
- ^ "English translation of Portuguese 'propaganda'". collinsdictionary.com. Retrieved 2 January 2024.
- ^ Lasswell, Harold D. (1928). "The Function of the Propagandist". International Journal of Ethics. 38 (3): 258–268. doi:10.1086/intejethi.38.3.2378152. JSTOR 2378152. S2CID 145021449. pp. 260–261
- ^ p. 113, Party and Pressure Politics, Boston: Houghton Mifflin Company, 1949.
- ^ Taylor, Philip M. (2002). "Strategic Communications or Democratic Propaganda?". Journalism Studies. 3 (3): 437–441. doi:10.1080/14616700220145641. S2CID 144546254.
- ^ a b c d e Briant, Emma Louise (2015). Propaganda and Counter-terrorism. Manchester: Manchester University Press. ISBN 9780719091056. JSTOR j.ctt18mvn1n.
- ^ Doob, L.W. (1949), Public Opinion and Propaganda, London: Cresset Press p 240
- ^ David Goodman, "Liberal and Illiberal Internationalism in the Making of the League of Nations Convention on Broadcasting in the Cause of Peace." Journal of World History 31.1 (2020): 165–193. excerpt
- ^ Daniel J Schwindt, The Case Against the Modern World: A Crash Course in Traditionalist Thought, 2016, pp. 202–204.
- ^ McNearney, Allison (29 August 2018). "This WWII Cartoon Taught Soldiers How to Avoid Certain Death". HISTORY. Retrieved 29 March 2020.
- ^ Richard Alan Nelson, A Chronology and Glossary of Propaganda in the United States (1996) pp. 232–233
- ^ Hobbs, Renee (9 November 2014). "Teaching about Propaganda: An Examination of the Historical Roots of Media Literacy". Journal of Media Literacy Education. 6 (2): 56–67. doi:10.23860/jmle-2016-06-02-5. ISSN 2167-8715.
- ^ Hobbs, Renee (2020). Mind Over Media. Norton.
- ^ Zeman, Zbynek (1978). Selling the War. Orbis Publishing. ISBN 978-0-85613-312-1.
- ^ Oberman, Heiko Augustinus (1 January 1994). The Impact of the Reformation: Essays. Wm. B. Eerdmans Publishing. ISBN 9780802807328 – via Google Books.
- ^ Luther's Last Battles: Politics And Polemics 1531-46 By Mark U. Edwards, Jr. Fortress Press, 2004. ISBN 978-0-8006-3735-4
- ^ In Latin, the title reads "Hic oscula pedibus papae figuntur"
- ^ "Nicht Bapst: nicht schreck uns mit deim ban, Und sey nicht so zorniger man. Wir thun sonst ein gegen wehre, Und zeigen dirs Bel vedere"
- ^ Edwards, Mark U. (2004). Luther's Last Battles: Politics and Polemics 1531–46. Fortress Press. p. 199. ISBN 9781451413984 – via Google Books.
- ^ Fisher, Lane (7 July 2022). "Trouvère Poets: Reflectors of Societal Zeal". Truvere. Truvere. Archived from the original on 5 April 2023. Retrieved 13 September 2022.
- ^ "The Religious Movements Page: Conceptualizing "Cult" and "Sect"". Archived from the original on 7 February 2006. Retrieved 4 December 2005.
- ^ "Religious propaganda". Encyclopedia.uia.org. Union of International Associations. Retrieved 9 November 2023.
- ^ Magill, Frank Northen (23 January 2003). Dictionary of World Biography. Taylor & Francis. p. 422. ISBN 978-1-57958-040-7. Retrieved 7 February 2022.
- ^ Rutter, Keith (31 March 2020). Word And Image In Ancient Greece. Edinburgh University Press. p. 68. ISBN 978-0-7486-7985-0. Retrieved 7 February 2022.
- ^ a b Stepper, R. (2006). "Politische parolen und propaganda im Hannibalkrieg". Klio. 88 (2): 397–407. doi:10.1524/klio.2006.88.2.397. S2CID 190002621. Retrieved 7 February 2022.
- ^ Hoyos, Dexter (26 May 2015). A Companion to the Punic Wars. John Wiley & Sons. p. 275. ISBN 978-1-119-02550-4. Retrieved 7 February 2022.
- ^ Williamson, Samuel R.; Balfour, Michael (Winter 1980). "Propaganda in War, 1939–1945: Organisations, Policies and Publics in Britain and Germany". Political Science Quarterly. 95 (4): 715. doi:10.2307/2150639. JSTOR 2150639.
- ^ Eksteins, Modris; Balfour, Michael (October 1980). "Propaganda in War, 1939–1945: Organisations, Policies and Publics in Britain and Germany". The American Historical Review. 85 (4): 876. doi:10.2307/1868905. JSTOR 1868905.
- ^ North Atlantic Treaty Organization, NATO Standardization Agency AAP-6 – Glossary of terms and definitions, 2-P-9.
- ^ Callanan, James D. The Evolution of The CIA's Covert Action Mission, 1947–1963. Durham University. 1999.
- ^ "Pravda za Uroša Predića!". e-novine.com. Retrieved 5 May 2015.
- ^ Karel C. Berkhoff, Motherland in Danger: Soviet Propaganda during World War II (2012)
- ^ Smithfield, Brad (30 July 2015). "10 Facts You Didn't Know About Stalin". The Vintage News. Timera Media. Retrieved 23 April 2021.
had his likeness softened on propaganda posters to reduce his Georgian facial characteristics.
- ^ Zhores A. Medvedev and (2003). The Unknown Stalin. I.B. Tauris. p. 248. ISBN 9781860647680.
- ^ "International Covenant on Civil and Political Rights". United Nations Human Rights: Office of the High Commissioner for Human Rights. United Nations. Retrieved 2 September 2015.
- ^ Gustave Gilbert's Nuremberg Diary(1947). In an interview with Gilbert in Göring's jail cell during the Nuremberg War Crimes Trials (18 April 1946)
- ^ Snow, Nacny. "US Propaganda". American Thought and Culture in the 21st Century: 97–98.
- ^ Khatib, Lina (2014). The Hizbullah Phenomenon: Politics and Communication. Oxford University Press. p. 84.
- ^ McGarry, Edmund D. (1958). "The Propaganda Function in Marketing". Journal of Marketing. 23 (2): 131–132. doi:10.2307/1247829. JSTOR 1247829.
- ^ How to Be a Spy: The World War II SOE Training Manual. Toronto: Dundurn Press. 2004. p. 192. ISBN 978-1-55002-505-7.
- ^ "About Roderick Hindery". Propaganda and Critical Thought Blog. Archived from the original on 2 December 2020. Retrieved 4 December 2019.
- ^ Hindery, Roderick (2001). Indoctrination and self-deception or free and critical thought. Lewiston, New York: Edwin Mellen Press. ISBN 0-7734-7407-2. OCLC 45784333.
- ^ Briant, Emma Louise (April 2015). "Allies and Audiences Evolving Strategies in Defense and Intelligence Propaganda". The International Journal of Press/Politics. 20 (2): 145–165. doi:10.1177/1940161214552031. S2CID 145697213.
- ^ "Smith-Mundt Act". 'Anti-Propaganda' Ban Repealed, Freeing State Dept. To Direct Its Broadcasting Arm at American Citizens. Techdirt. 15 July 2013. Retrieved 1 June 2016.
- ^ Howard, Philip N.; Woolley, Samuel; Calo, Ryan (3 April 2018). "Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration". Journal of Information Technology & Politics. 15 (2): 81–93. doi:10.1080/19331681.2018.1448735. ISSN 1933-1681.
- ^ Garth S. Jowett and Victoria J.: O'Donnell, Propaganda & Persuasion (5th ed. 2011)
- ^ a b Tu, Hang (2025). Sentimental Republic: Chinese Intellectuals and the Maoist Past. Harvard University Asia Center. ISBN 9780674297579.
- ^ Quito, Anne (30 June 2017). "North Korea's America-hating postage stamps are mini-masterpieces of anti-imperialist propaganda". Quartz.
- ^ Kolchinsky, Alexander (2013). "Stalin on Stamps and other Philatelic Materials: Design, Propaganda, Politics". The Carl Beck Papers in Russian and East European Studies (2301). doi:10.5195/cbp.2013.184.
- ^ Friedman, Herbert A. "Propaganda & Espionage Philately – Part I". PsyWar.Org. Archived from the original on 7 February 2020. Retrieved 1 July 2019.
- ^ "10 WWII Stamp Forgeries Used as Psychological Warfare". Best Masters in Psychology.
- ^ Carole Cadwalladr, as told to Lee Glendinning (29 September 2018). "Exposing Cambridge Analytica: 'It's been exhausting, exhilarating, and slightly terrifying'". The Guardian.
- ^ Briant, Emma (16 April 2018). "Research on Leave.EU and Cambridge Analytica strategy published". UK Parliamentary Committee on Digital, Culture Media and Sport.
- ^ Merelli, Annalisa (17 March 2018). "Facebook knew Cambridge Analytica was misusing users' data three years ago and only banned the company this week". Quartz.
- ^ Musil, Steven (9 April 2018). "Cambridge Analytica denies breaking any laws in Facebook data scandal". CNET.
- ^ Lomas, Natasha (9 January 2019). "Cambridge Analytica's parent pleads guilty to breaking UK data law". TechCrunch. Archived from the original on 21 February 2021. Retrieved 24 October 2020.
- ^ Biddle, W. W. (1931). "A psychological definition of propaganda". The Journal of Abnormal and Social Psychology. 26 (3): 283–295. doi:10.1037/h0074944.
- ^ Wright, Oliver (16 September 2015). "Barack Obama to bring Whitehall's 'nudge' theory to the White House". The Independent.
- ^ Cadwalladr, Carole; Graham-Harrison, Emma (19 March 2018). "Facebook and Cambridge Analytica face mounting pressure over data scandal". The Guardian.
- ^ Huang, Haifeng (1 July 2015). "Propaganda as Signaling". Comparative Politics. 47 (4): 419–444. doi:10.5129/001041515816103220. JSTOR 43664158.
- ^ Lasswell, Harold (1927). Propaganda Technique in World War I. M.I.T. Press.
- ^ "Letter from Noam Chomsky" to Covert Action Quarterly, quoting Alex Carey, Australian social scientist, "Letter from Noam Chomsky". Archived from the original on 10 July 2012. Retrieved 1 April 2007.
- ^ "Review of Alex Carey, Taking the Risk out of Democracy: Propaganda in the US and Australia". Retrieved 6 March 2015.
- ^ Pedro-Carañana, Joan; Broudy, Daniel; Klaehn, Jeffery, eds. (2018). The Propaganda Model Today: Filtering Perception and Awareness. Vol. 8. University of Westminster Press. ISBN 9781912656172. JSTOR j.ctv7h0ts6.
- ^ Fleming, David (2019). "Fear of persuasion in the English language arts". College English. 81 (6): 508–541. doi:10.58680/ce201930223. S2CID 201379873.
- ^ Hobbs, Renee (2020). Mind Over Media: Propaganda Education for a Digital Age. W.W. Norton.
- ^ A Mask for Privilege: Anti-Semitism in America. Transaction Publishers. 1999 [1948]. ISBN 978-1-4128-1615-1.
- ^ Gambrill, Eileen (20 February 2012). Propaganda in the Helping Professions. Oxford University Press, USA. ISBN 978-0-19-532500-3.
- ^ a b Gambrill, Eileen (6 March 2006). Critical Thinking in Clinical Practice: Improving the Quality of Judgments and Decisions. John Wiley & Sons. ISBN 978-0-471-78112-7.
- ^ Goutam, Urvashi (2014). "Pedagogical Nazi Propaganda (1939–1945)". Proceedings of the Indian History Congress. 75: 1018–1026. ISSN 2249-1937. JSTOR 44158487.
- ^ Corelli, Marie (May–June 2002). "Poisoning young minds in Nazi Germany: children and propaganda in the Third Reich". Social Education. 66 (4): 228. Archived from the original on 21 February 2021. Retrieved 22 February 2020.
- ^ Mills, Mary. "Propaganda and Children during the Hitler Years". The Nizkor Project. Retrieved 22 February 2020.
- ^ Hirsch, Herbert. Genocide and the Politics of Memory. Chapel Hill & London: University of North Carolina Press, 1995. p. 119.
- ^ Can public diplomacy survive the internet? (PDF), May 2017, p. 73, archived from the original (PDF) on 30 March 2019
- ^ The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money (PDF), Institute of Modern Russia, 2014, archived from the original (PDF) on 3 February 2019
- ^ Jackson, Dean (2018), Distinguishing Disinformation from Propaganda, Misinformation, and 'Fake News' (PDF), National Endowment for Democracy, archived (PDF) from the original on 7 April 2022, retrieved 31 May 2022
Sources
[edit]- "Appendix I: PSYOP Techniques". Psychological Operations Field Manual No. 33-1. Washington, D.C.: Department of the Army. 31 August 1979. Archived from the original on 24 May 2001.
- Bytwerk, Randall L. (2004). Bending Spines: The Propagandas of Nazi Germany and the German Democratic Republic. East Lansing: Michigan State University Press. ISBN 978-0-87013-710-5.
- Edwards, John Carver (1991). Berlin Calling: American Broadcasters in Service to the Third Reich. New York: Praeger. ISBN 978-0-275-93905-2.
- Füssel, Stephan (2020). Gutenberg and the Impact of Printing. Routledge. ISBN 978-1-351-93187-8.
- Hindery, Roderick. "The Anatomy of Propaganda within Religious Terrorism". Humanist (March–April 2003): 16–19.
- Howe, Ellic (1982). The Black Game: British Subversive Operations Against the German During the Second World War. London: Futura.
- Huxley, Aldous (1989) [1958]. Brave New World Revisited. New York: Harper. ISBN 978-0-06-080984-3.
- Jowett, Garth S.; O'Donnell, Victoria (2006). Propaganda and Persuasion (4th ed.). Thousand Oaks, California: Sage Publications, Inc. ISBN 978-1-4129-0897-9.
- Le Bon, Gustave (1977) [1895]. The Crowd: A Study of the Popular Mind. Penguin Books. ISBN 978-0-14-004531-4.
- Linebarger, Paul M. A. (1972) [1948]. Psychological Warfare. Washington, D.C.: Infantry Journal Press. ISBN 978-0-405-04755-8.
- Nelson, Richard Alan (1996). A Chronology and Glossary of Propaganda in the United States. Westport, CT: Greenwood Press. ISBN 978-0-313-29261-3.
- Shirer, William L. (2002) [1941]. Berlin Diary: The Journal of a Foreign Correspondent, 1934–1941. New York: Albert A. Knopf. ISBN 978-5-9524-0081-8.
- Young, Emma (10 October 2001). "Psychological warfare waged in Afghanistan". New Scientist. Archived from the original on 13 February 2002. Retrieved 5 August 2010.
Further reading
[edit]Books
[edit]- Altheide, David L. & John M. Johnson. Bureaucratic Propaganda. Boston: Allyn and Bacon, 1980.
- Bernays, Edward. Propaganda. New York: H. Liveright, 1928.
- (See also version of text at website www.historyisaweapon.com: "Propaganda.")
- Borgies, Loïc. Le conflit propagandiste entre Octavien et Marc Antoine: De l'usage politique de la uituperatio entre 44 et 30 a. C. n.. Brussels: Latomus, 2016.
- Brown, J.A.C. Techniques of Persuasion: From Propaganda to Brainwashing. Harmondsworth: Pelican, 1963.
- Chomsky, Noam & Herman Edward S. Manufacturing Consent: The Political Economy of the Mass Media. New York: Pantheon Books. (1988)
- Chomsky, Noam. Media Control: The Spectacular Achievements of Propaganda. Seven Stories Press, 1997.
- Cole, Robert. Propaganda in Twentieth Century War and Politics: An Annotated Bibliography. London: Scarecrow, 1996.
- Cole, Robert, ed. Encyclopedia of Propaganda. 3 vols. Armonk, NY: M.E. Sharpe, 1998.
- Combs James E. & Nimmo Dan, The New Propaganda: The Dictatorship of Palaver in Contemporary Politics. White Plains, N.Y. Longman. (1993)
- Cull, Nicholas John, Culbert, and Welch, eds. Propaganda and Mass Persuasion: A Historical Encyclopedia, 1500 to the Present (2003)
- Cunningham Stanley B. The Idea of Propaganda: A Reconstruction. Westport, Conn.: Praeger, 2002.
- Cunningham Stanley B. "Reflections on the Interface Between Propaganda and Religion", in The Future of Religion, eds. P. Rennick, S. Cunningham, & R.H. Johnson. Newcastle upon Tyne: Cambridge Scholars Pub., 2010, pp. 83–96.
- De Lange, William (2023). A History of Japanese Journalism: State of Affairs and Affairs of State. Toyo Press. ISBN 978-94-92722-393.
- DelHagen, Jacob M. Modern Propaganda : The art of influencing society, individuals, and the news media through digital communication. 2016 ISBN 9780998315607
- Dimitri Kitsikis, Propagande et pressions en politique internationale, Paris, Presses Universitaires de France, 1963, 537 pages.
- Ellul, Jacques, Propaganda: The Formation of Men's Attitudes. (1965).
- Hamilton, John M. (2020) Manipulating the Masses: Woodrow Wilson and the Birth of American Propaganda. Louisiana State University Press.
- Hale, Oron James. Publicity and Diplomacy: With Special Reference to England and Germany, 1890–1914 (1940) online Archived 4 December 2020 at the Wayback Machine
- Hench, John B. Books as Weapons: Propaganda, Publishing, and the Battle for Global Markets in the Era of World War II. Cornell University Press, 2010.
- Hirschberger, Bernd (2021). External Communication in Social Media During Asymmetric Conflicts A Theoretical Model and Empirical Case Study of the Conflict in Israel and Palestine. Bielefeld: transcript Verlag. ISBN 978-3-8394-5509-8. Retrieved 11 October 2021.
- Jowett, Garth S. & Victoria O'Donnell. Propaganda and Persuasion, 6th edn. California: Sage Publications, 2014. A detailed overview of the history, function, and analyses of propaganda.
- Lasswell, Harold. Propaganda Technique in the World War. K. Paul, Trench, Trubner & Company, Limited, 1927.
- Lohrey, Andrew, ed. Taking the Risk out of Democracy: Corporate Propaganda versus Freedom and Liberty. Urbana, Ill.: University of Illinois Press, 1997.
- Marlin, Randal. Propaganda & The Ethics of Persuasion. Orchard Park, New York: Broadview Press, 2002.
- McCombs, M. E. & D. L. Shaw. "The agenda-setting function of mass media", Public Opinion Quarterly 36, no. 2 (1972): 176–187.
- Mackenzie, A. J., Propaganda Boom (London: John Gifford, 1938)
- Moran, T. "Propaganda as Pseudocommunication", Et Cetera 2 (1979): 181–197.
- Nelson, Richard Alan. A Chronology and Glossary of Propaganda in the United States. Westport, Conn.: Greenwood Press, 1996.
- Oddo, J. (2018). The Discourse of Propaganda: Case Studies from the Persian Gulf War and the 'War on Terror'. University Park, PA: Pennsylvania State University Press.
- Pratkanis, Anthony & Elliot Aronson. Age of Propaganda: The Everyday Use and Abuse of Persuasion. New York: W.H. Freeman and Company, 1992.
- Rutherford, Paul, Endless Propaganda: The Advertising of Public Goods. Toronto: University of Toronto Press. (2000)
- Rutherford, Paul, Weapons of Mass Persuasion: Marketing the War Against Iraq. Toronto: University of Toronto Press, 2004.
- Shanahan, James, ed. Propaganda without Propagandists? Six Case Studies in U.S. Propaganda. Hampton Press, 2001.
- Shaw Jeffrey M., Illusions of Freedom: Thomas Merton and Jacques Ellul on Technology and the Human Condition. Eugene, OR: Wipf and Stock. ISBN 978-1625640581 (2014)
- Snow, Nancy (10 March 2014). Propaganda and American Democracy. Baton Rouge: LSU Press. ISBN 978-0-8071-5415-1.
- Snow, Nancy (4 January 2011). Propaganda, Inc.: Selling America's Culture to the World. New York: Seven Stories Press. ISBN 978-1-60980-082-6.
- Sproule J. Michael, Channels of Propaganda. Bloomington, IN: EDINFO Press. (1994)
- Stanley, Jason (2016). How Propaganda Works. Princeton University Press. ISBN 978-0691173429.
- John Steinbeck, The Moon is Down, Viking, NY, 1942
- Stauber, John & Sheldon Rampton. Toxic Sludge Is Good for You! Lies, Damn Lies and the Public Relations Industry. Monroe, Maine: Common Courage Press, 1995.
Essays and articles
[edit]- Rosenfeld, Bryn; Wallace, Jeremy (2024). "Information Politics and Propaganda in Authoritarian Societies". Annual Review of Political Science 27(1).
- Brown, John H. "Two Ways of Looking at Propaganda" (2006)
- Garcia, Hugo. "Reluctant liars? Public debates on propaganda and democracy in twentieth-century Britain (ca. 1914–1950)", Contemporary British History, vol. 33, no. 3 (2019), pp. 383–404.
- Kosar, Kevin R., Public Relations and Propaganda: Restrictions on Executive Branch Activities, CRS Report RL32750, February 2005.
- Auerbach, Jonathan, and Russ Castronovo: "Thirteen Propositions about Propaganda." The Oxford Handbook of Propaganda Studies, December 2013.
Propaganda
View on GrokipediaEtymology and Terminology
Origins of the Term
The term "propaganda" derives from the Latin propaganda, the neuter plural gerundive of propagare, meaning "to propagate" or "to spread," referring to things that ought to be propagated, such as doctrines or ideas.[3][12] This linguistic root entered common usage through the Catholic Church's institutional efforts to disseminate its faith during the Counter-Reformation. In response to the Protestant Reformation's gains and the need to coordinate missionary activities amid expanding European exploration, Pope Gregory XV established the Sacra Congregatio de Propaganda Fide (Sacred Congregation for the Propagation of the Faith) on January 6, 1622, formalized by the papal bull Inscrutabili Divinae Providentiae Arcano issued on June 22, 1622.[13][14] The congregation served as the central Roman Curia body overseeing global Catholic missions, including training missionaries, printing materials in vernacular languages, and countering non-Catholic influences in regions like the Americas, Asia, and Africa.[15] The name's "propaganda" element directly reflected its mandate to propagate (propagare) Christianity, particularly among non-believers and lapsed faithful, without the pejorative implications it later acquired.[4] At inception, the term connoted organized dissemination of religious truth, akin to propagation in agriculture or botany—systematic extension rather than deception—and was viewed positively within ecclesiastical contexts as essential for the Church's survival and expansion.[5] This ecclesiastical origin marked the term's formal institutionalization, distinguishing it from earlier informal uses of related Latin concepts in classical texts, such as Cicero's references to spreading ideas. By the late 17th century, "propaganda" began appearing in European vernaculars to describe the congregation's activities, initially retaining neutrality but gradually broadening to secular contexts by the 18th century Enlightenment, where it started shifting toward critiques of manipulative influence.[3][4]Evolution in Modern Usage
In the early 20th century, "propaganda" largely retained its historical neutrality as organized efforts to disseminate ideas or ideologies, but its application expanded into secular political contexts amid mass media's rise. During World War I, the U.S. government formed the Committee on Public Information (CPI) in April 1917, led by George Creel, to mobilize public support for the war through pamphlets, posters, films, and speeches reaching an estimated 75 million Americans. Creel explicitly rejected the label "propaganda" for his initiatives, citing its association with German deception, and instead framed them as "educational and informative" campaigns, reflecting the term's still-ambivalent status even as it described systematic opinion-shaping.[16][17] Postwar disillusionment accelerated a pejorative shift, as revelations of wartime exaggerations—such as unverified atrocity claims against Germans—fostered distrust of state-managed messaging. By the 1920s, the term connoted manipulation and excess, prompting rebranding in professional circles; Edward Bernays, a CPI veteran and nephew of Sigmund Freud, titled his 1928 book Propaganda to advocate "engineering consent" via psychological insights, yet acknowledged the word's growing stigma and pivoted to "public relations" to sanitize similar practices for commercial and political use. This evolution mirrored broader causal dynamics: mass democracy's demands for public buy-in clashed with transparency ideals, rendering overt persuasion suspect.[18][3] World War II and the Cold War solidified "propaganda" as predominantly derogatory, evoking totalitarian control rather than mere advocacy. Nazi Germany's Reich Ministry of Public Enlightenment and Propaganda, established in 1933 under Joseph Goebbels, centralized media to enforce ideology, producing over 1,300 newspapers and films that distorted facts for regime loyalty, which postwar analyses framed as paradigmatic deception. In Allied and democratic contexts, the term increasingly delegitimized opponents' narratives—e.g., Soviet disinformation—while one's own efforts were rephrased as "information" or "counterpropaganda," highlighting its weaponized asymmetry.[18] By the late 20th century into the present, modern usage broadened to encompass advertising, journalism, and digital campaigns perceived as ideologically skewed, often without requiring outright falsehoods but implying selective framing or emotional appeals over evidence. This reflects causal realism in information ecosystems: amid fragmented media, the label critiques systemic biases, such as state media in authoritarian regimes (e.g., China's global broadcasting via CGTN since 2016) or Western outlets' narrative alignment, though accusations remain subjective and rarely self-applied. Scholarly distinctions persist—e.g., propaganda as intentional belief manipulation versus neutral persuasion—but colloquial deployment prioritizes pejorative intent, underscoring source credibility's role in evaluation.[3]Definitions and Core Characteristics
Essential Elements
Propaganda fundamentally entails a deliberate, organized effort to influence the attitudes, beliefs, or behaviors of a target audience, typically through the selective presentation of information via mass communication channels. This distinguishes it from casual persuasion by its systematic nature and aim to advance ideological, political, or institutional agendas, often prioritizing emotional resonance over comprehensive factual disclosure. Harold Lasswell defined propaganda as the "manipulation of collective attitudes by the use of significant symbols (words, pictures, tunes) rather than violence, a 'peaceful' battle of words."[19] Jacques Ellul further characterized it as a sociological phenomenon inherent to technological mass societies, where it functions continuously to integrate individuals into prevailing social norms, rendering it unavoidable and total in scope.[20] Key elements include intentionality and secrecy regarding sources or ultimate goals, ensuring the message appears authoritative or spontaneous while concealing manipulative objectives. Ellul, drawing on earlier analyses like John Albig's, identified core definitional components: the covert nature of propaganda's origins and aims; a explicit intention to alter opinions or actions; broad dissemination to mass publics; unrelenting continuity rather than episodic bursts; and structured organization involving specialized personnel and resources.[20] These features enable propaganda to exploit pre-existing narratives, amplifying them through repetition and simplification to foster conformity or agitation without requiring overt coercion.[21] Psychological targeting forms another essential pillar, leveraging symbols, slogans, and emotional appeals—such as fear, pride, or enmity—to bypass rational scrutiny and embed ideas subconsciously. Lasswell's examination of World War I efforts highlighted techniques like stereotyping enemies and glorifying national virtues to unify publics, demonstrating propaganda's reliance on symbolic manipulation over empirical debate.[22] Unlike neutral information exchange, propaganda often omits counterevidence or distorts causality to attribute outcomes to favored narratives, as seen in its historical adaptation to media scales from print to digital platforms. Ellul noted that while falsehoods occur, propaganda's efficacy stems more from contextual orchestration of truths, fostering dependency on mediated realities.[21] In practice, these elements converge in orchestrated campaigns, where audience segmentation—via Lasswell's "who says what to whom" framework—tailors messages for maximum effect, such as reinforcing in-group solidarity against out-groups.[23] This causal mechanism, rooted in human predispositions to heuristic processing, underscores propaganda's distinction from education: the latter seeks autonomous understanding, while the former engineers compliance through perpetual exposure. Empirical analyses confirm that without mass reach and sustained application, such efforts devolve into mere advocacy, lacking propaganda's transformative potency.[24]Distinctions from Persuasion and Influence
Propaganda differs from persuasion primarily in its intent, methods, and scope. Persuasion encompasses a broad range of communicative efforts to alter attitudes or behaviors through appeals to reason, evidence, or mutual interest, often in reciprocal or dialogic contexts.[25] In contrast, propaganda constitutes a deliberate, systematic subcategory of persuasion aimed at advancing a predetermined ideological or political agenda, frequently employing selective truths, omissions, or distortions to manipulate rather than inform.[26] This distinction hinges on propaganda's covert asymmetry: it prioritizes the propagandist's objectives over audience autonomy, using techniques like repetition and emotional priming to foster uncritical acceptance, as opposed to persuasion's potential for verifiable debate.[27] Jacques Ellul further delineates propaganda from mere persuasion by emphasizing its totalizing effect in modern technological societies, where it integrates individuals into a comprehensive worldview through pervasive, non-rational conditioning rather than isolated argumentative influence.[21] Edward Bernays, while framing propaganda as an essential mechanism for "establishing reciprocal understanding" between leaders and publics, acknowledged its manipulative underpinnings in mass-scale opinion engineering, blurring lines with public relations but underscoring its departure from transparent advocacy.[28] Empirical analyses, such as those by Jowett and O'Donnell, quantify this through propaganda's reliance on ideological intent—measured by the communicator's exclusion of counter-evidence—versus persuasion's openness to scrutiny, with historical cases like World War I atrocity stories illustrating propaganda's willingness to fabricate for mobilization.[26] Influence, broader than both, refers to any process—intentional or incidental—by which external factors shape cognition or action, including cultural osmosis or personal example without structured messaging.[29] Propaganda qualifies as a targeted form of influence only when it involves organized dissemination of biased information to predefined ends, distinguishing it from diffuse social pressures; for instance, state media campaigns during the Cold War systematically propagated narratives to sway alliances, unlike organic cultural shifts.[30] This causal realism reveals propaganda's efficacy in overriding individual reasoning via scale and repetition, as evidenced by studies showing higher susceptibility in low-information environments, whereas neutral influence lacks such engineered deceit.[10]Historical Development
Ancient and Pre-Modern Instances
In ancient Mesopotamia, Assyrian kings disseminated propaganda through royal annals and palace reliefs to exaggerate military victories and instill fear in subjects and enemies. Ashurnasirpal II (r. 883–859 BCE), for instance, inscribed detailed accounts of campaigns involving mass executions and impalements, claiming to have built a palace with materials from 50 enemy cities while flaying thousands, thereby projecting unassailable power and divine favor.[31] Ancient Egyptian pharaohs similarly employed temple inscriptions and monumental art to fabricate narratives of triumph and god-like supremacy. Ramesses II's reliefs at the Ramesseum and Abu Simbel temples (circa 1270s BCE) portrayed the Battle of Kadesh against the Hittites (1274 BCE) as a personal rout of the enemy, omitting the battle's inconclusive outcome and subsequent peace treaty, to affirm his role as protector of ma'at (cosmic order) and deter internal dissent.[32][33] In classical Greece, strategic deception served propagandistic ends during conflicts. Athenian general Themistocles in 480 BCE spread false rumors via a "trusted" defector to mislead Persian king Xerxes into deploying his fleet at the narrow Salamis straits, enabling a Greek victory that orators like Aeschylus later mythologized in works such as The Persians to exalt Athenian heroism and democracy.[34] Roman emperors systematized visual and epigraphic propaganda to consolidate imperial legitimacy across diverse provinces. Augustus (r. 27 BCE–14 CE) commissioned the Res Gestae Divi Augusti, an autobiographical inscription erected posthumously at key sites like Ankara, enumerating 35 specific achievements—including closing temple doors signaling peace after 200 years of war—to frame his rule as a restoration of republican virtues rather than monarchy. Coins bearing his image alongside motifs of victory and piety circulated empire-wide, reinforcing loyalty among illiterate masses.[35][36] During the medieval period, the Catholic Church propagated crusading ideology through papal decrees and sermons to mobilize European knights against perceived Islamic threats. Pope Urban II's 1095 CE Council of Clermont address promised spiritual rewards for reclaiming Jerusalem, framing the First Crusade as a penitential pilgrimage divinely sanctioned, which chronicles like Fulcher of Chartres amplified by emphasizing miraculous signs and enemy atrocities to sustain fervor amid high casualties.[5] Secular rulers, such as England's Henry II, used illuminated manuscripts and charters to justify Angevin expansion, depicting conquests as rightful inheritance while vilifying rivals like Thomas Becket post-1170 assassination to mitigate rebellion.[37]Enlightenment to World War I
The Enlightenment facilitated the expansion of print media, enabling the dissemination of political ideas beyond elite circles and foreshadowing systematic propaganda through appeals to reason and public sentiment. Pamphlets and essays critiqued absolutism, promoting concepts of liberty and governance that influenced revolutionary movements. This period's emphasis on literacy and debate shifted persuasion from oral traditions to reproducible texts, amplifying reach amid rising newspaper circulation in Europe and America.[38] In the American Revolution, Thomas Paine's Common Sense, published January 10, 1776, served as a pivotal propagandistic tool, framing British rule as tyrannical and advocating republican independence through accessible, emotive language that resonated with colonists. The 47-page pamphlet sold approximately 120,000 copies in its first three months, equivalent to reaching about one in five free Americans, and galvanized support for the Continental Congress's Declaration of Independence on July 4, 1776. Paine's subsequent The American Crisis series, beginning December 23, 1776, further boosted morale, with its opening line—"These are the times that try men's souls"—reportedly read aloud to troops before the Battle of Trenton.[39] The French Revolution intensified pamphlet warfare, with an estimated 100,000 distinct titles produced between 1788 and 1795, targeting the monarchy's legitimacy and rallying support for radical change through satirical caricatures, accusations of corruption, and visions of egalitarian utopias. Prints and engravings depicted figures like Marie Antoinette as decadent, fueling public outrage that contributed to events such as the storming of the Bastille on July 14, 1789. Revolutionary leaders, including the Jacobins, leveraged these materials to consolidate power, though their hyperbolic claims often distorted facts to justify purges like the Reign of Terror from September 1793 to July 1794.[40][41] Throughout the 19th century, nationalism drove propagandistic campaigns in Europe, particularly in fragmented states seeking unification. In German territories after the Napoleonic Wars, authorities promoted national identity via patriotic festivals, monuments, and school curricula emphasizing shared language and history, as seen in the 1817 Wartburg Festival where students burned foreign symbols to assert cultural purity. Similar efforts in Italy during the Risorgimento, led by figures like Giuseppe Mazzini, used writings and secret societies to evoke romanticized past glories against Austrian dominance, culminating in unification by 1870. The rise of cheap newspapers, such as Britain's Daily Telegraph reaching 240,000 circulation by 1877, allowed governments and movements to shape public opinion on imperial expansion and domestic reforms.[42] World War I elevated propaganda to a state-orchestrated industry, with belligerents deploying agencies to sustain recruitment, morale, and resource mobilization amid total war. Britain's Wellington House, operational from September 1914, circulated reports of German atrocities in Belgium—such as the alleged execution of 6,000 civilians in Dinant on August 23, 1914—to vilify the Kaiser and justify Allied intervention, though subsequent inquiries revealed exaggerations in claims like widespread bayoneting of babies. In the United States, the Committee on Public Information, formed April 13, 1917, under George Creel, distributed over 75 million pamphlets and produced 6,000 reels of film, employing slogans like "The Hun Within" to stoke fears of subversion and enforce loyalty via the Espionage Act of 1917, which prosecuted over 2,000 dissenters. Techniques emphasized enemy dehumanization, patriotic symbolism, and atrocity narratives, but post-war revelations, including the 1920 Bryce Committee disavowals, exposed fabricated elements designed to override rational skepticism for causal support of war aims.[43][7][44]Interwar and World War II Totalitarian Regimes
Totalitarian regimes in the interwar period and World War II elevated propaganda to a core mechanism of governance, centralizing control over information to indoctrinate populations, legitimize leaders, and mobilize societies for ideological conformity and conflict. In Nazi Germany, Fascist Italy, Stalinist Soviet Union, and Imperial Japan, state apparatuses systematically deployed media monopolies, mass events, and repetitive messaging to demonize enemies, exalt rulers, and suppress dissent, achieving unprecedented penetration into daily life.[45] These efforts relied on modern technologies like radio and film, enabling regimes to bypass traditional elites and directly influence the masses.[46] Nazi Germany's propaganda machine, directed by Joseph Goebbels after his appointment as Reich Minister for Public Enlightenment and Propaganda on March 13, 1933, achieved total dominance over print, broadcast, and visual media within months of Hitler's rise.[47] Goebbels insisted that effective propaganda required a single authoritative source to prevent contradictory messages, subordinating all cultural and informational outlets to the Reich Ministry.[48] Key techniques included the "big lie" strategy—coined in Hitler's Mein Kampf but operationalized through ceaseless repetition of falsehoods, such as Jewish conspiracies orchestrating Germany's World War I defeat—and orchestration of spectacles like the annual Nuremberg rallies, attended by over 400,000 in 1938, to instill fervor.[49] Radio broadcasts reached 70% of households by 1939, with cheap "People's Receivers" designed for one-purpose listening to state content.[46] Anti-Semitic campaigns, via outlets like Der Stürmer, escalated from 1933 boycotts to justifying the Holocaust by portraying Jews as existential threats.[50] In the Soviet Union under Stalin, the Communist Party's Agitprop (Agitation and Propaganda) Department, formalized in the 1920s and intensified during the 1930s Great Purges, coordinated indoctrination across newspapers like Pravda, films, posters, and theater troupes targeting workers.[51] Propaganda glorified collectivization and industrialization as triumphs over "kulaks" and saboteurs, with Stalin depicted in over 5,000 statues by 1940 and films like Lenin in October (1937) rewriting history to center his role.[52] Techniques emphasized "agitpoints"—mobile units disseminating simplified Bolshevik ideology—and suppression of facts, such as the 1932-1933 Holodomor famine killing 3-5 million, reframed as capitalist slander.[53] By World War II, renamed the Propaganda Department in 1946 but active earlier, it mobilized 20 million Red Army recruits partly through patriotic narratives blending Marxism with Russian nationalism.[52] Fascist Italy under Mussolini centralized propaganda through the Ministry of Popular Culture, established in 1937, which censored press and cinema while promoting imperial revival via youth groups like Balilla and grandiose architecture emulating Rome.[54] Mussolini's persona as Il Duce was propagated through 3,000+ speeches broadcast on radio from 1924 onward, emphasizing virility and anti-Bolshevism, with campaigns like the 1935 Ethiopia invasion framed as civilizing missions.[55] Despite early successes in consolidating power post-1922 March on Rome, propaganda faltered in sustaining war enthusiasm, as battlefield defeats in 1940-1943 exposed regime boasts, contributing to Mussolini's 1943 ouster.[54] Film production, under state control from 1922, produced over 1,000 features by 1943, often embedding fascist values subtly to evade public resistance.[56] Imperial Japan's Cabinet Information Bureau, created in 1936 and expanded during WWII, enforced media compliance to propagate the "Greater East Asia Co-Prosperity Sphere" as liberation from Western imperialism, censoring dissent under the 1925 Peace Preservation Law.[57] Techniques involved school indoctrination, radio scripts reaching 80% coverage by 1941, and posters depicting Allied forces as barbaric, sustaining mobilization despite defeats like Midway in 1942.[58] The Kempeitai military police augmented this with terror, arresting 70,000 for "thought crimes" by 1945, ensuring propaganda's coercive efficacy.[59] These regimes' propaganda not only facilitated internal purges—Nazi Night of the Long Knives (1934), Soviet Great Terror (1936-1938)—but also wartime atrocities, with dehumanizing narratives enabling events like the Nazi Einsatzgruppen killings of 1.5 million Jews by 1943 and Japanese Nanjing Massacre (1937, 200,000+ deaths).[45] Postwar analyses reveal propaganda's limits against empirical failures, as Allied victories eroded credibility, underscoring its dependence on perceived successes for sustained belief.[54]Cold War and Decolonization Era
The Cold War (1947–1991) featured intense ideological propaganda between the United States and the Soviet Union, each seeking to portray its system as superior while demonizing the opponent. The United States initiated the "Campaign of Truth" on April 20, 1950, when President Harry Truman called for expanded information efforts to combat "the big lie" of Soviet propaganda, emphasizing factual broadcasting over deception. This initiative boosted funding for outlets like Voice of America (VOA), which by the 1950s transmitted news in over 40 languages to counter Soviet narratives in Europe, Asia, and beyond. Complementing VOA, Radio Free Europe (RFE) commenced operations on July 4, 1950, delivering uncensored news and cultural programming to Soviet-occupied Eastern Europe from Munich, initially under covert CIA funding to undermine communist regimes without direct U.S. government attribution. The Soviet Union responded with state-controlled media such as Radio Moscow, which broadcast anti-capitalist messages globally, often exaggerating Western imperialism and U.S. racial inequalities to erode American credibility. Soviet propaganda also glorified proletarian internationalism through posters, films, and literature, depicting the USSR as the vanguard against fascism and exploitation, though these efforts relied on centralized censorship that suppressed dissenting views. In 1953, the U.S. formalized its propaganda apparatus by creating the United States Information Agency (USIA), tasked with disseminating American values like democracy and free enterprise via libraries, films, and exchanges in over 100 countries, reaching millions annually during the era's peak. USIA materials highlighted economic successes under capitalism, such as post-Marshall Plan recoveries in Western Europe, contrasting them with Soviet famines and purges. The Soviets, through agencies like the KGB and Cominform (until 1956), propagated narratives of inevitable communist victory, using agitprop in occupied territories and allied states to foster loyalty. This bilateral contest extended to psychological operations; for instance, U.S. leaflet drops and broadcasts during the Korean War (1950–1953) urged North Korean defections by promising humane treatment, while Soviet counterparts accused the U.S. of bacteriological warfare without evidence. Decolonization from the late 1940s to the 1970s amplified superpower propaganda as over 50 African and Asian nations gained independence, becoming battlegrounds for influence. The Soviet Union positioned itself as an anti-colonial champion, supporting liberation movements with rhetoric and material aid; for example, it backed the Algerian Front de Libération Nationale (FLN) during the 1954–1962 war against France through propaganda framing Moscow as a partner in dismantling imperialism. Soviet posters and broadcasts in the 1950s–1960s celebrated African decolonization waves, such as Ghana's independence in 1957, while critiquing Western neocolonialism to attract leaders like Kwame Nkrumah. This approach yielded alliances, including Cuba's 1959 revolution and Soviet arms to Angola's MPLA in the 1970s civil war, where propaganda portrayed interventions as solidarity against "reactionary" forces. The United States countered with development-focused messaging via USIA and AID programs, promoting non-communist paths to modernity; in the Congo Crisis (1960–1965), U.S. operations, including CIA-backed propaganda, supported Joseph Mobutu against Soviet-favored Patrice Lumumba, emphasizing stability and anti-communism to prevent Soviet footholds. U.S. efforts often involved covert media manipulation, such as funding anti-communist outlets in Indonesia during the 1965 coup, though these prioritized geopolitical containment over unvarnished truth. Both powers exploited local grievances—Soviets via class struggle appeals, Americans via modernization promises—but Soviet state monopoly on information enabled more uniform narratives, while U.S. initiatives faced domestic scrutiny over covert elements. In non-aligned forums like the 1955 Bandung Conference, propaganda clashes highlighted third-world leaders' navigation of bipolar pressures, with Soviet denunciations of colonialism contrasting U.S. portrayals of partnership.Post-Cold War to Digital Age
The dissolution of the Soviet Union in 1991 reduced the scale of global ideological propaganda contests, yet propaganda persisted in regional ethnic conflicts and Western-led interventions. In the Yugoslav Wars of the 1990s, Serbian state-controlled media under Slobodan Milošević propagated narratives of Serb victimhood and demonized other ethnic groups, inciting violence through broadcasts that exaggerated threats and historical grievances.[60] [61] Similarly, during the 2003 U.S.-led invasion of Iraq, administration officials cited intelligence on weapons of mass destruction—later revealed as erroneous—to justify preemptive action, shaping public opinion and media coverage to emphasize imminent threats from Saddam Hussein's regime.[62] [63] The rise of the internet and social media platforms in the 2000s marked a pivotal shift toward decentralized, rapid dissemination of propaganda, enabling both grassroots mobilization and state countermeasures. The Arab Spring protests from 2010 to 2012 demonstrated this duality: activists in Tunisia, Egypt, and elsewhere used Facebook and Twitter to organize demonstrations and share uncensored footage, circumventing regime monopolies on traditional media, though participation remained limited to digitally connected urban elites.[64] [65] Authoritarian governments responded by enhancing digital surveillance, deploying pro-regime bots, and fabricating counter-narratives to discredit protesters as foreign agents.[66] State actors adapted legacy tactics to cyberspace, with Russia and China exemplifying hybrid approaches blending official outlets and covert operations. Russia's Internet Research Agency, a St. Petersburg-based entity operational by 2013, ran troll farms employing hundreds to post divisive content on platforms like Facebook and Twitter, including efforts to exacerbate U.S. racial tensions and influence the 2016 presidential election through fake accounts and targeted ads reaching millions.[67] [68] China, meanwhile, orchestrates vast astroturfing campaigns, generating approximately 448 million fabricated social media comments yearly to amplify positive state narratives, distract from criticisms, and promote policies like the Belt and Road Initiative via state media and influencers.[69] [70] Advancements in artificial intelligence have further amplified digital propaganda's potency since the mid-2010s, facilitating deepfakes and automated content generation. Russian operations during the 2022 Ukraine invasion incorporated AI-generated deepfakes and sham websites to spread false narratives, such as fabricated atrocities to undermine Western support.[71] Chinese state-linked actors have used AI for similar ends, including deepfake videos targeting Taiwanese elections to erode trust in democratic institutions.[72] These tools extend Cold War-era disinformation—rooted in KGB "active measures"—into algorithmic precision, allowing precise targeting while challenging attribution and verification.[73]Techniques and Methodologies
Psychological Manipulation Tactics
Psychological manipulation tactics in propaganda target cognitive and emotional vulnerabilities to shape perceptions and induce compliance, often circumventing rational evaluation. These tactics draw on principles of social psychology, including the exploitation of heuristics, biases, and group dynamics, to foster uncritical acceptance of messages. Unlike transparent persuasion, they prioritize subconscious influence through repetition, emotional priming, and selective framing, as Edward Bernays outlined in his 1928 book Propaganda, where he emphasized manipulating "organized habits and opinions" via psychological stimuli to form habits without conscious resistance.[28] Empirical studies confirm their efficacy; for instance, repeated exposure to claims increases perceived truthfulness via the illusory truth effect, regardless of factual accuracy, as demonstrated in experiments where subjects rated statements as more valid after multiple viewings.[74] A seminal classification comes from the Institute for Propaganda Analysis, established in 1937, which identified seven core devices based on observed patterns in mass communication during the interwar period. These devices systematically exploit emotional responses and cognitive shortcuts:[75]- Name-calling: Propagandists attach loaded, negative labels (e.g., "traitor" or "extremist") to opponents or ideas to provoke instinctive aversion and prejudice, bypassing evidence-based scrutiny. This tactic leverages affective bias, where emotional disgust overrides factual assessment.[76]
- Glittering generalities: Positive, vague terms like "freedom" or "justice"—linked to cherished values but devoid of specifics—are invoked to evoke uncritical approval, exploiting the halo effect where association with ideals transfers unearned credibility.[76]
- Transfer: Symbols of authority, sanctity, or prestige (e.g., flags, religious icons) are borrowed to lend legitimacy to unrelated claims, capitalizing on conditioned respect to manipulate associations.[76]
- Testimonial: Endorsements from ostensibly credible figures (celebrities, experts) are used to sway audiences, invoking the authority bias even when the endorser's expertise is irrelevant or fabricated.[76]
- Plain folks: Propagandists present themselves or their messages as relatable to ordinary people, fostering trust through feigned commonality and reducing perceived elitism, which appeals to in-group identification.[76]
- Card stacking: Selective presentation of facts—omitting contradictions or unfavorable data—creates a skewed narrative, exploiting confirmation bias by reinforcing desired interpretations while ignoring disconfirming evidence.[76]
- Bandwagon: Urging adoption of a position by claiming "everyone" supports it, this preys on social proof and conformity pressures, as individuals conform to perceived majorities to avoid isolation, a dynamic amplified in group settings.[76]
Rhetorical and Narrative Devices
Propaganda frequently employs rhetorical devices to manipulate emotions and bypass rational scrutiny, drawing from classical appeals to ethos, pathos, and logos but distorting them for ideological ends. The Institute for Propaganda Analysis, founded in 1937 by educators including Clyde Miller, systematically outlined seven key devices in its publications to educate the public on detecting manipulative rhetoric amid rising totalitarian influences in Europe.[80] These include name-calling, which substitutes derogatory labels for substantive debate, such as branding political opponents as "traitors" or "enemies of the people" to incite visceral rejection without evidence; for instance, Soviet propaganda under Stalin routinely applied such terms to purge rivals, contributing to the execution of over 680,000 individuals deemed disloyal between 1937 and 1938.[81][82] Glittering generalities invoke vague, emotionally charged virtues like "freedom" or "honor" to link ideas to unassailable ideals, evading specific scrutiny; Nazi propaganda exalted the "Volk" community in this manner to foster uncritical loyalty, as seen in speeches by Joseph Goebbels emphasizing abstract "Aryan purity" without empirical backing.[76] Transfer associates a cause with respected symbols, such as draping policies in religious or national icons to borrow their prestige—British World War I posters transferred imperial glory to recruitment drives, portraying enlistment as a sacred duty akin to historical heroism.[81] Testimonial leverages endorsements from admired figures, often out of context; for example, during the 1930s, fascist regimes secured quotes from intellectuals to legitimize expansionism, despite the endorsers' limited expertise in geopolitics.[81] Plain folks portrays leaders as ordinary people to build relatability and trust, masking elite agendas; American politicians in the 20th century, including Franklin D. Roosevelt, used radio "fireside chats" starting in 1933 to project approachable personas amid economic crisis.[81] Card stacking selectively presents facts while omitting counterevidence, creating a skewed reality; tobacco industry campaigns in the mid-20th century highlighted isolated studies on mildness to downplay health risks, influencing public perception until epidemiological data from the 1950s exposed the deception.[81] Bandwagon exploits conformity by implying widespread support, urging individuals to join the "winning side"; this was evident in Cold War-era McCarthyist rhetoric claiming inevitable communist takeover unless opposed en masse, amplifying fears documented in congressional hearings from 1950 to 1954.[81] Narrative devices in propaganda construct overarching stories that simplify complex realities into digestible, emotionally resonant plots, often employing binary oppositions of protagonists versus antagonists to foster group cohesion. Demonization narratives frame adversaries as existential threats embodying pure evil, as in Imperial Japanese propaganda during World War II depicting Americans as barbaric "devils" to justify aggression, a tactic analyzed in postwar declassified materials revealing its role in sustaining troop morale.[83] Hero-villain archetypes glorify in-group figures while vilifying out-groups, evident in Bolshevik narratives post-1917 Revolution portraying Lenin as a savior against "bourgeois villains," which omitted internal famines like the 1921-1922 Volga crisis that killed over 5 million.[84] Framing techniques selectively emphasize attributes to shape interpretation, such as portraying economic policies as "rescue missions" during crises while ignoring causal failures; this was critiqued in analyses of interwar fascist media, where recovery claims under Mussolini ignored persistent unemployment rates exceeding 20% in Italy by 1939.[9] Repetition reinforces narratives through redundancy, embedding them subconsciously—Goebbels' principle that a lie repeated becomes truth underpinned Nazi radio broadcasts from 1933 onward, which aired anti-Semitic tropes daily to normalize them among the populace.[85] These devices, while rooted in universal cognitive biases toward storytelling, enable propagandists to engineer consent by prioritizing causal narratives that align with power interests over verifiable data, as empirical studies in social psychology have since corroborated through experiments on persuasion susceptibility.[10]Technological and Media Strategies
The advent of mass communication technologies has amplified the reach and precision of propaganda efforts by enabling rapid, scalable dissemination of targeted messages to large audiences. The printing press, invented by Johannes Gutenberg around 1440, marked an early technological milestone, allowing for the inexpensive production of pamphlets and books that spread ideological narratives, such as those during the Protestant Reformation where Martin Luther's writings reached broad European readerships within months.[86] This shift from manuscript copying to mechanized printing reduced costs and barriers, facilitating state and religious authorities' control over information flows while enabling dissident voices to challenge orthodoxies through vernacular translations.[9] Broadcast media, particularly radio, revolutionized propaganda during the 20th century by providing one-to-many communication that bypassed literacy requirements and penetrated private homes. In Nazi Germany, Joseph Goebbels' Propaganda Ministry, established in 1933, centralized radio control, distributing 70-80% of households with affordable "Volk receivers" by 1939 to broadcast speeches and ideological content, fostering national unity and demonizing enemies in real-time during events like the 1936 Berlin Olympics.[87] Similarly, Allied forces employed radio for morale-boosting broadcasts and psychological operations, such as the BBC's wartime programming that reached millions across Europe. Film complemented radio's auditory focus with visual symbolism; Leni Riefenstahl's Triumph of the Will (1935) used innovative cinematography to glorify Adolf Hitler, screening to over 10 million Germans and influencing cinematic propaganda techniques worldwide.[88] These media allowed propagandists to synchronize messages across formats, exploiting emotional appeals through synchronized sound and imagery for greater persuasive impact.[89] In the post-World War II era, television extended these strategies by combining motion pictures with live broadcasting, enabling immersive narratives that shaped public perceptions during conflicts like the Cold War. State broadcasters, such as the Soviet Union's Telewizja Polska, aired scripted content promoting collectivism, while Western networks like CBS disseminated anti-communist footage, with viewership spiking to 90% of U.S. households by 1960 for events like the Kennedy-Nixon debates, which highlighted television's role in image-based persuasion.[86] The digital revolution from the 1990s onward introduced algorithmic amplification and micro-targeting, where platforms like Facebook and Twitter (now X) use data analytics to tailor content, creating filter bubbles that reinforce biases; a 2021 Oxford study documented over 80 countries employing computational propaganda, including bots generating 20-30% of certain political discussions to sway elections.[79] Social media's virality, driven by engagement metrics favoring sensationalism, has enabled state actors like Russia's Internet Research Agency to deploy troll farms, disseminating 2016 U.S. election interference content viewed by millions, while non-state groups leverage encrypted apps for decentralized coordination.[90] Emerging technologies such as deepfakes and AI-generated content further refine media strategies by fabricating hyper-realistic audiovisual deceptions, with instances like 2023 videos mimicking Ukrainian President Volodymyr Zelenskyy surrendering, viewed over 10 million times before removal, illustrating risks of eroded trust in visual evidence.[91] These tools exploit cognitive heuristics, prioritizing speed over verification, and underscore how technological advancements prioritize virality and personalization over factual accuracy, often amplifying propaganda in low-gatekeeper environments.[9]Categories of Propaganda
Political and Ideological Forms
Political propaganda involves systematic campaigns by governments, parties, or movements to influence public opinion toward specific policies, leaders, or electoral outcomes, often employing biased narratives to mobilize support or discredit opponents.[92] Ideological forms extend this by promoting overarching belief systems, such as racial hierarchies or class struggle doctrines, framing them as inevitable truths while suppressing contradictory evidence.[93] These efforts typically rely on state-controlled media in authoritarian contexts to achieve saturation, contrasting with more fragmented applications in democracies where independent outlets limit total dominance.[94] In Nazi Germany, ideological propaganda under Joseph Goebbels' Ministry of Enlightenment and Propaganda centralized control over radio, film, and press to inculcate Aryan supremacy and anti-Semitism as core tenets, portraying Jews as existential threats to the Volk through posters, newsreels like Der Ewige Jude, and school curricula revised by 1933 to exclude "degenerate" influences.[93] [95] This apparatus facilitated the regime's shift from electoral gains in 1932—when Nazis became Germany's largest party—to dictatorial consolidation, with propaganda deceiving the public on events like the staged Gleiwitz incident to justify invading Poland on September 1, 1939.[96] [97] Academic analyses note that while such propaganda exploited economic despair post-Versailles Treaty, its effectiveness stemmed from repetitive demonization rather than empirical validation, as evidenced by sustained support amid military setbacks after 1943.[98] Fascist Italy under Benito Mussolini similarly harnessed propaganda to forge a mythic national identity, using posters and rallies to exalt the Duce's persona against Bolshevik threats and glorify Roman imperial revival, with media like the Istituto Luce producing films that reached millions by the 1930s.[99] Official manifestos in voting stations listed Mussolini atop candidate slates, embedding party loyalty into electoral processes from 1924 onward, while events like the 1932 Exhibition of the Fascist Revolution reinforced ideological continuity from ancient Rome.[100] This approach, blending antiquity motifs with modern mass media, sustained regime stability until Allied invasions in 1943, though sources from Western archives highlight its role in masking economic stagnation under corporatist policies.[55] Soviet communist propaganda propagated Marxist-Leninist ideology through historical revisionism, such as the 1938 Short Course on the History of the All-Union Communist Party that airbrushed rivals like Trotsky and reframed events to depict the Bolshevik Revolution as predestined proletarian triumph, disseminated via posters and Pravda to over 100 million citizens by Stalin's death in 1953.[101] It emphasized class enemies as saboteurs, justifying purges that executed 681,692 in 1937-1938 alone, while glorifying Five-Year Plans despite famines like the 1932-1933 Holodomor killing 3-5 million Ukrainians, which propaganda attributed to kulak resistance rather than collectivization failures.[102] Post-WWII, it pivoted to anti-fascist narratives while promoting global revolution, with continuity in Russian state media tactics observed into the 2020s.[103] In democracies, political propaganda often surfaces in wartime mobilization or elections, as with the U.S. Committee on Public Information's 1917-1919 posters urging enlistment against German "Huns," which reached 20 million via 3,000 speakers but faced postwar backlash for exaggerating atrocity claims.[7] Modern instances include partisan ads card-stacking facts, yet pluralistic media and fact-checking mitigate totalitarian-style indoctrination, though studies indicate vulnerability to echo chambers in digital eras.[104] Sources from military academies emphasize that while authoritarian regimes integrate propaganda into governance for ideological hegemony, democratic variants prioritize persuasion over coercion, reflecting causal differences in institutional accountability.[92]
Wartime and Conflict-Related
![I Want You for U.S. Army by James Montgomery Flagg][float-right] Wartime propaganda encompasses government-led campaigns to mobilize populations, sustain morale, recruit personnel, and delegitimize adversaries during armed conflicts. These efforts often employ posters, films, leaflets, and media broadcasts to foster unity and portray the enemy as barbaric or existential threats. In World War I, the United States established the Committee on Public Information (CPI) in April 1917 under George Creel to coordinate propaganda, producing over 2,000 titles in posters, pamphlets, and films that emphasized American exceptionalism and German atrocities.[17] The CPI's "Four Minute Men" initiative deployed 75,000 volunteers to deliver short speeches in theaters and public spaces, reaching an estimated 400 million Americans and contributing to war bond sales exceeding $18 billion.[105] British propaganda similarly amplified reports of German crimes in Belgium, as detailed in the 1915 Bryce Report, which, while based on witness accounts, included unverified claims of bayoneting babies to incite Allied support.[106] During World War II, propaganda intensified with state-controlled apparatuses on both sides. Nazi Germany's Reich Ministry of Public Enlightenment and Propaganda, headed by Joseph Goebbels since March 1933, monopolized media to glorify the regime, demonize Jews and Allies, and justify expansionism through films like Triumph of the Will (1935) and radio broadcasts reaching millions.[107] The ministry orchestrated the 1933 book burnings and censored dissent, fostering a cult of personality around Adolf Hitler that sustained domestic support until late 1944.[87] In response, the U.S. Office of War Information (OWI), created in June 1942, disseminated posters such as "Rosie the Riveter" to encourage women's workforce participation, boosting female employment from 12 million in 1940 to 18 million by 1944, while films and cartoons depicted Axis powers as monstrous aggressors.[108] Allied campaigns also included leaflet drops over enemy territories, with the U.S. distributing over 6 billion leaflets in Europe alone to undermine morale and promote surrender.[83] In later conflicts, propaganda adapted to asymmetric warfare and media landscapes. During the Vietnam War (1955–1975), North Vietnamese forces used posters and radio to frame the U.S. as imperial invaders, portraying downed American aircraft as victories to rally domestic support and international sympathy.[109] U.S. efforts, including over 20 billion leaflets via psychological operations, aimed to induce defections but largely failed amid graphic media coverage of events like the Tet Offensive in January 1968, which shifted American public opinion against the war despite tactical U.S. successes.[110] In the 2003 Iraq War, U.S. administration claims of weapons of mass destruction and Saddam Hussein's alleged 9/11 ties, echoed uncritically by major media, facilitated initial invasion support but eroded credibility post-invasion when no such stockpiles were found, as confirmed by the 2004 Iraq Survey Group report.[111] Embedded journalism and "shock and awe" framing further shaped perceptions, though insurgent videos via early internet platforms countered official narratives.[62] These cases illustrate propaganda's dual role in short-term mobilization and long-term risks of backlash when discrepancies emerge.Commercial and Economic
Commercial propaganda adapts systematic persuasion methods, originally refined during World War I, to commercial ends, fostering consumerism by linking products to emotional desires, social status, and cultural narratives rather than mere utility.[18] This approach treats consumers as malleable audiences whose behaviors can be directed toward profit maximization, often prioritizing psychological influence over factual product attributes.[112] Edward Bernays, leveraging insights from uncle Sigmund Freud's theories on the unconscious, pioneered these tactics in the 1920s by reorienting wartime propaganda toward private enterprise.[113] In his 1928 book Propaganda, Bernays argued that an "invisible government" of public relations experts must organize public opinion to avert chaos, explicitly applying crowd psychology to boost sales for industries like tobacco and appliances.[114] A landmark example was his 1920s breakfast campaign for Beech-Nut Packing, where he commissioned surveys of 5,000 physicians endorsing bacon as a health-promoting food, resulting in widespread media adoption of "bacon and eggs" as the ideal meal and a sales surge.[112] Similarly, the 1929 "Torches of Freedom" effort for American Tobacco staged a march of hired women smoking cigarettes during New York's Easter Parade, framing the act as a symbol of gender emancipation and normalizing female consumption, which correlated with a rise in women smokers from 5% in 1924 to 12% by 1929.[113] Common techniques mirror political propaganda: bandwagon appeals urge purchases by implying universal participation, as in ads claiming "everyone's switching to [brand]"; testimonials deploy celebrities or experts for endorsement, like athlete-backed energy drinks; and transfer associates products with aspirational values, such as luxury cars evoking freedom or prestige.[115] These methods, while effective in driving revenue—U.S. advertising spending rose from $1.3 billion in 1920 to $3.4 billion by 1930—have drawn scrutiny for cultivating artificial needs and debt-driven economies, with Bernays himself acknowledging the deliberate creation of demand to sustain growth.[18][112] Economic propaganda, distinct yet overlapping, deploys similar tools by states or institutions to legitimize policies, obscure failures, or rally support for resource distribution amid scarcity or ideology. In the Soviet Union, Joseph Stalin's First Five-Year Plan (1928–1932) was propagandized via posters, films, and rallies depicting steel mills and collective farms as engines of proletarian triumph, with slogans like "Fulfill the Five-Year Plan in Four!" mobilizing labor quotas under threat of purge.[116] Official claims touted industrial output growth—steel production jumped from 4 million tons in 1928 to 5.9 million in 1932—but concealed inefficiencies, forced collectivization, and the Holodomor famine killing millions, using metrics selectively to project socialist superiority.[117] Such campaigns sustained regime control by equating economic sacrifice with ideological destiny, influencing subsequent plans through 1941.[118] In the United States, the Roosevelt administration's New Deal (1933–1939) employed posters and radio broadcasts to portray programs like the National Recovery Administration as collective salvation from the Great Depression, with symbols of unity and recovery encouraging compliance despite mixed empirical outcomes, such as temporary unemployment spikes from codes.[7] During World War II, Treasury Department efforts sold $185 billion in war bonds via celebrity drives and ads framing purchases as economic patriotism, while rationing campaigns justified shortages by emphasizing shared burden, achieving 85 million participants by 1945.[83] These instances highlight economic propaganda's role in aligning public action with policy imperatives, often amplifying successes while downplaying causal trade-offs like inflation or coercion.[5]Religious and Cultural
Religious propaganda refers to organized efforts by religious authorities to disseminate doctrines, inspire devotion, and expand influence through persuasive narratives, symbols, and media. The term "propaganda" originated with the Roman Catholic Church's Sacra Congregatio de Propaganda Fide, founded on June 22, 1622, by Pope Gregory XV through the bull Inscrutabili Divinae Divinae Providentiae, to oversee missionary propagation amid the Protestant Reformation and European colonial ventures.[119] This congregation standardized training at the Urban College, funded expeditions, and produced vernacular texts, contributing to Catholicism's growth in regions like Latin America, where by 1700, millions had been baptized, often blending evangelization with colonial administration.[120] Early Christianity employed similar tactics; the Apostle Paul's epistles, circulated from approximately 50-60 CE, adapted Jewish messianic claims to Gentile contexts, using rhetoric to counter Roman paganism and foster communities across the empire.[121] In Islam, da'wah—the call to faith—has functioned as a core propagation strategy since the 7th century, with the Prophet Muhammad's Meccan preaching (610-622 CE) emphasizing monotheism through public recitation and treaties, later expanding via conquests that integrated persuasion with territorial control.[122] By the Umayyad Caliphate (661-750 CE), da'wah incorporated administrative policies favoring converts, such as tax incentives, leading to rapid demographic shifts in the Middle East and North Africa, where non-Muslim populations declined from majorities to minorities over centuries. Modern Islamist groups, including those affiliated with the Muslim Brotherhood since 1928, have digitized da'wah via social media, reaching billions while framing it as defensive against secularism, though critics note its selective emphasis on appealing verses over doctrinal rigor.[123] Cultural propaganda promotes or defends shared identities, norms, and aesthetics to foster cohesion or superiority, often intersecting with religious elements. During British colonialism in India (1858-1947), officials deployed photography and exhibitions to portray indigenous customs as primitive, justifying "civilizing" interventions; for instance, images of sati or caste practices, captured post-1857 Rebellion, were exhibited in London to garner public support for empire, despite selective framing that ignored adaptive reforms.[124] In Nazi Germany (1933-1945), Joseph Goebbels' Ministry of Propaganda synchronized culture via the Reich Chamber of Culture, purging over 16,000 "degenerate" artworks in 1937 exhibitions while glorifying Nordic myths in films like Triumph of the Will (1935), which drew 500,000 viewers to instill racial purity as cultural destiny.[125] Post-World War II, U.S. cultural exports—Hollywood films averaging 200 annual releases by the 1950s—projected democratic individualism abroad, influencing global tastes but critiqued for eroding local traditions, as evidenced by European quotas limiting American imports to counter "Coca-Colonization."[126] Such efforts highlight propaganda's dual role in preservation, as seen in indigenous resistance media, and imposition, where dominant narratives marginalize alternatives through institutional control.State-Sponsored and Institutional
State-sponsored propaganda refers to efforts by governments to systematically produce and distribute information aimed at shaping public opinion in favor of official policies, ideologies, or wartime objectives, often through dedicated ministries or agencies.[107] In Nazi Germany, the Ministry of Propaganda and Public Enlightenment, established in 1933 under Joseph Goebbels, centralized control over media, arts, and public communications to promote Aryan supremacy and anti-Semitism, achieving near-total domination of information flow by 1939.[107] Similarly, during World War II, the United States government created the Office of War Information in 1942 to coordinate propaganda campaigns, producing posters, films, and radio broadcasts that mobilized public support for the war effort, with over 200,000 posters distributed to encourage enlistment and resource conservation.[83] Institutional propaganda extends to state-controlled media outlets and educational systems designed to indoctrinate populations. In authoritarian regimes, such as the Democratic People's Republic of Korea, government directives integrate propaganda into primary education, with posters and curricula emphasizing loyalty to the ruling family and anti-Western narratives, fostering generational adherence to state ideology.[83] China's Xinhua News Agency, employing over 8,000 staff and operating 105 branches worldwide as of 2005, functions as the primary conduit for official narratives, blending news with ideological messaging to project Beijing's global influence while suppressing dissenting views domestically.[127] Russia's RT (formerly Russia Today), funded by the state since its 2005 launch, broadcasts content challenging Western narratives on issues like Ukraine, reaching millions internationally through multilingual platforms.[128] Contemporary state-sponsored efforts increasingly incorporate digital tools, with at least 62 countries employing government agencies for computational propaganda as of 2021, including automated social media accounts to amplify official positions.[79] In China, state media like CGTN extends this through Twitter strategies that promote governance models portraying authoritarian efficiency over democratic alternatives, targeting global audiences amid U.S.-China tensions.[129] These institutional mechanisms often operate under the guise of journalism, but their alignment with government directives raises questions of credibility, particularly when Western analyses highlight adversarial propaganda while domestic efforts, such as U.S. historical wartime mobilization, are retrospectively framed as patriotic information campaigns rather than equivalent manipulation.[128][83] Empirical studies indicate that such propaganda's effectiveness depends on audience predispositions and repetition, underscoring the causal role of institutional monopoly on information in sustaining regime legitimacy.[79]Theoretical Frameworks
Models from Social Psychology
Social psychology examines propaganda through models that highlight mechanisms of influence on individual cognition, group dynamics, and attitude formation. These frameworks reveal how propaganda exploits innate tendencies toward conformity, obedience, and identity-based biases to shape beliefs and behaviors without necessitating rational scrutiny. Empirical studies, such as those on conformity and authority, demonstrate that ordinary individuals can adopt propagated views under social pressure, often prioritizing group harmony or hierarchical cues over personal judgment.[130] Conformity and Social Proof. Solomon Asch's 1951 line judgment experiments illustrated how individuals conform to erroneous group consensus, with about 75% of participants yielding at least once to a unanimous majority, even when aware of the inaccuracy.[131] Propaganda leverages this by fabricating perceived majority support through repeated messaging or staged endorsements, creating an illusion of normative behavior that pressures dissenters to align. Robert Cialdini's principle of social proof, derived from observational studies, posits that people look to others' actions in ambiguous situations to guide their own, a tactic evident in propaganda campaigns that amplify testimonials or crowd simulations to imply widespread acceptance. For instance, wartime posters depicting unified public enthusiasm exploit this to foster compliance.[132] Obedience to Authority. Stanley Milgram's 1961-1962 obedience studies found that 65% of participants administered what they believed were lethal electric shocks to a learner when instructed by an experimenter in a white lab coat, underscoring the potency of perceived authority in overriding moral inhibitions.[133] In propaganda contexts, this model explains adherence to directives from leaders or institutions portrayed as legitimate experts, where cues like uniforms, titles, or official rhetoric reduce personal accountability. Theoretical extensions link authority propagation to evolutionary adaptations for hierarchical coordination, enabling rapid belief shifts in populations via top-down inculcation.[130] Cognitive Dissonance. Leon Festinger's 1957 theory describes the psychological discomfort from holding conflicting cognitions, prompting individuals to resolve it by altering beliefs or rationalizing actions. Propaganda induces dissonance by juxtaposing new narratives against existing views—such as portraying out-groups as threats—motivating acceptance to restore consistency, particularly when commitment to initial actions (e.g., public endorsements) entrenches the shift.[134] Empirical applications show this in disinformation campaigns, where repeated exposure amplifies selective reinforcement, biasing information processing toward propagated ideologies.[135] Social Identity Theory. Henri Tajfel and John Turner's 1979 framework argues that self-concept derives from group memberships, fostering in-group favoritism and out-group discrimination via minimal cues alone, as shown in Tajfel's 1970s experiments where arbitrary groupings led to biased resource allocation.[136] Propaganda amplifies this by emphasizing collective identities (e.g., national or ideological) to heighten perceived intergroup threats, justifying aggression or exclusion; studies confirm stronger effects under uncertainty, where identity-affirming messages solidify loyalty.[130] This model underscores propaganda's role in sustaining divisions, as individuals derogate contrary evidence to protect group-derived esteem.Sociological and Educational Theories
Sociological theories frame propaganda as an embedded mechanism for maintaining social cohesion and control in mass societies. Jacques Ellul's 1965 analysis posits propaganda not merely as deliberate persuasion but as a pervasive sociological process in technological civilizations, where individuals are continuously integrated into collective attitudes through "pre-propaganda" mechanisms like education, media, and group affiliations.[21] This horizontal propaganda operates subtly, fostering conformity by aligning personal needs with societal norms, distinct from vertical political directives that impose top-down ideology.[137] Ellul argued that modern efficiency demands total propaganda, rendering it inevitable and inescapable, as it exploits the individual's isolation in urban, industrialized settings to manufacture unanimous public opinion.[138] Harold Lasswell's foundational work in the 1920s and 1930s examined propaganda as a tool of elite influence over mass behavior, emphasizing the strategic dissemination of symbols to mobilize support during conflicts.[139] In Propaganda Technique in the World War (1927), Lasswell documented how World War I belligerents used repetitive messaging across channels to sustain morale and demonize enemies, laying groundwork for viewing propaganda as a rational instrument of power in democratic and authoritarian contexts alike.[19] His communication model—who says what, through which channel, to whom, with what effect—highlights propaganda's causal role in shaping perceptions and actions within stratified societies.[23] Educational theories distinguish propaganda from genuine pedagogy by its intent to suppress critical thinking in favor of ideological uniformity. In autocratic systems, state-controlled curricula function as propaganda vectors, embedding ruling narratives to deter dissent and justify authority, as evidenced by models showing how such indoctrination correlates with reduced political opposition and sustained regime stability.[140] For instance, North Korean primary education integrates propaganda posters and texts promoting leader worship from early grades, conditioning obedience over empirical inquiry.[141] Conversely, democratic educational responses, such as media literacy programs developed since the 1930s, treat propaganda as a teachable distortion, training students to evaluate sources and biases through frameworks like public pedagogy, which views learning as a battleground for competing narratives.[142] These approaches underscore propaganda's epistemological threat, where deliberate falsehoods erode fact-based discourse, prompting curricula reforms to prioritize verification skills amid rising digital manipulation.[143]Cognitive and Self-Propaganda Mechanisms
Propaganda exploits cognitive biases that shape human judgment and decision-making, notably confirmation bias, whereby individuals preferentially process and retain information aligning with preexisting beliefs, thereby amplifying receptivity to ideologically congruent messages while discounting disconfirming evidence.[144] This bias operates through selective exposure and interpretation, as empirical studies show people gravitate toward sources reinforcing their views, fostering echo chambers that entrench propagandistic claims.[145] Complementing this, the availability heuristic renders vivid or recent propagandistic imagery more persuasive, as repeated exposure elevates perceived plausibility independent of factual accuracy.[134] Emotional mechanisms further underpin cognitive susceptibility, with reliance on affective cues over deliberative reasoning correlating with heightened belief in deceptive narratives; for instance, experimental data indicate that emotion-driven processing increases endorsement of false claims by up to 20-30% compared to analytical approaches.[146] Cognitive dissonance, triggered when propaganda contradicts held convictions, prompts rationalization or selective reinterpretation to alleviate discomfort, as individuals adjust attitudes to align with authoritative or group-endorsed messages.[134] Group dynamics exacerbate these effects via social proof and in-group favoritism, where conformity pressures lead to uncritical acceptance of collective narratives, as modeled in social influence experiments.[144] Self-propaganda manifests through internalized persuasion processes, where individuals actively construct arguments supporting external propaganda, thereby deepening personal commitment; field experiments at deliberative forums reveal that self-articulation of positions boosts perceived factual and moral validity by 15-25%, simulating voluntary endorsement.[147] This self-persuasion hinges on effort perception, with greater anticipated cognitive investment yielding stronger attitudinal shifts, as demonstrated in controlled studies varying target audience assumptions.[148] Recursive cognition contributes by equating narrative coherence with truth, enabling absurd or ideologically extreme claims to gain traction through iterative self-reinforcement, particularly in isolated informational environments.[149] Such mechanisms sustain long-term adherence, as habitual rumination on aligned content overrides metacognitive scrutiny, per psychological models of belief perseverance.[150]Empirical Applications and Examples
Major Historical Case Studies
One prominent historical case study of propaganda involves the efforts of the United States during World War I, where the Committee on Public Information, established on April 13, 1917, under George Creel, produced over 20 million posters, 75 million pamphlets, and thousands of films to mobilize public support for the war. These materials emphasized enlistment, bond purchases, and conservation, with iconic posters like James Montgomery Flagg's "I Want You" depicting Uncle Sam directly addressing viewers to boost recruitment, contributing to over 4 million American troops mobilized by 1918.[7] The campaign also fostered anti-German sentiment, leading to suppression of German-language publications and cultural elements, as evidenced by the closure of over 500 German newspapers and the renaming of sauerkraut to "liberty cabbage."[151] In Nazi Germany, propaganda was centralized under the Reich Ministry of Public Enlightenment and Propaganda, created on March 13, 1933, and led by Joseph Goebbels, who controlled media, film, radio, and arts to promote Aryan supremacy and antisemitism.[152] Films like The Eternal Jew (1940) and posters depicted Jews as vermin or economic parasites, facilitating the Nuremberg Laws of September 15, 1935, and escalating to the Holocaust, where propaganda justified the deportation and extermination of 6 million Jews by portraying them as threats to national purity.[93] This apparatus reached broad audiences via mandatory radio ownership initiatives, with 70% of households equipped by 1939, sustaining support for the regime until late in World War II despite military setbacks.[153] Soviet propaganda under Joseph Stalin exemplified state control through the Agitprop department of the Central Committee, which from the 1920s onward used posters, newspapers like Pravda, and films to glorify collectivization and industrialization, such as the Five-Year Plans starting in 1928 that claimed to transform the USSR into an industrial power, though at the cost of millions in the Holodomor famine of 1932-1933. During World War II, after the German invasion on June 22, 1941, propaganda shifted to nationalism, producing over 200,000 posters depicting the "Great Patriotic War" and Stalin as a defender, which helped mobilize 34 million Soviet soldiers and maintain civilian resolve amid 27 million deaths.[154] Postwar, it falsified history, such as rewriting the 1939 Molotov-Ribbentrop Pact to emphasize Soviet victimhood.[101] Allied propaganda in World War II, particularly by the United States Office of War Information formed in June 1942, utilized emotional appeals in posters to promote war bond sales totaling $185 billion and rationing compliance, with designs focusing on fear of Axis brutality rather than abstract ideals for greater impact.[83] British efforts, including BBC broadcasts and leaflets dropped over Germany, aimed to undermine morale, with studies indicating limited but measurable effects on desertions in occupied Europe.[155] These campaigns contrasted with Axis efforts by emphasizing democratic values and unity, contributing to sustained home-front production that outpaced enemies, as U.S. industrial output rose 96% from 1941 to 1945.[87]Contemporary Instances in Media and Politics
In the digital era, propaganda in media and politics has proliferated through social media algorithms and state-sponsored campaigns, enabling rapid dissemination of tailored narratives to influence public opinion and electoral outcomes. During the 2020 U.S. presidential election, false claims of widespread voter fraud propagated by former President Donald Trump and supporters were amplified across platforms, contributing to the January 6, 2021, Capitol riot, though subsequent investigations found no evidence of fraud sufficient to alter results.[156] Mainstream media outlets, often characterized by left-leaning institutional biases, framed these claims uniformly as disinformation, potentially suppressing debate on verifiable irregularities like ballot harvesting in states such as Pennsylvania, where over 1 million mail-in ballots were processed amid chain-of-custody concerns raised in court filings.[157] [158] The 2024 U.S. election saw disinformation further define narratives, with foreign actors like Russia and Iran deploying bots and fake accounts to exacerbate divisions on issues such as immigration and economic policy, reaching millions via platforms like X and TikTok.[159] A Stanford study revealed that partisan loyalty overrides factual accuracy, with both Democrats and Republicans accepting misleading information aligning with their views—e.g., conservatives endorsing unverified election interference claims, while liberals dismissed documented border security data as exaggerated.[157] This echoes patterns in media coverage, where outlets like CNN and MSNBC allocated 90% negative airtime to Trump in 2024, per Media Research Center analysis, fostering perceptions of coordinated anti-conservative propaganda rather than objective journalism.[158] In international conflicts, Russia's invasion of Ukraine on February 24, 2022, prompted extensive state propaganda via outlets like RT and Sputnik, reframing the operation as "denazification" and denying atrocities such as the Bucha massacre, where over 400 civilian bodies were documented by satellite imagery and eyewitness accounts.[160] Western media, while countering Russian narratives, has been critiqued for selective emphasis on Ukrainian successes—e.g., underreporting Russian territorial gains in Donbas, which controlled 20% of Ukraine by mid-2023—potentially serving alliance-building propaganda amid NATO aid totaling $100 billion by 2024.[161] RAND analysis showed Russian extremist content reaching 500 million impressions globally via proxies, manipulating data like casualty figures to claim Ukrainian losses at 1 million versus official estimates of 500,000 combined.[160] [162] During the COVID-19 pandemic, governments and media propagated unified messaging on measures like mask mandates and vaccines, with the U.S. CDC reporting over 1 million excess deaths linked to hesitancy fueled by counter-narratives, though suppression of lab-leak hypotheses—later deemed plausible by FBI assessments in 2023—exemplified institutional alignment over empirical inquiry.[163] Chinese state media disseminated propaganda minimizing origins and efficacy of lockdowns, promoting unsubstantiated claims of Western bioweapon development, which garnered billions of views on Weibo and influenced global skepticism toward WHO data.[164] A NIH review linked such misinformation to 20-30% vaccine refusal rates in low-trust populations, underscoring propaganda's role in eroding public health compliance.[163]Emerging AI-Driven and Digital Propaganda
The integration of artificial intelligence into propaganda has enabled the rapid generation of synthetic media, including deepfakes, text, and images, allowing actors to disseminate tailored narratives at unprecedented scale and low cost. Tools like generative adversarial networks and large language models facilitate the creation of convincing audiovisual content that mimics real events or personas, often evading initial detection by human reviewers. For instance, benchmarks indicate that AI-driven "fake news" sites proliferated tenfold between 2023 and 2024, flooding online ecosystems with algorithmically optimized disinformation.[165] State and non-state actors exploit these capabilities to amplify influence operations, with Russia's government-directed campaigns employing AI to produce election-related content targeting Western democracies as early as 2024.[166] Similarly, Iranian and Chinese entities have leveraged generative AI, such as Google's Gemini, to accelerate narrative dissemination, though empirical assessments reveal limited behavioral sway compared to traditional methods.[167] In the 2024 U.S. presidential election, AI-generated visuals emerged as a vector for partisan messaging, with Donald Trump posting at least 19 such images or videos on Truth Social to rally supporters and critique opponents, including depictions of fabricated scenarios like immigrants invading suburbs.[168] Deepfake audio and video, hyped as a existential threat, appeared in scattered instances—such as a robocall mimicking President Biden's voice urging voters to abstain—but analyses of 78 election-related deepfakes found they were no more persuasive than conventional fake news, with detection rates improving via forensic tools and public skepticism.[169][170] Foreign malign influence compounded this, as U.S. intelligence reported Russia and Iran deploying AI to generate divisive content, including synthetic endorsements and scandal fabrications, though platforms like OpenAI disrupted several state-affiliated attempts by revoking access to models.[171][172] Digital platforms exacerbate AI-driven propaganda through algorithmic amplification, where machine learning prioritizes engaging—often polarizing—content, creating echo chambers that reinforce preconceptions rather than convert skeptics. Chinese state-aligned firms, such as GoLaxy, have pioneered AI for multilingual propaganda bots that simulate grassroots discourse on social media, targeting diasporas and international audiences with narratives aligned to Beijing's interests. In authoritarian regimes, governments influence AI models by requiring alignment with state narratives, raising concerns about embedded propaganda, though no direct evidence links training on government data to widespread skewing of propaganda influence in 2025-2026. Experts anticipate increased use of AI for disinformation and propaganda by state actors as capabilities advance.[173] Despite these advances, causal evaluations underscore that AI's propaganda efficacy hinges on audience priors; synthetic media reinforces biases but rarely shifts entrenched views, as evidenced by post-election studies showing minimal vote impact from deepfakes in contests like Slovakia's 2023 ballot.[174] Countermeasures, including watermarking standards and AI detection classifiers, are proliferating, yet lag behind generative tools' evolution, posing ongoing risks to informational integrity in hybrid analog-digital environments.[175]Societal Perceptions and Debates
Contested Legitimacy and Bias Claims
The term "propaganda" carries contested legitimacy due to its evolving and ambiguous definitions, originally denoting the neutral propagation of faith by the Catholic Church's Congregatio de Propaganda Fide established in 1622, but increasingly connoting deliberate manipulation since the early 20th century. Scholars debate whether propaganda encompasses all organized persuasion or requires intent to deceive, with some arguing its inherent power renders it illegitimate in open societies, while others view distinctions as subjective and ideologically driven. This definitional fluidity allows actors to label disfavored communications as propaganda while exempting aligned efforts, undermining claims of objective legitimacy. Bias accusations frequently frame media and institutional outputs as propagandistic, with empirical patterns showing partisan asymmetry: political conservatives issue such claims more often than liberals, correlating with documented disparities in news coverage favoring left-leaning perspectives on issues like economics and social policy. For instance, a Stanford University analysis of 2024 data found that extreme partisan views and one-sided media consumption predict biased perceptions, yet objective metrics reveal mainstream outlets underrepresenting conservative viewpoints relative to public opinion distributions. Public trust metrics reinforce these contests, as Gallup polls indicate only 31% of Americans held a "great deal" or "fair amount" of confidence in mass media in 2024, with Republicans at 14% versus Democrats at 54%, reflecting perceptions of systemic institutional bias rather than mere ideological disagreement.[176][177] In academic and journalistic institutions, claims of propaganda arise from evidence of overrepresentation of left-leaning viewpoints, with surveys showing faculty political donations skewing 96% Democratic in social sciences as of recent cycles, potentially causal in shaping narratives presented as neutral scholarship. These biases manifest in selective emphasis or omission, as seen in coverage of events like the 2020 U.S. election disputes, where outlets accused of right-wing propaganda faced counter-claims of left-driven suppression. Such mutual delegitimization highlights how propaganda labels often prioritize causal self-interest over empirical verification, with higher-frequency accusations from marginalized ideological groups signaling genuine distortions rather than symmetric equivalence.[157][158]Cross-Ideological Accusations
Accusations of propaganda frequently traverse ideological divides, as partisans on both the left and right attribute manipulative intent to opponents' communications, often framing them as deliberate distortions to advance agendas. Studies indicate that both major U.S. political affiliations routinely accuse the opposing party of conspiratorial behavior, including spreading propaganda, which reinforces mutual distrust and contributes to affective polarization.[178] For instance, conservatives have long charged mainstream media outlets with left-leaning bias, labeling coverage of issues like climate change, immigration, and elections as propagandistic efforts to undermine traditional values and electoral integrity; a 2024 analysis of social media discourse found that claims of "leftist news media bias" predominate in such accusations, though they do not span a broad ideological spectrum among accusers.[158] Conversely, liberals and left-leaning commentators accuse conservative media and figures of deploying propaganda to stoke division, such as portraying right-wing narratives on election fraud or cultural issues as echoes of foreign disinformation tactics. Notable examples include Democratic-aligned critics labeling former President Donald Trump's rhetoric and appointees' statements as parroting Russian propaganda, as seen in 2024 objections to Tulsi Gabbard's intelligence role nomination for allegedly promoting narratives aligned with adversarial states.[179] This bidirectional pattern extends to entertainment and education, where right-leaning voices decry left-influenced content in television and schools as indoctrination—evident in critiques of programs like Sesame Street for embedding progressive ideologies—while left-leaning sources counter that conservative outlets amplify misinformation on topics like public health during the COVID-19 pandemic.[180] Such cross-accusations are amplified in digital echo chambers, where partisan sharing of "fake news" claims correlates with ideological affiliation, yet empirical reviews reveal asymmetry in vulnerability: right-leaning users show higher rates of sharing misleading content, though both sides perceive the other's information ecosystem as propagandistic.[181] This dynamic not only mirrors historical propaganda rivalries but also sustains polarization, as each side's claims of victimhood to the other's tactics discourage cross-ideological dialogue and bolster in-group cohesion. Mainstream academic and media analyses, often from left-leaning institutions, tend to emphasize right-wing propaganda risks while downplaying equivalent left-wing efforts, highlighting credibility concerns in source selection for these debates.[182][183]Resistance and Counter-Propaganda
Resistance to propaganda encompasses both individual psychological mechanisms that mitigate susceptibility and organized societal efforts to debunk or neutralize propagandistic messaging. Empirical studies identify cognitive factors such as prior knowledge, analytical thinking, and emotional regulation as key barriers to persuasion by misleading narratives. For instance, individuals with higher cognitive reflection tendencies are less prone to endorsing misinformation, as they engage in effortful scrutiny rather than heuristic acceptance.[144] Social influences, including exposure to diverse viewpoints, further bolster resistance by fostering skepticism toward uniform echo chambers.[144] Inoculation theory, developed in the 1960s and validated through decades of experimentation, provides a structured approach to building attitudinal resistance by preemptively exposing individuals to weakened forms of propagandistic arguments, enabling them to generate refutations. This "vaccination" analogy has demonstrated efficacy in reducing susceptibility to conspiracy theories, such as those surrounding the 9/11 attacks, where inoculated participants showed sustained motivational defenses against subsequent exposure.[184] Meta-analyses confirm inoculation's robustness across domains, outperforming post-hoc corrections by activating threat recognition and counterarguing prior to full confrontation.[185] Recent applications, including social media campaigns, have scaled this method to confer resilience against misinformation tactics like discrediting sources or false dichotomies, with prebunking videos increasing resistance by up to 20% in controlled trials.[186] [187] Counter-propaganda involves deliberate state or non-state initiatives to expose and dismantle adversarial messaging, often through revelation of origins, factual rebuttals, or amplification of alternative narratives. Historically, the United States Information Agency in the 1980s systematically debunked Soviet disinformation campaigns, such as fabricated atrocity claims, by disseminating evidence of KGB orchestration to targeted audiences in Eastern Europe and beyond.[188] During World War II, Allied psychological operations countered Axis propaganda by airdropping leaflets that highlighted inconsistencies in Nazi claims, such as exaggerated military successes, thereby eroding enemy morale and civilian compliance. Similar tactics were employed against ISIS propaganda in the 2010s, combining content takedowns with counter-narratives emphasizing ideological contradictions, though cyber disruptions proved more immediately disruptive than persuasive rebuttals.[189] Fact-checking represents a common modern counter-strategy, verifying claims against empirical evidence to undermine propagandistic assertions; however, its impact is limited, primarily enhancing factual recall without consistently altering deeply held beliefs or voting behavior.[190] Studies indicate that while fact-checks correct specific inaccuracies, they can trigger backfire effects among audiences ideologically aligned with the original message, particularly when checkers are perceived as biased—a critique substantiated by analyses revealing selective scrutiny in outlets like PolitiFact.[191] [192] In contrast, inoculation and media literacy programs, which train recognition of manipulative techniques rather than disputing content, yield more durable resistance, as evidenced by reduced polarization in experimental groups exposed to propaganda simulations.[193] Overall, effective counter-propaganda prioritizes preempting persuasion over reactive correction, aligning with causal pathways where early skepticism disrupts belief formation more reliably than ex post interventions.[194]Differentiations from Adjacent Concepts
Propaganda vs. Disinformation and Misinformation
Propaganda involves the deliberate and systematic dissemination of information—facts, arguments, rumors, half-truths, or lies—to advance a specific political, ideological, or organizational agenda, often by state or institutional actors.[195] Unlike mere persuasion, it employs techniques such as selective emphasis, emotional appeals, and repetition to shape public attitudes or behaviors in alignment with the propagator's interests.[196] This distinguishes it from neutral information-sharing, as propaganda prioritizes advocacy over comprehensive truth, though it may incorporate verifiable facts when they serve the narrative.[9] In contrast, misinformation refers to false or inaccurate information circulated without deliberate intent to deceive, often resulting from errors, misunderstandings, or careless sharing.[197] For instance, an individual might unwittingly spread outdated statistics due to reliance on unverified sources, lacking awareness of their inaccuracy.[198] Disinformation, however, entails the intentional creation and distribution of fabricated or manipulated falsehoods to mislead audiences, typically for strategic gains like sowing discord or undermining trust.[199] The core differentiator here is mens rea: disinformation requires purposeful deception, as seen in coordinated campaigns fabricating events, whereas misinformation arises from negligence or ignorance.[200] Key variances emerge in veracity, structure, and objectives. Propaganda can be truthful in parts but is inherently biased through omission or framing, aiming to mobilize support rather than merely confuse.[201] Disinformation and misinformation, by definition, involve untruths, but propaganda's organized, agenda-driven nature—often involving media control or mass campaigns—sets it apart from the potentially sporadic spread of dis/misinformation via social networks.[202] Overlaps exist, as propaganda may incorporate disinformation (e.g., state-sponsored fabrications during wartime), yet not all disinformation qualifies as propaganda without a broader persuasive framework.[196] Empirical analyses highlight that while misinformation proliferates virally through cognitive biases like confirmation seeking, propaganda leverages institutional resources for sustained influence, as evidenced in historical cases like Cold War broadcasts.[9]| Aspect | Propaganda | Misinformation | Disinformation |
|---|---|---|---|
| Truth Content | Can include facts, but selectively biased | Always false or inaccurate | Deliberately false or misleading |
| Intent | Persuasion for specific agenda | None; unintentional error | Deception and harm |
| Organization | Systematic, often institutional | Ad hoc, individual or viral | Coordinated, often covert |
| Examples | Government posters rallying support | Shared rumor based on mistake | Fabricated stories to incite panic |