History

Please reply to this post by responding to the following:
The readings this week covered the topic of deception in the Cold War, highlighting the practices of both the Americans and the Soviets. Select one of those examples and explain which information operations practices (Week 3) and biases/errors (Weeks 1 and 2) are at play in your chosen case study. Then explain one possible change that might have helped improve the results of your chosen case. Support your arguments.

Forum Feedback Instructions

Your Initial Post must be:
posted by Thursday evening;
a minimum of 250 words and closer to 500 words; and
contain reference to at least two of the lesson’s assigned readings.
Peer Responses: Respond to at least 2 other students
By Sunday evening; with
Responses between 100-200 words;
Containing reference to at least one of the lesson’s assigned readings; and
Include direct questions.
Forum Engagement & Professor Queries: In addition, you need to:
Monitor the postings throughout the week; and
Respond to my queries/questions.
Initial Post Due: Thursday, by 11:55pm ET
Responses Due: Sunday, by 11:55pm ET

__________________________________________
NTL653: DECEPTION, PROPAGANDA AND DISINFORMATION

Lesson 1: The Psychology of Influence
Greetings everyone! Welcome to Deception, Propaganda, and Disinformation (INTL653)!

In this lesson we will cover the topic of psychological methods and their influence. The definitions and concepts will be essential as you build your understanding of underlying concepts related to deception and information operations.

“The receptivity of the great masses is very limited, their intelligence is small, but their power of forgetting is enormous. In consequence of these facts, all effective propaganda must be limited to a very few points and must harp on these slogans until the last member of the public understands what you want him to understand by your slogan.”

–Adolf Hitler (Mein Kampf Vol. 1, Ch. VI)

Introduction
It’s a fast moving class for professionals who want to add to their skill sets or hone existing skills. This is indeed a murky arena of thinking and practice; however, it is not unfathomable. It does tend to challenge many who are comfortable with the more routinized forms of intelligence operations.

Here is a koan that points to the way we’ll likely wrestle with some ideas in this class.

Not the Wind, Not the Flag:

Two monks were arguing about a flag.
One said: `The flag is moving.’
The other said: `The wind is moving.’

The sixth patriarch happened to be passing by.
He told them: `Not the wind, not the flag; the mind is moving.’

Mumon’s Comment: The sixth patriarch said: `The wind is not moving, the flag is not moving. Mind is moving.’ What did he mean? If you understand this intimately, you will see the two monks there trying to buy iron and gaining gold. The sixth patriarch could not bear to see those two dullards, so he made such a bargain (Zen@Metalab n.d.).

Wind, flag, mind moves.
The same understanding.
When the mouth opens
All are wrong.

Much of what goes on in deception, propaganda, and disinformation shares in the thought behind this koan. There appear to be many things going on. Many look and see the flag — the obvious. Others sense the wind, though nothing can be seen. Some look beyond to other factors. Some look to the mind, but all can change with the output of the mouth (communication) which shapes new meaning, new activity.

Today, discussions of these topics are viewed by many as repugnant or worse. Perhaps you had that feeling when you saw the picture of or read the quote from Adolph Hitler. Throughout much of history, terms such as propaganda didn’t have negative connotations. That is a recent phenomenon that largely came about because of counter-propaganda efforts by the Americans and British. However, even if that’s enough to dissuade you, consider this. These practices are used routinely by many agencies and people — even private companies make use of persuasion practices. If you’re to learn how to thwart the denial and deception efforts of others, you will need to understand the principles and practices behind them. This is the primary reason for the course.

For a professional to improve his/her ability to cut through the denial and deception inherent to these practices, a greater understanding of psychology, sociology, communication, culture, history, and more is needed. Further, the analyst needs tools that help order “habits of the mind” and limit inherent bias that enables successful deception. That’s the focus of this lesson and the entire course.

Persuasion and its Many Aspects
We begin this week by establishing fundamental definitions, concepts, and related practices. For that reason, this is one of the most fundamental of all the lessons in this course. It’s important to remember that this course, despite elements that are very personal or individual in nature, is to be viewed from the perspective of a state (country) achieving national objectives. Thus, the tools provided in the course should be viewed from that perspective to achieve the best understanding.

Within this material, it’s important to remember that there is also a hierarchy of concepts. Influence is the broadest of ideas and covers all that we talk about. Yet, it’s so broad that it’s not very helpful for much of what we’re doing in this course.

Hierarchy of Persuasion
It’s important to recognize that many things are related to persuasion but not all of them are inherently deceptive in nature. This lesson explains elements of the first and second tier as well as related psychological concepts. In this discussion, a number of terms will be defined in order to reduce the confusion caused by the inaccurate use in the common vernacular. This will help make the discussion in the class more productive, because confusion should be reduced as we examine aspects of the third tier.

1ST TIER
2ND TIER
3RD TIER
Definition of Key Terms
PERSUASION INFLUENCE DECEPTION
This section discusses the nature of influence and the subordinate concepts of deception and information. It’s important to remember that many of the concepts discussed in this class look at human psychology and the ways it can be exploited or protected. In the case of deception, the exploitation normally uses “lies” of commission or omission. In other words, the deception is done through presenting other information as truth or by simply leaving information out. Yet, this arena can be a murky one with frequent overlap. To help increase your ability to work through this domain, let’s first look at the concept of influence.

Key Psychological Functions
This lesson looks at two specific types of functions that are known to cause cognitive errors. The first are biases. The second are heuristics. The term bias or biases is likely well known to you. One may be biased against a category of people, a way of doing something, or a specific thing; however, unlike the media promoted ideas of bias, biases are not always negative. For example, there’s nothing wrong if you prefer rice rather than potatoes or westerns over mysteries. Nevertheless, biases will lead individuals to make decisions that by necessity leave out, ignore, or alter thinking in ways that might not otherwise occur. When considering intelligence collection, analysis, targeting, etc., biases may remove viable, even critical, targets unduly.

The second area involves heuristics. Heuristics are mental algorithms that speed thinking by focusing attention and streamlining analysis. For example, if you were asked whether you’d like fish or fowl for dinner, it’s unlikely that you’d go through the entire list of fish or birds that you know. For example, most people already have in mind what fowl they might eat. In the United States, that list would likely include chicken, turkey, and duck. More exotic eaters might include grouse, partridge, squab (young pigeon), etc. Nevertheless, it’s virtually impossible to find someone who included hummingbirds, ostriches, penguins, egrets, golden eagles, etc. when considering what to eat for dinner. That limiting process is the result of a heuristic that tells your brain there are limited variables to be considered. Yet, while this may help in making dinner decisions, it may impede efforts to determine what an opponent might do in a real-world, threat situation.

Key Psychological Functions
Arrow pointing down

Confirmation Bias
Confirmation bias refers to the tendency for people to select those elements that support their pre-conceived notions. Depending on how the information is handled it may also be called the Texas Sharpshooter Fallacy, cherry picking, “myside” bias, confirmatory bias, etc. Regardless of the name, the process involves mental efforts to eliminate any competing ideas and focus on those data points that support one’s case. One simply draws the proverbial bullseye around those data points that fit preconceived ideas.

Elections are an excellent place to look for confirmation bias. If you love Candidate Smith but despise Candidate Jones, you’ll look for information that supports your candidate and denigrates his/her opponent. Further, when things become too troubling, you might find yourself coloring that information to fit your biases. For example, if Candidate Jones barely squeaks out of criminal charges after an investigation, you might trumpet the vindication of your Jones. You might even trumpet Jones’ innocence and the abuse of power by those doing the investigation, even if he/she had many examples of troubling or questionable behavior come to light in that investigation. Conversely, you’d delight to see Jones’ being called out for kicking his/her neighbor’s dog. For you, this might be taken as clear evidence of how evil Jones really is. The bottom line is simple. Humans seek to be right. Thus, they’ll look for evidence to support their views, even if that means ignoring clear evidence to the contrary.

One of the easiest examples of confirmation bias to visualize is the Texas Sharpshooter Fallacy. Imagine a less than stellar shooter firing at the side of a barn. After firing all his/her rounds, the next step is to see how accurate they were. Now imagine the shooter circling the biggest group of shots with a bullseye and then putting concentric circles out from there. Success! Of course, not starting with the bullseye brings the shooter’s accuracy into question. Yet this is often how people approach analysis of issues. Begin with existing biases, desired outcomes, etc. Then find ways to pull them together into a credible package to show to others.

In the realm of intelligence, it’s easy to see where this could influence one’s analysis or actions. When someone “knows” the bad guy and what the bad guy will do, such an individual will be looking for confirmation of what is already known. That’s true for many people, even when there is clear evidence to the contrary. People like to know they’re right, and they like things that are easy. Viola! An answer with little analysis, based on assumptions. One such example can be seen in the Vietnam War. Many U.S. analysts looked at the North Vietnamese military leader General Vo Nguyen Giap through the lens of their training. As an Asian Communist, they “knew” that he must have borrowed his warfare theories from the Chinese Communist leader Mao Zedong. After the war, corrections were necessary. In fact, the underlying animosity between Vietnam and China as well as Giap’s European education meant that he favored Clausewitz and Jomini for his insights into conduction warfare. By the way, these were the two most predominant influences in the doctrine of U.S. land force doctrine and practice!

One of the best counters to this problem involves establishing criteria for analysis beforehand. Such criteria must be written out and available to others who review the final work. This increases the odds that both the originator and reviewer might catch problems in the analysis. Without established analytical frameworks and measures, it’s virtually impossible to avoid some degree of confirmation bias.

RELATED READING

Normalcy Bias
This problem tends to be most evident in high stress or crisis situations. Though there is often discussion of the “fight or flight” reflex in which humans either confront or actively avoid a problem or conflict, normalcy bias is the less discussed “hide” reflex. Many creatures exhibit this behavior. It can be a life-saving process when a creature relies upon its natural camouflage or superior position to avoid detection by predators. Unfortunately, this doesn’t help individuals trapped in a fire or other situations wherein the threat will overtake them.

The type of behavior caused by normalcy bias has often been attributed to injury and death in humans. For example, in aircraft fires on the ground, many survivors report other unharmed passengers sitting motionless in their seats or going about normal tasks, i.e. collecting their things, that were inappropriate for the situation. Seldom were these people among the survivors unless others interviewed. Mentally the situation overwhelmed their ability to process because they had never experienced or even considered such a catastrophic situation. Survivors from such accidents typically came from three groups. The first group had either received prior training and/or had considered the possibility beforehand and prepared a plan (Remember those boring pre-flight briefings by the flight crew?). The second group had been helped out of the burning craft by members of the first group or outside rescuers. The third and final group might be considered the “blind luck” group, because they were often ones who had fallen out, been blown out, or otherwise been removed from the situation through no effort of their own or others. Of course, normalcy bias doesn’t just come into play for aircraft accidents. It’s seen in many human interactions.

Open conflict such as fights, arrests, and combat also tend to trigger normalcy bias. If you’ve ever been in any of these situations, you know the responses triggered by your body and the way time awareness changes. You probably also realize how important your training and experience were in moving you through the process. For those who lack such experience, trust the rest of us! The human brain must process new situations, but not all situations are conducive to on-the-job learning. There are many combat systems that use a color scheme to represent this process. For example, military and police often use one that visualizes green as normal conditions, yellow as high alert status, and red as active threat/crisis. That’s simple enough, because similar concepts are seen elsewhere. However, it’s the final stage that is tied to normalcy bias — black. In crisis, people can easily go from green or yellow to black — the stage in which the mind shuts down or slows so dramatically that meaningful action is no longer possible. It’s at the black stage of mental processing in which normalcy bias is at its worst. It’s here that an opponent might capture, wound or kill you while your mind is processing options or trying to focus on those things you’re accustomed to for a lack of any decision making.

Normalcy bias can affect those in intelligence in a number of ways. Because humans seek to establish a normal state, any change to that can cause mental roadblocks to analysis. It may be as simple as slowing the process or as bad as “locking up” someone’s mental functions for a period of time. This can happen even outside of direct combat.

As has already been discussed, one of the best counters to normalcy bias is experience. This may come from actual experience or experience gained in training. This may also be “borrowed” from others by the use of simple devices like checklists — mental or actual. This is why you see most military organizations and some aspects of the intelligence community having checklists at hand. When the feces hit the proverbial rotating blades, it’s not time to start thinking from scratch. Even in analysis, one might run a checklist, analysis form or some other device that moves one step by step through the needed analysis action items.

Another device for overcoming the problems associated with normalcy bias is the practice of running worst case analysis or planning. If one considers what might happen next, that individual will be more likely to respond effectively if conditions change. This is somewhat true, even when the conditions don’t change exactly as predicted. As noted earlier, a good example of this can be seen in the testimonies of those in aircraft accidents. Many of those who survive often attribute their actions to thinking about the worst-case possibilities and the necessary actions to respond. Many of the remaining survivors attribute their survival to being pushed, pulled or otherwise directed by those who had planned ahead.

RELATED READING

Framing
When a person looks at a problem, it is seldom with fresh eyes (without bias). Not surprisingly, the new problem is viewed through the lens of past experience. That experience is linked to the current context of the problem and helps put the “frame” around what will be examined and how it will be viewed. As a rule, younger people have fewer points of reference to draw on and thus may be more flexible in their processing. As one ages, as long as cognitive faculties remain intact, one’s increased number of points of reference can increase analytic speed while also providing less biased results (Erber 2010; Peters, Finucane, MacGregor, and Slovic 2000). Although age alone isn’t sufficient for this, individuals must have training and life experience to draw on. Older or younger individuals without cognitive resources from training and experience are more likely to use emotional frames of reference (Watanabe and Shibutani 2010).

Because framing leads individuals to apply past practices and ideas, it is often linked to or used synonymously with agenda setting. Often the term agenda setting is used in the context of past media messaging that set the “agenda” for one’s thinking and actions. In this, there is normal a prioritization to what is to be accepted and what is to be rejected (McCombs and Shaw 1972).

One of the ways intelligence professionals have found to reduce the effects of framing has been to employ techniques like Red Team or Red Hat exercises that place them in the role of an opponent or other actor. When one is forced to think like someone else, it can easily highlight the problems that come from applying one’s own experience to that of others. The more distant the culture between analyst and target, the more necessary such efforts are to eliminate errors caused by framing.

RELATED READING

Priming
Priming is related but notably different from framing. Humans respond to their environment both physically and mentally. One of the most common aspects of mental engagement is called priming. Priming helps the mind focus on a specific schema (the way humans order/categorize the world, i.e. all cats purr and have tails, all chairs have four legs, etc.). Despite debate about how it works, there is clear evidence that humans tend to use the most immediate schema created by recent stimuli. It may be immediate because the person in question has just heard, seen or experienced something related. Nevertheless, the resulting mental activity is implicit, meaning it is not consciously recognized or processed.

A classic example might be seen in the famous Alfred Hitchcock movie Psycho. This movie includes one of the most famous horror scenes in western cinema. A shadowy figure with a knife attacks and kills a young woman in a hotel shower. Viewers continue to report an increased fear of attack while bathing/showering after viewing this scene. If you were one of those, you might consider actions you took — lock the door(s), check the window(s), consider routes of escape, consider means of defense, etc. If you did anything like this while still using your regular bathroom, the only thing that changed was awareness (priming) provided by said movie scene.

A common example of this becomes evident to many people when they make a major purchase. For example, when a person buys a new car, he or she may suddenly see the same type of car “everywhere”. They existed before the purchase, but there was no reason to focus on their presence prior to the purchase. Now they seem frequent and the purchaser begins to construct ideas about who buys them, why they buy them, etc. Both positive and negative priming that uses the most recent, relevant information by which to interpret events in the current environment. In the first, one uses negative priming cues about what bad things would happen in a situation like this. Notably, negative priming can slow mental processing. Conversely, positive priming can help speed processing time. In the second, the new buyer suddenly sees things that never evoked awareness before. With this new awareness, he/she might more quickly recognize more similar vehicles.

As might be seen from these examples, the priming effect tends to be shorter than the effects of framing (Rokos-Ewoldsen, Rokos-Ewoldsen, and Carpenter 2009). The more recent and intense primes will create stronger effects (Rokos-Ewoldsen, et al. 2009). Not surprisingly, once one is primed with specific concepts and the conditions influence one’s emotions, attitude and behavior changes will normally follow quickly and without conscious thought.

Though priming, like framing, tends to highlight or make salient a specific point, it doesn’t tend to provide specific evaluative or prioritizing suggestions like framing (Scheufele and Tewksbury 2007). Once opinions are set, there is a tendency for individuals to seek information that is consistent with their views. This can be present in both framing and priming; however, in priming this information tends to fit existing evaluative measures rather than providing the evaluative measures as in framing. The effects of priming in this way have been seen to affect evaluations of politicians (Iyengar and Kinder 1987; Sheafer and Weimann 2005; and Moy, Xenos, and Hess 2006). The way other genders, races, classes, etc. are perceived (Hansen and Hansen 1988; Oliver, Ramasubramanian, and Kim 2007). From these primes, people construct mental models to better understand the situation and in preparation for future events (Wyer 2004; Johnson-Laird 1983; Rokos-Ewoldsen, et al. 2009; Wyer and Radvansky 1999).

Analysts are constantly influenced in ways that might not be evident. For example, the subtle facial gestures, body gestures, change in tone or word use by managers and commanders may trigger thinking that is less than optimal for good analysis. Though leaders sometimes make it clear what they want an analyst to find, it’s often more subtle. Yet the human mind has been wired to detect and act on these cues. Studies show support for this. For example, using certain words triggered changes in audience response (Drane and Greewald 1998).

An essential counter to the problem of priming is self-awareness. Where are your blind spots, problem areas, personal biases, etc.? These are areas that priming is more likely to pass undetected. It’s impossible to deflect every prime, given the mass of information inputs in any given day; however, one can minimize the effect by increased analysis of inputs. The more emotionally laden the input, the more care is needed to analyze it.

RELATED READING

Availability Heuristic
Some estimates put the number of decisions made by the average American at more than 50,000 a day! Thus, it’s not surprising that the brain creates shortcuts to reduce the overall processing burden (Tversky and Kahneman 1974). Thus, mundane activities may be categorized or analyzed in ways that are not necessarily accurate. For example, in one study test subjects were asked to list six reasons they might consider themselves assertive. Subjects in another group were asked to list 12 reasons. Not surprisingly, more of the subjects asked to list six were able to complete all or most of the list. In contrast, those asked for 12 reasons did not complete their list. When both groups were asked how assertive they felt, the six reason group scored themselves higher. Evidence suggested they did this because they believed they had a more complete data set, even though those in the 12 group often hard more than six reasons to support their assertiveness.

Often those who use this technique to influence others will use more vivid or emotional content in their communications. They will also repeat key elements more. The result is the message is more firmly lodged in the targets brain. A classic example of this in interpersonal communication involves the communication of a “hunk” or “babe” with a target that “wasn’t in their league.” A touch, a wink, and some suggestions, might lead the target to “decide for himself/herself” to do exactly what is being suggested. Of course, there are many other examples, but the key is the impact of highly visual, emotive, and/or repetitive language from others.

In a professional setting, it is common for one to act on their more immediate recall. The easier it is to recall the potential benefit or penalty for an action or non-action (often reinforced by visual, emotive, or repetitious elements) will drive the decision-making process. Frequency may not always come from a single event. It can be assumed from the mind’s attempts to link seemingly related events of sufficient immediacy and impact (Tversky and Kahneman 1973).

Countering the problems created by the Availability Heuristic calls for actions found in dealing with framing and priming. One must know themselves and ask questions about what is being decided, how it’s being decided, why it’s being decided, etc. Consider who might have influenced you in the process. Was there a push from leadership or a friend?

RELATED READING

Anchoring
Framing draws on existing knowledge. Priming relates to recent stimuli, but anchoring relates to your very first impression. This is the tendency for humans to fixate on the first thing they see or hear. Those selling you things rely on this heavily. Ever seen the price tag that’s been “slashed” to give you deep discounts? That great Item X was $975 but now you can get it for only $375! Wow! How could you turn it down? Anchoring is at the heart of negotiations too. The first one to announce the price or other negotiating point has set the anchor point. Trained negotiators know how to work around this, but most people just stick to that anchor when they offer their counter-argument, if any. Yet, it doesn’t occur only in sales and negotiations. Here’s an example from David McRany’s You Are Not So Smart (2012).

Answer this: Is the population of Uzbekistan greater or fewer than 12 million? Go ahead and guess.

OK, another question, how many people do you think live in Uzbekistan? Come up with a figure and keep it in your head. We’ll come back to this in a few paragraphs (MacRany 2012, 215).

Humans are given things to consider every day. Analysts are no exception; however, these considerations are never made in isolation. Consider a situation in which you begin the day with a briefing from your intel manager. The emphasis of the briefing involves a new problem with Terror Organization X. You’re given information that shows some of this activity is taking place in your area of responsibility. What’s your likely tendency? Your leadership is interested in you finding something. Of course, you’re a professional and want to succeed. Thus, you have both organizational and personal motivations to start from this anchor point. This will help you look within a specific range for things. This could lead you to find or miss things that aren’t specifically connected.

Back to Uzbekistan. The populations of Central Asian states probably aren’t numbers you have memorized. You need some sort of cue, a point of reference. You searched your mental assets for something of value concerning Uzbekistan — the terrain, the language, Borat — but the population figures aren’t in your head. What is in your head is the figure I gave, 12 million, and it’s right there up front. When you have nothing else to go on, you fixate on the information at hand. The population of Uzbekistan is about 28 million people. How far away was your answer? If you are like most people, you assumed something much lower. You probably thought it was more than 12 million but less than 28 million.

You depend on anchoring every day to predict the outcome of events, to estimate how much time something will take or how much money something will cost. When you need to choose between options, or estimate a value, you need tooting to stand on. How much should your electricity bill be each month? What is a good price for rent in this neighborhood? You need an anchor from which to compare, and when someone is trying to sell you something, that salesperson is more than happy to provide one. The problem is, even when you know this, you can’t ignore it (MacRany 2012, 215).

In the realm of analysis, there are a number of counters to this problem. The key is to use techniques that evaluate your answers. Humans are too quick to accept their own answers, because they lack insight or are lazy or are arrogant or for many other reasons. Thus, one must use devices to constructively question their decision-making processes and final decision validity. Some recommended tools from the intelligence community include processes like Diagnostic Reasoning, Analysis of Competing Hypotheses, and Argument Mapping. This is not an exhaustive list though. There are many techniques and practices than can help in this area. Sometimes something as simple as getting a disinterested party to evaluate your conclusion can help in a pinch.

___________________________________________

NTL653: DECEPTION, PROPAGANDA AND DISINFORMATION

Lesson 3: Applying Influence Practices by Type
The diverse efforts incorporated within deception, propaganda, and information operations do not make for easy focusing. Those who work in these arenas tend to be specialists in one or a few areas rather than being generalists. The reason for this should become obvious in this lesson. Nevertheless, anyone who wants to have a working understanding of these areas of endeavor needs to have a passing knowledge of the many component parts. That’s why this lesson focuses on key processes and concepts that are linked to our earlier psychological discussions but now are discussed in terms of application.

This lesson’s discussion largely focuses on something known as “soft power”. This is an important point to remember, since many people are focused on the use of “hard power” and do not appreciate how valuable it can be to avoid the costs of coercive forces like military power or economic sanctions.

Introduction
We again start this lesson by establishing fundamental definitions, concepts, and related practices. It’s important to remember that this course, despite elements that are very person or individual in nature, is to be viewed from the perspective of a state (country) achieving national objectives. Thus, the tools provided in the course should be viewed from that perspective to achieve the best understanding. As such, the tools being discussed in this lesson are state-based. Further, they are drawn from the American experience but limited by classification constraints. Therefore, the emphasis is almost solely military in nature. This should not lead one to then believe that all information operations that involve deception are military in nature. It’s not helpful that even the military says this. As you read on, keep in mind that deception practices can be performed by agencies other than the U.S. military.

Within the material provided, it’s important to remember that there is also a hierarchy of concepts as discussed in the Week One lesson. Influence is the broadest of ideas and covers all that we talk about. Yet, it’s so broad that it’s not very helpful for much of what we’re doing in this course. Information can be said to be comprised of information usage that is both deceptive and informative. Both seek to influence, but the latter makes no effort to hide the truth. Sometimes the line between the two of them can be fine indeed. This can be seen in practices like MISO/PSYOP and Propaganda. As you’ve read, propaganda can be both deceptive (black and gray) as well as informational (white), though all three types seek to influence. This then leads us to the Third Tier for this week’s discussion.

Hierarchy

1ST TIER

2ND TIER

3RD TIER

Definitions
As with past week’s there are definitions to be derived; however, since some these are plainly laid out in the readings, they’ll only be discussed briefly here. Specifically, these would be MILDEC and MISO/PSYOP. There is another overarching term that will be added here — information operations. This has been added, because you’ll see this term used in many military publications that encompass aspects of our discussion. However, let’s first consider a term used in this course that you won’t hear much about. Nevertheless, it’s an important part of deception. The term is disinformation.

Disinformation
Disinformation is a term found in the name of this course, but not a term you’ll find in this week’s readings. Disinformation sounds like an English word. It does have its roots there, but it came back to English after editing by the Soviets who created the Russian term дезинформация dezinformatsiya. Disinformation is best thought of as a specific form of false or inaccurate information that has been spread to achieve a political end (Pacepa and Rychlak 2013). In discussing the Soviet use of disinformation, Pacepa and Rychlak (2013) explain that “Elsewhere in the world, foreign intelligence services are primarily engaged in collecting information to help their heads of state conduct foreign affairs, but in Russia and later throughout the Russian sphere of influence, that task has always been more or less relevant. There the goal is to manipulate the future, not just to learn about the past. Specifically, the idea is to fabricate a new past for enemy targets in order to change how the world perceives them” (5).

Don’t confuse this with misinformation, which is material that is unintentionally incorrect. In military usage, this may involve the communication of information that is designed to redirect enemy efforts as in Operation Mincemeat during WWII. This is where there tends to be overlap with the concept of deception. However, this term is often used differently so be wary.

It’s an ill-defined term that is often applied widely to describe false or deceptive efforts. For example, Country X engaged in a disinformation campaign to discredit its leading critique. For historical references, one might look to the case of the United States’ attack on Jacobo Arbenz or the Soviet attack on Leon Trotsky. Both countries initially sought to discredit their respective targets through rumor campaigns, falsified news reports, legal efforts, diplomatic efforts and other means. In the case of Arbenz, the effort worked. In the case of Trotsky, who advocated a form of Communism counter to Stalinism, these efforts, that included widely televised show trials, failed. This led to several assassination attempts. Finally, an ice axe was deemed successful.

Fundamentally, disinformation is intentionally false and spread intentionally. This is in contrast to a related term — misinformation — that refers to the accidental spread of false or inaccurate information. Normally, both cane be involved deception efforts. For example, an agent may intentionally promote false information (disinformation) in order to turn opinion against a target. Those who receive such information and believe it to be true might very well pass it on. If they do, they are sharing misinformation, because they don’t know that it isn’t true. Because of the convoluted nature of these related ideas, the term disinformation doesn’t appear in the lesson readings nor will it be discussed in any substantive way.

Propaganda
We began the discussion of propaganda in the last lesson, but there is more to that discussion. First, it’s important to remember for the purposes of this class propaganda involves a (state or proto-state) government transmitting a message, though the message outlets may be non-governmental (Morris 2016). Originally, propaganda had a positive connotation, but this changed most dramatically after Allied propaganda efforts successfully attacked the propaganda efforts of the Axis, especially that of the Nazis. The United States was especially successful in executing an information campaign against the Nazi Bureau of Propaganda that gave propaganda a bad name. Until then, it was held as a neutral term related to communicating ideas and attitudes. Many countries had some official entity known as the propaganda arm. The American effort was more subtle. For example, during World War II the United States had several propaganda arms with the US Office of War Information being one of the most effective. Though the psychological techniques may be the same, the difference between propaganda and other forms of influence (public relations, marketing, advertising, political campaigning, etc.) is the source.

Jowett and O’Donnell define propaganda as, “a form of communication that attempts to achieve a response that furthers the desired intent of the propagandist” (2012, 1). Of course, this isn’t very helpful. That’s like saying that persuasive writing is used by an author to persuade. Fortunately, Jowett and O’Donnell build on this later in the text. They add this: “propaganda is the deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist” (2012, 7).

One addition that goes beyond the bounds of specificity found in our readings involves efforts by political entities. Though these most commonly may be thought of as governments, it would also include other organizations such as al Qaeda, Sinn Fein, and others. Most of these have political agendas that involve the creation or change of political structures that would allow them to create a state in their own image, thus my use of the term proto-state in the forum this week. By focusing our attention on propaganda as a product of states and proto-states, we stay more consistent with the modern application of this term in much of the literature and practice. Further, it helps keep us from muddling issues when considering techniques that are found in a wide range of influence and deception efforts from marketing, advertising, public relations, political campaigns, intimate relationships, etc., etc., etc. A former student added to this discussion with the following statement .

Propaganda takes a variety of forms with white, gray, and black. White propaganda makes no effort to hide the source. A common form of this type of propaganda can be found in public diplomacy efforts conducted by the U.S. State Department. The information is normally verifiable and perceived as true by its originators. Gray propaganda tends to obscure the source but doesn’t purport to be from someone else. It also tends to be verifiable and perceived as true by its originators, though the source is obscured to increase receptivity. Black propaganda promotes a message that supports political aims but does so by attributing the material to sources other than the originator. This form of propaganda often involves falsehoods, innuendo, and other material that appeals to the recipients.

Propaganda Continued…
Though your readings don’t discuss it, there is a role known as counter-propaganda. If one understands propaganda as “the deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that further the desired intent of the propaganda” (Jowett and O’Donnell 2012, 7) then the counter is relatively simple. It would involve thwarting such efforts. How that might be depend upon the nature of the information operations being waged. Nevertheless, certainly elements remain true in most situations. Whatever counter-propaganda efforts may be mounted, it normally relies upon clearly and quickly conveying understandable and true information that is suitable for the targeted audience. Successful counterpropaganda unmasks the actual source behind gray and black propaganda as well as any falsehoods found in said propaganda products.

Remember that as one looks at information operations (IO), it will not always be clear where propaganda and counter-propaganda efforts fall. Many of the functions fall within the realm of MISO, but because not all propaganda/counter-propaganda efforts are solely military (no matter what the military doctrine says), these efforts will extend to other agencies as well.

Information Operations
This term is commonly applied by the U.S. military to describe a wide range of offensive and defensive functions. These are employed within the physical, informational, and cognitive domain to achieve national objectives. These operations and the underlying concepts are developed within JP 3-13, Information Operations. That publication defines IO “as the integrated employment of electronic warfare (EW), computer network operations (CNO), psychological operations (PSYOP), military deception (MILDEC), and operations security (OPSEC), in concert with specified supporting and related capabilities, to influence, disrupt, corrupt or usurp adversarial human and automated decision making while protecting our own” (DOD 2014, GL-3).

MISO/PSYOP MILDEC
The first acronym recognizes the new name (Military Information Support Operations (MISO)), politically inspired category of influence — both deceptive and informational — that has long been known as psychological operations (PSYOP). Obviously, if a government entity other than the military employed such methodologies using other than military assets, it would not be called MISO. There are a number of names and euphemisms that may be applied in such cases, but they are not necessary in understanding the principles.

Other Information Operation Functions
The other components placed in the domain of information operations are electronic warfare (EW), computer network operations (CNO) and operational security (OPSEC). EW is often focused on more kinetic and technical aspects. For example, there is often a military imperative to destroy or incapacitate sensor systems that provide opponents information about friendly movements. There are also deception efforts that can occur in this arena. A frequently used acronym that describes the four primary forms is MIJI — Meaconing, Intrusion, Jamming and Interference. The first involves intercepting and rebroadcasting signals, usually with some error(s) inserted. Think of what might happen if navigational information sent by an enemy to its aircraft were re-coded with faulty elevation information. This might throw off the effectiveness of the enemy aircraft or cause the aircraft to crash on landing. Intrusion involves penetrating an enemy network and causing confusion by generating erroneous messages. Jamming and interference are processes that block or reduce reception of signals. Given the nature of EW, it’s influence and nature won’t be discussed further in this class.

CNO involves the defense of friendly systems and offensive operations against the systems of opponents. Though there can be some element of deception in some of these processes, this area of investigation is better suited to cyber electives offered by APUS.

OPSEC, on the other hand, is a fundamental process that is inherently defensive in nature. The poster above underscores the most basic nature of OPSEC — don’t share information with others who don’t have the clearance AND the need to know. It’s surprising how simple little bits of information can be pieced together to make a much clearer picture of an opponent’s plans or activities. Often those assigned to collect information on a target look for mundane things: changes in shifts, increased security, increased deliveries, etc. In fact, one of the open secrets in Washington, D.C. involves pizza delivery to the Pentagon. When there are significant spikes in delivery requests, those who are watching know that there’s likely some crisis or military operation in the offing. Thus, it’s time to start asking people in and around the Pentagon questions to better understand what’s happening. The Bottom Line on OPSEC outlines the elements of OPSEC.

The Enemy is Listening Poster

Connecting the Dots
One of the things I had hoped you’d gain from this week’s discussion is a sense of how the terms influence, deception, disinformation and propaganda fit together. As discussed, influence is the broadest area of endeavor with deception and propaganda falling within it. Further, deception and propaganda overlap but do not subsume each other. Thus, to conceptualize these ideas you would see something like the following image.

As for other functions or terms discussed earlier in the lesson, their placement would vary. Disinformation and MILDEC would be squarely within the red field labeled deception. CNO and EW would overlap influence and deception; however, most of their functions would extend outside these circles because of their non-influence elements. Of course, these are not the only functions that rely on technology. There are many ways in which technology is essential to influence and deception efforts.

Chart showing that Influence, Deception, and Propaganda are all connected

Technical Considerations
The authors mention the effects of technology in seeking these ends. It’s important to realize that although the authors seem to impute innovation to technology, there’s more going on. It’s true that deception in many forms has been changed by the advances of technology, however, it might be more accurate to say that more is now possible in the realm of deception. There have long been many ideas conceived that could not be advanced for lack of means. For example, the idea of giving an illusion of a sentry, a war machine, or other item has long been conceived of as a lure and a deterrent. However, it’s only in recent history that such things have become more practicable.

IMAGERY SOFT POWER
One area that has increasingly been exploited involves imagery, especially video. In fact, there was a great example not that long ago. The announcement of working hover boards like those from the Back to the Future movie series. Here’s Error! Hyperlink reference not valid. and another that discusses what really happened. Of course, countries like to get in on this act too. For example, China’s new J-10 fighter and the movie Top Gun apparently come together in this Chinese propaganda piece. It can be embarrassing when such frauds come to light. First, there’s the speculation and then the confirmation.

Others are more controversial. Faked Palestinian videos have been a common point of discussion in this regard. The al Dura video is an excellent example of how readily “true” images can be created that generate a public perception and response, or at least that’s what some say. The speed of the internet further compounds the power of these things. Some argue that the speed at which analysis can be done on internet material limits the consequences of such efforts, however, I would suggest the impact of the al Dura video and many other doctored videos from across the world suggest otherwise. This is part of what your others are referring to when they discuss the impact of technology change on the field of deception. Here’s a discussion. Here’s another link for a roundup story on al Dura that you might find interesting, though it is not from a news outlet that holds that it is a fake.

———————————————————————–
Lesson 6: Deception in the Cold War
This week we move from World War II in which deception became formalized and specialized in nature and in practice. Further, these changes help fuel the behavior of countries dramatically in the post-WWII era. In this lesson, we reach the most complicated phase of deception and information operations yet. The art and science of the last two world wars combined with increasing globalization, rapid technological change, and the threat of global nuclear warfare. The capabilities, the reach and the risk had never been greater. Further, new actors were beginning to enter the picture.

The United States, the Soviet Union, their respective allies, and most other countries in the international system began to engage in deception and information efforts on a daily basis. In the Cold War, it becomes difficult to determine what was true and wasn’t. During this time, governments used propaganda, deception, and other information and influence methods routinely. With propaganda infusing government statements at all levels, it became increasingly difficult to determine what was true. For example, the Soviet practice of maskirovka would grow to permeate its entire society. Likewise, the open, democratic society of the United States would discover there were more than enough secrets and deceptive efforts from its leaders. Many of these involved extensive information campaigns that built on lessons from Operation Bodyguard, but in some cases the complexity might have arguably been greater. This lesson touches on some of these as a means to build your understanding further. It focuses on examining case studies from various actors during this time frame.

The Red Iceberg Poster
The Red Iceberg (1960)

Introduction
Deception in the Cold War moves from efforts developed in the First and Second World Wars to bring some practices to new heights of development. What had been art in WWI would become sciences in WWII. The Cold War provided opportunities worldwide to further develop those sciences. Though they were two “gorillas” in this conflict — the United States and the Soviet Union — there were other affiliated actors and some free agent countries. For both these countries and many of their allies, the Cold War was framed as an existential threat. It was to be seen as a life and death struggle in which no mistakes could be made. This lesson will look at all of these with an emphasis on the competing efforts of the United States and the Soviet Union. However, other elements will also be touched on.

Arguably the struggle between the superpowers to outdo the other during the Cold War would not have worked for either side if deceptive tactics were not used. As in WWII, the practice of deceiving enemies and domestic audiences continued. In fact, they arguably grew.

SOVIET DECOY EFFORTS U.S. RECRUITING POSTER
Below you see a decoy road mobile nuclear launcher that was used to deceive intelligence agencies of the United States and its allies.

The 45th Separate Engineer-Camouflage Regiment of Russia

United States
The United States of America had the challenge of being an open democracy facing a more closed, oligarchic state. This meant that U.S. efforts had to be more robust in order to keep plans from being leaked (OPSEC), consistent with U.S. law, and able to withstand congressional and judicial scrutiny.

The need to be more subtle and controlled in deception efforts from within an open, democratic society cannot be understated. In contrast, countries like the Soviet Union and China could control most forums of media. This often led to less careful efforts.

Likewise, the United States developed a range of deceptive efforts that covered all kinds of strategic systems to include things like the underwater sonar arrays (SOSUS) that were essential to detecting Soviet nuclear submarine movements. However, efforts here generally needed to be more nuanced, more controlled.

Soviet Union
‹1/3 ›

The integration of deception in Soviet government was extensive. One example that can easily be drawn upon was that the Soviets used fake missiles in their parades around Red Square. This was done for a few reasons, to confuse adversaries on how many missiles they truly had, and also to further bolster nationalism within the USSR as the missile arsenal was seen as a source of pride among Soviets.

Kuban Cossacks during the Moscow Parade

Allies/Surrogates/Other Actors
For those who didn’t live through the Cold War, it can be difficult to conceive of how pervasive deceptive efforts really were. In addition to the United States and the United Soviet Socialist Republic, there were allies, independents, and other actors.

ALLIES SURROGATES OTHER ACTORS
The United States was considered the “leader of the West” with greater or lesser degrees of leadership given to North Atlantic Treaty Organization (NATO) countries, Southeast Asia Treaty Organization (SEATO) countries (Australia, France, New Zealand, Phillippines, Thailand, and the United Kingdom), and countries in other organizations in South America, Central America, Africa, and the Caribbean. All of these member states had their own interests to promote and used influence and information operations to promote those interests.

Allies/Surrogates/Other Actors Continued…
As we look at the Cold War this week, one thing you’ll see is that the one thing that gets increasingly more difficult to sort out is the truth. Deception and propaganda become constants in all phases of life. Consider this as you translate previous weeks’ concepts into this new, challenging period.

Knowledge Check
1
Question 1
The United States was considered the “leader of the _______” with greater or lesser degrees of leadership given to North Atlantic Treaty Organization (NATO) countries, Southeast Asia Treaty Organization (SEATO) countries (Australia, France, New Zealand, Phillippines, Thailand, and the United Kingdom), and countries in other organizations in South America, Central America, Africa, and the Caribbean.

West

North

South

East

I don’t know
One attempt
Submit answer
You answered 0 out of 0 correctly. Asking up to 3.

Abstract image of a propaganda poster that says conclusion

In this lesson, we investigated the highly globalized, highly technical, new world order in which deception and information efforts by states became common instruments of policy. Often these efforts were deemed essential to avoid the worst-case scenario. They were also becoming a staple of non-state revolutionary movements. These practices would become even more important as these types of movements increased in number. In Lesson Seven, these factors will become more important as the Cold War comes to an end and these groups grow in number and importance. During this period, understanding deception efforts continued to be important, even though peace seemed to be ever more common.

References
Hansen, James. 2002. “Soviet Deception in the Cuban Missile Crisis.” Studies in Intelligence: 49-58.

Vankin, Sam, 2000. The False Maps of Maskirovka. Central Europe Review 23, no. 23 (June 12), Accessed July 12, 2012 http://www.ce-review.org/00/23/vaknin23.html.

Image Citations
Everything for the Front, USSR WWII propaganda poster – Public Domain

“The 45th Separate Engineer-Camouflage Regiment of Russia” by https://commons.wikimedia.org/wiki/File:Aircraft_preparation_-_S-300_SAM_mock_up_%283%29.jpg.

“America Calling Civilian Defense Poster” by https://commons.wikimedia.org/wiki/File:%22America_Calling%22_Civilian_Defense_-_NARA_-_513793.jpg.

“Kuban Cossacks 1937” By rkka.ru (Photo courtesy of Alexander Kiyan rkka.ru) [GFDL (http://www.gnu.org/copyleft/fdl.html) or CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0/)], via Wikimedia Commons

English 101: Planning

Site Footer