Cutting Through the Tangled Web:
An Information-Theoretic Perspective on Information Warfare
Air Power Australia Analysis 2012-02
20th October 2012
A Monograph by
Lachlan N. Brumley, BSE(Hons),
Dr Carlo Kopp, AFAIAA, SMIEEE, PEng,
Dr Kevin B. Korb, SMIEEE, MAAAI
Text, computer graphics © 2012 Lachlan Brumley, Carlo Kopp, Kevin Korb

Information Warfare in social systems has a long and colourful history dating back to antiquity. Despite the plethora of well documented historical instances, and well known instances in the biological domain, Information Theory based mathematical formalisms are a very recent development, produced over the last two decades. Depicted is an RC-135VW Rivet Joint electronic and signals intelligence aircraft of the 763rd Expeditionary Reconnaissance Squadron in South-West Asia, 2009, a critical asset in both theatre and strategic Information Operations (U.S. Air Force image).
Abstract
Information Warfare, which is the competitive use of information in survival contests, has been a pervasive feature of conflicts since the beginnings of recorded history. The advent of digital technologies for the collection, storage, analysis and distribution of information has created not only growth in the use of Information Warfare, but also created numerous new opportunities for its use.
The study of Information Warfare has progressed considerably since the term was first formally used by Rona thirty six years ago. Robust foundation theory rooted in Shannon's information theory and mathematical game theory, emerged over a decade ago, with the formalisms identified now as the Borden-Kopp model.
This paper surveys extant and past research in the information-theoretic foundations of Information Warfare. Specifically, both qualitative and quantitative definitions of information are surveyed, and closely compared. The most commonly used definitions of Information Warfare are then explored and analysed from an information-theoretic perspective.
Research covering the four canonical strategies, which are based on Shannon's definition of information, and the Turing machine model, is explored in detail, with a specific focus on boundary conditions between the four respective canonical strategies, and differences between the Borden and Kopp models. Mappings between qualitative and information theoretic models of Information Warfare are explored. The paper also surveys numerous applications for information-theoretic representations of Information Warfare.
The advent of information-theoretic models for Information Warfare establishes this as an area of study within the information sciences, in addition to its interest for the social sciences, military science and information systems.
Keywords: Information Warfare, Information Operations, Information Theory, Shannon, Turing Machine, Game Theory, Hypergame, Cognitive Cycle, OODA Loop, Mills' Paradox.
“Oh what a tangled web we weave, When first we practise to deceive!”
Sir Walter Scott, Marmion, Canto VI, Stanza 17.
Index
  • Index
  • The Information Warfare Problem
  • What is meant by “Information”?
  • Processing of Information by an Entity
  • What is meant by “Information Warfare”?
  • Shannon's Communication Theory and Information Warfare
  • The Generality of Information Warfare
  • Applications of Information Warfare
  • Conclusions
  • Endnotes
  • References

The Information Warfare Problem
It is widely accepted that the term “Information Warfare” was first used by Thomas Rona in 1976 [3] when discussing the advantages of targeting the information and communication systems an opponent depends upon. This general area was later explored by a number of researchers, with the National Defense University establishing a School of Information Warfare Studies in 1992, and the infoWarCon series of conferences launched shortly thereafter.
The usage of the term “Information Warfare”, for better or worse, encompasses the full gamut of techniques whereby information is employed to gain a competitive advantage in a conflict or dispute. The problems raised by “Information Warfare” are fundamental to nature and are applicable not just to social or computing systems, but to any competitive survival situation where information is exploited by multiple players. In any contest between two machines, for example between a spam generator and a spam filtering engine, both entities obey the very same constraints obeyed by biological organisms exploiting information while competing for survival.
This usage amounted a “soft” definition of “Information Warfare”, followed in 1997 by a formal definition, produced by the US Air Force. The US Air Force definition focussed primarily on military applications in social systems [60]. The formal account has provided a basis for elaboration in the independently developed information-theoretic formalisms of Borden [7] and Kopp [25], which identified “four canonical strategies of Information Warfare” (see §4); “strategy” in both instances was defined in the game-theoretic sense and was related to Shannon’s information theory. Further research has aimed to establish the range of environments in which the canonical strategies apply and tie them to established research in areas such as game theory, the Observation Orientation Decision Action loop, and the theory of deception, propaganda, and marketing. Applications of the theory to network security and military electronic warfare have also emerged.
Perhaps the most surprising application of the four canonical strategies has been in evolutionary biology. Very little effort was required to establish that the four strategies are indeed biological survival strategies, evolved specifically for gaining an advantage in survival games. In the evolutionary arms race pursued by all organisms, the use of information is a powerful offensive and defensive weapon against competitors, prey and predators.
The four canonical strategies of Information Warfare provide a common mathematical model for problems that arise in biological, military, social and computing systems. This allows, importantly, a unified approach to Information Warfare, simplifying automation. For example, implementing a robust security strategy requires a common model for analyzing and understanding problems which arise from a potentially wide range of an opponent’s offensive and defensive uses of information.
In this paper we shall review recent research in this area and provide a comprehensive description of the information-theoretic foundations of “Information Warfare”. The paper does not aim to address broader issues in strategy and how “Information Warfare” may aid or hinder existing paradigms of conflict, nor will this paper attempt a deeper study of the psychological dimensions of “Information Warfare”.
What is meant by “Information”?
In the context of Information Warfare, information can be either a weapon or a target. Somewhat surprisingly for this literature, detailed in further discussion, the definitions and usage of “information” are often left vague.
Information is defined as “knowledge communicated concerning some particular fact, subject, or event; that of which one is apprised or told; intelligence, news” by the Oxford English Dictionary [54]. Another definition from the same source states that it is “separated from, or without the implication of, reference to a person informed: that which inheres in one of two or more alternative sequences, arrangements, etc., that produce different responses in something, and which is capable of being stored in, transferred by, and communicated to inanimate things”. The first definition equates information with news or intelligence regarding a fact, subject or event, while the second describes it as data which can be stored and communicated by machines. Combining these definitions, we may say that information provides knowledge of an object, event or phenomenon that can be stored or communicated by people or machines.

Figure 1: A General Communication System (Shannon, 1948)
The mathematical definition of information comes from Shannon’s [52] work on communication theory, which proposes an abstract model of communication represented by an information channel between an Information Source and a Destination [53]. This model (Figure 1) describes a generalisation of communication and consists of five parts — the Information Source, the Transmitter, the Channel, the Receiver and the Destination.
The Information Source selects the Message to send from a set of possible messages. The Message is mapped by the Transmitter into a Signal and transmitted over the communication channel. The channel is simply the medium that carries the signal, and its physical instantiation depends upon the communication method. The channel is non-ideal, and therefore impairments which damage the signal are inevitably introduced during transmission. These impairments may be additive, such as noise or other signals, or may distort the signal. A frequent model for such impairments is simple additive white Gaussian noise (AWGN), due to its wide applicability in electronic communications and to its mathematical tractability. In Shannon’s model noise is produced by a Noise Source that is connected to the channel. The Receiver collects the signal from the channel and performs the inverse of the Transmitter’s mapping operation, converting the signal with any impairments back into a Message, which is then passed on to the Destination.
In communication theory, information is a measure of the freedom of choice one has when selecting a message for transmission [53]. According to this definition, highly unlikely messages contain much information, while highly predictable messages contain little information. The Shannon information measure (Shannon entropy), H, of a message consisting of N symbols appearing with respective probabilities pi is (Equation 1):
/ (1)
Shannon and Weaver provide a simple demonstration of how the probability of a message’s selection affects the amount of information in the transmitted message [53]. Suppose there is a choice between two possible messages, whose probabilities are p1 and p2. The measure of information, H, is maximised when each message is equally possible, that is p1 = p2 = 1/2. This occurs when one is equally free to select between the two messages. Should one message become more likely than the other, H will decrease. As a message becomes more probable, the value of H decreases toward zero.
Shannon [52] also demonstrated that the capacity of a noisy communications channel of a given bandwidth is bounded by the relationship in Equation 2, where C is the channel capacity, W is the channel bandwidth, P is the power of the signal and N is the noise power. The channel capacity is measured in bits per second, the bandwidth in Hertz and the power and noise are measured in Watts. Clearly, how much information a channel can carry can be manipulated via the channel’s bandwidth and signal-to-noise ratio. Given a fixed channel capacity, information transmission will be maximised if the actual code lengths of messages are equal to their Shannon information measure, thus, using an “efficient code” [53].
/ (2)
Shannon’s definition of information is a purely quantitative measure, determined by the probability of the message’s transmission [55]. Therefore, information as defined by Shannon is a property of a communication signal and should not be confused with the semantics (meaning) of the signal. Weaver linked Shannon information to thermodynamic entropy [55]. Entropy is used in the physical sciences as a measure of the level of organisation and structure. A system with high entropy is highly chaotic or random, while low entropy indicates a well-ordered and predictable system. Following Weaver, Shannon entropy reports how organised the information source is, which determines the rate of information generation [53].
Shannon’s model relativises information to the prior probability distribution of the receiver. Prior probabilities report the predictability of the message and thus its information content, or “surprise value”.
Another definition of information, similar to Shannon’s, is provided by Wiener [61]. Wiener describes information from a cybernetic point of view, where it is a property of the signals communicated between the various components of a system, describing the system’s state and operations. While Shannon’s definition was only applied to communication, Wiener applies his to the control and communication processes in complex mechanical, electronic, biological and organisational systems. Wiener also defines information mathematically in terms of probability, where the probabilities describe the choice between alternatives. Thus, Wiener’s definition also relates information and entropy; in particular, Wiener’s information measures negative entropy, where the information in a communicated message is a measure of the order, or lack of randomness, in the message. Shannon’s measure, on the other hand, was the information source’s entropy, describing the uncertainty of the message being transmitted, which can be interpreted as the number of bits needed in an efficiently utilised (noiseless) channel to report the state of the information source. Thus, in Shannon’s case information might be described as a potentiality, a measure of how likely a signal is to occur. In both these definitions information is a property of the communicated signal.
Bateson [1] instead defines information as a “difference which makes a difference”. According to Bateson, this definition is based upon Kant’s assertion that an object or phenomenon has a potentially infinite number of facts associated with it. Bateson argues that sensory receptors select certain facts from an object or phenomenon, which become information. Bateson suggests that a piece of chalk may be said to have an infinite number of “differences” between itself and the rest of the universe. These differences are mostly useless to an observer; however, a few of these differences are important and convey information to the observer. The filtered subset of important differences for the chalk could include its colour, location, shape and size. In Bateson’s definition of information, which differences are filtered to become information depends upon the perspective of the interested party.
Determining or quantifying the value of an item of information is thus dependent upon the observer, the circumstances of that observer, and the time at which the information is acquired. For instance, knowing which stocks will gain in the market before other observers know can yield a higher value than learning this information at the same time as others. Learning such information under circumstances where it cannot be exploited inevitably diminishes its value.
In games of incomplete information, discussed later, the value of information in reducing uncertainty can be related directly to the payoff in the game [42, 18]. If the information results in a high payoff, otherwise denied, the information is of high value. If the game is iterated or comprises multiple turns or steps, the time at which the information is acquired determines the manner in which the value of information changes over time. In this sense, Bateson’s representation is a qualitative mapping of what modern game theory tells us indirectly about the context and time variant properties of the value of information. This paper will not explore the problem of how to quantitatively determine the value of an item of information, an area well studied in recent game theory, as that problem is distinct from problems arising from the use of information to gain an advantage in a contest or conflict.
Boisot [6] provides a different model of information, arguing that what has previously been called information can instead be considered three different elements — data, information and knowledge. Entities first observe and make sense of data, converting it to information, which is then understood and incorporated into the entity’s knowledge base. Data describes the attributes of objects, while information is a subset of the data, produced by the filtering of an entity’s perceptual or conceptual processes. Boisot’s definition of information is more psychologically oriented and much broader than the mathematical definitions of Shannon or Wiener.1
Definitions of “information” fall into the categories of quantitative or qualitative: the strictly mathematical definitions of Shannon and Wiener versus ordinary language definitions, such as those of the Oxford English Dictionary, Bateson and Boisot. From a mathematical perspective, information is a property of a communicated signal, determined by the probability of that signal. The more likely a signal is, the less information it has, while the less likely it is, the more surprising its arrival and so the more information it possesses. The informal definitions consider information to be descriptions of some aspects of the world that can be transmitted and manipulated by biological organisms and machines. Of course, quantitative and qualitative definitions are potentially compatible and can be used jointly.
When “information” is used in the context of Information Warfare, it is commonly under its qualitative meaning. For example, when describing Information Warfare against computer systems, information may be used to refer to a computer program, stored data or a message sent between systems. The qualitative definitions of information are, however, vague, leading to conflation with distinct concepts such as knowledge, data and belief. Applying Shannon’s definition of information allows Information Warfare to be studied more rigorously. Whereas the mathematical definitions treat information as a property of a communicated signal, under a qualitative interpretation it is likely to be confused with the semantics of the signal.
In any case, the term “information” as it appears in much of the literature is context sensitive and that context must be interpreted carefully if the meaning of the text is to be read as intended (i.e., with high signal-to-noise ratio).
Processing of Information by an Entity
The target of an action to manipulate or impair information used by a competing entity is the victim’s decision-making. From a game theoretic perspective the intention is to compel or entice the victim into making choices which are to the advantage of the attacker. Understanding such mechanisms is therefore important to understanding how Information Warfare produces its intended effect.
Any decision-making mechanism is inherently constrained by the information collection and processing behaviours of the system which it is part of. There are many conceptual models which attempt to describe the information collection and processing behaviours of entities. Such models provide a relatively simple representation of the decision-making process, against which the effects of Information Warfare on the entity’s decision-making process can be studied. One such model of the decision-making cycle is the Observation Orientation Decision Action (OODA) loop model [10, 48], which we have employed in our research due to its generality, and because it is widely used and understood.
The OODA loop model is a method of representing the decision-making and action cycles of an entity. It was originally developed to model the decision-making process of fighter pilots, however its generality makes it suitable for modeling most decision-making cycles. The OODA loop is commonly used to describe the decision-making process in both military and business strategy [56].