Information warfare uses “means and methods of imparting information to achieve the objectives of the attacking side.” These include intelligence, counterintelligence, disinformation, electronic warfare, deceit, debilitation of communications, degradation of navigation support, psychological pressure, information systems, and propaganda [1]. Distributed denial of service attacks (DDoS), advanced exploitation techniques, and foreign media outlets all facilitate a foreign agenda in this context. From this perspective, using intrusive DIOs as part of a broader influence campaign operations strategy makes perfect sense. Influence Operations campaigns are tailored to sow doubt and confusion to undermine trust and confidence in the governments of targeted nations. Given the limited possibilities for attribution and the absence of any real threat provoking an armed (or any kind of) response, DIOs are low-risk, low-cost capabilities that can contribute to an adversary’s destabilization. The problematic nature of the attribution of cyberattacks ensures that it will remain unclear who is actually behind the attack, allowing for a certain degree of plausible deniability when the source of an attack has been determined [2].

Influence operations are an integral part of hybrid warfare, which is the coordinated overt and covert use of a broad range of instruments, military and civilian, conventional and unconventional, to make an ambiguous attack on another state. The objective of influence operations is exerting power by influencing the behavior of a target audience; the ability for “A to have B doing, to the extent that he can get B to do something that B would not otherwise do.” Influence operations are thus assumed to modify attitudes and shape opinions through the dissemination of information and conveying of messages. However, there are more intrusive ways to influence a specific audience that remain in the information realm but can no longer be regarded as the application of soft power, as they are no longer designed to achieve their objective solely through “attraction.” Cyberspace offers numerous possibilities for these kinds of coercive operations, which are designed to influence a target audience by changing, compromising, destroying, or stealing information by accessing information systems and networks. In principle, influence operations offer the promise of victory through “the use of non-military [non-kinetic] means to erode the adversary’s willpower, confuse and constrain his decision-making, and undermine his public support, so that victory can be attained without a shot being fired.” Intrusive cyber capabilities may be part of military operations. “Influence operations are the coordinated, integrated, and synchronized application of national diplomatic, informational, military, economic, and other capabilities in peacetime, crisis, conflict, and post-conflict to foster attitudes, behaviors, or decisions by foreign target audiences that further [a nation’s] interests and objectives.” The U.S. Department of Defense defines information operations as “the integrated employment, during military operations, of information-related capabilities in concert with other lines of operations to influence, disrupt, corrupt, or usurp the decision making of adversaries and potential adversaries while protecting [its] own.” They include all the efforts undertaken by states or any other groups to influence the behavior of a target audience, in peacetime or during an armed conflict. It is an umbrella term for all operations in the information domain, including all soft power activities. Although influence operations are, in principle, non-violent, they can be part of military operations [2].

Hybrid warfare provides many opportunities for the use of cyber capabilities as one of the broad range of possible non-kinetic or non-violent options. If the main goal of political influence operations outside of an armed conflict is to destabilize and confuse adversaries, then it could be effective in attacking the opponent’s digital infrastructure to undermine trust by compromising, altering, and disrupting the digital services of both the government and the private sector through the use of malware. It is inevitable that the future of cyberwarfare will be as much about hacking energy infrastructure, such as power grids, as about hacking the minds and shaping the environments in which political debates occur [2].

Information warfare and influence operations are, in principle, intended to get your own message across or to prevent your adversary from doing so. It is not just about developing a coherent and convincing storyline, however, as it also involves confusing, distracting, dividing, and demoralizing the adversary. From that perspective, cyberspace seems to be ideal for conducting such operations that will have disruptive, rather than destructive, outcomes. The means through which influence can be exerted relies mostly on spreading information. There are more intrusive ways to influence specific audiences, however, that remain in the information realm but are designed to change, compromise, inject, destroy, or steal information by accessing information systems and networks [2].

With influence intrusive operations, it becomes necessary to separate the “apples” of information content from the “apple carts” of information systems. This is in line with Russian thinking on information warfare, which traditionally makes the distinction between “informational-technical” and “informational-psychological” activities. The semantic or cognitive actions (apples) consist mainly of attacks of information on information (typically narrative vs. narrative) that affects the semantic layer of cyberspace. In other words, these activities create a crafted informational environment. These content-oriented activities can be defined as inform and influence operations (IIOs) that we define as “efforts to inform, influence, or persuade selected audiences through actions, utterances, signals, or messages.” Strategic communications (STRATCOM) and propaganda activities fall under this category, as well as the deliberate dissemination of disinformation to confuse audiences [2].

Operations target the logical layer of unauthorized access to destroy, change or alter information. DIOs occur at the logical layer of cyberspace with the intention of influencing attitudes, behaviors, or decisions of target audiences. DIOs are undertaken in cyberspace and qualify as cyberattacks. A cyberattack is “an act or action initiated in cyberspace to cause harm by compromising communication, information or other electronic systems, or the information that is stored, processed, or transmitted in these systems.” Harm includes physical damage and effects on information systems, including direct or indirect harm to a communication and information system, such as compromising the confidentiality, integrity, or availability of the system and any information exchanged or stored [2].

Influence operations focus on manipulating the psychology of targets through strategic communication, public affairs, inundation, or human-technology interfaces as a mechanism to alter their feelings, experiences, behavior, beliefs, or actions. These campaigns present a unique and pressing technical challenge, because they affect the logical or tactical layer of cyberspace but remain below the international legal threshold for armed conflict. Whether through hacking, disinformation, propaganda, or numerous other weapons, digital influence attacks alter, disrupt, or initiate the flow of data between two entities to achieve a strategic effect [2].

The greatest failure of strategic operations, whether psychological, kinetic, digital, or along any other vector, is the inability of the actor to impart a lasting impact on the behavior or ideology of the target at the culmination of the campaign. Too often, military and intelligence entities compartmentalize and focus on specific short-term objectives rather than exerting the effort to coerce the target to think or act differently in the future. Effective communication is about storytelling. Optimal influence operations disseminate content, deliver propaganda, and leverage multiple platforms, including social media, to affect the outlook, opinions, and actions of specific populations directly and indirectly according to their demographic and psychographic characteristics [2].

Perhaps the most profound characteristic of influence operations is the ease with which adversaries can launch attacks and succeed in modifying the thoughts or behaviors of particular audiences. Operations may consist of multiple layers; however, the majority of vectors do not require technical sophistication, skill, or overwhelming resources. The unsophisticated portions of the attack preclude any risk of escalation since many amount to defacements or the distribution of false or misleading data. Influence campaigns are extremely effective during both periods of peace and turmoil because they do not amount to an armed attack or cyber war and, even if the source is known, the attacks are difficult to respond to, whether or not the response is proportional. In most instances, various proxies are employed to obscure arbitration and provide plausible deniability [2].

Coercive DIOs will become more prevalent in the near future, because they offer the opportunity to undermine an opponent’s credibility with little risk of escalation. The main attraction in the use of DIOs lies in that they are limited in scope and difficult to attribute, thereby limiting the risks of escalation and countermeasures. This is especially reflected in the Russian approach to information warfare, which considers it an instrument of hard power. Issues of legality and transparency limiting the options for using DIOs remain, in principle, in Western democracies [2].

It is difficult to counter a DIO, since responding to them might result in a counterproductive outcome or be disproportionate and thus lead to escalation. The international law of state responsibility provides grounds to determine if a state has breached an obligation under international law (e.g., violation of sovereignty, violation of the principle of non-intervention) in a way that would be deemed an internationally wrongful act. To identify such a violation, it is essential to determine whether a state exercised “effective control” over the group or organization in question. According to the stringent criteria defined by the International Court of Justice, it is difficult to relate many actions in cyberspace to a state, making the options to respond highly limited [2].

Given their effects, DIOs do not reach the level of an armed attack in the legal sense; that is to say, these activities may not prompt action in self-defense by the injured state, according to article 51 of the United Nations Charter. Given their low intensity, however, these attacks do not imply that there is a legal void. DIOs are difficult to grasp through the legal lens. The attraction of DIOs for states lies mainly in the fact that they are difficult to attribute and thus provide plausible deniability, else they provoke a strong or quick response from the target nation. The more target audiences and organizations become aware of the need for adequate protection of their digital infrastructure and the limited long-term impact of cyberattacks, the less useful they will become. Most DIOs do not require significant technical capabilities because they exploit vulnerabilities in the human psyche and configuration flaws in “low-hanging fruit” networks. IO campaigns can fuel an already existing sense of insecurity and thereby support the overall narrative of the campaign. A study conducted by Chapman University showed, for instance, that Americans fear a cyber-terrorist attack more than a physical terrorist attack. This indicates that an adversary can exploit the fear of the unknown, whether that fear is realistic or mostly imaginary [2].

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Consent to display content from Youtube
Consent to display content from Vimeo
Google Maps
Consent to display content from Google