by Justin Sherman
For years, the cybersecurity industry -- and, more broadly, the field of cyber strategy -- has suffered from a serious bout of inertia. That is, while many great thinkers have done much to advance the field, many more remain firmly planted, holding the same positions and ways of thinking that they have for decades. This is highlighted by many thoughtful articles; this was highlighted in the New York Cyber Task Force's report on leverage, which found that organizations are developing innovative technologies yet failing to change the fundamental, asymmetric advantage held by attackers; and this is highlighted in my forthcoming conversations with cybersecurity executives and senior cyber strategists who say the same. Inertia of thought is further evidenced by a mere examination of how "cyber" itself is treated: as its own discipline, often locked away within the computer or information sciences, never to make contact with academic coursework in ethics or business or healthcare. And private-sector organizations are just now waking up to the notion of human-centered design, despite its long history in the startup world. Rather than complain about this issue, we as a society -- meaning state and federal governments, schools and universities, and private-sector corporations -- need to fight this inertia by empowering and encouraging diverse thinking. First, the government must stop treating cybersecurity as the purview of just "cyber people," a point that future of war strategist Lydia Kostopoulos highlighted in our recent interview. While the U.S. military view of cyber as a domain is perhaps an easy "out," it seriously hampers the ways in which strategists and key decision-makers discuss cyberspace itself. There are challenging jurisdictional questions that must be answered, yes -- such as the division of authority between NSA and CYBERCOMM, or deciding whether DHS or DOE has authority over protecting critical infrastructure -- but that doesn't excuse the segmentation and isolation of cyber discussions. This is especially an issue at state and local levels of government. Second, educational institutions must dedicate resources to teaching cyber, and not just through the lens of computer and information science. As I recently argued, all students -- from business to policy to healthcare to media -- need a "Tech 101" education that prepares tomorrow's leaders to face the challenges of digitization. Looking to cybersecurity in particular, we not only need awareness beyond the circle of developers and hackers that maintain security in code; we also need diverse individuals to enter the field in the first place. This simply cannot happen without appropriate coursework in elementary schools, middle schools, high schools, and colleges, or without certificate programs that provide alternative forms of learning. As New America's Laura Bate has written, "for scalable solutions to the cybersecurity workforce shortage, the U.S. government will need to look beyond just higher education." Diverse teaching will empower diverse thinking -- fighting this cyber inertia. Third, organizations must work harder to hire more diverse people. The field remains extremely homogeneous, as anyone who has ever stepped foot in a conference or cybersecurity workplace can tell you, and there is clear data that this lack of diversity is making us less safe. Different people handle risk in different ways, which means they think about cyber differently -- again, thrusting against the inertia that keeps cybersecurity conversations so stagnant. Organizations must therefore take clear steps to hire diverse individuals, looking to such groups as "Women in Homeland Security" and "Help a Sister Up" or such events as Europe's first all-female cybersecurity conference. If we want better strategies and policies around cyberspace, hiring different types of people (really, anyone outside the current frame of thinking) is a necessary step forward. We will never attain total security in cyberspace, as such a state doesn't exist. However, we can fight the inertia of thought we currently face -- and it starts with bringing in new thinkers who will challenge existing assumptions. by Justin Sherman
Cyber, herein referring broadly to the digital and online space, does not operate in isolation from "conventional" elements that affect foreign policy. Geopolitical economy has a direct role in shaping the physical infrastructure behind the Internet, which in turn impacts everything from browsing speeds to content censorship. Philosophical works on deterrence and honor still hold enormous value in the digital era. And as the last two weeks have already shown me, the same goes for semantic understanding. To use a demonstrative anecdote: I'm reminded of Duke University's 2018 Winter Forum, "Crisis Near Fiery Cross Reef," during which Georgetown's Dr. Oriana Mastro gave a fascinating talk on the thought process (and actual logistics) behind Chinese military decision-making. Chinese military leadership, Dr. Mastro discussed, sees deterrence in quite a different way than their American counterparts -- which, perhaps obviously, leads to some tangible misunderstandings in the international arena. Now, these misunderstandings of course have their enormous complexities (which I am not qualified to fully understand myself), but they are in some way caused by semantics: two nation-states using the same terminology, but thinking and meaning fundamentally different things. Cyber is not exempt from this reality. Much of the West thinks of "information security" as the ability to ensure the confidentiality, integrity, and availability (CIA) of information; the tech community even uses the abbreviation "InfoSec" in this regard. Encryption, hashing, and data segmentation are just some of the techniques that fall under this "information security" umbrella, as are standards compliance, breach reporting, and crisis management. Despite our enormous reliance on what many to assume to be a technically-objective definition, other nation-states do not hold the same exact understanding. Perhaps most notably, Russia sees "information security" in a different light -- related to the government's ability to control the flow of information (e.g., as they do with television) in order to maintain national sovereignty and political order. While data security is encapsulated in this idea, it arguably refers more to censorship, surveillance, and control of the Internet than anything else; its meaning isn't just technical and operational, but philosophical and deeply political as well. What many think of as a clear term, it turns out, is quite semantically ambiguous. The same semantic issues occur with other powerful nation-states like China, whose ideas of "cultural security" and "innovation security" might not resound with the West as-is, let alone when taken in a cyber context; these challenges arise when trying to translate English cyber terminology into other languages; they even occur within our own country, where debates over the difference between cybersecurity, cyber-security, and cyber security are quite contentious. Cyberspace is not immune from semantics, and just as two physicians should be on the same semantic page when discussing a patient, cyber strategists need to think more carefully about the words they use and try to come to consensus definitions. Because as nation-states begin to develop their international cyber strategies and their domestic cyber laws, such as with Russia and China's 2015 International Code of Conduct for Information Security, we're going to need to speak in cyber terms without losing total meaning. This is just another reason for collaboration and consensus-building in the digital era. (We also need to teach students more about this: hence my first article for New America's Cybersecurity Initiative, entitled "Colleges, It's Time for a General Technology Class.") |
Proudly powered by Weebly