Cyber security is a dispersed field, one comprising both a variety of topics (ranging from malware analysis, to psychology, to policy) and numerous stakeholders (including government, industry and academia). Silos have unsurprisingly emerged, making it critical for different communities to work together. None of this should be seen as particularly controversial — an interdisciplinary approach is vital for many of the world’s challenges from climate change to artificial intelligence. Yet, cross-silo discussions are clearly not functioning as they should in cyber security. I have seen this most clearly from my perspective within the international relations and security studies academic community: more people are getting interested in cyber security (which is great) but the engagement with the broader cyber security field is disappointingly lacking. Young academics are looking at really important issues and their assumptions are based on what seems like sound logic, yet they are regularly producing flawed research and arguments. These people aren’t stupid — and in fact often possess frighteningly high IQs — yet they are approaching cyber security from a position of ignorance.
Of course, not everyone engaging with cyber security should be expected to engage with other cyber security silos. I have met some really impressive area studies thinkers that have unique insight into states like Iran or North Korea. Their knowledge can go a long way towards helping us understand the broader context in which cyber operations take place. There are also undoubtedly bright minds in psychology, business management and linguistics that can all be part of the solution, even if these individuals lack the bandwidth or interest to engage with cyber security in serious depth. Crucially, there is a difference between thinkers that largely want to carve out a career in cyber security (where broader engagement is vital and should be expected) and those that instead specialise on separate issues (yet whose knowledge can enhance our understanding of cyber security).
For those focused on cyber security from an international relations perspective, however, the lack of appreciation for operational realities can really inhibit progress. A large source of this problem stems from two myths or misperceptions. The first of which is intentional ignorance — the idea that engagement with operational and technical realities is simply unnecessary. If international relations thinkers were able to develop theories about nuclear deterrence without understanding the intricacies of nuclear fission physics, then the argument goes that the same logic applies to cyber security. This mentality is all too common in academia but the argument rests on flawed assumptions.
The difference between nuclear and cyber security being that while the strategic implications caused by nuclear fission are relatively simple (causing things to go bang), the technical and operational realities in cyber security lead to much more nuanced outcomes. By dismissing them, academics soon resort to making tripe remarks about the impossibility of attribution, or the way cyber attacks occur at light speed. The good news is this misperception is actually easy to fix. Academics that spend even a short amount of time reading some APT reports or following some of the threat intel crowd on Twitter will quickly find important nuances that inform their understanding of vital issues. Engaging in respectful and productive debate that demonstrates the importance of wider engagement can therefore lead to meaningful progress.
The second myth is that the technical and operational details are too esoteric — a sentiment that it is too difficult to learn and understand how all of this works without a computer science degree. This is simply not the case and there are some straightforward measures people can take to build a broader awareness. This takes a bit of initiative, although it is an area where we as a community need to do a better job of providing mentorship and useful resources that can help people to bridge the gap (something I hope to do much more of going forward). Indeed, this roadblock is, more than anything, one of perceptions.
The incentive structures of academia do not always encourage cross-disciplinary engagement. Academics predominantly build career capital by publishing in their own discipline and this means faulty cyber security claims are not always rigorously tested. Yet, while academic incentives may not always actively encourage engagement across silos, nor do they necessarily discourage it. Work that bridges the gap, and elucidates the strategic implications of technical realities will provide an important contribution to the field. Academics can also build their reputation through impact metrics that specifically measure how research has impacted the wider world. The structural incentives of academia are an important issue, and there are certainly areas that could be improved. Yet, we should avoid adopting a defeatist attitude. By placing too much of the current lack of interdisciplinary engagement on academic structures, we risk overlooking our own agency in being able to fix the problem.
Furthermore, there are positive examples of such cross-silo engagement working well. Thomas Rid is one of the most well established academics looking at strategic cyber security issues (amongst other topics) and provides an example of what can be achieved by engaging beyond self-referential academic debates. Likewise, institutions such as the Oxford Centre for Doctoral Training Centre in Cyber Security (which I am part of) have developed an encouraging track record of building an interdisciplinary community and integrating an eclectic mix of backgrounds (ranging from mathematics and computer science to law and philosophy) into a single cohort. There is a desperate need for more interdisciplinary educational initiatives going forward.
The academic community clearly has some work to do, yet a significant part of the problem also lies with operational communities themselves. Here, there is plenty of arrogance and snide that easily alienates outsiders. While charlatans should rightfully be called out, mocking non-technical people for being perhaps overenthusiastic about the cyber security possibilities of blockchain, for example, is counterproductive. Indeed, the culture of readily laughing at bad marketing and faulty arguments in such a superior manner may be one reason why imposter syndrome appears to be so prevalent in the industry. Ultimately, none of this scares people into engaging with operational realities; it simply discourages them from talking to other communities.
Academics will continue to engage with cyber security issues and if the experience with operational crowds is consistently negative, silos will just become more insular. It is also ironic that many of the individuals that are so ready to humiliate those that miss operational and technical nuances often find themselves lacking when it comes to their own takes on policy and geopolitical issues. You also don’t have to go very far before seeing some rather questionable interpretations of state characteristics in APT reports. In this regard, the academic community is by no means a charity case — indeed, it brings unique perspectives and enhances the conceptual depth on a variety of cyber security topics. Operational communities may dismiss academia on the basis of its current state, but again, this comes down to taking leadership in order to improve the situation. By instead engaging in productive ways, the operational community has much to learn itself.
While there is still a lot of work to do, we have the agency to enhance mutual understandings and build what could become a truly symbiotic relationship between the two communities.