Cyber warfare is a defining capability in the same way the development of the rifled barrel or mechanisation changed the nature of warfare. Nations with cutting-edge cyber capabilities have significant competitive advantage. Interception and interference with military communications poses a significant threat to the UK’s economic interests and security. It affects political and military communications systems but also, in today’s technology-driven world, scientific research, all forms of communication including social media and many other government-related or commercial organisations in the defence supply chain.
Our big challenge is how to keep computers and data networks safe from malign actors. Cyber security is big business. Cybersecurity Ventures (a Californian research agency) assessed this market was worth more than $120 billion in 2017 and predicted global spending on cyber security products and services to exceed $1 trillion cumulatively between 2017 and 2021. The traditional emphasis has been developing technological solutions to protect user end-points and data networks, but we should assume those who seek to harm us by exploiting military communications will find ways to exploit even the best technical solutions.
A 2015 Harvard Business Review (HBR) article quoted a US Department of Defense source with an alarming statistic that the DoD experiences 41 million scans, probes and attacks each month. However, ‘secure’ architecture and state-of-the-art technology are only part of the answer. My background in high-threat environments suggests many, if not most, ‘failures’ find root causes in the human beings designing, building and operating systems. We therefore need to train and rehearse operators to avoid mistakes and to detect and correct issues before they morph into mission–critical failure.
Not all cyber security breaches are well-published, but it was revealed that Islamic State briefly took control of the US Central Command’s Twitter feed in 2015 because the application had not been updated to dual-factor authentication. Technology, like body armour, may create a false sense of security when what’s really required is a high-performance culture which consistently analyses and minimises risk. We identify such cultures in many high-reliability organisations needing to avoid errors and minimise their consequences, including airlines, air traffic control systems, Formula 1 motor racing teams and bomb disposal units. They are technical operations conducted in potentially dangerous and complex environments where systems, sub-systems, human operators and the environment interact to cause dynamic risk that must be addressed before it turns into disastrous and potentially fatal problems.
High-reliability organisations are ‘situationally aware’. They have well-developed awareness of the environment and their own vulnerabilities. One such organisation, the US Submarine Service, has identified six interconnected principles that help them reduce and contain the impact of human error. Many of these will be familiar to anyone with experience in high-reliability organisations:
A 2015 HBR article identified that military cyber security breaches caused by human error invariably involved a breach of one or more of these six principles. Military commanders should simply ask what these six principles mean in their organisations, and ask themselves these questions:
Military commanders must ensure accountability is widely shared, making everyone responsible for ‘safety’ and stewardship of military data networks. They should ensure high-reliability behaviours are as embedded in day-to-day routine to the same extent as keeping personal weapons clean and operational. This means everyone must know and comply with basic rules and standards, irrespective of rank.
When failures occur – and they will – military organisations must treat unintentional, occasional errors as opportunities to learn and correct the processes allowing them to occur. Commanders must nurture a culture in which people will speak up when they make mistakes or notice others doing so, while at the same time being publicly intolerant of people who ignore well-intentioned standards and procedures. A Formula 1 client of ours adopted a version of the ‘cockpit resource management’ model to help engineers and manufacturing teams listen to their intuition, identify early causes of defects and take corrective action. They reward contributions to the safety framework and have hard evidence of improved reliability and reductions in safety–critical failure.
Communications security depends on leadership through technology and high–reliability people. Technology alone cannot defend military communications networks. Reducing human error is as important, if not more. Building and maintaining a culture of high-reliability will help, but it’s labour-intensive for leaders at all levels. The return on investment in building high-reliability organisations may be difficult to measure, but it is ultimately worthwhile as military communications security and thus valuable lives depend on it.
If you would like to join our community and read more articles like this then please click here.
cyber attacks Leading Change Mark Bouch Military Communications