Interview with Dr Simon Bennett of the Civil Safety and Security Unit
Aviation safety and risk management demand more than theory; they require lived experience and sharp observation. Dr Simon Bennett, Director of the Civil Safety and Security Unit at the University of Leicester, draws on decades of fieldwork to reveal how teamwork, culture, and vigilance shape safer skies and stronger security systems.Â
What first inspired your interest in aviation safety and the broader study of risk, and how have your experiences as both a researcher and educator shaped your perspective?Â
Regarding my interest in risk management, in 1966 I was the same age as those children killed in the Aberfan, Wales, tip disaster. I think, subliminally, that disaster, which saw 116 children killed when a coal tip buried their school, affected my outlook on life and spurred an interest in social injustice and, frankly, the arrogance, self-interestedness and ineptitude of the rich and powerful. I lived the first two decades of my life in a Welsh coal-mining village, very much like Aberfan.Â
As to my interest in aviation safety, that started when one of my students suggested I visit her airline with a view to observing what pilots do on the flight-deck. I have spent the last 25 years making observations from the flight-deck jump seat. I have worked with numerous passenger airlines, for example, GoFly and easyJet, and with freight airlines such as DHL Air. I have also worked with the Royal Air Force and United Kingdom National Police Air Service (NPAS). My aviation safety books include: A Sociology of Commercial Flight Crew (Ashgate, 2006), How Pilots Live: An Examination of the Lifestyle of Commercial Pilots (Peter Lang International Academic Publishers, 2014), Aviation Safety and Security: The Importance of Teamwork, Leadership, Creative Thinking and Active Learning (Libri Publishing, 2015) and Safety in Aviation and Astronautics: A socio-technical approach (Routledge, 2022).
As Director of the Civil Safety and Security Unit and a university teacher, how do you balance academic theory with real-world practice when preparing students to address complex safety and security challenges?Â
In my opinion, too many academics spend their careers closeted in academia, talking to, and writing for each other rather than the world at large. Too many academics believe the world owes them a living. When I entered academia after a career in business I promised myself I would produce work that was helpful to working people. I decided early on in my career that the best way I could serve my students and society was to spend as much time as possible in the field observing those who daily manage risk, for example, pilots, soldiers and police officers. I feel more at home on the flight-deck than in the lecture theatre. I see nothing wrong in that.Â
What key lessons about human factors and team-based decision-making stand out most from those experiences?Â
The promulgation of CRM across the industry has significantly reduced the number of human-factors-related errors made by captains and first officers.Â
The most important lesson I have learned from my work on the flight deck is that teamwork, informed by mindfulness, works. It delivers safety and efficiency. Further, it creates a welcoming and rewarding work environment – something many working people will never know, unfortunately. Since the 1970s, the aviation industry has invested heavily in teamwork training, known in aviation circles as crew resource management (CRM). Pilots are taught to respect colleagues and to canvas opinion. They are taught that every opinion matters and that, before making a safety-critical decision, they must solicit as many views as possible in the time available. This can see a captain who has been flying for twenty years canvassing the opinion of her first officer who has been flying for twelve months. When it comes to solving a difficult problem pilots are taught that what matters is intelligence and insight rather than time served or rank attained. The promulgation of CRM across the industry has significantly reduced the number of human-factors-related errors made by captains and first officers.Â
In both civil and military aviation, what systemic or cultural weaknesses do you believe still pose the greatest threats to safety today?Â
In civil aviation the greatest threat to safety is posed by profit-seeking behaviour on the part of airline managements. Aviation is a cut-throat business organised along Darwinian lines. The fit survive. The unfit go bankrupt. Cost-reduction is aviation’s religion. Sometimes the relentless push to reduce costs impacts safety. Consider, for example, the outsourcing of core activities such as maintenance. Airlines outsource to save money. Unfortunately, because outsourcing inhibits communication and impedes oversight, it can create latent errors or resident pathogens – accidents waiting to happen – in an airline’s operation. It is not uncommon for airlines registered in Europe to outsource their aircraft maintenance to companies based in Pacific rim countries where labour costs are lower. Physical distance, language differences and cultural differences inhibit communication and impede oversight. Quality control suffers. Â
How can approaches like Crew Resource Management (CRM) and fatigue-risk management be applied beyond aviation—for example, in policing, counterinsurgency, or anti-terror operations? Â
Teamworking protocols such as CRM have been applied in several enterprises, including healthcare, where they have had a small impact on the number of deaths and injuries attributable to medical error. Crew resource management would have a greater impact in healthcare if medical staff were less deferential. Healthcare’s culture of deference – where junior doctors and nurses habitually defer to senior clinicians – makes further significant progress on reducing the number of deaths or injuries attributable to medical error unlikely. Aviation’s culture of deference, which held sway during the early post-war years, has been all but eliminated through a concerted CRM training programme across the industry. Teamwork training, endorsed by the International Civil Aviation Organisation (ICAO), is today de rigueur.
Many, but not all, militaries reference the principles that inform CRM in their training programmes, where new recruits are encouraged and empowered to voice their ideas and concerns. It is interesting to note that the Ukrainian army, despite finding itself under pressure because of the war with Russia, has embraced CRM training. As the former Commander-in-Chief of the Armed Forces of Ukraine, Valerii Zaluzhnyi, explained in a 2023 interview with Ukrainska Pravda: “I have tried to … change the culture in the Armed Forces of Ukraine …. To listen to the opinion of my subordinate, to listen to my subordinate as a human being and to build normal relations between people …. [T]his is the fundamental difference between us and the Soviet army … [T]he fact that we are no longer like the armed forces of the Russian Federation is a huge advantage for us …. [T]his new culture … has united everyone around us. Generals, junior commanders and, most importantly, soldiers …. [T]he Soviet army lived in the Armed Forces of Ukraine for a long time and there are still echoes of it.” I review the Ukrainian army’s investment in CRM in my new book The Russia-Ukraine War – Security Lessons published by Peter Lang International Academic Publishers.
Your research also covers military tactics, psychological warfare, and PSYOPS. What insights from these fields should civil safety and security leaders be paying more attention to?Â
Because the consequences of error on the battlefield are so devastating, with lives and, possibly, battles lost, the wealthier and more progressive militaries invest heavily in training their commanders and troops in the latest risk-management techniques such as CRM. It follows, therefore, that civilian enterprises such as healthcare and policing would benefit from studying the training regimes of the more innovative militaries.Â
During the post-war period the aircraft carrier arm of the United States Navy (USN) invested heavily in developing new safety management tools such as high-reliability organisation (HRO) theory, which advocates, amongst other things, that junior ratings should be authorised and empowered to interrogate the decisions and actions of their superiors and, if necessary, intervene should they conclude that those decisions and actions would lead to a near-miss, incident or accident. The resulting mindful organising of the aircraft carrier flight-deck – a highly dangerous working environment – has seen a steady reduction in near-misses, incidents and accidents on board the USN’s aircraft carriers. Lives have been saved and assets protected. Mission readiness has been improved. Civilian risk-managers can learn much from studying the USN’s application of HRO theory. American carrier operations are a safety exemplar.
As technologies like AI, autonomous systems, and advanced weaponry reshape aviation and security, what emerging risks or unintended consequences do you foresee?Â
First, I would like to challenge the term Artificial Intelligence. It is a misnomer in my opinion. Intelligence is a uniquely human attribute. Intelligence is a property of sentient beings – in part it is a product of emotional labour. Computers are machines. Consequently, computers’ capacity to mine large volumes of data at high speeds with a view to answering user queries and, if deemed necessary, modify algorithms or create new ones should be called machine or computer logic rather than artificial intelligence.Â
Computers’ capacity to mine large volumes of data at high speeds with a view to answering user queries and, if deemed necessary, modify algorithms or create new ones should be called machine or computer logic rather than artificial intelligence.Â
Computers’ Achilles heel is that they are only as good as their architecture and algorithms allow. Any weakness or flaw in either could send a drone, cruise missile or ballistic missile careening out of control. It is not beyond the realms of possibility that in a badly designed system with little or no human oversight, a ballistic missile, perhaps carrying multiple nuclear warheads, could be launched by the computers that manage it. Testing rarely reveals every possible failure mode of complex microprocessor-based control or guidance systems. Consequently, most complex microprocessor-based systems harbour latent errors or resident pathogens – near-misses, incidents or accidents waiting to happen – deep inside the millions of lines of machine code and/or the physical architectures that allow them to function. Manufacturers and operators insist the public has nothing to worry about and that microprocessor-based products are 100% reliable and safe. But then, manufacturers and operators would say that wouldn’t they? In their best-selling 1962 novel Fail-Safe, political scientists Eugene Burdick and Harvey Wheeler questioned the wisdom of using high-speed, tightly-coupled and increasingly opaque microprocessor-based systems to manage nuclear arsenals. Eugene Burdick was familiar with the military mind-set and associated culture, having attained the rank of Lieutenant Commander in the United States Navy during the Second World War. In director Sidney Lumet’s 1964 film adaptation of Burdick and Wheeler’s novel, also called Fail Safe, one of the protagonists, General Black (played by Daniel O’Herlihy), observes: “We’re going too fast. Things are getting out of hand …. We are setting up a war machine that acts faster than the ability of men to control it. We are putting men into situations that are getting too tough for men to handle …. We have got to slow down”.
Looking ahead, what steps should governments, industries, and universities take to strengthen aviation safety, develop resilient security strategies, and prepare the next generation of safety professionals?Â
It is important that governments and managements listen to those who actually do the heavy lifting – the people who work at the sharp end of an enterprise making goods, providing services or defending a nation. It is they, and they alone, who understand what works and what does not. Managers like to think they know how an enterprise is faring. Often, they do not. Why? Because they are too remote. Too cut-off. The Japanese economy is organised around worker-participation, which helps explain Japan’s post-war economic miracle that has seen a country devastated by conventional war and two nuclear strikes rebuild and re-invent itself. The University of Michigan Law School’s Bill Rutchow observes: “Foreign corporations, particularly Japanese, investing directly in the United States have imported a style of labour-management relations which emphasises worker participation in the management of the organisation. American corporations faced with competition from more efficient foreign enterprises have also adopted worker-participation programmes to increase efficiency and productivity.” While this all sounds very promising, it should be noted that it takes a good deal of humility for middle-class, university-educated and well-paid managers to admit they don’t know everything. Some will possess the necessary humility. Some won’t. The former will prove an asset. The latter a liability. Â








