Archive for the ‘Organizational Culture’ Category
Scapegoats and systems: contrasting approaches to managing human error in organisations
Much can be learnt about an organization by observing what management does when things go wrong. One reaction is to hunt for a scapegoat, someone who can be held responsible for the mess. The other is to take a systemic view that focuses on finding the root cause of the issue and figuring out what can be done in order to prevent it from recurring. In a highly cited paper published in 2000, James Reason compared and contrasted the two approaches to error management in organisations. This post is an extensive summary of the paper.
The author gets to the point in the very first paragraph:
The human error problem can be viewed in two ways: the person approach and the system approach. Each has its model of error causation and each model gives rise to quite different philosophies of error management. Understanding these differences has important practical implications for coping with the ever present risk of mishaps in clinical practice.
Reason’s paper was published in the British Medical Journal and hence his focus on the practice of medicine. His arguments and conclusions, however, have a much wider relevance as evidenced by the diverse areas in which his paper has been cited.
The person approach – which, I think is more accurately called the scapegoat approach – is based on the belief that any errors can and should be traced back to an individual or a group, and that the party responsible should then be held to account for the error. This is the approach taken in organisations that are colloquially referred to as having a “blame culture.”
To an extent, looking around for a scapegoat is a natural emotional reaction to an error. The oft unstated reason behind scapegoating, however, is to avoid management responsibility. As the author tells us:
People are viewed as free agents capable of choosing between safe and unsafe modes of behaviour. If something goes wrong, it seems obvious that an individual (or group of individuals) must have been responsible. Seeking as far as possible to uncouple a person’s unsafe acts from any institutional responsibility is clearly in the interests of managers. It is also legally more convenient…
However, the scapegoat approach has a couple of serious problems that hinder effective risk management.
Firstly, an organization depends on its frontline staff to report any problems or lapses. Clearly, staff will do so only if they feel that it is safe to do so – something that is simply not possible in an organization that takes scapegoat approach. The author suggests that the Chernobyl disaster can be attributed to the lack of a “reporting culture” within the erstwhile Soviet Union.
Secondly, and perhaps more important, is that the focus on a scapegoat leaves the underlying cause of the error unaddressed. As the author puts it, “by focusing on the individual origins of error it [the scapegoat approach] isolates unsafe acts from their system context.” As a consequence, the scapegoat approach overlooks systemic features of errors – for example, the empirical fact that the same kinds of errors tend to recur within a given system.
The system approach accepts that human errors will happen. However, in contrast to the scapegoat approach, it views these errors as being triggered by factors that are built into the system. So, when something goes wrong, the system approach focuses on the procedures that were used rather than the people who were executing them. This difference from the scapegoat approach makes a world of difference.
The system approach looks for generic reasons why errors or accidents occur. Organisations usually have a series of measures in place to prevent errors – e.g. alarms, procedures, checklists, trained staff etc. Each of these measures can be looked upon as a “defensive layer” against error. However, as the author notes, each defensive layer has holes which can let errors “pass through” (more on how the holes arise a bit later). A good way to visualize this is as a series of slices of Swiss Cheese (see Figure 1).

Figure 1: The Swiss cheese model (from: http://patientsafetyed.duhs.duke.edu/module_e/swiss_cheese.html)
The important point is that the holes on a given slice are not at a fixed position; they keep opening, closing and even shifting around, depending on the state of the organization. An error occurs when the ephemeral holes on different layers temporarily line up to “let an error through”.
There are two reasons why holes arise in defensive layers:
- Active errors: These are unsafe acts committed by individuals. Active errors could be violations of set procedures or momentary lapses. The scapegoat approach focuses on identifying the active error and the person responsible for it. However, as the author points out, active errors are almost always caused by conditions built into the system, which brings us to…
- Latent conditions: These are flaws that are built into the system. The author uses the term resident pathogens to describe these – a nice metaphor that I have explored in a paper review I wrote some years ago. These “pathogens” are usually baked into the system by poor design decisions and flawed procedures on the one hand, and ill-thought-out management decisions on the other. Manifestations of the former include faulty alarms, unrealistic or inconsistent procedures or poorly designed equipment; manifestations of the latter include things such as unrealistic targets, overworked staff and the lack of funding for appropriate equipment.
The important thing to note is that latent conditions can lie dormant for a long period before they are noticed. Typically a latent condition comes to light only when an error caused by it occurs…and only if the organization does a root cause analysis of the error – something that is simply not done in an organization takes a scapegoat approach.
The author draws a nice analogy that clarifies the link between active errors and latent conditions:
…active failures are like mosquitoes. They can be swatted one by one, but they still keep coming. The best remedies are to create more effective defences and to drain the swamps in which they breed. The swamps, in this case, are the ever present latent conditions.
“Draining the swamp” is not a simple task. The author draws upon studies of high performance organisations (combat units, nuclear power plants and air traffic control centres) to understand how they minimised active errors by reducing system flaws. He notes that these organisations:
- Accept that errors will occur despite standardised procedures, and train their staff to deal with and learn from them.
- Practice responses to known error scenarios and try to imagine new ones on a regular basis.
- Delegate responsibility and authority, especially in crisis situations.
- Do a root cause analysis of any error that occurs and address the underlying problem by changing the system if needed.
In contrast, an organisation that takes a scapegoat approach assumes that standardisation will eliminate errors, ignores the possibility of novel errors occurring, centralises control and, above all, focuses on finding scapegoats instead of fixing the system.
Acknowledgement:
Figure 1 was taken from the Patient Safety Education website of Duke University Hospital.
Further reading:
The Swiss Cheese model was first proposed in 1991. It has since been applied in many areas. Here are a couple of recent applications and extensions of the model to project management:
- Stephen Duffield and Jon Whitty use the Swiss Cheese model as a basis for their model of Systemic Lessons Learned and Knowledge Captured (SLLKC model) in projects.
- In this post, Paul Culmsee extends the SLLKC model to incorporate aspects relating to teams and collaboration.
Towards a critical practice of management – a book review
Introduction
Management, as it is practiced, is largely about “getting things done.” Consequently management education and research tends to focus on improving the means by which specified ends are achieved. The ends themselves are not questioned as rigorously as they ought to be. The truth of this is reflected in the high profile corporate scandals that have come to light over the last decade or so, not to mention the global financial crisis.
Today, more than ever, there is a need for a new kind of management practice, one in which managers critically reflect on the goals they pursue and the means by which they aim to achieve them. In their book entitled, Making Sense of Management: A Critical Introduction, management academics Mats Alvesson and Hugh Willmott describe what such an approach to management entails. This post is a summary of the central ideas described in the book.
Critical theory and its relevance to management
The body of work that Alvesson and Willmott draw from is Critical Theory, a discipline that is based on the belief that knowledge ought to be based on dialectical reasoning – i.e. reasoning through dialogue – rather than scientific rationality alone. The main reason for this being that science (as it is commonly practiced) is value free and is therefore incapable of addressing problems that have a social or ethical dimension. This idea is not new, even scientists such as Einstein have commented on the limits of scientific reasoning.
Although Critical Theory has its roots in the Renaissance and Enlightenment, its modern avatar is largely due to a group of German social philosophers who were associated with the Frankfurt-based Institute of Social Research which was established in the 1920s. Among other things, these philosophers argued that knowledge in the social sciences (such as management) can never be truly value-free or objective. Our knowledge of social matters is invariably coloured by our background, culture, education and sensibilities. This ought to be obvious, but it isn’t: economists continue to proclaim objective truths about the right way to deal with economic issues, and management gurus remain ready to show us the one true path to management excellence.
The present day standard bearer of the Frankfurt School is the German social philosopher, Juergen Habermas who is best known for his theory of communicative rationality – the idea that open dialogue, free from any constraints is the most rational way to decide on matters of importance. For a super-quick introduction to the basic ideas of communicative rationality and its relevance in organisational settings, see my post entitled, More than just talk: rational dialogue in project environments. For a more detailed (and dare I say, entertaining) introduction to communicative rationality with examples drawn from The Borg and much more, have a look at Chapter 7 of my book, The Heretic’s Guide to Best Practices, co-written with Paul Culmsee.
The demise of command and control?
Many professional managers see their jobs in purely technical terms, involving things such as administration, planning, monitoring etc. They tend to overlook the fact that these technical functions are carried out within a particular social and cultural context. More to the point, and this is crucial, managers work under constraints of power and domination: they are not free to do what they think is right but have to do whatever their bosses order them to, and, so in turn behave with their subordinates in exactly the same way.
As Alvesson and Willmott put it:
Managers are intermediaries between those who hire them and those whom they manage. Managers are employed to coordinate, motivate, appease and control the productive efforts of others. These ‘others’ do not necessarily share managerial agendas…
Despite the talk of autonomy and empowerment, modern day management is still very much about control. However, modern day employees are unlikely to accept a command and control approach to being managed, so organisations have taken recourse to subtler means of achieving the same result. For example, organisational culture initiatives aimed at getting employees to “internalise”the values of the organisation are attempts to “control sans command.”
The point is, despite the softening of the rhetoric of management its principal focus remains much the same as it was in the days of Taylor and Ford.
A critical look at the status quo
A good place to start with a critical view of management is in the area of decision-making. Certain decisions, particularly those made at executive levels, can have a long term impact on an organisation and its employees. Business schools and decision theory texts tells us that decision-making is a rational process. Unfortunately, reality belies that claim: decisions in organisations are more often made on the basis of politics and ideology rather than objective criteria. This being the case, it is important that decisions be subject to critical scrutiny. Indeed it is possible that many of the crises of the last decade could have been avoided if the decisions that lead to them had been subjected a to critical review.
Many of the initiatives that are launched in organisation-land have their origins in executive-level decisions that are made on flimsy grounds such as “best practice” recommendations from Big 4 consulting companies. Mid-level managers who are required to see these through to completion are then faced with the problem of justifying these initiatives to the rank and file. Change management in modern organisation-land is largely about justifying the unjustifiable or defending the indefensible.
The critique, however, goes beyond just the practice of management. For example, Alvesson and Willmott also draw attention to things such as the objectives of the organisation. They point out that short-sighted objectives such as “maximising shareholder value” is what lead to the downfall of companies such as Enron. Moreover, they also remind us of an issue that is becoming increasingly important in today’s world: that natural resources are not unlimited and should be exploited in a judicious, sustainable manner.
As interesting and important as these “big picture” issues are, in the remainder of this post I’ll focus attention on management practices that impact mid and lower level employees.
A critical look at management specialisations
Alvesson and Willmott analyse organisational functions such as Human Resource Management (HRM), Marketing and Information Systems (IS) from a critical perspective. It would take far too many pages to do justice to their discussion so I’ll just present a brief summary of two areas: HR and IS.
The rhetoric of HRM in organisations stands in stark contradiction to its actions. Despite platitudinous sloganeering about empowerment etc., the actions of most HR departments are aimed at getting people to act and behave in organisationally acceptable ways. Seen in a critical light, seemingly benign HR initiatives such as organizational culture events or self-management initiatives are exposed as being but subtle means of managerial control over employees. (see this paper for an example of the former and this one for an example of the latter).
Since the practice of IS focuses largely on technology, much of the IS research and practice tends to focus on technology trends and “best practices.” As might be expected, the focus is on “fad of the month” and thus turns stale rather quickly. As examples: the 1990s saw an explosion of papers and projects in business process re-engineering; the flavour of the decade in the 2000s was service-oriented architecture; more recently, we’ve seen a great deal of hot air about the cloud. Underlying a lot of technology related decision-making is the tacit assumption that choices pertaining to technology are value-free and can be decided on the basis of technical and financial criteria alone. The profession as a whole tends to take an overly scientific/rational approach to design and implementation, often ignoring issues such as power and politics. It can be argued that many failures of large-scale IS projects are due to the hyper-rational approach taken by many practitioners.
In a similar vein, most management specialisations can benefit from the insights that come from taking a critical perspective. Alvesson and Willmott discuss marketing, accounting and other functions. However, since my main interest is in solutions rather than listing the (rather well-known) problems, I’ll leave it here, directing the interested reader to the book for more.
Towards an enlightened practice of management
In the modern workplace it is common for employees to feel disconnected from their work, at least from time to time if not always. In a prior post, I discussed how this sense of alienation is a consequence of our work and personal lives being played out in two distinct spheres – the system and the lifeworld. In brief, the system refers to the professional and administrative sphere in which we work and/or interact with institutional authority and the lifeworld is is the everyday world that we share with others. Actions in the lifeworld are based on a shared understanding of the issue at hand whereas those in the system are not.
From the critical analysis of management specialisations presented in the book, it is evident that the profession, being mired in a paradigm consisting of prescriptive, top-down practices, serves to perpetuate the system by encroaching on the lifeworld values of employees. There are those who will say that this is exactly how it should be. However, as Alvesson and Wilmott have stated in their book, this kind of thinking is perverse because it is ultimately self-defeating:
The devaluation of lifeworld properties is perverse because …At the very least, the system depends upon human beings who are capable of communicating effectively and who are not manipulated and demoralized to the point of being incapable of cooperation and productivity.
Alvesson and Willmott use the term emancipation, to describe any process whereby employees are freed from shackles of system-oriented thinking even if only partially (Note: here I’m using the term system in the sense defined above – not to be confused with systems thinking, which is another beast altogether). Acknowledging that it is impossible to do this at the level of an entire organisation or even a department, they coin the term micro-emancipation to describe any process whereby sub-groups of organisations are empowered to think through issues and devise appropriate actions by themselves, free (to the extent possible) from management constraints or directives.
Although this might sound much too idealistic to some readers, be assured that it is eminently possible to implement micro-emancipatory practices in real world organisations. See this paper for one possible framework that can be used within a multi-organisation project along with a detailed case study that shows how the framework can be applied in a complex project environment.
Alvesson and Willmott warn that emancipatory practices are not without costs, both for employers and employees. For example, employees who have gained autonomy may end up being less productive which will in turn affect their job security. In my opinion, view, this issue can be addressed through an incrementalist approach wherein both employers and employees work together to come up with micro-emancipatory projects at the grassroots level, as in the case study described in the paper mentioned in the previous paragraph.
…and so to conclude
Despite the rhetoric of autonomy and empowerment, much of present-day management is stuck in a Taylorist/Fordist paradigm. In modern day organisations command and control may not be obvious, but they often sneak in through the backdoor in not-so-obvious ways. For example, employees almost always know that certain things are simply “out of bounds” for discussion and of the consequences of breaching those unstated boundaries can be severe.
In its purest avatar, a critical approach to management seeks to remove those boundaries altogether. This is unrealistic because nothing will ever get done in an organisation in which everything is open for discussion; as is the case in all social systems, compromise is necessary. The concept of micro-emancipation offers just this. To be sure, one has to go beyond the rhetoric of empowerment to actually creating an environment that enables people to speak their minds and debate issues openly. Though it is impossible to do this at the level of an entire organisation, it is definitely possible to achieve it (albeit approximately) in small workgroups.
To conclude: the book is worth a read, not just by management researchers but also by practicing managers. Unfortunately the overly-academic style may be a turn off for practitioners, the very people who need to read it the most.
Overcoming the corporate immune system – some lessons from the dengue virus
Introduction
The term corporate immune system was coined by James Birkenshaw as a way to describe the tendency of corporate head offices to resist entrepreneurial initiatives by their subsidiaries. In the present day, the term has also been used to refer to the tendency of organisations to reject or suppress novel ideas or processes that employees may come up with. This post is about the latter usage of the phrase.
The metaphor of an immune system is an apt one: apart from being a good description of what happens (as, for example, in the Dilbert piece above!). It also suggests ways in which one can overcome or bypass managerial resistance to initiatives that are seen as threats. In this post I build on Stefan Lindegaard’s excellent article, to discuss how the Dengue virus can teach us a trick or two about how employees can get around the corporate immune system.
The mechanics of Dengue infection
Dengue fever, also known as breakbone fever, is endemic to many tropical countries. Its symptoms are fever, severe headaches, muscle and joint pains and a characteristic skin rash. Dengue is caused by a virus that is transmitted by the Aedes Aegyptii mosquito which can be identified by the white bands on its legs. Although it originated in Africa, the Aedes species is now found in most tropical and sub-tropical countries throughout the world.
There are four closely related strains (or serotypes) of the Dengue virus– imaginatively named Dengue 1 through Dengue 4. This has interesting consequences as we shall see shortly. First let’s have a quick look at what goes on in the human body after a bite from carrier mosquito. My discussion is based on this article from the Scitable website.
Once a person is bitten by a carrier mosquito, the virus starts to infect skin cells and specialised immune cells (called Langerhans cells) that are near the site of the bite. The infected Langerhans cells travel via the bloodstream to the lymph nodes which are responsible for producing white blood cells (WBCs) that combat infections.
The WBCs are the body’s first line of defence against an infection. The problem is WBCs generally do not succeed in destroying the Dengue virus; worse, they actually end up getting infected by it. The infected white blood cells then help in spreading the virus to other organs in the body.
However, all is not lost because the body has another line of defence – the adaptive immune system – which produces antibodies that target specific intruders. Once the infection spreads, the adaptive immune system kicks in, producing antibodies that recognise and neutralise the virus. The fever an infected person experiences is a manifestation of the battle between the antibodies and the virus. In a healthy person, the immune system eventually wins and the person recovers.
Now here’s the interesting bit: a person who has been infected by the virus gains long term immunity, but only against the particular Dengue serotype that he or she was infected by. If the person is bitten by a mosquito carrying another serotype, the antibodies for the old serotype actually assist the new strain to spread within the body. Essentially this happens because the antibodies for the old strain see the new strain as the old one and thus attempt to engulf it. However, because the virus is different, the antibody cannot bind with it completely. It thus forms an antibody-virus complex within which the virus is still capable of replicating.
These circulating antibody-virus complexes then infect other white blood cells which in turn carry the virus to other parts of the body. This results in a higher volume of virus in the bloodstream than would have occurred otherwise, and hence a more severe infection. This is well known: subsequent infections of Dengue often lead to considerably more severe symptoms than the first one.
The above description is sufficient for the present discussion, but you may want to see this article to learn more about this fascinating virus.
Overcoming the corporate immune system
The processes of primary and secondary Dengue infections hold some lessons for those who want to gain executive support for proposals that might be just a tad too radical for their workplaces. A direct approach, wherein the idea is pitched directly to executives is unlikely to work for at least a couple reasons:
- The generic corporate immune system (akin to white blood cells in the human body) will attempt to take it down. This is typified by the generic, “It will never work here (so let’s not try it)” response.
- Let’s assume that you are at your persuasive best and manage to get past the generic first line corporate defence. You still cannot rest easy because, in time, managerial ingenuity will come up specific managerial objections to the idea (these are akin to strain-specific antibodies).
However, all is not lost, we can take inspiration from the secondary infection process described in the previous section. The second serotype is able to do a more thorough job in infecting its host because antibodies actually help in transporting the virus through the body. This happens because the antibodies do not fully recognise the virus and thus bind with it incompletely.
So the trick to getting your idea past the corporate immune system is to cast it in terms that are familiar to managers and to get them to have a stake in it. Here’s one way to do this:
- Make a connection between your idea and an already well-established element or aspect of your organisation. Be sure to stress this connection in your pitch (see point 2). This way, the idea is seen as a logical continuation what already exists – i.e. it is seen as old rather than new, much as the old serotype antibodies see the new strain as the old one.
- Present your idea to a manager who may be in a position to help you, seeking her advice on it.
- Take the advice offered seriously – i.e. modify the idea in a way that incorporates the advice.
- Re-present the idea to the manager, thanking her for their advice and emphasising how it makes a difference.
- If they are receptive, ask her if she’d would be willing to socialise the idea amongst her peers. If you have genuinely taken her advice, chances are she’ll be willing to do this. After all, the idea is now hers too.
The above are generic steps that can be tailored to specific situations. For example, the same principles apply when writing a business case for a new system or whatever – emphasise continuity and get people to be a part of the idea by offering them a stake in it. The bottom line is that the corporate immune response can be “tricked” into accepting novel ideas, much as the human immune system is fooled by the Dengue virus.
Conclusion
The metaphor of a corporate immune system not only provides an evocative description of how organisations kill novel ideas, but also suggests how such organisational resistance can be overcome. In this post I have described one such strategy based on the fiendishly clever dengue virus.


