Author Archive
Dysfunctional IT attitudes: processes are more important than people
The service desk phone rang one morning. The guys were busy attending to other jobs, so the manager picked up the call, “Morning, IT service desk, Jake speaking. How can I help you?”
“I had asked for Consolidate to be installed on my new computer, but have just noticed that it wasn’t.” The lady at the other end of the line sounded irritated. The software should have been installed on her computer – it was on top of the list she had provided to the service desk when she’d put in the request for her new computer.
“Have you logged a service request?” enquired Jake.
“Yes,” she said, ‘but this is urgent. I have to send my sales figures for the month to head office this morning, and I can’t do it without Consolidate. Could you please send someone up right away?”
There was a short pause at Jake’s end. “I’m looking at the SLA right now, and Consolidate isn’t listed as a business critical application. There’s no way we can do this right now.”
“Look, it’s critical as far as I’m concerned. It’s got to be done right away or head office won’t get their sales figures. So, when can I expect a response?” Her annoyance levels were starting to increase
“Not before tomorrow, or may be even day after, depending on how soon we clear other, pending jobs.”
“I think I’ve made it clear this is important. Can’t you do it sooner?”
“No.” Jake clearly thought that no further explanation was necessary. Can’t have folks jumping the queue; service desk processes were put in place for a reason.
She took a more conciliatory tone, “Please understand,” she said, “I wouldn’t make an issue out of it if it weren’t important… the sales figures must be done by this afternoon. I just need the application installed; it shouldn’t take more than five minutes.”
“Sorry, you’ll just have to wait.” He didn’t sound sorry at all.
She’s starting to get really ticked off now. “It was a help desk mess-up in the first place. You should take responsibility and fix it now.”
“Perhaps you didn’t hear what I said; someone will come by tomorrow or day after. That’s the best we can do given that Consolidate is not a business critical application. You’ll just have to wait your turn.” There was no response from her side, so he added, “We have processes in place. We can’t bypass them for just any request.”
Jake’s reference to processes only annoyed her further, “Obviously your processes – whatever they may be – don’t work. The application should have been installed when I got my computer.”
“I’m sorry about that, but I can’t make any exceptions to the way we deal with service requests.” He sounded even less sorry now.
She seethed. “Thanks….you’ve been so very helpful.” Her tone made it clear that she thought Jake was being singularly unhelpful. She hung up, not waiting for a response.
—
Jake had a point: proper functioning of a service desk depends on processes. Bypassing these can lead to problems – not the least being that everyone would expect an instant response. Service desk processes ensure efficiency and transparency. Everyone knows what they can expect when they lodge a request; expected service levels being documented in excruciating detail in service level agreements. Yes, all this is true, and can’t be argued. Even so, I can’t help but think that the lady deserved better. Jake could have explained his position in a more acceptable way, or damn it – even got off his rear and fixed the issue himself in five minutes flat. He would have bypassed his beloved processes, but gained much goodwill in doing so.
Over the years, processes have become entrenched in corporate IT, as witnessed by the plethora of best practices such as ITIL and CMMI. Implementation of processes based on these frameworks and methodologies helps standardise the way corporate IT carries out its functions. This, in most cases, is a good thing. Yet, processes aren’t the be all and end all of IT. At the receiving end of IT services are ….yes, real people doing real work that keeps businesses ticking. Conflicts between IT and the business occur when IT folks forget that people are more important than processes; like Jake in the true incident described above. This holds not just for operational IT (like the service desk), but also for development work (i.e. projects) as I’ve mentioned in an earlier post. Trouble is, processes trump people more often than not. When that happens, things aren’t working the way they should- processes are intended to help people, not to hinder them. This is something folks who work in corporate IT would do well to keep in mind; especially these days, when business leaders are being seduced by the call of outsourcers and the IT-as-utility crowd.
All too often, IT management thinks of processes as a panacea for all IT ills. The way I look at it is a little different: processes are fine and good, and even necessary; but the people who are served by IT must come first. If that means making the occasional exception to a mandated process, then so be it.
Why I didn’t do some of the things I had to do…
Why do people postpone important tasks? Research by Sean McCrea and his colleagues may provide a partial answer. Theyfound that people tend to procrastinate when asked to perform tasks that are defined in abstract terms. What this means is best explained through one of their experiments: half of a group of students were asked to describe how they would carry out a mundane task such as opening a bank account, and the other half were asked to describe reasons why one might do that task – i.e. why one might want to open a bank account. The first task is straightforward, and needs little thought prior to execution. The second one is more abstract; some deliberation is required before doing it. Even though all participants were offered a small (but interesting enough) sum of money if they completed the task within three weeks, it was found that most of those who were given the concrete task completed it on time whereas more than half those assigned the abstract task failed to complete it. The researchers use the concept of psychological distance to describe this behaviour. Psychological distance in this context is a measure of the closeness (or remoteness) a person feels to a task, abstract tasks being more “distant” in this sense than concrete ones.
Reading about this reminded me of an incident that occurred many years ago, just after I’d made a career switch from academic research to business consulting. One of the partners in the firm I was working for had asked me to write a project proposal for a new client. He assumed I knew what was needed, and offered no guidance. I had a half-hearted try at it, but couldn’t make much headway. Like the stereotypical student, I then put it off for several days. The day before the deadline, fearing the consequences of inaction, I got down to it. I spoke to a few colleagues to make the task clearer, spent some thinking it through then, finally, wrote (and rewrote) the proposal well into the night.
Seen in the light of Dr. McCrea’s research, my procrastination was simply a normal human reaction to an abstract task. Once I was able to define the task better – with the help of my colleagues and some thought – my reasons for procrastination vanished, and with it my mental block.
I see this operate in my current job too. I work with a small group of developers who tackle a wide range of projects ranging from enterprisey stuff (such as the implementation of CRM systems), to the development of niche applications used by a handful of people. The small size of our group means that everyone has to do a bit of everything – design, coding, testing, maintenance, support and (unfortunately) … documentation. Now, in keeping with the stereotypical developer, most of the mob detest doing documentation. “I’d rather do maintenance coding,” said one. When asked why, he replied that it took him a lot more effort to write than it did to do design or coding work. Of course, this is not to say that cutting coding is easy, but that developers (or the ones I work with, at any rate) find it less remote psychologically – and hence easier – than writing. So, when required to do documentation, they typically put it off if as much as possible.
The relationship between task abstraction and procrastination indicates how managers can help reduce the tendency to procrastinate. The basic idea is to reduce task abstraction, and hence reduce the psychological remoteness an assignee feels in relation to a task. For example, when asking a coder to write documentation, it might help to provide a template with headings and sub-headings, or make suggestions on what should and should not be included in the documentation. Anything that makes the task less abstract will help counter procrastination.
Tasks can be made more concrete in a number of ways. Some suggestions:
- Outline steps required to perform the task.
- Providing more detail about the task.
- Narrow the task down to specifics.
- Provide examples or templates of how the task might be done.
Of course, not all procrastination can be attributed to task abstraction. Folks put off tasks for all kinds of reasons – and sometimes even for no reason at all. However, speaking from personal experience, Dr. McCrea’s work does ring true: I didn’t do some of the things I had to do simply because they weren’t clear enough to me – like that project plan I was supposed to have started on a week ago. But advice is easier given than taken. With only a gentle pang of guilt, I put it off until tomorrow.
Anchored and over-optimistic: why quick and dirty estimates are almost always incorrect
Some time ago, a sales manager barged into my office. “I’m sorry for the short notice,” she said, “but you’ll need to make some modifications to the consolidated sales report by tomorrow evening.”
I could see she was stressed and I wanted to help, but there was an obvious question that needed to be asked. “What do you need done? I’ll have to get some details before I can tell you if it can be done within the time,” I replied.
She pulled up a chair and proceeded to explain what was needed. Within a minute or two I knew there was no way I could get it finished by the next day. I told her so.
“Oh no…this is really important. How long will it take?”
I thought about it for a minute or so. “OK how about I try to get it to you by day after?”
“Tomorrow would be better, but I can wait till day after.” She didn’t look very happy about it though. “Thanks,” she said and rushed away, not giving me a chance to reconsider my off-the-cuff estimate.
—
After she left, I had a closer look at what needed to be done. Soon I realised it would take me at least twice as long if I wanted to do it right. As it was, I’d have to work late to get it done in the agreed time, and may even have to cut a corner or two ( or three) in the process.
So why was I so wide off the mark?
I had been railroaded into giving the manager an unrealistic estimate without even realising it. When the manager quoted her timeline, my subconscious latched on to it as an initial value for my estimate. Although I revised the initial estimate upwards, I was “pressured” – albeit unknowingly – into quoting an estimate that was biased towards the timeline she’d mentioned. I was a victim of what psychologists call anchoring bias – a human tendency to base judgements on a single piece of information or data, ignoring all other relevant factors. In arriving at my estimate, I had focused on one piece of data (her timeline) to the exclusion of all other potentially significant information (the complexity of the task, other things on my plate etc.).
Anchoring bias was first described by Amos Tversky and Daniel Kahnemann in their pioneering paper entitled, Judgement under Uncertainty: Heuristics and Biases. Tversky and Kahnemann found that people often make quick judgements based on initial (or anchor) values that are suggested to them. As the incident above illustrates, the anchor value (the manager’s timeline) may have nothing to do with the point in question (how long it would actually take me to do the work). To be sure, folks generally adjust the anchor values based on other information. These adjustments, however, are generally inadequate. The final estimates arrived at are incorrect because they remain biased towards the initial value. As Tversky and Kahnemann state in their paper:
In many situations, people make estimates by starting from an initial value that is adjusted to yield the final answer. The initial value, or starting point, may be suggested by the formulation of the problem, or it may be the result of a partial computation. In either case, adjustments are typically insufficient. That is, different starting points yield different estimates, which are biased toward the initial values. We call this phenomenon anchoring.
Although the above quote may sound somewhat academic, be assured that anchoring is very real. It affects even day-to-day decisions that people make. For example, in this paper Neil Stewart presents evidence that credit card holders repay their debt more slowly when their statements suggest a minimum payment. In other words the minimum payment works as an anchor, causing the card holder to pay a smaller amount than they would have been prepared to (in the absence of an anchor).
Anchoring, however, is only part of the story. Things get much worse for complex tasks because another bias comes into play. Tversky and Kahnemann found that subjects tended to be over optimistic when asked to make predictions regarding complex matters. Again, quoting from their paper:
Biases in the evaluation of compound events are particularly significant in the context of planning. The successful completion of an undertaking, such as the development of a new product, typically has a conjunctive character: for the undertaking to succeed, each of a series of events must occur. Even when each of these events is very likely, the overall probability of success can be quite low if the number of events is large. The general tendency to overestimate the probability of conjunctive events leads to unwarranted optimism in the evaluation of the likelihood that a plan will succeed or that a project will be completed on time.
Such over-optimism in the face of complex tasks is sometimes referred to as the planning fallacy.1
Of course, as discussed by Kahnemann and Fredrick in this paper, biases such as anchoring and the planning fallacy can be avoided by a careful, reflective approach to estimation – as opposed to a “quick and dirty” or intuitive one. Basically, a reflective approach seeks to eliminate bias by reducing the effect of individual judgements. This is why project management texts advise us (among other things) to:
- Base estimates on historical data for similar tasks. This is the basis of reference class forecasting which I have written about in an earlier post.
- Draft independent experts to do the estimation.
- Use multipoint estimates (best and worst case scenarios)
In big-bang approaches to project management, one has to make a conscious effort to eliminate bias because there are fewer chances to get it right. On the other hand, iterative / incremental methodologies have bias elimination built-in because one starts with initial estimates, which include inaccuracies due to bias, and subsequently refine these as one progresses. The estimates get better as one goes along because every refinement is based on an improved knowledge of the task.
Anchoring and the planning fallacy are examples cognitive biases – patterns of deviations of judgement that humans display in a variety of situations. Since the pioneering work of Tversky and Kahnemann, these biases have been widely studied by psychologists. It is important to note that these biases come into play whenever quick and dirty judgements are involved. They occur even when subjects are motivated to make accurate judgements. As Tversky and Kahnemann state towards the end of their paper:
These biases are not attributable to motivational effects such as wishful thinking or the distortion of judgments by payoffs and penalties. Indeed, several of the severe errors of judgment reported earlier (in the paper) occurred despite the fact that subjects were encouraged to be accurate and were rewarded for the correct answers.
The only way to avoid cognitive biases in estimating is to proceed with care and consideration. Yes, that’s a time consuming, effort-laden process, but that’s the price one pays for doing it right. To paraphrase Euclid, there is no royal road to estimation.
1 The planning fallacy is related to optimism bias which I have discussed in my post on reference class forecasting.

