Eight to Late

Sensemaking and Analytics for Organizations

Archive for the ‘Consulting’ Category

Capturing and using knowledge in project-based organisations

with one comment

Many organisations find it hard to capture and use knowledge effectively. This problem is especially acute in project-based organisations because project teams – the primary “generators” of knowledge in such organisations – are temporary structures which are disbanded when a project’s completed. Hence most project-based organisations emphasise (and enforce!) the capture of knowledge through end-project activities such as project post-mortems, documentation etc.

That’s fine as it goes, but capturing knowledge is only a part of the story. There is the other (not so small) matter of using it. Here’s where most efforts fall flat – all that supposedly useful knowledge is rarely used. This is the point addressed by Katrina Pugh and Nancy Dixon in a short note entitled, Don’t Just Capture Knowledge–Put It to Work, published in the May 2008 issue of Harvard Business Review. Pugh and Dixon suggest using what they call a knowledge harvest– an approach aimed at capturing knowledge and putting it to use. In brief, a knowledge harvest consists of:

  1. Identifying those who might find the knowledge useful. Pugh and Dixon call these people knowledge seekers. These folks are interested in the knowledge on offer and so already have the motivation to learn. In projectised organisations, programme managers have a broad view of project activity, and can thus help identify suitable seekers to participate in knowledge capture (or harvesting) sessions.
  2. Involving knowledge seekers  in harvesting sessions. The idea here is that seekers, being self-motivated, will ask pointed questions aimed at extracting information that is often left out of typical post-project documentation. They might, for instance, ask probing questions regarding what went wrong and why; points that are often glossed over for political or other reasons.

In their note, Pugh and Dixon present a case study where this method was used successfully in a project situation. Following the initial success of the technique, it has been adopted by other programmes within the organisation that was the subject of the study.

This simple technique has much to commend it. For one, conversation is a more effective way (than documentation) to get at tacit knowledge. The presence of knowledge seekers at harvest sessions improves chances that the right questions – i.e. those that “tease out” tacit knowledge – will be asked. Secondly,the captured knowledge will almost certainly be used since seekers are identified by their interest in what’s on offer. Finally, if seekers find the knowledge gained to be useful on their own projects, they’ll pass it on to other seekers in harvesting sessions down the line, thus ensuring what’s learnt becomes a part of organisational memory.

Written by K

June 1, 2008 at 9:27 am

Appstronauts

with 2 comments

Eliciting and documenting application requirements is hard work. It’s made even harder if one is dealing with an appstronaut. “So who or what’s an appstronaut?” – I hear you ask.  Here’s a definition, along with some  explanatory notes:

Appstronaut (noun): a person who can hold forth for hours on the big picture, but cannot (will not!) get down to details.  
Distribution: Generally found in upper echelons of management,  but sightings have been reported at other levels too. 
Field notes: Appstronauts are characterised by verbosity coupled with a short attention span. They exhibit extreme fondness for strategy,  vision and all that “big picture” stuff, but have little patience for details. They are known to have good ideas, but generally lack the focus to see them through to fruition.

Appstronauts infuriate analysts, who need nitty-gritty details in order to understand and document application requirements. The big picture stuff, as fascinating as it is to the appstronaut, is simply of no use to the analyst.  But, if left unchecked, appstronauts will be content to float around at rarefied heights where ideas are thick, but details thin. Analysts charged with compiling application requirements must bring them down to earth. But, how?  This post aims to answer the question by highlighting some simple techniques to tether appstronauts to reality.

Without further ado, here they are:

  • Start by zooming in on a small part of the big picture: The main problem is that the canvas painted by appstronauts is way too big. A practical way to start filling in  detail is by focusing on a part of the picture and drilling down to specifics. Which part? The answer should be clear from the appstronaut’s spiel. In brief: the part should be important yet easy enough to action or implement. It may be something that ties into existing systems, for instance. What the analyst should look for is a quick win. Something that can be  built quickly, but also provides value to the appstronaut.
  • Contribute to the picture by painting a part of it: The best analysts understand the business thoroughly. Given this, they should be able to contribute to the appstronaut’s vision. In fact, I have sat through meetings where smart analysts have steered the discussion (with tact!) in productive directions. At this point the analyst is contributing to the vision too.
  • Help appstronauts drill down to specifics: Appstronauts abhor details; analysts thrive on them. The analyst’s job is to get appstronauts to talk about specifics.  To do this, it can be helpful to send out a list of questions before the meeting so that appstronauts come in prepared. Very often, they’ll throw up their hands and say, “It’s your job to fill in the details.” At this point the analyst has to gently, but firmly, insist that details of business requirements have to come from (no surprise!) the business.
  • Verify mutual understanding: It is important to verify that both parties – the appstronaut and the analyst – have a common understanding of what transpires in a requirements analysis session. This should be done  both during and after the meeting. During the session the analyst should, by asking appropriate questions, check that their understanding of the requirements is correct . After the session,  a compilation of notes should be sent out, asking users to send their corrections and comments within the next day or two. This gives them a chance to make any revisions before  work on documenting the requirements is started. This is standard practice in requirements elicitation, but is absolutely critical when one is dealing with an appstronaut (remember the “short attention span” bit in the field notes above)
  • Use visual aids (screen mock-ups, process diagrams etc.) liberally: This applies both to the analysis sessions and the documents. Often, in requirements sessions I use the whiteboard to sketch process flows, relationships and even app screen layouts.  Any documents or presentations should have lots of visuals as appstronauts  have even less patience than others to go through large swathes of text.

After bagging appstronauts through most of this piece, I should acknowledge that they can play a key role in driving important business initiatives. As mentioned in the field notes above, they  often have good ideas but need some help implementing them. This is where the analyst comes in: he or she is very familiar with the business and/or associated systems, and is thus well placed to help appstronauts add substance to their grand, but ethereal, visions. If approached constructively, working with appstronauts can be an opportunity instead of a trial.

Written by K

May 25, 2008 at 9:33 pm

Great expectations and how to manage them

with one comment

Many project management books contain a section or two on managing stakeholder expectations. The emphasis in most of these tomes tends to be on the bureaucratic aspects of the process – developing stakeholder management plans or getting the project scope statement signed off, for example. Although documents and signatures may be useful in proving that you’ve performed your duties with due diligence  (aka CYA),  they don’t guarantee stakeholder satisfaction. Despite reams of documentation, a good number of projects are deemed failures because of a mismatch between stakeholder expectations and completed deliverables.  I’d go so far as to say that chronic project failure is one of the main reasons why many corporate IT departments get a bad rap.

The root of the problem, in my opinion, is that documented requirements very often do not reflect what a client really wants. It is impossible to get a comprehensive view of client requirements in a limited number of analysis sessions. This difficulty is particularly acute when one tries to document all requirements in one go, as is the case when a waterfall development methodology (also known as Big Design Up Front or BDUF)  is used. It is well known that Iterative / Incremental methodologies are better because they ensure that requirements are reviewed and revised several times in the development cycle. Incidentally,  this article by Joe Marasco  has an excellent, visual explanation of why iterative development is superior to BDUF.  Despite this being common knowledge, many corporate IT departments remain stuck under the waterfall. Why this is so is a topic for another post – I’ll just take it as a given here. However, consequent stakeholder dissatisfaction is one of the major reasons why these departments have low credibility within the organisations they serve.

So what can be done about this?

The two-word answer: better communication.

The one-sentence answer: adapt flexible stakeholder management practices from the Iterative/Incremental world and integrate them into your BDUF approach.

The longer answer:

Iterative/Incremental approaches mandate regular meetings between stakeholders and project team members. Consequently these methodologies have stakeholder management built in, as opposed to BDUF approaches which don’t require regular interactions.  So when using BDUF methodologies one has to work doubly hard at communicating with stakeholders. When doing so, it is a good idea to adapt practices from Iterative/Incremental approaches and integrate them into your development methodology.  In particular, the following points are worth considering:

  • Frequent feedback: You don’t need to schedule formal meetings to give people status updates or feedback. Wander over to their office to have a chat and give them an update. If you make a habit of this, you may even find that formal meetings are required less often.
  • Frequent delivery (or demos): Although BDUF methods appear to preclude frequent delivery (as opposed to Iterative techniques, which mandate frequent releases), it is often possible to show users work in progress. Schedule informal demos of work in progress where a developer shows end users the current state of the product. By doing so, you may get feedback early enough to do something about it. It’s no good having users complain about the user interface at the final demo.
  • Flexible attitude: Yes, I know, your requirements are frozen (the users have signed off on them, after all). However, if you can accommodate changes, it is best that you do (via whatever change control process you have).  A blanket, “Next phase” response to all change requests will only annoy your clients. A flexible, can-do attitude will go a long way in getting users “on your side”. Once they see that you do your best to help them, they’ll (usually) reciprocate by not asking you for the moon. Further, on the rare occasions that you turn them down, they’ll (generally) understand that you’re doing so for good reasons. 
  • Present alternatives when saying no: This is important. If you do have to say, “No” to a user request, try to present a viable workaround or alternative. Often, a well thought out alternative may be more than enough to satisfy the user. Besides, it also gives the user a sense that you are working with them to solve their problems, and not off on your own trip building something that is divorced from their needs.

That’s it. I know, when you look at the list the points seem pretty obvious. However, they are hard to apply in practice – not just because of recalcitrant users, but also due to the numerous constraints on project teams operating in BDUF environments. In practice you’ll also find that these techniques work better on some projects than others – mainly because of differences in user attitudes.

Everyone has great (or should I say, inflated)  expectations from things to come. This invariably leads to disappointment in the end. A good part of managing users is to  make their expectations more realistic rather than lowering them. The best way to do this is by involving users as much as possible in the development process. This essentially is what Incremental/Iterative and Agile techniques advocate.  BDUF practitioners – which include many corporate IT development teams –  should take a leaf from their book.

Written by K

April 6, 2008 at 10:00 pm