Now as a reasonably indolent consultant, I am all for diagnostic tools which will save me time during data gathering and at the same time give me fundament insights without having to unduly tax my poor little brain: ask a standard set of questions, crank the handle et voila! The answers to the client's problems appear before their very eyes and you have earned your fee by simply being clever enough to source the tool and point them to the URL containing the questionnaire. Alas, life is never straightforward!
The obvious weakness in these tools (and I've used a few in my time) is that, inevitably, the knowledge embedded (codified if you like) within them, has had to make certain assumptions about the situation being analysed. The number and nature of these assumptions adds a margin of error to the results which can render the tools almost redundant. Some attempt to get round this limitation by being template-based: a template for Government organisations, one for Enterprise, one for family-owned publishers based in the West Midlands with 150 to 250 employees specialising in fishing magazines and ventriloquism etc. The issue here is, making the templates generic enough to be used by more than one client in 10,000 and yet specialist enough to take into account a particular business landscape.
The advantage of diagnostic tools of course, is that results can obtained quickly, be compared across organisations (from different sectors if necessary) and the questions asked and answers received are not dictated by the prejudices of the questioner or answerer (if that is a real word).
The key disadvantage of course, is that the answers will not be nuanced by real world subtleties and the conclusions based on making generic correlations without any context. Although the data does will not lie, the model interpreting it may be flawed. Possibly fatally. The computer might say "no" when the real answer is "it's not as simple as that actually".
My view is that there is a place for this kind analysis tool, but only in conjunction with more subjective approaches such as good old fashioned interviews backed up by a human systems analysis methods such as VPEC-T which will include in their machinations, contextual subtleties and the complex human factors including values and trust relationships, which will influence significantly why things go wrong, organisations underperform or things are just plain rubbish.
Making used of the subjective and the objective is the trick: if the results agree then happy days, and if not then the conflicting results will suggest areas requiring further investigation.
Monday, March 8, 2010
His case studies are impressive (although I don’t suppose anyone would be troubled to outline 250 pages of their greatest mistakes where a few numbers lead to the death of thousands) but his approach speaks loud and clear to the nerd in me: imagine, being able to solve the woes of the world with a laptop, strong coffee and a serious kick-ass spreadsheet!
What I like most is the fact that techniques like Mr BdeM’s use cold logic and numbers to come up with answers which strip out the emotion and prejudices of the stakeholders and strategists who, being usually human, are therefore the victims of extended bouts of megalomania, paranoia or any of a myriad of the other ‘oias, to which powerful people are often prone.
So, can a few equations solve global warming? Maybe not, but they can tell us that is won’t be solved by a few men not signing anything in a nice City called Kioto or Copenhagen or Slough or wherever… worth a scan if you like numbers, facts, unexpected outcomes and “letting the data speak” as my friend CB would say…