Chess and competitive intelligence (part 1)Posted: July 30, 2014
July 29, 2014
In one of his many excellent science fiction novels, the late Arthur C. Clarke, “invented” a device that basically eliminated privacy. Talking about it, one of his characters said:
“Before [that invention], business was a closed game. Nobody knew my cards….And that gave me a lot of leverage for bluff, counterbluff…I could minimize my weaknesses, advertise my strengths, surprise the competition with a new strategy, whatever. But now the rules have changed. Now the game is more like chess….Now – for a price – any shareholder or competitor, or regulator come to that, can check up on any aspect of my operation….”
It sounds like Clarke was predicting the creation of CI. No, but the effect is similar. So what did Clarke mean that business became a lot more like chess? Hint: just insert the work “business” for “chess” and “competition” for “game” in the following 3 quotes from Chess Grand Masters and World Champions: 
Accept the inevitability of imperfection
One of the problems in competitive intelligence (CI) is that you are asking for someone, and that someone may be you, to predict what someone else or another organization may do, based on what they have done, the resources that they have or will have, and how they perceive the competitive environment. It is impossible to be perfect 100% of the time:
“A good player is always lucky.” (José Raúl Capablanca y Graupera)
- You are usually analyzing moving targets. When your analysis is finished, the targets do not then conveniently stop moving. That means that your analysis can be correct, as of the date it was done, but it may not necessarily be so when you or the end-user actually reads it, much less uses it.
- Your CI efforts are an attempt to provide a precise output in a world of imprecise inputs. Mistakes in interpretation and analysis are almost inevitable in this context. Experience will help minimize them.
- You may be seeking to develop specific intelligence reflecting an evaluation of the target’s intentions, as distinguished from its assets or capabilities. The end product may be in error, for it is ultimately very hard to predict with certainty how someone else, in a different environment, and with access to different facts, will act at some point in the future. Also, the target may later just change its mind. Or you may not have allowed (or been given) enough time to do an appropriate amount of analysis.
“The worst enemy of the strategist is the clock. Time trouble…reduces us all to pure reflex and reaction, tactical play. Emotion and instinct cloud our strategic vision when there is no time for proper evaluation.” (Garry Kimovich Kasparov)
All of this means you and management should expect some CI failures. However, you and management must do more – not only expect failure, but affirmatively accept it. Why? Pressure for perfection in CI can actually be counter-productive.
An analogy may help illustrate this. In hospitals, surgeons were once be called to task if they appeared to have excessive numbers of deaths in the operating room. The goal: weed out those surgeons who were taking too many risks for their patients’ own good.
However, as medicine has become more sophisticated, we now see that hospitals perform the same type of review also on surgeons who have a substantially lower death rate than average. The reasoning: these doctors may not be taking enough risks in efforts to save patients. That means their patients may be dying outside of the operating room instead of being saved on the table.
The same dichotomy is present in CI. You and management must understand the likelihood of some failures in CI, since they reflect the limits inherent in the intelligence process itself. But, management must also be suspicious of a CI analyst or unit which is never wrong. A source which avoids being wrong usually ends up providing intelligence of such generality, and subject to such qualifications, that it ultimately becomes unusable. In other words, in an effort to avoid being wrong, it may also never be right – or actionable.
So your efforts should be focused on reducing errors, but always accepting that perfection is not possible.
“The winner of the game is the player who makes the next-to-last mistake.” (Ksawery Tartakower)
 Arthur C. Clarke and Stephen Baxter, The Light of Other Days. Thor Books. New York: 2000. p. 144.