September 19, 2018
Over ten years ago, I spoke to a SCIP chapter in Atlanta about some of the major problems I saw in CI, now and in the future
Here are the notes for that discussion.
Current Major Flaws in CI
Most CI staff are forced operate by looking into a rearview mirror. That means that management rarely gets a view of where they are going, but rather of where they and their competitors have been. For example, virtually all of CI’s current processes, including in particular the use of KITs and KIQs, are backwards looking and reactive. Ultimately, they lead to the production of CI with less and less value. And that makes those providing the CI of less and less value.
- Solution: Phase out KITs & KIQs. Let CI staff be freer to anticipate needs, to work directly with key end-users.
Many CI collectors and analysts are being forced to use less and less primary sources by a variety of technological and legal restraints. But, this increased reliance on secondary sources actually tends to support a growing image of CI as akin to library work. And that image may be right!
- Solution: Get clear guidelines/ethical statements affirmatively allowing for primary research; point out where lack of primary research, interviews, attending trade shows, etc. is constraining CI. But don’t whine!
Too many efforts to integrate CI with market research fail. In fact, these efforts are almost always doomed to fail. Why? Because CI is qualitative and MR quantitative. Most MR managers cannot deal with what they see as a lack of precision in CI. And most CI analysts cannot understand the obsession with “numbers”.
- Solution: The very few successes have occurred when the head of the combined unit either (a) comes up through CI or (b) has been a “customer” of CI in the past.
Your client knows more each day, or at least thinks he does. But what he knows might not be right.
- Solution: Read what your client reads and get critical CI to her before she asks for it. If trade sources are wrong, make sure to point that out!
CI as we know it today totally fails to provide any meaningful tools for the next generation of managers who expect, or will be expected to, do their own CI.
- Solution: Wait a few minutes.
Fatal flaw – not all intelligence has to be actionable. Generating real understanding in a clear context can be critical.
- Solution: Increased education of end-users and decreased use of KITs, etc.
Some Failures That Will Emerge Due to Reliance on the “Government Model”
First, a question: According to some, intelligence analysis in government is still not a true profession. Why should we assume it is in the private sector? Just a thought?
The Government Model – the black box after 5 decades. Based on looking at a few targets, most of which generated external “objective” data, coupled with access to secrets, often internal deliberations.
Now being challenged for several reasons:
- Too many end users
- Too rigid in needs determination – Sound familiar?
- Too driven by the reports to be given instead of the intelligence on questions yet unasked out there
- Still believing that secret data is more useful than data in the public domain – essentially dissing analysts!
- Focus on the short term, not the long term
- Focus on what the other side does, not what they think or what they feel
- Too much demand for certainty; drives out dissent in name of unanimity
- Designed to focus on one or a few targets; not a functional model in an era of multiple, changing targets.
- End users also get data, good or bad, from other sources. Intelligence no longer the sole source anymore. Again, sound familiar?
- Buying into the end users definition of needs begins a buy-in process of their biases, their politics, world view, etc.
- Analysts are too far removed from end users of intelligence, with exception of military/tactical, that is battlefield intelligence.
- Lack of experienced analysts to train the next generation of analysts. Current training lacks hands-on “interning” type environment that is most helpful. Also lacks the ability to test analysis of the analysts.
- Inability to deal with “too much” data. Just what do they do with all of that take from the NSA? Welcome to the Internet Age!
These same failings are occurring or will occur in almost every case for the private sector. Why? We have adopted, or at best adapted, a flawed model.
Biggest Future Unmet Needs
Above all of this is a problem not yet recognized by CI. That is the problem of success.
In the past 25 years, the most common model was a formal (or informal) CI unit, charged with collecting data, generating analysis, and communicating the finished intelligence to an end-user. In some case, the unit was one person, serving as both collector and analyst; in others, it was a dozen or more people, dividing among themselves collection and analytical functions.
CI is finally achieving its goals: incorporated in graduate schools, understood by other disciplines, talked about at the AMA [marketing], MRA [market research], SLA [libraries], AIIP [information brokers], LMA [law office managers], PDA [product development], ASIS [security].
In 1986, Carolyn Vella warned that CI could go the way of strategic planning. She meant the over promising, over bureaucracy, etc. that marked SP at that time. At it turned out, strategic planning’s association imploded as planners were replaced by executives telling manager to do their own planning. CI may be close to that position.
Now, as CI becomes integrated into other business processes, a new model is emerging – not easily. That is one where the collector, the analyst, and the end-user are all the same person. CI’s present models and processes do not fit that new archetype, whether it is in the areas of ethical conduct, needs determination, communication, data collection, or utilization.
The CI cycle and all of the literature surrounding it no longer applies – Needs, Collection, Analysis, Dissemination, Utilization.
What about ethics – where is the check on unethical means of collection?
- Revise ethical policies to acknowledge the incorporation of CI into general management. And really train everyone on them. Yearly, if needed.
Where is the pushback in defining needs more sharply when you do it yourself?
- Stop before you start collecting, even though you have already been doing it. Recall your time as a student. Realize that collection and analysis are now not merely linked – they are merged!
Where is the need to write up a separate analysis when you just synthesize it into what you are doing?
- It is good practice to separate what you know, suspect, and yes, guess, in a report or presentation. You are not God. Don’t deliver a message on stone tablets!
Where is the review of what you did before you present it to be used?
- Get someone else to at least read it – critically, really critically. If you cannot deal with that thought, your work product probably could not stand up to it anyway.
Where is the completion of analysis when the data is just absorbed as it comes in?
- It is not a bad thing that the analysis is continuing – changes in the competitive environment don’t stop for your Monday briefing, do they?
Where is the feedback in terms of success of the intelligence work?
- You had better learn to evaluate what you do poorly – you know what you do well.
Where is the ability (or is there a need) to justify the costs of getting the intelligence?
- If you cannot use what you collect, why are you bothering? Develop new targets, new data sources, new tools!
Where is the application of a variety of analytical tools? To a man with a hammer, everything looks like a nail!
- This is a problem now for analysts. In most cases, you only have to decide if you are assembling a puzzle or proving/disproving a working hypothesis. Remember, if you think it is only working with a puzzle, how do you know the pieces you have are even from the same puzzle!
What is the relationship with the library/information center process?
- They can and should become your first stop in collecting data. Even if they do not have the data you need, they can help identify where/from whom you can get it. They should become more valuable, not less.
What kinds of skills are needed? The current CI research indicates that there are a variety of skills needed to be a CI analyst. The odds of having them in one person are nil. In fact, a parsing of these skills divides them into the Nero Wolfe-Archie Goodwin divide. If they are hard to get in one person now, how do you add them to a batch of OTHER skills/training a manager must have?
- Here, you will have to just make do. No one has all of the skills that they need. Just work at acquiring new ones and polishing old ones.
What this all means is that without a new CI process aimed at the end user as analyst and/or collector, they will misuse CI, produce poor CI, or none at all. At best they will be subject to what one critic of US intelligence has recently called, and I paraphrase, the spectacle of the collection of factoids driving out real thinking.
Note: Oddly enough, this disconnect does not apply to the outside source of CI. There, the current CI model would seem to work perfectly well there, with the exceptions noted earlier. And, based on our experience, as well as the facilitated sessions at SCIP06, company restrictions, such as above, may actually drive work towards outside consultants.
What are the overall solutions for CI to avoid imploding as it achieves success?
In no particular order:
- Admit that CI is not a profession. Lawyers still refer to their practice, a “guild” term.
- Redirect training at SCIP and by private consultants to training non-CI professionals in CI. Already those in security are looking at CI (defensive) as a new tool. Others will too, and soon.
- Train all managers who might conduct or use CI on what CI really is. Stress the concept not the process. CI is a tool for sales, marketing, crises management, and strategy. It is also a tool for human resources and many other functions. Recognize that and deal with it. The Baldrige Awards already have – for years!
Looking back, I do not see much progress on any of these, do you?