DIY CI Blog

October 24, 2018

Today marks the end of this Blog. I will leave the hundreds of blogs I have written up here for a year or so, and then turn them all over to the Centre for Intelligence and Securities Studies, Mercyhurst College along with all copyright rights.

The reason for this is that I am retiring. I am following my beloved spouse and business partner, Carolyn Vella, who retired from The Helicon Group, which she founded, some time ago for health reasons.

Carolyn is now recuperating in hospital from major kidney problems. I expect her home in a couple of weeks. I am spending full time with her now as she goes through her physical rehabilitation.


An Historical Precedent

October 2, 2018

From time to time, I run into an historical nugget that offers a link between competitive intelligence today and the distant past.

Most of the time, we talk about how CI was heavily influenced by what happens in government, and how modern CI borrows heavily from the government model. However, in the past, that was reversed.

Take, for example, this extract from a wonderful biographical history of the international banking family the Warburgs:

“Max Warburg attained eminence in the heyday of imperial intrigue [the “eve of World War I”], when statesmen picked countries ripe for exploitation on unfurled maps and bankers served their will. Private bankers were ideal channels for such covert action because they didn’t answer to shareholders or publish balance sheets. They also prized intelligence and operated with sphinxlike discretion that mimicked diplomatic activity.”[1]

[1]Ron Chernow, The Warburgs, Random House, NY, 1993, p. 141.


Current and Future Problems for CI

September 19, 2018

Over ten years ago, I spoke to a SCIP chapter in Atlanta about some of the major problems I saw in CI, now and in the future

Here are the notes for that discussion.

—–

Current Major Flaws in CI

 

Most CI staff are forced operate by looking into a rearview mirror. That means that management rarely gets a view of where they are going, but rather of where they and their competitors have been. For example, virtually all of CI’s current processes, including in particular the use of KITs and KIQs, are backwards looking and reactive. Ultimately, they lead to the production of CI with less and less value. And that makes those providing the CI of less and less value.

  • Solution: Phase out KITs & KIQs. Let CI staff be freer to anticipate needs, to work directly with key end-users.

Many CI collectors and analysts are being forced to use less and less primary sources by a variety of technological and legal restraints. But, this increased reliance on secondary sources actually tends to support a growing image of CI as akin to library work. And that image may be right!

  • Solution: Get clear guidelines/ethical statements affirmatively allowing for primary research; point out where lack of primary research, interviews, attending trade shows, etc. is constraining CI. But don’t whine!

Too many efforts to integrate CI with market research fail. In fact, these efforts are almost always doomed to fail. Why? Because CI is qualitative and MR quantitative. Most MR managers cannot deal with what they see as a lack of precision in CI. And most CI analysts cannot understand the obsession with “numbers”.

  • Solution: The very few successes have occurred when the head of the combined unit either (a) comes up through CI or (b) has been a “customer” of CI in the past.

Your client knows more each day, or at least thinks he does. But what he knows might not be right.

  • Solution: Read what your client reads and get critical CI to her before she asks for it. If trade sources are wrong, make sure to point that out!

CI as we know it today totally fails to provide any meaningful tools for the next generation of managers who expect, or will be expected to, do their own CI.

  • Solution: Wait a few minutes.

Fatal flaw – not all intelligence has to be actionable. Generating real understanding in a clear context can be critical.

  • Solution: Increased education of end-users and decreased use of KITs, etc.

Some Failures That Will Emerge Due to Reliance on the “Government Model”

First, a question: According to some, intelligence analysis in government is still not a true profession. Why should we assume it is in the private sector? Just a thought?

The Government Model – the black box after 5 decades. Based on looking at a few targets, most of which generated external “objective” data, coupled with access to secrets, often internal deliberations.

Now being challenged for several reasons:

  • Too many end users
  • Too rigid in needs determination – Sound familiar?
  • Too driven by the reports to be given instead of the intelligence on questions yet unasked out there
  • Still believing that secret data is more useful than data in the public domain – essentially dissing analysts!
  • Focus on the short term, not the long term
  • Focus on what the other side does, not what they think or what they feel
  • Too much demand for certainty; drives out dissent in name of unanimity
  • Designed to focus on one or a few targets; not a functional model in an era of multiple, changing targets.
  • End users also get data, good or bad, from other sources. Intelligence no longer the sole source anymore. Again, sound familiar?
  • Buying into the end users definition of needs begins a buy-in process of their biases, their politics, world view, etc.
  • Analysts are too far removed from end users of intelligence, with exception of military/tactical, that is battlefield intelligence.
  • Lack of experienced analysts to train the next generation of analysts. Current training lacks hands-on “interning” type environment that is most helpful. Also lacks the ability to test analysis of the analysts.
  • Inability to deal with “too much” data. Just what do they do with all of that take from the NSA? Welcome to the Internet Age!

These same failings are occurring or will occur in almost every case for the private sector. Why? We have adopted, or at best adapted, a flawed model.

Biggest Future Unmet Needs

Above all of this is a problem not yet recognized by CI. That is the problem of success.

In the past 25 years, the most common model was a formal (or informal) CI unit, charged with collecting data, generating analysis, and communicating the finished intelligence to an end-user. In some case, the unit was one person, serving as both collector and analyst; in others, it was a dozen or more people, dividing among themselves collection and analytical functions.

CI is finally achieving its goals: incorporated in graduate schools, understood by other disciplines, talked about at the AMA [marketing], MRA [market research], SLA [libraries], AIIP [information brokers], LMA [law office managers], PDA [product development], ASIS [security].

In 1986, Carolyn Vella warned that CI could go the way of strategic planning. She meant the over promising, over bureaucracy, etc. that marked SP at that time. At it turned out, strategic planning’s association imploded as planners were replaced by executives telling manager to do their own planning. CI may be close to that position.

Now, as CI becomes integrated into other business processes, a new model is emerging – not easily. That is one where the collector, the analyst, and the end-user are all the same person. CI’s present models and processes do not fit that new archetype, whether it is in the areas of ethical conduct, needs determination, communication, data collection, or utilization.

The CI cycle and all of the literature surrounding it no longer applies – Needs, Collection, Analysis, Dissemination, Utilization.

What about ethics – where is the check on unethical means of collection?

  • Revise ethical policies to acknowledge the incorporation of CI into general management. And really train everyone on them. Yearly, if needed.

Where is the pushback in defining needs more sharply when you do it yourself?

  • Stop before you start collecting, even though you have already been doing it. Recall your time as a student. Realize that collection and analysis are now not merely linked – they are merged!

Where is the need to write up a separate analysis when you just synthesize it into what you are doing?

  • It is good practice to separate what you know, suspect, and yes, guess, in a report or presentation. You are not God. Don’t deliver a message on stone tablets!

Where is the review of what you did before you present it to be used?

  • Get someone else to at least read it – critically, really critically. If you cannot deal with that thought, your work product probably could not stand up to it anyway.

Where is the completion of analysis when the data is just absorbed as it comes in?

  • It is not a bad thing that the analysis is continuing – changes in the competitive environment don’t stop for your Monday briefing, do they?

Where is the feedback in terms of success of the intelligence work?

  • You had better learn to evaluate what you do poorly – you know what you do well.

Where is the ability (or is there a need) to justify the costs of getting the intelligence?

  • If you cannot use what you collect, why are you bothering? Develop new targets, new data sources, new tools!

Where is the application of a variety of analytical tools? To a man with a hammer, everything looks like a nail!

  • This is a problem now for analysts. In most cases, you only have to decide if you are assembling a puzzle or proving/disproving a working hypothesis. Remember, if you think it is only working with a puzzle, how do you know the pieces you have are even from the same puzzle!

What is the relationship with the library/information center process?

  • They can and should become your first stop in collecting data. Even if they do not have the data you need, they can help identify where/from whom you can get it. They should become more valuable, not less.

What kinds of skills are needed? The current CI research indicates that there are a variety of skills needed to be a CI analyst. The odds of having them in one person are nil. In fact, a parsing of these skills divides them into the Nero Wolfe-Archie Goodwin divide. If they are hard to get in one person now, how do you add them to a batch of OTHER skills/training a manager must have?

  • Here, you will have to just make do. No one has all of the skills that they need. Just work at acquiring new ones and polishing old ones.

What this all means is that without a new CI process aimed at the end user as analyst and/or collector, they will misuse CI, produce poor CI, or none at all. At best they will be subject to what one critic of US intelligence has recently called, and I paraphrase, the spectacle of the collection of factoids driving out real thinking.

Note: Oddly enough, this disconnect does not apply to the outside source of CI. There, the current CI model would seem to work perfectly well there, with the exceptions noted earlier. And, based on our experience, as well as the facilitated sessions at SCIP06, company restrictions, such as above, may actually drive work towards outside consultants.

What are the overall solutions for CI to avoid imploding as it achieves success?

In no particular order:

  • Admit that CI is not a profession. Lawyers still refer to their practice, a “guild” term.
  • Redirect training at SCIP and by private consultants to training non-CI professionals in CI. Already those in security are looking at CI (defensive) as a new tool. Others will too, and soon.
  • Train all managers who might conduct or use CI on what CI really is. Stress the concept not the process. CI is a tool for sales, marketing, crises management, and strategy. It is also a tool for human resources and many other functions. Recognize that and deal with it. The Baldrige Awards already have – for years!

—–

Looking back, I do not see much progress on any of these, do you?


Stages of Development

September 14, 2018

I have updated and attached a short presentation I made several years ago about how a world-class competitive intelligence program develops. Click here to download it.


Down Time

August 6, 2018

This blog will not publish again until the week of September 4, 2018. Our office was flooded out in the heavy rains of August 3-5. We will reopen on September 4.

 


Data Spotting

August 3, 2018

The basics of CI call for effective collection of relevant data. In the past, I have been involved in the typical “counting cars” data collection, including studying aerial photos of huge chemical plants, and hiring people to count trucks coming and going at a construction site.

Now that kind of data collection, at least some of that data collection, is becoming easier. Consider the following reports in Bloomberg Businessweek[1]:

  • One company now offers data from around 200 satellites, public and private, that allows for the identification and tracking of many things by geographic area. The article pitches the ability to monitor all the parking lots for Six Flags, the amusement park operator. Full lots = profits; empty lots = losses.
  • Another firm’s satellites focus on the heights of the lids on oil tanks throughout the world. The goal? Provide data on oil supplies and sales where “official statistics are incomplete or untrustworthy”.

Of course, the data from these satellite services is not free and it still must be processed and analyzed, but it sure beats sitting in a car at Six Flags Over Texas, counting cars for three days, doesn’t it?

[1] Eric Roston, “The View From Way Up”, Bloomberg Businessweek, July 30, 2018, pp. 46-9.


Not All Interviews are Alike

July 24, 2018

“[A co-founder of Zoox, a self-driving car “hopeful”] reached out to some of the biggest names in the field and told them he was making a documentary on the rise of self-driving cars. The plan was to mine these people for information and feel out potential partners…. [He says] ’In my defense I might have been making a documentary.’”[1]

Legal? Yes. Ethical? No! Why? Let’s look at ethical standards in CI:

  • SCIP’s Code of Ethics requires its members “[t]o accurately disclose all relevant information, including one’s identity and organization, prior to all interviews.”[2] Never happened. Unless he said he “might” be making a documentary, instead of that he was.
  • The Helicon Group “[n]ever employs questionable data collection activities. These are techniques, otherwise legal, which, if made public, might tend to embarrass Helicon’s reputation or that of a client.”[3] What sort of reputation does this person and his firm have now?

Now, what should these “big names” have done to protect themselves from this individual as well as CI professionals seeking competitively sensitive data? Here are a couple of suggestions for them (and for others):

  • Check out anyone seeking an interview. Is this person really who/what they say they are? In this case, he was a video producer. Maybe close enough to a documentary maker to skate by.
  • Do the conditions look and sound right? In this case, the interviewer showed up with a “Canon and a bullshit microphone”. Does that look professional? Probably not.
  • What is the interviewers approach? This one relied on flattery. Warning! No one is really that interested in what you are doing – except your competition.
  • What kind of interview is being conducted? This one was two hours long – another warning! After a while, your defenses fall and you speak more freely.
  • Also, it was conducted in a “grassy field”. Maybe it was sold as a good background for the video. But, it could have been a way to keep this person from his computer or other interruptions that might force him to reconsider “why am I still talking to this person and exactly what am I saying?”

[1] Ashlee Vance, “Hype Machine”, Bloomberg Businessweek, July 23, 2018, p. 53.

[2] https://www.scip.org/page/CodeofEthics.

[3] https://helicongroup.com/ethical-standards.