Data Spotting

August 3, 2018

The basics of CI call for effective collection of relevant data. In the past, I have been involved in the typical “counting cars” data collection, including studying aerial photos of huge chemical plants, and hiring people to count trucks coming and going at a construction site.

Now that kind of data collection, at least some of that data collection, is becoming easier. Consider the following reports in Bloomberg Businessweek[1]:

  • One company now offers data from around 200 satellites, public and private, that allows for the identification and tracking of many things by geographic area. The article pitches the ability to monitor all the parking lots for Six Flags, the amusement park operator. Full lots = profits; empty lots = losses.
  • Another firm’s satellites focus on the heights of the lids on oil tanks throughout the world. The goal? Provide data on oil supplies and sales where “official statistics are incomplete or untrustworthy”.

Of course, the data from these satellite services is not free and it still must be processed and analyzed, but it sure beats sitting in a car at Six Flags Over Texas, counting cars for three days, doesn’t it?

[1] Eric Roston, “The View From Way Up”, Bloomberg Businessweek, July 30, 2018, pp. 46-9.


Not All Interviews are Alike

July 24, 2018

“[A co-founder of Zoox, a self-driving car “hopeful”] reached out to some of the biggest names in the field and told them he was making a documentary on the rise of self-driving cars. The plan was to mine these people for information and feel out potential partners…. [He says] ’In my defense I might have been making a documentary.’”[1]

Legal? Yes. Ethical? No! Why? Let’s look at ethical standards in CI:

  • SCIP’s Code of Ethics requires its members “[t]o accurately disclose all relevant information, including one’s identity and organization, prior to all interviews.”[2] Never happened. Unless he said he “might” be making a documentary, instead of that he was.
  • The Helicon Group “[n]ever employs questionable data collection activities. These are techniques, otherwise legal, which, if made public, might tend to embarrass Helicon’s reputation or that of a client.”[3] What sort of reputation does this person and his firm have now?

Now, what should these “big names” have done to protect themselves from this individual as well as CI professionals seeking competitively sensitive data? Here are a couple of suggestions for them (and for others):

  • Check out anyone seeking an interview. Is this person really who/what they say they are? In this case, he was a video producer. Maybe close enough to a documentary maker to skate by.
  • Do the conditions look and sound right? In this case, the interviewer showed up with a “Canon and a bullshit microphone”. Does that look professional? Probably not.
  • What is the interviewers approach? This one relied on flattery. Warning! No one is really that interested in what you are doing – except your competition.
  • What kind of interview is being conducted? This one was two hours long – another warning! After a while, your defenses fall and you speak more freely.
  • Also, it was conducted in a “grassy field”. Maybe it was sold as a good background for the video. But, it could have been a way to keep this person from his computer or other interruptions that might force him to reconsider “why am I still talking to this person and exactly what am I saying?”

[1] Ashlee Vance, “Hype Machine”, Bloomberg Businessweek, July 23, 2018, p. 53.

[2] https://www.scip.org/page/CodeofEthics.

[3] https://helicongroup.com/ethical-standards.


Maybe Someone is Paying Attention

Bloomberg Businessweek recently ran an article on Adidas’ new factory. [1] It is described as a “super factory” aimed at letting Adidas make its footwear in high-cost, developed economies, instead of contacting work out to suppliers and assemblers in China, Vietnam and other lower labor cost countries. The article, in an aside, noted that Adidas could gain an additional benefit if this project works out:

“Adidas gains the added benefit of keeping the latest trends and ideas in-house rather than sharing them with suppliers.”

Axiomatic: In competitive intelligence, the fewer people that have access to competitively sensitive data, the harder it is for competitors to gather it.

[1] “Adidas Automates to Make Shoes Faster, Bloomberg Businesssweek, October 8, 2017, p. 17-18.


This is Your Brain on CI – Part 1 of 3

 

July 7, 2015

When you are doing competitive intelligence (CI), you are relying on your intelligence to drive your research and analysis. But your brain, like any other part of your body, needs proper care and conditioning. What follows here and in the next 2 parts are a few notes on what works for me – and for others as well. Your suggestions and comments are, of course, very welcome.

First, let me deal with relaxing your mind.

Look at the things that you do to relax, such as reading and games. I’m a great believer that you should continually change these things. By that, I mean changing the “subject matter” of the material you are reading (or the games that you are playing) for something that’s new and different.

With respect to reading, my practice is to change magazine subscriptions on a regular basis. So I stop reading the Economist when a subscription ends, instead of just renewing it, even though I really like it.  Then I start reading another magazine that it entirely different in terms of slant or subject. Think about it. For example, if you regularly read only Bloomberg BusinessWeek try switching to Smithsonian magazine. For The Atlantic switch out to National Review or Biblical Archaeology Review.

This holds true with books. Read mysteries? Try histories. Read archeology? Try psychology. The same is true with games. Do you do crossword puzzles? What about Sudoku? Anagrams?

You are still relaxing but doing it differently. And try doing it, that is the reading, the games, etc., in different places. So your relaxation is still real, but different.


Presenters and Presentations

May 27, 2014

A while ago, Bloomberg BusinessWeek ran a piece title “Why Bezos Bought The Post”[1]. It contains a lesson for presenting your competitive intelligence findings. Brad Stone, the author, observed that

“[a] decade ago, frustrated with the pace of meetings at his company [Amazon], Bezos banished PowerPoint and proclaimed that all future Amazon meetings would begin with the presenter passing out a narrative document that outlined the topic being discussed. The first papers were endless, spanning dozens of pages, so Bezos decreed a six-page limit. Many of his colleagues still thought this managing-by-writing approach would fade. It didn’t.“

So?

The lessons here are several:

First, PowerPoint is not the only way to convey information at a business meeting. In fact, there are those that argue, in my words not theirs, PowerPoint serves less to communicate than to conceal[2]. So, master other ways. Or at least practice what you want to say, relying on PowerPoint only as a reminder – to you of what you want to say and to the attendees of what you have said.

Second, present your case the way that senior management wants, simply because they may pay less attention to your message if they are not comfortable with the way it is delivered. If that means PowerPoint, it means PowerPoint.

Third, whatever means you employ, master the subject before your presentation. At the actual meeting, you may not be able to present what you want, when you want, and/or in the order you want. The form of your presentation is a tool; the content is the key. A corollary to this is that you should avoid presenting where the presentation and the work behind it were largely (or exclusively) done by someone else.

Fourth, shorter is almost always better than longer. Longer presentations may be more detailed, but that risks losing attention – as well as actual attendees.

[1] By Brad Stone, August 8, 2013, http://www.businessweek.com/articles/2013-08-08/why-jeff-bezos-bought-the-em-washington-post-em.

[2] For more on that, see Edward R. Tufte, Beautiful Evidence, Graphics Press, LLC, 2006, p. 181: “Our comparison of various presentation tools in action indicate that PowerPoint is intellectually outperformed by alternative tools.”


Assumptions

August 14, 2013

When analyzing the data you collect to develop CI, sometimes you have to make assumptions. Actually, you have to make a lot of assumptions a lot of times. Some of them a very small and it is usually safe to make them. Others are not so small, and potentially more dangerous.

Usually when dealing with assumptions, you know when you are assuming something. You have a gap between two sets of facts or you have a set of facts and you are trying to determine what they mean. There you understand that you are making assumptions, so you are usually careful.

But there is a special problem dealing with assumptions: serial assumptions. By serial assumptions, I mean assumptions that connect with other assumptions or assumptions are somehow dependent on a former assumption. These can be difficult to manage.

Let me illustrate what I’m talking about by reference to a recent article in Bloomberg BusinessWeek[1]. In an article discussing adjustable-rate mortgages (ARMs), the authors noted that, while it is likely that home prices will probably continue to rise, is difficult to predict that by state or by region. That means potential buyers are usually making that assumption, without qualification, when deciding to buy a home with an ARM.

However many homebuyers, according to the article, applying for ARMs also make another assumption, perhaps unstated, about their income. [Note: unstated assumptions are perhaps the most dangerous of all.] Their assumption is that their income will be higher by the end of the loan’s fixed payment period, some 3 to 5 years out. To them, that means that they would be able to handle bigger mortgage payments, even if they cannot sell the house. So they take the ARM and buy the house.

The article makes very clear the danger of these serial (and unstated) assumptions: “[S]ays Henry Savage, president of PNC mortgage, ‘When you start making those calculations, you’re playing golf in the dark’”.


[1]More Americans are Gambling on ARMs”, Bloomberg BusinessWeek, August 5-11, 2013.


How do you measure success?

August 6, 2013

As you do your own competitive intelligence or utilize CI provided by others, you will eventually run into the question, “So what’s the bottom line here? How much is the CI worth?”

      Commercial aside: I could just refer you to our book, Bottom Line Competitive Intelligence, but I want to talk about it from your point of view, not mine.

There are two ways to look at this problem. The first is to try to measure things in terms of dollars, which is to answer the question “What is the dollar return from the CI?” The second is to look at what was actually done with the CI.

How do we measure the monetary value of competitive intelligence? There really two sides to that. One is to determine how much more money you made that you would not have made without it or, conversely, how much less money you avoided losing that you would’ve lost without it. The problem is that most decisions have many inputs, so crediting one input with a decision’s entire success or failure is just not right. So you try to assign a percentage to it.

Say you were 60% sure your decision would succeed, whatever that decision is.  Your decision, if right, could produce $1 million in new profits. If you could increase the likelihood of success by 10% by using CI, you could assign a value of $100,000 to the CI. The same kind of calculation works in terms of reducing the costs of failure, although most people prefer not to make those calculations. Why? Because they do not like to be “negative”.

The alternative is to see if and how the CI was used and then determine whether or not the decision-making process was improved. Simply put, if the decision-making process was not improved, or if the CI was ignored, then the CI was worth nothing. If it was improved, then the CI was valuable.

For people who believe that you must measure everything, consider the recent observation by the head of Yahoo: “Just because we have a ruler doesn’t mean we have to measure everything[1].”


[1] Brad Stone, “Can Marissa Mayer Save Yahoo?”, Bloomberg BusinessWeek, August 5-11, 2013.