Home > Case Studies, Competitive Intelligence, Competitive Strategy > Analysing weak signals for competitive & marketing intelligence

Analysing weak signals for competitive & marketing intelligence

I’ve just read an interesting blog post by  Philippe Silberzahn and Milo Jones. The post “Competitive intelligence and strategic surprises: Why monitoring weak signals is not the right approach” looked at the problems of weak signals in competitive intelligence and how even though an organisation may have lots of intelligence, they still get surprised.

Silberzahn and Jones point out that it’s not usually the intelligence that is the problem, but the interpretation of the gathered intelligence. This echoed a statement by Issur Harel, the former head of Mossad responsible for capturing the Nazi war criminal Eichmann. Harel was quoted as saying “We do not deal with certainties. The world of intelligence is the world of probabilities. Getting the information is not usually the most difficult task. What is difficult is putting upon it the right interpretation. Analysis is everything.”

In their post, Silberzahn and Jones argue that more important than monitoring for weak signals, is the need to monitor one’s own assumptions and hypotheses about what is happening in the environment. They give several examples where weak signals were available but still resulted in intelligence failures. Three different types of failure are mentioned:

  • Too much information: the problem faced by the US who had lots of information prior to the Pearl Harbour attack of 7 December 1941,
  • Disinformation, as put out by Osama bin Laden to keep people in a high-state of alert – by dropping clues that “something was about to happen“, when nothing was (and of course keeping silent when it was),
  • “Warning fatigue” (the crying wolf syndrome) where constant repetition of weak signals leads to reinterpretation and discounting of threats, as happened prior the Yom Kippur war.

Their conclusion is that with too much data, you can’t sort the wheat from the chaff, and with too little you make analytical errors. Their solution is that rather than collect data and subsequently analyse it to uncover its meaning you should first come up with hypotheses and use that to drive data collection. They quote Peter Drucker (Management: Tasks, Responsibilities, Practices, 1973) who wrote: “Executives who make effective decisions know that one does not start with facts. One starts with opinions… To get the facts first is impossible. There are no facts unless one has a criterion of relevance.”  and emphasise that “it is hypotheses that must drive data collection”.

Essentially this is part of the philosophy behind the “Key Intelligence Topic” or KIT process – as articulated by Jan Herring and viewed as a key CI technique by many Competitive Intelligence Professionals.

I believe that  KITs are an important part of CI, and it is important to come up with hypotheses on what is happening in the competitive environment, and then test these hypotheses through data collection. However this should not detract from general competitive monitoring, including the collection of weak signals.

The problem is how to interpret and analyse weak signals. Ignoring them or even downplaying them is NOT the solution in my view – and is in fact highly dangerous. Companies with effective intelligence do not get beaten or lose out through known problems but from unknown ones. It’s the unknown that catches the company by surprise, and often it is the weak signals that, in hindsight, give clues to the unknown. In hindsight, their interpretation is obvious. However at the time, the interpretation is often missed, misunderstood, or ignored as unimportant.

There is an approach to analysing weak signals that can help sort the wheat from the chaff. When you have a collection of weak signals don’t treat them all the same. Categorise them.

  • Are they about a known target’s capabilities? Put these in box 1.
  • Are they relating to a target’s strategy? These go into box 2.
  • Do they give clues to a target’s goals or drivers? Place these in box 3.
  • Can the weak signal be linked to assumptions about the environment held by the target? These go into box 4.

Anything else goes into box 5. Box 5 holds the real unknowns – unknown target or topic or subject. You have a signal but don’t know what to link it to.

First look at boxes 1-4 and compare each bit of intelligence to other information.

  1. Does it fit in? If so good. You’ve added to the picture.
  2. If it doesn’t, why not?

Consider the source of the information you have. What’s the chronology? Does the new information suggest a change? If so, what could have caused that change? For this, compare the other 3 boxes to see if there’s any information that backs up the new signal – using the competitor analysis approach sometimes known as 4-corners analysis, to see if other information would help create a picture or hypothesis of what is happening.

If you find nothing, go back and look at the source.

  • Is it old information masquerading as new? If so, you can probably discount it.
  • Is it a complete anomaly – not fitting in with anything else at all? Think why the information became available. Essentially this sort of information is similar to what goes into box 5.
    • Could it be disinformation? If so, what is likely to be the truth? Knowing it may be disinformation may lead to what is being hidden?
    • Or is it misinformation – which can probably be discounted?
    • What about if you can’t tell? Then it suggests another task – to try and identify other intelligence that would provide further detail and help you evaluate the anomaly. Such weak signals then become leads for future intelligence gathering.

With box 5 – try and work out why it is box 5. (It may be that you have information but no target to pin it to, for example – so can’t do the above). As with anomalies, think why the information became available. You may need to come up with a number of hypotheses to explain meaning behind the information. These can sometimes (but not always) be tested.

Silberzahn and Jones mention a problem from Nassim Taleb’s brilliant book “The Black Swan: The Impact of the Highly Improbable“. The problem is how do you stop being like a turkey before Thanksgiving. Prior to Thanksgiving the turkey is regularly fed and given lots and lots of food. Life seems good, until the fateful day, just before Thanksgiving, when the food stops and the slaughterer enters to prepare the turkey for the Thanksgiving meal. For the turkey this is a complete surprise as all the evidence prior to this suggests that everything is going well. Taleb poses the question as to whether a turkey can learn from the events of yesterday what is about to happen tomorrow. Can an unknown future be predicted – and in this case, the answer seems to be no.

For an organisation, this is a major problem as if they are like turkeys, then weak signals become irrelevant. The unknown can destroy them however much information they hold prior to the unforeseen event. As Harel said, the problem is not information but analysis. The wrong analysis means death!

This is where a hypothesis approach comes in – and why hypotheses are needed for competitive intelligence gathering. In the Thanksgiving case, the turkey has lots of consistent information coming in saying “humans provide food”.  The key is to look at the source of the information and try to understand it. In other words:

Information: Humans provide food.
Source: observation that humans give food every day – obtained from multiple reliable sources.

You now need to question the reason or look at the objectives behind this observation. Why was this observation available? Come up with hypotheses that can be used to test the observations and see what matches. Then choose a strategy based on an assessment of risk. In the case of the turkey there are two potential hypotheses:

  1. “humans like me and so feed me” (i.e. humans are nice)
  2. “humans feed me for some other reason” (i.e. humans may not be nice).

Until other information comes in to justify hypothesis 1, hypothesis 2 is the safer one to adopt as even if hypothesis 1 is true, you won’t get hurt by adopting a strategy predicated on hypothesis 2. (You may not eat so much and be called skinny by all the other turkeys near you. However you are less likely to be killed).

This approach can be taken with anomalous information in general, and used to handle weak signals. The problem then becomes not the analysis of information but the quantity. Too much information and you start to drown and can’t categorise it – it’s not a computer job, but a human job. In this case one approach is to do the above with a random sample of information – depending on your confidence needs and the quantity of information. This gets into concepts of sampling theory – which is another topic.

  1. March 5, 2012 at 9:52 pm

    Congratulations for this note and thank you so much for quoting us so extensively. There is really not much I can add to this really comprehensive piece. I really like your “box” appraoch. The only limitation I would see is that the sheer volume of information (or rather data) available would make it, in practice, extremely difficult to implement. Similarly, for the turkey exemple, simply formulating the question of “why do humans feed me” seems to me extraordinarily difficult ex ante.
    Philippe Silberzahn

    • March 6, 2012 at 1:51 pm

      I agree that formulating the question on “why do humans feed me” is difficult (and not just for turkeys!). That’s why analysis is so important, and also why clever companies still make mistakes. Kodak had a sophisticated competitive intelligence programme and were fully aware of the threat of digital cameras and photography, yet they still failed. Compare Kodak that embraced digital media with Polaroid that sold it off and you see the difference. Polaroid behaved like a turkey – ignoring and even rejecting the threat. Kodak didn’t but they’ve still got eaten! In fact, I believe that the problem for Kodak was the wrong hypothesis – that led to the wrong strategy. If you look at Kodak’s latest annual reports you can see this in the way that they waffle on about their strategy (or actually non-strategy). Dick Rumelt’s book “Good Strategy Bad Strategy” is instructive in this case, as if you use Rumelt’s guidelines you can see that Kodak had lost it.

      Asking the right hypothesis does depend on more data or an element of luck. Not asking a hypothesis at all is the problem. In the turkey case, it would be very easy to assume that what happened yesterday (i.e. I got fed) will continue today and tomorrow. You have no evidence to the contrary so why assume that things will change.

      For businesses today however that should not be the case. There is so much change that assuming that things will remain constant is dangerous. When I do training, one question I sometimes ask is “What will the impact of e-books be on the world” and get attendees to think. When they really think, they see that it impacts education, publishing, but also global politics (as books become more available and easier to distribute / harder to surpress). I then ask about the fate of paper manufacturers. Some say they are doomed. I then ask what about when / if oil runs out, or if oil prices rise – what will the impact be on plastic. At first they don’t see a connection – until I point out about plastic used in packaging. They they realise paper can also be a packaging material – and so they realise how things are interconnected and there is little that is constant in the world.

      As for too much data – that IS a problem. I’m not sure of the solution. Although I intimated that computers aren’t a solution, in the end they probably have to be. Artificial intelligence approaches, and semantic analysis of material should allow categorisation of data. (Not just for CI targets but also to link other data that’s related based on key words). That should help in the data analysis as you would have sets of data. You’d not need to look at everything – just a selection to verify / assess importance. The technology exists or is being developed although I don’t (yet) think it’s being used properly in the CI world. It definitely is in the military world where there is much more data availability through electronic monitoring systems.

  2. March 6, 2012 at 2:36 pm
  3. March 7, 2012 at 1:14 am

    I hope other people find your write-up here as practical as I have. I manage a site myself and would be happy for you or the guests on your own site to visit. Please go ahead and browse through my website like I have with yours and post a remark or two if you discover anything interesting. Many thanks.

  4. June 18, 2012 at 9:31 am

    This article give me a lot of reason to study very well about weak signal and i learn a lot in it.For me this is very impressive one.

  1. March 13, 2012 at 6:55 am

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: