Archive

Posts Tagged ‘marketing intelligence’

Analysing weak signals for competitive & marketing intelligence

March 5, 2012 6 comments

I’ve just read an interesting blog post by  Philippe Silberzahn and Milo Jones. The post “Competitive intelligence and strategic surprises: Why monitoring weak signals is not the right approach” looked at the problems of weak signals in competitive intelligence and how even though an organisation may have lots of intelligence, they still get surprised.

Silberzahn and Jones point out that it’s not usually the intelligence that is the problem, but the interpretation of the gathered intelligence. This echoed a statement by Issur Harel, the former head of Mossad responsible for capturing the Nazi war criminal Eichmann. Harel was quoted as saying “We do not deal with certainties. The world of intelligence is the world of probabilities. Getting the information is not usually the most difficult task. What is difficult is putting upon it the right interpretation. Analysis is everything.”

In their post, Silberzahn and Jones argue that more important than monitoring for weak signals, is the need to monitor one’s own assumptions and hypotheses about what is happening in the environment. They give several examples where weak signals were available but still resulted in intelligence failures. Three different types of failure are mentioned:

  • Too much information: the problem faced by the US who had lots of information prior to the Pearl Harbour attack of 7 December 1941,
  • Disinformation, as put out by Osama bin Laden to keep people in a high-state of alert – by dropping clues that “something was about to happen“, when nothing was (and of course keeping silent when it was),
  • “Warning fatigue” (the crying wolf syndrome) where constant repetition of weak signals leads to reinterpretation and discounting of threats, as happened prior the Yom Kippur war.

Their conclusion is that with too much data, you can’t sort the wheat from the chaff, and with too little you make analytical errors. Their solution is that rather than collect data and subsequently analyse it to uncover its meaning you should first come up with hypotheses and use that to drive data collection. They quote Peter Drucker (Management: Tasks, Responsibilities, Practices, 1973) who wrote: “Executives who make effective decisions know that one does not start with facts. One starts with opinions… To get the facts first is impossible. There are no facts unless one has a criterion of relevance.”  and emphasise that “it is hypotheses that must drive data collection”.

Essentially this is part of the philosophy behind the “Key Intelligence Topic” or KIT process – as articulated by Jan Herring and viewed as a key CI technique by many Competitive Intelligence Professionals.

I believe that  KITs are an important part of CI, and it is important to come up with hypotheses on what is happening in the competitive environment, and then test these hypotheses through data collection. However this should not detract from general competitive monitoring, including the collection of weak signals.

The problem is how to interpret and analyse weak signals. Ignoring them or even downplaying them is NOT the solution in my view – and is in fact highly dangerous. Companies with effective intelligence do not get beaten or lose out through known problems but from unknown ones. It’s the unknown that catches the company by surprise, and often it is the weak signals that, in hindsight, give clues to the unknown. In hindsight, their interpretation is obvious. However at the time, the interpretation is often missed, misunderstood, or ignored as unimportant.

There is an approach to analysing weak signals that can help sort the wheat from the chaff. When you have a collection of weak signals don’t treat them all the same. Categorise them.

  • Are they about a known target’s capabilities? Put these in box 1.
  • Are they relating to a target’s strategy? These go into box 2.
  • Do they give clues to a target’s goals or drivers? Place these in box 3.
  • Can the weak signal be linked to assumptions about the environment held by the target? These go into box 4.

Anything else goes into box 5. Box 5 holds the real unknowns – unknown target or topic or subject. You have a signal but don’t know what to link it to.

First look at boxes 1-4 and compare each bit of intelligence to other information.

  1. Does it fit in? If so good. You’ve added to the picture.
  2. If it doesn’t, why not?

Consider the source of the information you have. What’s the chronology? Does the new information suggest a change? If so, what could have caused that change? For this, compare the other 3 boxes to see if there’s any information that backs up the new signal – using the competitor analysis approach sometimes known as 4-corners analysis, to see if other information would help create a picture or hypothesis of what is happening.

If you find nothing, go back and look at the source.

  • Is it old information masquerading as new? If so, you can probably discount it.
  • Is it a complete anomaly – not fitting in with anything else at all? Think why the information became available. Essentially this sort of information is similar to what goes into box 5.
    • Could it be disinformation? If so, what is likely to be the truth? Knowing it may be disinformation may lead to what is being hidden?
    • Or is it misinformation – which can probably be discounted?
    • What about if you can’t tell? Then it suggests another task – to try and identify other intelligence that would provide further detail and help you evaluate the anomaly. Such weak signals then become leads for future intelligence gathering.

With box 5 – try and work out why it is box 5. (It may be that you have information but no target to pin it to, for example – so can’t do the above). As with anomalies, think why the information became available. You may need to come up with a number of hypotheses to explain meaning behind the information. These can sometimes (but not always) be tested.

Silberzahn and Jones mention a problem from Nassim Taleb’s brilliant book “The Black Swan: The Impact of the Highly Improbable“. The problem is how do you stop being like a turkey before Thanksgiving. Prior to Thanksgiving the turkey is regularly fed and given lots and lots of food. Life seems good, until the fateful day, just before Thanksgiving, when the food stops and the slaughterer enters to prepare the turkey for the Thanksgiving meal. For the turkey this is a complete surprise as all the evidence prior to this suggests that everything is going well. Taleb poses the question as to whether a turkey can learn from the events of yesterday what is about to happen tomorrow. Can an unknown future be predicted – and in this case, the answer seems to be no.

For an organisation, this is a major problem as if they are like turkeys, then weak signals become irrelevant. The unknown can destroy them however much information they hold prior to the unforeseen event. As Harel said, the problem is not information but analysis. The wrong analysis means death!

This is where a hypothesis approach comes in – and why hypotheses are needed for competitive intelligence gathering. In the Thanksgiving case, the turkey has lots of consistent information coming in saying “humans provide food”.  The key is to look at the source of the information and try to understand it. In other words:

Information: Humans provide food.
Source: observation that humans give food every day – obtained from multiple reliable sources.

You now need to question the reason or look at the objectives behind this observation. Why was this observation available? Come up with hypotheses that can be used to test the observations and see what matches. Then choose a strategy based on an assessment of risk. In the case of the turkey there are two potential hypotheses:

  1. “humans like me and so feed me” (i.e. humans are nice)
  2. “humans feed me for some other reason” (i.e. humans may not be nice).

Until other information comes in to justify hypothesis 1, hypothesis 2 is the safer one to adopt as even if hypothesis 1 is true, you won’t get hurt by adopting a strategy predicated on hypothesis 2. (You may not eat so much and be called skinny by all the other turkeys near you. However you are less likely to be killed).

This approach can be taken with anomalous information in general, and used to handle weak signals. The problem then becomes not the analysis of information but the quantity. Too much information and you start to drown and can’t categorise it – it’s not a computer job, but a human job. In this case one approach is to do the above with a random sample of information – depending on your confidence needs and the quantity of information. This gets into concepts of sampling theory – which is another topic.

Thinking Hats

August 7, 2007 1 comment
This entry has been prompted by a comment (critique) on Jon Lowder’s CI blog that I don’t publish very often. I could try and make excuses (work, laze, inability – delete whichever is not applicable). However I won’t – as I think the complaint is totally justified. In fact I tend to have spurts – and publish when I get ideas. I’d prefer to blog something that fulfilled the aims I have for this blog then just use it for a stream of consciousness – much of which would be just a way of me asserting my ego. So thank you Jon for the prompt to think!

First – a couple of comments on Jon’s blog – if you’ve not ever read it. He has some great tips which I firmly second. For example, recent blogs mention the uses of LinkedIn in CI. I’ve been a LinkedIn user for some time – and have found it invaluable as a source for potential contacts. I’ve also signed up with other networking groups although my network is smaller on these – Xing, Ecademy, etc. Also – don’t ignore Facebook and MySpace. A lot of companies have signed up for pages on these networking sites, and you never know who or what you might find that could help with a project.

Jon mentions a new LinkedIn feature – the ability to ask questions, and get answers from other users as a strength of the service. Potentially it could be – although I felt the answers given were poor. I think a better service for answering questions is the FreePint bar which has a circulation list of approaching 100,000 expert searchers who answer questions on a massive range of topics – many of which are relevant for competitive intelligence professionals. (As an example, recent posts have looked at international tax comparisons, media monitoring, Swiss, Austrian & German company shareholders and Russian export regulations).

In the example Jon highlighted, half the answers suggested HitWise. This is a great service, but I’m not sure that it is the right solution for the questioner, from the bank JP Morgan-Chase, who was looking for competitive intelligence vendors for paid search – asking Is CI effective in Search? None of the answers given took into account the questioner’s origins in financial services – or asked what he meant by his question about whether CI was effective in search.

What Hitwise offers is a service giving customers knowledge on how Internet users interact with web-sites – your own and your competitors. You can use it to compare how your site is performing against competitor sites – and if this is what was wanted, then Hitwise would be a good solution. However Hitwise’s strength is not really for B2B web-sites, as these will generally receive much less traffic than the consumer web-sites for which the Hitwise service is best aimed. If what was wanted were vendors who were experts at secondary Internet search then Hitwise would not be the correct solution – members of the Association of Independent Information professionals (www.aiip.org) would have been a better bet – as most are experts at searching the Internet and other databases, and many, including us at AWARE, specialise in competitive intelligence.

In fact, another interpretation of this question is completely different and takes into account both the nature of the questioner and medium where the question was posed. LinkedIn attracts a lot of recruiters and recruitment agencies, and is used by these for looking for candidates. Search is sometimes used in this context so the question could have related to this i.e. Is CI effective in Recruitment Searching? If this was what the questioner really wanted then none of the 8 responses was satisfactory.

This highlights a lesson for all competitive intelligence professionals – you need to know, for each research request:

  • who is actually asking the question (i.e. you are asked a question by your boss, but this is because his or her boss has asked them a question – are the two questions the same or has something been lost in the transmission?),
  • why are they asking it,
  • what are they really looking to achieve with the answer.
Only then can you really answer the question. It’s a question of putting on your thinking hat to get behind the, often, easy looking question.

In fact, if you really want to study a problem it’s not one thinking hat that should be used but six! This idea comes from the work of Edward de Bono – and should be a key element of all competitive intelligence analytical approaches. Essentially every problem for which a decision is required should be looked at in six ways:

  1. Neutral: focusing on the data available, knowledge gaps, past trends and extrapolations from historical data. Unfortunately this is where a lot of CI people stop in their analyses – and just present the neutral view. This is rarely the full answer that the decision maker needs.
  2. Self-opinionated / emotionally: how will your customer react to the response you are giving him or her? Does your work answer the question they’ve posed – not the surface question, but the underlying driver that led to the question? You need to use intuition and your emotional instincts to look at the problem with this approach. What are the emotions involved? How will people respond to your research when they’ve not been through the process or followed the reasoning you took to reach the answer?
  3. Judgmentally: what are the bad points or weaknesses in your work or the decision suggested? What could go wrong? Be cautious and risk-adverse. This approach lets you prepare for the worst and makes you think of alternative options and create contingency plans if things don’t work as expected.
  4. Positively: now look at the good points and the benefits that will result from any decision. Even if everything looks like a disaster, trying to see the positive can help find a way out of the mess. It can also help show the value in the decision – in a way that may not be immediately obvious.
  5. Creatively: brainstorm a bit. Try and think beyond the problem for alternative solutions or approaches. Don’t criticise any ideas – just go with the flow. This approach allows you to come up with further suggestions and ideas that could add increased value to what you are suggesting. More importantly they show that you’ve really considered all aspects of the problem.
  6. Take an overview of the other 5 approaches: this final approach looks at all the other five and evaluates the responses, synthesizing the responses into a single coherent, balanced position. If there are too few alternatives then it may be time to go back to the creative approach. If everything looks perfect, then be really judgmental and see if you can come up with anything wrong at all – just in case there is some gremlin that was missed. If everything looks bad, go back to the positive approach and look to see if there is anything salvageable.
Answering problems and coming to decisions using de Bono’s 6 Thinking Hats technique will result in better solutions and safer, more resilient and robust decisions – avoiding potential disasters, while being able to feel more confident about the actions you commit to.

Online Again!

December 2, 2005 Leave a comment

Yesterday was the first day of the International Online Information Conference and Exhibition – the premier (well I think so anyway) trade show for those interested in anything to do with online information.

Whether you are interested in competitive intelligence, or scientific information or history or knowledge management – or even just chilling out with some really great people, you’d find something to keep you interested, amused or just full up with chocolate. (Yes – lots of stands were giving out free chocolates – which means that this is one show that you should skip if you subscribe to Chocoholics Anonymous).

My day started with the annual AIIP breakfast – sponsored by the Thomson Organisation (Yes – even corporates can be altruistic sometimes!) – and speaking to old (and new) friends within the AIIP community. (You don’t know what AIIP is – and you call yourself an information professional? Go this minute to their web-site and sign up – or if you are not independent, find out how you can improve your research efforts by using some of the world’s best searchers (www.aiip.org).

And then to the day’s key-note speaker: David Weinberger (for more on David – visit his site at www.evident.com or his blog at www.johotheblog.com). Unfortunately I spent too much time chatting at the breakfast and so missed the start of David’s talk. However what I heard was enough to make me realise how much further things will go in the information-using industries (and isn’t that all industries?). He highlighted how blogs and wikis are changing the way people perceive information. He contrasted corporate web-sites with the newer collaborative models such as wikis and blogs. He suggested that corporate sites tend to be narcissistic in that they are self-referencing with links that only refer to other parts of their web-site, or sometimes to paid advertisements. Compare this to blogs which invite the reader to explore outside and visit other sites. Rather than focus on sticky eyeballs and making sites sticky (whatever that means – I’ve yet to see anybody attach their eyes to a sticky screen showing some cool web-site!) they have enough confidence in their content to know that readers will return for more – after they’ve visited the links of interest.

The impact of such collaborative approaches is sure to grow – just consider the number of entries on Wikipedia compared to something more traditional – the Encyclopedia Britannica for example. Wikipedia has more entries – many of which are highly eclectic showing the range of information that people view interesting or important. The Britannica is, more staid, serious, and tied to older ways of sharing knowledge. As a result it can’t keep up with the dynamism of Wikipedia. (Could you imagine an entry such as the Wikipedia one for Deep Fried Mars Bars in the Britannica. This was one example of several given by Weinberger).

Apparently Weinberger has given a similar talk before – which was turned into a Podcast. So if you missed the talk at online, it is available for downloading at the Everything is Miscellaneous link on Paidcontent.org. (Thanks to Marydee Odjala for this – Marydee, apart from producing a great blog at InfotodayBlog, is the editor of Online Magazine).

And then to my session. I spoke for 30 minutes on using Online tools for finding competitive intelligence that can help identify opportunities and threats. Obviously you can’t do more than an overview of such a vast topic in 30 minutes – but I tried, by giving a brief overview on competitor, customer and similar monitoring using selected online tools before moving on to mention RSS feeds as a way of keeping up to date and then selected futurist sites for anticipating the future (e.g. the Global Business Network (led by Peter Schwartz, author of the excellent The Art of the Long View) or Shaping Tomorrow as two examples. (The Art of the Long View is my favourite scenario planning/futures studies books – I list several more on my web pages at www.marketing-intelligence.co.uk/resources/books.htm. OK – I know that is a plug for my site, but this is my blog, so tough – live with it! )

In the afternoon, I found time for two sessions on searching, featuring luminaries from both the UK and across the pond in the US including Chris Sherman (of Search Engine Watch; Karen Blakeman of RBA Information services – one of the top UK based information search services; Amelia Kassel of MarketingBase who had joined me a short-while earlier as a co-leader for a round-table session on competitive intelligence where we were joined by an international audience with people from the UK, US, Europe, Egypt…; Mary-Ellen Bates of Bates Information Services; the UK’s own Phil Bradley and the aforementioned Marydee Odjala. Could you ask for more?

If all that wasn’t enough for one day – I finished off joining Will Hann of Freepint (the information professional community site – if you don’t know Freepint then this is another one to visit and bookmark now) and friends for after show drinks and snacks. A great day – to start a great show. Today – Wednesday – will finish with the International Online Awards dinner, but before then will be some more great sessions.

And the show goes on (until Thursday – 1st December 2005, that is!)