Archive

Posts Tagged ‘competitive analysis’

Analysing weak signals for competitive & marketing intelligence

March 5, 2012 6 comments

I’ve just read an interesting blog post by  Philippe Silberzahn and Milo Jones. The post “Competitive intelligence and strategic surprises: Why monitoring weak signals is not the right approach” looked at the problems of weak signals in competitive intelligence and how even though an organisation may have lots of intelligence, they still get surprised.

Silberzahn and Jones point out that it’s not usually the intelligence that is the problem, but the interpretation of the gathered intelligence. This echoed a statement by Issur Harel, the former head of Mossad responsible for capturing the Nazi war criminal Eichmann. Harel was quoted as saying “We do not deal with certainties. The world of intelligence is the world of probabilities. Getting the information is not usually the most difficult task. What is difficult is putting upon it the right interpretation. Analysis is everything.”

In their post, Silberzahn and Jones argue that more important than monitoring for weak signals, is the need to monitor one’s own assumptions and hypotheses about what is happening in the environment. They give several examples where weak signals were available but still resulted in intelligence failures. Three different types of failure are mentioned:

  • Too much information: the problem faced by the US who had lots of information prior to the Pearl Harbour attack of 7 December 1941,
  • Disinformation, as put out by Osama bin Laden to keep people in a high-state of alert – by dropping clues that “something was about to happen“, when nothing was (and of course keeping silent when it was),
  • “Warning fatigue” (the crying wolf syndrome) where constant repetition of weak signals leads to reinterpretation and discounting of threats, as happened prior the Yom Kippur war.

Their conclusion is that with too much data, you can’t sort the wheat from the chaff, and with too little you make analytical errors. Their solution is that rather than collect data and subsequently analyse it to uncover its meaning you should first come up with hypotheses and use that to drive data collection. They quote Peter Drucker (Management: Tasks, Responsibilities, Practices, 1973) who wrote: “Executives who make effective decisions know that one does not start with facts. One starts with opinions… To get the facts first is impossible. There are no facts unless one has a criterion of relevance.”  and emphasise that “it is hypotheses that must drive data collection”.

Essentially this is part of the philosophy behind the “Key Intelligence Topic” or KIT process – as articulated by Jan Herring and viewed as a key CI technique by many Competitive Intelligence Professionals.

I believe that  KITs are an important part of CI, and it is important to come up with hypotheses on what is happening in the competitive environment, and then test these hypotheses through data collection. However this should not detract from general competitive monitoring, including the collection of weak signals.

The problem is how to interpret and analyse weak signals. Ignoring them or even downplaying them is NOT the solution in my view – and is in fact highly dangerous. Companies with effective intelligence do not get beaten or lose out through known problems but from unknown ones. It’s the unknown that catches the company by surprise, and often it is the weak signals that, in hindsight, give clues to the unknown. In hindsight, their interpretation is obvious. However at the time, the interpretation is often missed, misunderstood, or ignored as unimportant.

There is an approach to analysing weak signals that can help sort the wheat from the chaff. When you have a collection of weak signals don’t treat them all the same. Categorise them.

  • Are they about a known target’s capabilities? Put these in box 1.
  • Are they relating to a target’s strategy? These go into box 2.
  • Do they give clues to a target’s goals or drivers? Place these in box 3.
  • Can the weak signal be linked to assumptions about the environment held by the target? These go into box 4.

Anything else goes into box 5. Box 5 holds the real unknowns – unknown target or topic or subject. You have a signal but don’t know what to link it to.

First look at boxes 1-4 and compare each bit of intelligence to other information.

  1. Does it fit in? If so good. You’ve added to the picture.
  2. If it doesn’t, why not?

Consider the source of the information you have. What’s the chronology? Does the new information suggest a change? If so, what could have caused that change? For this, compare the other 3 boxes to see if there’s any information that backs up the new signal – using the competitor analysis approach sometimes known as 4-corners analysis, to see if other information would help create a picture or hypothesis of what is happening.

If you find nothing, go back and look at the source.

  • Is it old information masquerading as new? If so, you can probably discount it.
  • Is it a complete anomaly – not fitting in with anything else at all? Think why the information became available. Essentially this sort of information is similar to what goes into box 5.
    • Could it be disinformation? If so, what is likely to be the truth? Knowing it may be disinformation may lead to what is being hidden?
    • Or is it misinformation – which can probably be discounted?
    • What about if you can’t tell? Then it suggests another task – to try and identify other intelligence that would provide further detail and help you evaluate the anomaly. Such weak signals then become leads for future intelligence gathering.

With box 5 – try and work out why it is box 5. (It may be that you have information but no target to pin it to, for example – so can’t do the above). As with anomalies, think why the information became available. You may need to come up with a number of hypotheses to explain meaning behind the information. These can sometimes (but not always) be tested.

Silberzahn and Jones mention a problem from Nassim Taleb’s brilliant book “The Black Swan: The Impact of the Highly Improbable“. The problem is how do you stop being like a turkey before Thanksgiving. Prior to Thanksgiving the turkey is regularly fed and given lots and lots of food. Life seems good, until the fateful day, just before Thanksgiving, when the food stops and the slaughterer enters to prepare the turkey for the Thanksgiving meal. For the turkey this is a complete surprise as all the evidence prior to this suggests that everything is going well. Taleb poses the question as to whether a turkey can learn from the events of yesterday what is about to happen tomorrow. Can an unknown future be predicted – and in this case, the answer seems to be no.

For an organisation, this is a major problem as if they are like turkeys, then weak signals become irrelevant. The unknown can destroy them however much information they hold prior to the unforeseen event. As Harel said, the problem is not information but analysis. The wrong analysis means death!

This is where a hypothesis approach comes in – and why hypotheses are needed for competitive intelligence gathering. In the Thanksgiving case, the turkey has lots of consistent information coming in saying “humans provide food”.  The key is to look at the source of the information and try to understand it. In other words:

Information: Humans provide food.
Source: observation that humans give food every day – obtained from multiple reliable sources.

You now need to question the reason or look at the objectives behind this observation. Why was this observation available? Come up with hypotheses that can be used to test the observations and see what matches. Then choose a strategy based on an assessment of risk. In the case of the turkey there are two potential hypotheses:

  1. “humans like me and so feed me” (i.e. humans are nice)
  2. “humans feed me for some other reason” (i.e. humans may not be nice).

Until other information comes in to justify hypothesis 1, hypothesis 2 is the safer one to adopt as even if hypothesis 1 is true, you won’t get hurt by adopting a strategy predicated on hypothesis 2. (You may not eat so much and be called skinny by all the other turkeys near you. However you are less likely to be killed).

This approach can be taken with anomalous information in general, and used to handle weak signals. The problem then becomes not the analysis of information but the quantity. Too much information and you start to drown and can’t categorise it – it’s not a computer job, but a human job. In this case one approach is to do the above with a random sample of information – depending on your confidence needs and the quantity of information. This gets into concepts of sampling theory – which is another topic.

Gun smuggling, airline security and an intelligence failure.

January 25, 2011 2 comments

The headline article in the London Times for 25 January 2011 (print edition), Gunrunner Security Fiasco, reports how a security consultant named Steven Greenoe had smuggled numerous weapons into the UK – subsequently sold to UK criminals and gangs. At least one gun is known to have been used in a drive-by shooting.

This story raises several issues – not least the problem of airport security and how to ensure passenger safety, both on the ground and in the air. The news appeared to break on the same day that a suicide bomber killed three dozen people at the Moscow arrivals lounge.

I’ve often felt that the current paranoia over airport security was “overkill” (pardon the word-use). When I first started flying it was an adventure, but since September 2001 it has become more and more unpleasant. The security checks – although necessary – are becoming increasingly intrusive, yet the terrorists and criminals continually find new ways to get round them. Each time they are caught, new barriers are put in front of the innocent travelling public, to the extent that the average traveller is now so nervous that it would be almost impossible to differentiate between the genuinely nervous innocent and the person exhibiting nervousness due to their plans to blow up a plane.

Just as an example of how easy it is to blow up a plane if you really wanted to, I did some quick research prior to writing this post. For a few hundred US$ it is possible to purchase a few grams of a chemical and package it in a way that would not arouse suspicion if taken on a plane. With the addition of further chemicals available to all passengers on the plane, this could be turned into a bomb that would cause substantial damage. I’m not going to identify the chemicals for obvious reasons and not having tested this, I can’t say whether this bomb would be sufficient to blow a hole in the plane’s fuselage. However videos of the two chemicals in combination are available on the Internet, and the reaction is always highly explosive, completely destroying the reaction container. (One described the reaction of just 2 grams of a similar less-reactive chemical as like letting off a hand-grenade in a bath tub, and the resulting video confirmed this as the bath was destroyed).

The point is that if you want to kill and cause mayhem, it is possible. The job of security is to spot those people who are acting suspiciously or where intelligence suggests that they may be up to no good. This is how El Al caught Nezar Hindawi when he persuaded his pregnant girlfriend to carry a bomb onto a plane for him. The girlfriend was innocent and knew nothing about the suitcase with semtex hidden inside. It was only due to excellent intelligence, prior to reaching check-in, that a massacre was stopped.

The problem today is that everybody is likely to act suspiciously due to nervousness – and so make the job of picking up the genuine criminal more difficult. I believe that this is the first problem with airline security. The second is the laxness of checks at some smaller airports. Both are examples of intelligence failures. The first adds “noise” to the security problem, and uses staff that just go through procedures rather than depend on intelligence skills. The second is potentially worse in that it fails to use intelligence at all, and just hopes that the fact that the airport is small / regional means that the risk will be much lower. Of course, any potential terrorist can spot this from a long way off.

The US has long felt relatively safe, so long as the terrorist is kept out. As a result, checks on domestic flights are minimal or ineffective. This means that it is relatively easy to pack guns in domestic luggage – that then gets transferred to an international flight. Part of the problem here is the US obsession with gun ownership as a right (with the right saying that guns don’t kill people – people kill people, and ignoring the fact that guns make it easier for people to kill people). As long as the gun is in stored luggage there is less of an incentive to stop the passenger – even if detected. In the case of Steven Greenoe, he was reportedly stopped on at least one occasion – but managed to justify himself and so was allowed to fly, rather than get arrested. (I find it strange that in America – driving at 95mph or smoking cannabis – both generally less dangerous than owning and using a loaded gun are more likely to result in a criminal record).

The Times newspaper article mentioned that the gun smuggler concerned, Steven Greenoe, described himself as a security consultant. I did a brief search and up popped Greenoe’s LinkedIn page. Greenoe describes himself as the CEO of Jolie Rouge (which to me sounds a bit like the name given to the Pirate Flag – the Jolly Roger: surely not a coincidence). One part of Jolie Rouge’s business appears to be competitive intelligence – although the company doesn’t actually seem to use this term. Nevertheless Jolie Rouge Consulting states:

JRC uses public and private sources to unearth information critical to accurately valuing business and financial transactions. JRC uses an established network of legal, political, business, and military thought leaders to rapidly compile up-to-date and difficult-to-acquire information. Our clients use JRC’s oral and written reports to validate and sharpen their investment strategies and long-term business planning.

When I first looked at Greenoe’s profile he’d included the Business Strategy & Competitive Strategy forum within his LinkedIn profile.  When I next looked this had disappeared. I don’t know whether Greenoe dropped the group, or the group dropped him – scared about adverse publicity linking a gun runner to competitive strategy. Nevertheless, it highlights how important it is for the competitive intelligence community to police their own and ensure that anybody linked to the profession behaves ethically and morally. (This wouldn’t be the first time. There is a well-known and erudite CI consultant and author who many years ago, got caught up similarly, causing a scandal that is still remembered by long-time competitive intelligence professionals). Gun-running – especially where the guns are then sold on illegally is a lucrative business. (The guns cost $500 each but were reported to be selling at 10x that amount – meaning that the consignment he was arrested over would have netted him $360,000 profit for a little over $40,000 expenditure).

However the really odd thing about this news story is the date. Although the reports reached the press today (January 2011), Greenoe was first stopped on May 3, 2010, and arrested in July 2010. I wonder why it has taken six months for this story to hit the headlines. It’s another example of how care needs to be taken when doing competitive intelligence analyses – as what may look like a new news story could actually be quite old.

The car that hated vanilla ice cream!

November 1, 2010 4 comments

I was speaking to a colleague today and he commented that the terrorists who tried to send a bomb from the Yemen to a Chicago synagogue were pretty stupid. His view was that any package sent from the Yemen to a synagogue in the US would be suspect – and so the terrorists had to be stupid.

In competitive intelligence it is important not to make assumptions – and assuming that your competitor is stupid is one of the most dangerous assumptions you can make. It is possible that they are stupid. Alternatively, it is also feasible that they see things differently from you – and their viewpoint may be rational and logical from their perspective. Effective competitive intelligence should always involve you trying to see things from the perspective of your competitor rather than from your own, possibly subjective and biased standpoint.

I cannot really understand the rationale of the Yemeni terrorists sending their bomb, presumably intended to blow up en-route, with an address of a synagogue. It does seem stupid – but that is because I am not an Islamist terrorist. However trying to see things from that perspective I could envisage a conversation such as this:

Terrorist 1: So what address shall we use – something that would not be suspicious?”
Terrorist 2: How about a synagogue – the Jews control the USA / World so they must get lots of mail. Also they need to print their subversive material so won’t suspect our fake printer cartridges packed with explosives.
Terrorist 1: Good idea – which synagogue?
Terrorist 2: Obama came from Chicago. Let’s find the synagogue that he would take orders from….

Of course belief in a Jewish world conspiracy is nonsense, as is the idea that President Obama takes orders from a Jewish cabal. However that is not the opinion of large parts of the Moslem world – who sincerely believe in this, and that the 9-11 destruction of the Twin Towers was a Jewish plot, etc. If that is your world view, then sending suspect packages to a synagogue probably is completely logical and rational and the best way to ensure that they don’t raise suspicion.

The point is, that even if your enemy IS stupid, they will act based on their own warped rationale. In order to anticipate their actions you need to try and see things as they see them. This is even more important if in fact you are the one who is wrong – as in that case, switching your viewpoint should allow you to spot where your mistakes actually are.

There is a great story that illustrates this point – that what seems crazy may in fact not be. The story is apocryphal – and may be true.

Several years ago, the Pontiac Division of General Motors received a complaint:


     This is the second time that I have written to you. I don’t blame
     you for not answering my first letter as I must have sounded crazy.


     In our family, we have a tradition of having ice cream for desert after
     dinner each night. Every night, after we’ve eaten, we vote on which
     kind of ice cream to have – and I drive down to our local store to
     buy it. I recently purchased a new Pontiac and since then I’ve had a
     problem when I go to the ice cream store. Every time I buy vanilla
     ice cream and go back to my car it won’t start. If I buy any other
     type it starts first time. I realise this sounds insane but it’s true.


     Please help me understand what it is that makes my Pontiac fail
     to start when I purchase vanilla ice cream and easy to start with
     any other type.

The complaints department was naturally skeptical about this letter. However it was obviously written by somebody educated who knew how to write clearly and lucidly. Furthermore the area the writer came from was an affluent area – and a Pontiac is not a cheap car. They decided to take it seriously and an engineer was sent to investigate. The engineer arranged to meet the man just after dinner time – and the two drove to the ice cream store. That night, the vote had been for vanilla ice cream – and just as the man had said, the car wouldn’t start. Bemused, the engineer returned the following night – and the night after that. The car started first time – the votes had been for chocolate on the first night, and strawberry the second night. The fourth night, the choice was again for vanilla – and the car failed to start.

The engineer now realised that there was a problem that needed identification and fixing. He started to log what happened from the moment they arrived at the store – arrival time, time taken to make the purchase, and several other factors. Soon he had a clue – purchases of vanilla ice cream took less time than the other flavours. The reason was that the freezer containing vanilla ice cream was at the front of the store near a quick purchase till,  while other flavours were at the back and required lining up to get checked out.

Quickly the engineer realised that this was the answer to the problem – not the ice cream flavour, but the time required. When purchasing vanilla ice cream there was a vapour lock which prevented the car restarting. With the other flavours, there was sufficient time for the engine to cool down, allowing vapour to dissipate and the car to restart.

Of course the moral of the story is that even if something sounds crazy it may not be. Competitive Intelligence analysts should always bear this in mind when they look at a competitor and fail to understand why they are doing something that seems stupid.