How often have you heard something – and not questioned it, as you don’t want to appear stupid, foolish or ignorant?
Too often people accept what they are told and don’t question information. In educational environments this leads to a failure to learn. In business environments, it leads to bad decisions and bad strategy. Received wisdom becomes the operating principle rather than reality – especially when things have changed or are changing.
The reason people don’t question is that they don’t want to look foolish in front of peers, bosses or employees. Rather than highlight something that doesn’t make sense, they prefer to keep quiet so as not to appear stupid. The term for this is “pluralistic ignorance“. It is especially a problem in cultures where “losing face” is an issue. (I wrote about this almost two years ago -see Competitive Intelligence & Culture). In such cultures, employees find it difficult to question superiors – there is almost a belief that superiors are in their position as they know more and are better.
“Pluralistic Ignorance” is a phenomenon that prevents people questioning, when they fail to understand something or when they disagree with an issue, because they feel that they are the only ones not understanding or agreeing. It leads to “group-think” whereby a group of people fail to face up to their lack of knowledge or address false/inaccurate information because they don’t wish to appear foolish by questioning it.
In business it is important to emphasise communication and openness at all levels – and encourage questioning. This is especially key for effective competitive intelligence, but can be just as much a problem in CI as in other corporate areas if CI people aren’t looking out for it. For example, in CI there is the risk that a key piece of intelligence is missed because the person (perhaps a sales rep) doesn’t pass it on. They are sure that the CI team will already know this / that senior management is sure to know this – and so they don’t want to look stupid by passing it on.
The solution appears easy – build a corporate culture that rewards those who share information, even if it is already known. The difficulty is that such openness often contradicts other aspects of the corporation including hierarchical aspects – where one needs to address chains of command to pass on information. This leads to problems where the person at the bottom passes on information to their superior. This person then qualifies the information (exaggerating good news and softening bad news) when they pass it up – and by the time it reaches the actual decision-maker the information has been so transformed as to become meaningless and often false.
An example of how pluralistic ignorance works can be seen in this video of a college lecture. This brief (5 minute) video is the first in a course on behavioural economics. The lecturer, Dan Ariely of Duke University Business School (and TED speaker), is aware of the problem and halfway through this lecture shows how it works.
Every morning at around 7.45am, BBC Radio 4 includes a short talk from a religious figure giving listeners a thought to ponder. The daily “Thought for the Day” is given by Christian priests and vicars, Rabbis, Imans and others.
This morning’s programme (25 February 2013) featured Dr Giles Fraser, priest-in-charge of St Mary’s, Newington. Fraser spoke about Jesus and pointed out that the Western World’s perceptions on what he looked like are likely to be wrong. He referred to classical paintings of Jesus and contrasted these to Judas. Jesus is often blonde while Judas tends to be much more swarthy looking with a longer nose and red or dark hair. Jesus has become an archetypical North European, while Judas reflects stereotypes on how Jews are supposed to look. Of course Jesus was Jewish – and was born and lived in what is now Israel. So did Judas. Both would have had Semitic physiognomies – as both were Jewish.
Fraser’s point however has further implications. There is a tendency to put our own preconceptions and views onto others – and expect others to behave and think like we do. In a business context, this can be fatal as it means we see competitors as just reflections of ourselves. When a competitor comes up with something that appears odd, or that we don’t understand, the inclination is to say that the competitor has it wrong – rather than that we have it wrong, which could just as easily be the situation. This error is a classic type of blind spot.
Myers-Briggs Type Indicators
One part in Fraser’s short talk caught my attention. While he was studying to become a priest, he was taught about Myers-Briggs Type Indicators based on work by Carl Jung. Fraser commented that both he and his fellow trainee priests were asked to assess the personality type of Jesus based on what they knew and had learned about him. They were then assessed using the Myers-Briggs test. Most found that the personality type they had given to Jesus was actually a reflection of their own type.
The implications for this are that people have a tendency to assign their own expectations and prejudices onto others – and judge them accordingly.
In business recruitment, this can mean choosing a candidate who, rather than bring something fresh to the business, just continues the same old approach. Although this may avoid conflict, it also means that the chance for new, innovative thinking and an ability to change or challenge current norms is also lost. There is a real risk that recruiting clones may lead to the business stultifying and failing to recognise new opportunities and threats.
In research interviewing any attempt to profile an individual remotely is just foolhardy and a key source for interviewer bias, resulting in flawed interviews and erroneous conclusions riddled with misconceptions. Yet there are interviewers who claim to be so expert at such psychometric evaluations that they can assess an interviewee within minutes even though the published tests for Myers Briggs involve dozens of questions that need to be answered before an assessment can be made.
In business analysis it can lead to a potentially more serious problem. Some analysts pride themselves on their ability to identify the personality type of business or political leaders, without meeting them and with minimal information. Unless there is a vast quantity of information available on another individual – speeches, TV and radio interviews, published articles and opinion pieces, etc. it is risky to extrapolate about another individual and anticipate their behaviour remotely. The danger is that the analyst may project their own typology onto the leader – judging them by reported actions without necessarily understanding the thought processes that lay behind those actions or even the accuracy of the reporting. The risk is that any assessment will be based on prejudices – rather than reality, and so lead to poor decisions.
Business research and analysis should depend on accurate and rigorous methodologies, and not pop-psychology. Myers-Briggs can be useful when backed up by sufficient data. It should be viewed as an analysis tool requiring detailed insight into the subject. Using these, and other similar psychometric approaches, as a basis for complex business decision-making without the full data as demanded by the process is another route to business failure, so treat with care, and treat advocates of these tools even more carefully.
Dyson has discovered a spy for its German rival Bosch working in its high-security inventing department in Malmesbury.
One of the basic principles of business strategy is that competitive advantage comes from differentating yourself from competitors. This comes from either improving processes or improving products – cost leadership or product/service differentiation.
Competitive Advantage cannot come from a follower-strategy. It comes from proving to customers that you are different and offer something that competitors don’t have.
Copying competitors does not do this – it shows a lack of ideas and a lack of creative, innovative strategy. For a company like Bosch, known for engineering excellence, resorting to corporate espionage – and being suspected of wanting to use another ocmpany’s ideas says that Bosch is in deep trouble.
There is a big difference between competitive intelligence and corporate espionage. Competiive Intelligence aims to understand everything about the competitive environment – and why customers choose one company in preference to another. It can also try and understand what a competitor aims to do next – so that clear lines can be drawn between companies. Espionage does something different. It says “we want to do the same as you and want to know your secrets”. That’s a straegy failure – and wrong!
See on www.telegraph.co.uk
I’ve just read an interesting blog post by Philippe Silberzahn and Milo Jones. The post “Competitive intelligence and strategic surprises: Why monitoring weak signals is not the right approach” looked at the problems of weak signals in competitive intelligence and how even though an organisation may have lots of intelligence, they still get surprised.
Silberzahn and Jones point out that it’s not usually the intelligence that is the problem, but the interpretation of the gathered intelligence. This echoed a statement by Issur Harel, the former head of Mossad responsible for capturing the Nazi war criminal Eichmann. Harel was quoted as saying “We do not deal with certainties. The world of intelligence is the world of probabilities. Getting the information is not usually the most difficult task. What is difficult is putting upon it the right interpretation. Analysis is everything.”
In their post, Silberzahn and Jones argue that more important than monitoring for weak signals, is the need to monitor one’s own assumptions and hypotheses about what is happening in the environment. They give several examples where weak signals were available but still resulted in intelligence failures. Three different types of failure are mentioned:
- Too much information: the problem faced by the US who had lots of information prior to the Pearl Harbour attack of 7 December 1941,
- Disinformation, as put out by Osama bin Laden to keep people in a high-state of alert – by dropping clues that “something was about to happen“, when nothing was (and of course keeping silent when it was),
- “Warning fatigue” (the crying wolf syndrome) where constant repetition of weak signals leads to reinterpretation and discounting of threats, as happened prior the Yom Kippur war.
Their conclusion is that with too much data, you can’t sort the wheat from the chaff, and with too little you make analytical errors. Their solution is that rather than collect data and subsequently analyse it to uncover its meaning you should first come up with hypotheses and use that to drive data collection. They quote Peter Drucker (Management: Tasks, Responsibilities, Practices, 1973) who wrote: “Executives who make effective decisions know that one does not start with facts. One starts with opinions… To get the facts first is impossible. There are no facts unless one has a criterion of relevance.” and emphasise that “it is hypotheses that must drive data collection”.
Essentially this is part of the philosophy behind the “Key Intelligence Topic” or KIT process – as articulated by Jan Herring and viewed as a key CI technique by many Competitive Intelligence Professionals.
I believe that KITs are an important part of CI, and it is important to come up with hypotheses on what is happening in the competitive environment, and then test these hypotheses through data collection. However this should not detract from general competitive monitoring, including the collection of weak signals.
The problem is how to interpret and analyse weak signals. Ignoring them or even downplaying them is NOT the solution in my view – and is in fact highly dangerous. Companies with effective intelligence do not get beaten or lose out through known problems but from unknown ones. It’s the unknown that catches the company by surprise, and often it is the weak signals that, in hindsight, give clues to the unknown. In hindsight, their interpretation is obvious. However at the time, the interpretation is often missed, misunderstood, or ignored as unimportant.
There is an approach to analysing weak signals that can help sort the wheat from the chaff. When you have a collection of weak signals don’t treat them all the same. Categorise them.
- Are they about a known target’s capabilities? Put these in box 1.
- Are they relating to a target’s strategy? These go into box 2.
- Do they give clues to a target’s goals or drivers? Place these in box 3.
- Can the weak signal be linked to assumptions about the environment held by the target? These go into box 4.
Anything else goes into box 5. Box 5 holds the real unknowns – unknown target or topic or subject. You have a signal but don’t know what to link it to.
First look at boxes 1-4 and compare each bit of intelligence to other information.
- Does it fit in? If so good. You’ve added to the picture.
- If it doesn’t, why not?
Consider the source of the information you have. What’s the chronology? Does the new information suggest a change? If so, what could have caused that change? For this, compare the other 3 boxes to see if there’s any information that backs up the new signal – using the competitor analysis approach sometimes known as 4-corners analysis, to see if other information would help create a picture or hypothesis of what is happening.
If you find nothing, go back and look at the source.
- Is it old information masquerading as new? If so, you can probably discount it.
- Is it a complete anomaly – not fitting in with anything else at all? Think why the information became available. Essentially this sort of information is similar to what goes into box 5.
- Could it be disinformation? If so, what is likely to be the truth? Knowing it may be disinformation may lead to what is being hidden?
- Or is it misinformation – which can probably be discounted?
- What about if you can’t tell? Then it suggests another task – to try and identify other intelligence that would provide further detail and help you evaluate the anomaly. Such weak signals then become leads for future intelligence gathering.
With box 5 – try and work out why it is box 5. (It may be that you have information but no target to pin it to, for example – so can’t do the above). As with anomalies, think why the information became available. You may need to come up with a number of hypotheses to explain meaning behind the information. These can sometimes (but not always) be tested.
Silberzahn and Jones mention a problem from Nassim Taleb’s brilliant book “The Black Swan: The Impact of the Highly Improbable“. The problem is how do you stop being like a turkey before Thanksgiving. Prior to Thanksgiving the turkey is regularly fed and given lots and lots of food. Life seems good, until the fateful day, just before Thanksgiving, when the food stops and the slaughterer enters to prepare the turkey for the Thanksgiving meal. For the turkey this is a complete surprise as all the evidence prior to this suggests that everything is going well. Taleb poses the question as to whether a turkey can learn from the events of yesterday what is about to happen tomorrow. Can an unknown future be predicted – and in this case, the answer seems to be no.
For an organisation, this is a major problem as if they are like turkeys, then weak signals become irrelevant. The unknown can destroy them however much information they hold prior to the unforeseen event. As Harel said, the problem is not information but analysis. The wrong analysis means death!
This is where a hypothesis approach comes in – and why hypotheses are needed for competitive intelligence gathering. In the Thanksgiving case, the turkey has lots of consistent information coming in saying “humans provide food”. The key is to look at the source of the information and try to understand it. In other words:
Information: Humans provide food.
Source: observation that humans give food every day – obtained from multiple reliable sources.
You now need to question the reason or look at the objectives behind this observation. Why was this observation available? Come up with hypotheses that can be used to test the observations and see what matches. Then choose a strategy based on an assessment of risk. In the case of the turkey there are two potential hypotheses:
- “humans like me and so feed me” (i.e. humans are nice)
- “humans feed me for some other reason” (i.e. humans may not be nice).
Until other information comes in to justify hypothesis 1, hypothesis 2 is the safer one to adopt as even if hypothesis 1 is true, you won’t get hurt by adopting a strategy predicated on hypothesis 2. (You may not eat so much and be called skinny by all the other turkeys near you. However you are less likely to be killed).
This approach can be taken with anomalous information in general, and used to handle weak signals. The problem then becomes not the analysis of information but the quantity. Too much information and you start to drown and can’t categorise it – it’s not a computer job, but a human job. In this case one approach is to do the above with a random sample of information – depending on your confidence needs and the quantity of information. This gets into concepts of sampling theory – which is another topic.