Archive

Posts Tagged ‘Competitive Intelligence’

Pluralistic Ignorance

May 13, 2013 Leave a comment

How often have you heard something – and not questioned it, as you don’t want to appear stupid, foolish or ignorant?

Too often people accept what they are told and don’t question information. In educational environments this leads to a failure to learn. In business environments, it leads to bad decisions and bad strategy. Received wisdom becomes the operating principle rather than reality – especially when things have changed or are changing.

The reason people don’t question is that they don’t want to look foolish in front of peers, bosses or employees. Rather than highlight something that doesn’t make sense, they prefer to keep quiet so as not to appear stupid. The term for this is “pluralistic ignorance“. It is especially a problem in cultures where “losing face” is an issue. (I wrote about this almost two years ago -see  Competitive Intelligence & Culture). In such cultures, employees find it difficult to question superiors – there is almost a belief that superiors are in their position as they know more and are better.

Pluralistic Ignorance” is a phenomenon that prevents people questioning, when they fail to understand something or when they disagree with an issue, because they feel that they are the only ones not understanding or agreeing. It leads to “group-think” whereby a group of people fail to face up to their lack of knowledge or address false/inaccurate information because they don’t wish to appear foolish by questioning it.

In business it is important to emphasise communication and openness at all levels – and encourage questioning. This is especially key for effective competitive intelligence, but can be just as much a problem in CI as in other corporate areas if CI people aren’t looking out for it. For example, in CI there is the risk that a key piece of intelligence is missed because the person (perhaps a sales rep) doesn’t pass it on. They are sure that the CI team will already know this / that senior management is sure to know this – and so they don’t want to look stupid by passing it on.

The solution appears easy – build a corporate culture that rewards those who share information, even if it is already known. The difficulty is that such openness often contradicts other aspects of the corporation including hierarchical aspects – where one needs to address chains of command to pass on information. This leads to problems where the person at the bottom passes on information to their superior. This person then qualifies the information (exaggerating good news and softening bad news) when they pass it up – and by the time it reaches the actual decision-maker the information has been so transformed as to become meaningless and often false.

An example of how pluralistic ignorance works can be seen in this video of a college lecture. This brief (5 minute) video is the first in a course on behavioural economics. The lecturer, Dan Ariely of Duke University Business School (and TED speaker), is aware of the problem and halfway through this lecture shows how it works.

https://www.youtube.com/watch?v=-9wHttUayMo

Testing perceptions – Myers-Briggs and false appearances

February 25, 2013 Leave a comment

Every morning at around 7.45am, BBC Radio 4 includes a short talk from a religious figure giving listeners a thought to ponder. The daily “Thought for the Day” is given by Christian priests and vicars, Rabbis, Imans and others.

The Last Supper

The Last Supper
(Hans Holbein the Younger, 1524)

This morning’s programme (25 February 2013) featured  Dr Giles Fraser, priest-in-charge of St Mary’s, Newington. Fraser spoke about Jesus and pointed out that the Western World’s perceptions on what he looked like are likely to be wrong. He referred to classical paintings of Jesus and contrasted these to  Judas. Jesus is often blonde while Judas tends to be much more swarthy looking with a longer nose and red or dark hair. Jesus has become an archetypical North European, while Judas reflects stereotypes on how Jews are supposed to look. Of course Jesus was Jewish – and was born and lived in what is now Israel. So did Judas. Both would have had Semitic physiognomies – as both were Jewish.

Fraser’s point however has further implications. There is a tendency to put our own preconceptions and views onto others – and expect others to behave and think like we do. In a business context, this can be fatal as it means we see competitors as just reflections of ourselves. When a competitor comes up with something that appears odd, or that we don’t understand, the inclination is to say that the competitor has it wrong – rather than that we have it wrong, which could just as easily be the situation. This error is a classic type of blind spot.

Myers-Briggs Type Indicators

One part in Fraser’s short talk caught my attention. While he was studying to become a priest, he was taught about Myers-Briggs Type Indicators based on work by Carl Jung.  Fraser commented that both he and his fellow trainee priests were asked to assess the personality type of Jesus based on what they knew and had learned about him. They were then assessed using the Myers-Briggs test. Most found that the personality type they had given to Jesus was actually a reflection of their own type.

The implications for this are that people have a tendency to assign their own expectations and prejudices onto others – and judge them accordingly.

Myers-Briggs test form

The Jungian Briggs Myers 16-Types Personality Test (JBM16) is designed to measure how you like to look at the world and make decisions.

In business recruitment, this can mean choosing a candidate who, rather than bring something fresh to the business, just continues the same old approach. Although this may avoid conflict, it also means that the chance for new, innovative thinking and an ability to change or challenge current norms is also lost. There is a real risk that recruiting clones may lead to the business stultifying and failing to recognise new opportunities and threats.

In research interviewing any attempt to profile an individual remotely is just foolhardy and a key source for interviewer bias, resulting in flawed interviews and erroneous conclusions riddled with misconceptions. Yet there are interviewers who claim to be so expert at such psychometric evaluations that they can assess an interviewee within minutes even though the published tests for Myers Briggs involve dozens of questions  that need to be answered before an assessment can be made.

In business analysis it can lead to a potentially more serious problem. Some analysts pride themselves on their ability to identify the personality type of business or political leaders, without meeting them and with minimal information. Unless there is a vast quantity of information available on another individual – speeches, TV and radio interviews, published articles and opinion pieces, etc. it is risky to extrapolate about another individual and anticipate their behaviour remotely. The danger is that the analyst may project their own typology onto the leader – judging them by reported actions without necessarily understanding the thought processes that lay behind those actions or even the accuracy of the reporting. The risk is that any assessment will be based on prejudices – rather than reality, and so lead to poor decisions.

Business research and analysis should depend on accurate and rigorous methodologies, and not pop-psychology. Myers-Briggs can be useful when backed up by sufficient data. It should be viewed as an analysis tool requiring detailed insight into the subject. Using these, and other similar psychometric approaches, as a basis for complex business decision-making without the full data as demanded by the process is another route to business failure, so treat with care, and treat advocates of these tools even more carefully.

Myers Briggs personality types

Dyson sues after discovering German ‘spy’ on its staff – Telegraph

October 24, 2012 5 comments

Dyson has discovered a spy for its German rival Bosch working in its high-security inventing department in Malmesbury.

One of the basic principles of business strategy is that competitive advantage comes from differentating yourself from competitors. This comes from either improving processes or improving products – cost leadership or product/service differentiation.

Competitive Advantage cannot come from a follower-strategy. It comes from proving to customers that you are different and offer something that competitors don’t have.

Copying competitors does not do this – it shows a lack of ideas and a lack of creative, innovative strategy. For a company like Bosch, known for engineering excellence, resorting to corporate espionage – and being suspected of wanting to use another ocmpany’s ideas says that Bosch is in deep trouble.

There is a big difference between competitive intelligence and corporate espionage. Competiive Intelligence aims to understand everything about the competitive environment – and why customers choose one company in preference to another. It can also try and understand what a competitor aims to do next – so that clear lines can be drawn between companies. Espionage does something different. It says “we want to do the same as you and want to know your secrets”. That’s a straegy failure – and wrong!

See on www.telegraph.co.uk

Analysing weak signals for competitive & marketing intelligence

March 5, 2012 6 comments

I’ve just read an interesting blog post by  Philippe Silberzahn and Milo Jones. The post “Competitive intelligence and strategic surprises: Why monitoring weak signals is not the right approach” looked at the problems of weak signals in competitive intelligence and how even though an organisation may have lots of intelligence, they still get surprised.

Silberzahn and Jones point out that it’s not usually the intelligence that is the problem, but the interpretation of the gathered intelligence. This echoed a statement by Issur Harel, the former head of Mossad responsible for capturing the Nazi war criminal Eichmann. Harel was quoted as saying “We do not deal with certainties. The world of intelligence is the world of probabilities. Getting the information is not usually the most difficult task. What is difficult is putting upon it the right interpretation. Analysis is everything.”

In their post, Silberzahn and Jones argue that more important than monitoring for weak signals, is the need to monitor one’s own assumptions and hypotheses about what is happening in the environment. They give several examples where weak signals were available but still resulted in intelligence failures. Three different types of failure are mentioned:

  • Too much information: the problem faced by the US who had lots of information prior to the Pearl Harbour attack of 7 December 1941,
  • Disinformation, as put out by Osama bin Laden to keep people in a high-state of alert – by dropping clues that “something was about to happen“, when nothing was (and of course keeping silent when it was),
  • “Warning fatigue” (the crying wolf syndrome) where constant repetition of weak signals leads to reinterpretation and discounting of threats, as happened prior the Yom Kippur war.

Their conclusion is that with too much data, you can’t sort the wheat from the chaff, and with too little you make analytical errors. Their solution is that rather than collect data and subsequently analyse it to uncover its meaning you should first come up with hypotheses and use that to drive data collection. They quote Peter Drucker (Management: Tasks, Responsibilities, Practices, 1973) who wrote: “Executives who make effective decisions know that one does not start with facts. One starts with opinions… To get the facts first is impossible. There are no facts unless one has a criterion of relevance.”  and emphasise that “it is hypotheses that must drive data collection”.

Essentially this is part of the philosophy behind the “Key Intelligence Topic” or KIT process – as articulated by Jan Herring and viewed as a key CI technique by many Competitive Intelligence Professionals.

I believe that  KITs are an important part of CI, and it is important to come up with hypotheses on what is happening in the competitive environment, and then test these hypotheses through data collection. However this should not detract from general competitive monitoring, including the collection of weak signals.

The problem is how to interpret and analyse weak signals. Ignoring them or even downplaying them is NOT the solution in my view – and is in fact highly dangerous. Companies with effective intelligence do not get beaten or lose out through known problems but from unknown ones. It’s the unknown that catches the company by surprise, and often it is the weak signals that, in hindsight, give clues to the unknown. In hindsight, their interpretation is obvious. However at the time, the interpretation is often missed, misunderstood, or ignored as unimportant.

There is an approach to analysing weak signals that can help sort the wheat from the chaff. When you have a collection of weak signals don’t treat them all the same. Categorise them.

  • Are they about a known target’s capabilities? Put these in box 1.
  • Are they relating to a target’s strategy? These go into box 2.
  • Do they give clues to a target’s goals or drivers? Place these in box 3.
  • Can the weak signal be linked to assumptions about the environment held by the target? These go into box 4.

Anything else goes into box 5. Box 5 holds the real unknowns – unknown target or topic or subject. You have a signal but don’t know what to link it to.

First look at boxes 1-4 and compare each bit of intelligence to other information.

  1. Does it fit in? If so good. You’ve added to the picture.
  2. If it doesn’t, why not?

Consider the source of the information you have. What’s the chronology? Does the new information suggest a change? If so, what could have caused that change? For this, compare the other 3 boxes to see if there’s any information that backs up the new signal – using the competitor analysis approach sometimes known as 4-corners analysis, to see if other information would help create a picture or hypothesis of what is happening.

If you find nothing, go back and look at the source.

  • Is it old information masquerading as new? If so, you can probably discount it.
  • Is it a complete anomaly – not fitting in with anything else at all? Think why the information became available. Essentially this sort of information is similar to what goes into box 5.
    • Could it be disinformation? If so, what is likely to be the truth? Knowing it may be disinformation may lead to what is being hidden?
    • Or is it misinformation – which can probably be discounted?
    • What about if you can’t tell? Then it suggests another task – to try and identify other intelligence that would provide further detail and help you evaluate the anomaly. Such weak signals then become leads for future intelligence gathering.

With box 5 – try and work out why it is box 5. (It may be that you have information but no target to pin it to, for example – so can’t do the above). As with anomalies, think why the information became available. You may need to come up with a number of hypotheses to explain meaning behind the information. These can sometimes (but not always) be tested.

Silberzahn and Jones mention a problem from Nassim Taleb’s brilliant book “The Black Swan: The Impact of the Highly Improbable“. The problem is how do you stop being like a turkey before Thanksgiving. Prior to Thanksgiving the turkey is regularly fed and given lots and lots of food. Life seems good, until the fateful day, just before Thanksgiving, when the food stops and the slaughterer enters to prepare the turkey for the Thanksgiving meal. For the turkey this is a complete surprise as all the evidence prior to this suggests that everything is going well. Taleb poses the question as to whether a turkey can learn from the events of yesterday what is about to happen tomorrow. Can an unknown future be predicted – and in this case, the answer seems to be no.

For an organisation, this is a major problem as if they are like turkeys, then weak signals become irrelevant. The unknown can destroy them however much information they hold prior to the unforeseen event. As Harel said, the problem is not information but analysis. The wrong analysis means death!

This is where a hypothesis approach comes in – and why hypotheses are needed for competitive intelligence gathering. In the Thanksgiving case, the turkey has lots of consistent information coming in saying “humans provide food”.  The key is to look at the source of the information and try to understand it. In other words:

Information: Humans provide food.
Source: observation that humans give food every day – obtained from multiple reliable sources.

You now need to question the reason or look at the objectives behind this observation. Why was this observation available? Come up with hypotheses that can be used to test the observations and see what matches. Then choose a strategy based on an assessment of risk. In the case of the turkey there are two potential hypotheses:

  1. “humans like me and so feed me” (i.e. humans are nice)
  2. “humans feed me for some other reason” (i.e. humans may not be nice).

Until other information comes in to justify hypothesis 1, hypothesis 2 is the safer one to adopt as even if hypothesis 1 is true, you won’t get hurt by adopting a strategy predicated on hypothesis 2. (You may not eat so much and be called skinny by all the other turkeys near you. However you are less likely to be killed).

This approach can be taken with anomalous information in general, and used to handle weak signals. The problem then becomes not the analysis of information but the quantity. Too much information and you start to drown and can’t categorise it – it’s not a computer job, but a human job. In this case one approach is to do the above with a random sample of information – depending on your confidence needs and the quantity of information. This gets into concepts of sampling theory – which is another topic.

Internet Explorer is for Dummies! Anatomy of a hoax.

August 7, 2011 15 comments

Good business intelligence quickly identifies information that is real and what’s false – or should. It’s important that decision making is based on accurate, factual data – as otherwise bad decisions get made. So how do you tell whether something is real or fake?

Generally, the first rule is to check the source or sources.

  • Are they reputable and reliable?
  • Is the information in the story sensible and reasonable?
  • What’s the background to the story – does it fit in with what’s already known?

The problem is that even if information passes these tests it may still not be true. There are numerous examples of news items that sound true but that turn out to be false. One example is a BBC news story from 2002 quoting German researchers who claimed that natural blondes were likely to disappear within 200 years.   A similar story appeared in February 2006 in the UK’s Sunday Times. This article quoted a WHO study from 2002. In fact, there was no WHO study that stated this – it was false. The story of blonde extinction has been traced back over 150 years and periodically is reported – always with “scientific” references to imply validity.

The “Internet Explorer users have lower IQs” hoax

Often, the decision to accept a news item depends on whether or not it sounds true. If the story sounds true, especially if supported by apparent research then people think that it probably is – and so checks aren’t made. That is why a recent news story suggesting that users of Internet Explorer have lower IQs than those of other browsers was reported so widely. Internet Explorer is often set up as the default browser on Windows computers, and many users are more familiar with Explorer than other browsers. The suggestion that less technologically adept users (i.e. less intelligent users) would not know how to download or switch to a different browser made sense.

I first read the news story in The Register – an online technical newspaper covering web, computer and scientific news. Apart from The Register, the story appeared on CNN, the BBC, the Huffington Post, Forbes and many other news outlets globally (e.g. the UK’s  Daily Telegraph  and Daily Mail). Many of these have now either pulled the story completely, just reporting the hoax, or added an addendum to their story showing that it was a hoax. A few admit to being fooled – the Register, for example, explained why they believed it: because it sounded plausible.

The hoax succeeded however, not only because the story itself sounded plausible, but also because a lot of work had been put in to make it look real. The hoaxer had built a complete web-site to accompany the news item – including other research, implying that the research company concerned was bona fide, other product details, FAQs, and even other research reports, etc. The report itself was included as a PDF download.

In fact most pages had been copies from a genuine company, Central Test headquartered in Paris and with offices in the US, UK, Germany and India – as was highlighted in an article in CBR Online.

Red Flags that indicated the hoax

To its credit the technology magazine, Wired.com spotted several red flags, suggesting that the story was a hoax, stating that “If a headline sounds too good to be true, think twice.”

Wired commented that the other journalists hadn’t really looked at the data, pointing out that “journalists get press releases from small research companies all the time“. The problem is that it’s one thing getting a press release and another printing it without doing basic journalistic checks and follow-throughs. In this case,

  • the “research company” AptiQuant had no history of past studies – other than on its own web-site;
  • the company address didn’t exist;
  • the average reported IQ for Internet Explorer users (80) was so low as to put them in the bottom 15% of the population (while that for Opera users put them in the top 5%) – scarcely credible considering Internet Explorer’s market share.

After the hoax was exposed, the author, Tarandeep Gill, pointed out several red flags that he felt should have alerted journalists and admitted it had been a hoax i.e.

1. The domain was registered on July 14th 2011.
2. The test that was mentioned in the report, “Wechsler Adult Intelligence Scale (IV) test” is a copyrighted test and cannot be administered online.
3. The phone number listed on the report and the press release is the same listed on the press releases/whois of my other websites. A google search reveals this.
4. The address listed on the report does not exist.
5. All the material on my website was not original.
6. The website is made in WordPress. Come on now!
7. I am sure, my haphazardly put together report had more than one grammatical mistakes.
8. There is a link to our website AtCheap.com in the footer.

The rationale and the aftermath

Gill is a computer programmer based in Vancouver, Canada, working on a a comparison shopping website www.AtCheap.com. Gill became irritated at having to code for earlier versions of Internet Explorer – and especially IE 6.0 which is still used by a small percentage of web users. (As of July 2011, 9% of web-users use Internet Explorer versions 6.0 and 7.0 with a further 26% using version 8.0. Only 7% of web users have upgraded to the latest version of Internet Explorer – v9.0).

The problem with IE versions 6.0-8.0 is that they are not compatible with general web-standards making life difficult for web designers who have to code accordingly, and test sites on multiple versions of the same browser – all differing slightly. (As you can’t have all 4 versions of Internet Explorer IE6.0 – IE9.0 on the same computer this means operating 4 separate computers or having 4 hard-disk partitions – one for each version).

Gill decided to create something that would encourage IE users to upgrade or switch, and felt that a report that used scientific language and that looked authentic would do the trick.  He designed the web-site, copying material from Central Test, and then put out the press release – never expecting the story to spread so fast or far. He was sure he’d be found out much more quickly.

The problem was that after one or two reputable news sources published the story everybody else piled in. Later reports assumed that the early ones had verified the news story so nobody did any checks. The Register outlined the position in their mea culpa, highlighting how the story sounded sensible.

Many news outlets are busy flagellating themselves for falling for the hoax. But this seems odd when you consider that these news outlets run stories on equally ridiculous market studies on an almost day basis. What’s more, most Reg readers would argue that we all know Internet Explorer users have lower IQs than everyone else. So where’s the harm?

The facts are that AptiQuant doesn’t exist and its survey was a hoax. But facts and surveys are very different from the truth. “It’s official: IE users are dumb as a bag of hammers,” read our headline. “100,000 test subjects can’t be wrong.” The test subjects weren’t real. But they weren’t necessarily wrong either.

You may disagree. But we have no doubt that someone could easily survey 100,000 real internet users and somehow prove that we’re exactly right. And wrong.

The real issue is that nobody checked as the story seemed credible. Competitive Intelligence analysis cannot afford to be so lax. If nobody else bothers verifying a news story that turns out to be false, you have a chance to gain competitive advantage. In contrast those failing to check the story risk losing out. The same lessons that apply to journalists apply to competitive intelligence and just because a news story looks believable, is published in a reputable source and is supported by several other sources doesn’t make it true. The AptiQuant hoax story shows this.

Meanwhile the story rumbles on with threats of lawsuits against Tarandeep Gill by both Microsoft (for insulting Internet Explorer users) and more likely by Central Test. Neither company is willing to comment although Microsoft would like users to upgrade Internet Explorer to the latest version. In May 2010 Microsoft’s Australian operation even said using IE6 was like drinking nine-year-old milk. If Gill has managed to get some users to upgrade he’ll have helped the company. He should have also helped Central Test – as the relatively unknown company has received massive positive publicity as a result of the hoax. If they do sue, it shows a lack of a sense of humour (or a venal desire for money) – and will leave a sour taste as bad as from drinking that nine-year-old milk.

Zanran – a new data search engine

April 21, 2011 4 comments

I’ve been playing with a new data search engine called Zanran – that focuses on finding numerical and graphical data. The site is in an early beta. Nevertheless my initial tests brought up material that would only have been found using an advanced search on Google – if you were lucky. As such, Zanran promises to be a great addition for advanced data searching.

Zanran.com

Zanran.com - Front Page

Zanran focuses on finding what it calls  ‘semi-structured’ data on the web. This is defined as numerical data presented as graphs, tables and charts – and these could be held in a graph image or table in an HTML file, as part of a PDF report, or in an Excel spreadsheet. This is the key differentiator – essentially, Zanran is not looking for text but for formatted numerical data.

When I first started looking at the site I was expecting something similar to Wolfram Alpha – or perhaps something from Google (e.g. Google Squared or Google Public Data). Zanran is nothing like these – and so brings something new to search. Rather than take data and structure or tabulate it (as with Wolfram Alpha and Google Squared), Zanran searches for data that is already in tables or charts and uses this in its results listing.

Zanran.com

Zanran.com Search: "Average Marriage Age"

The site has a nice touch in that hovering the cursor over results gives you the relevant data page – whether a table, a chart or a mix of text, tables or charts.

Zanran.com - Hovering over a result brings up an image of the data.

The advanced search options allow country searching (based on server location), document date and file type, each selectable from a drop-down box, as well as searches on specified web-sites.  At the moment only English speaking countries can be selected (Australia, Canada, Ireland, India, UK New Zealand, USA and South Africa). The date selections allow for the last 6, 12 or 24 months and the file type allows for selection based on PDF; Excel; images in HTML files; tables in HTML files; PDF, Excel and dynamic data; and dynamic data alone. PowerPoint and Word files are promised as future options. There are currently no field search options (e.g. title searches).

My main dislike was that the site doesn’t give the full URLs for the data presented. The top-level domain is given, but not the actual URL which makes the site difficult to use when full attribution is required for any data found (especially if data gets downloaded, rather than opening up in a new page or tab).

Zanran.com has been in development since at least 2009 when it was a finalist in the London Technology Fund Competition. The technology behind Zanran is patented and based on open-source software, and cloud storage. Rather than searching for text, Zanran searches for numerical content, and then classifies it by whether it’s a table or a chart.

Atypically, Zanran is not a Californian Silicon Valley Startup, but is based in the Islington area of London, in a quiet residential side-street made up of a mixture of small mostly home-based businesses and flats/apartments. Zanran was founded by two chemists, Jonathan Goldhill and Yves Dassas, who had previously run telecom businesses (High Track Communications Ltd and Bikebug Radio Technologies) from the same address. Funding has come from the London Development Agency and First Capital among other investors.

Zanran views competitors as Wolfram Alpha, Google Public Data and also Infochimps (a database repository – enabling users to search for and download a wide variety of databases). The competitor list comes from Google’s cache of Zanran’s Wikipedia page as unfortunately, Wikipedia has deleted the actual page – claiming that the site is “too new to know if it will or will not ever be notable“.

Google Cache of Zanran's Wikipedia entry

I hope that Wikipedia is wrong and that Zanran will become “notable” as I think the company offers a new approach to searching the web for data. It will never replace Google or Bing – but that’s not its aim. Zanran aims to be a niche tool that will probably only ever be used by search experts. However as such, it deserves a chance, and if its revenue model (I’m assuming that there is one) works, it deserves success.

Telling stories – fairy tales, case-studies & scenarios….

April 14, 2011 5 comments

Telling Stories - At the ICI/Atelis competitive intelligence conference that took place last week (April 6-7, 2011) in Bad Nauheim, Germany there was a panel discussion on story-telling as a method of reporting intelligence. At about the same time, the Association of Independent Information Professionals (AIIP) held their 25th annual conference in Vancouver, Washington in the USA. Mary-Ellen Bates described how stories can help information professionals market themselves by showing how their skills can solve client problems. The fact that both conferences looked at story-telling shows how businesses are adopting the technique as a way of addressing complex issues.

Story telling is an ancient art-form that might seem strange as a business tool. However, often stories will be an excellent approach for solving business questions as they allow people to look at a situation objectively, remove themselves from the scene and take an outside view. The trick is to tell the right story, catching the imagination and making people think. During the ICI / Atelis conference I suggested a framework for when different story styles can be used.

The first story type is the “fairy-tale” – the “Once Upon a Time in a Kingdom Far Away” type of story. Fairy-tales are possibly the most abstract example of a story that can be applicable to business. The danger is that they can be seen as childish and far-removed from real-world business realities. In fact, they can be a powerful way of highlighting deep-seated organisational problems, as management refusal to see such problems can be illustrated with stories. Such stories can help managers recognise their own situation, and so identify the problems and think of possible solutions.

Consider a company where the CEO or other senior management refuse to see that their business has changed.  Often such management grew up in the industry and believe that they know it inside out. Accepting that things have changed is anathema to them. A standard comment given by such managers when asked why things are done in a particular way is “We’ve always done it that way“. Essentially such management suffers from corporate denial – or what Ben Gilad called a business taboo in his book “Business Blindspots“.

Telling such managers a fairy-tale story can help them see the problem (assuming that you can arrange a session they will be willing to attend).

Once upon a time, in a far-away country there was a king who loved to sing. He loved to sing so much that he made laws that all his people were to learn his favourite songs.

Every Sunday, the people were to gather in the town squares and village greens and sing the songs the king loved.  The people were happy as they also loved the music and they prided themselves as being the most musical people in the world.

One day, a travelling minstrel sailed into the the kingdom from across the sea – singing a new song. Soon, children started to sing this new song, followed by their parents, and word reached the king that the people were no longer singing the king’s songs but were singing something different.

The king flew into a rage, and put the minstrel into a deep and dark dungeon. However this didn’t stop the minstrel singing – and soon the guards started to sing the new song. The king then made laws saying the new song lacked harmony, was discordant, and that anybody caught singing it would be severely punished.

Gradually the people became unhappier. They liked the new song and wanted to sing it along with the old songs. Instead they stopped singing – and the king got angrier and angrier that his songs were no longer being sung. He tried to force people to sing, but they just sang out-of-tune. He made new laws that said they had to sing on Sundays and Mondays, but found that lots of people said they’d lost their voices from singing so much and so couldn’t sing on Sundays or Mondays. And so the king also got unhappier as he no longer heard his songs being sung as in the past….

The basic lesson for a story such as this is to accept and embrace change – rejecting change is likely to be self-defeating. There are many companies and industries that fail in this – the music industry being a classic example, that lost out by refusing to recognise the impact of music downloading, Napster, iTunes and peer-to-peer file sharing. A fairy-story can help highlight the problems – although the solution will need to come from full discussion and management acceptance.

The second story-type is the traditional case-study. Case studies should be used where the organisation knows the problem, but not the solution. Finding the solution directly is difficult as management is too close to the situation. The case-study serves as a way of examining the problem dispassionately, by looking at a parallel situation involving a company or organisation, from another industry, or market. The aim is to analyse the problem and work out appropriate strategies to solve the problem and apply them to the real situation. The key for a case-study is to find one that matches the organisation’s problems. There is a vast bank of case-studies for a range of industries, topics and problems at the Case Study Clearing House.

A third story-type are future scenarios, generally generated as part of a scenario-planning exercise. Such stories attempt to answer “what if” questions by looking at external factors and their correlations and impacts, and then considering how these could play out in the future. It is essential that such scenarios are internally consistent and that there is a clear line of development from the current situation to the future scenario. This can then allow for strategies to be put in place that take into account what could happen. Such strategies need to be adaptable to changing situations and allow for organisations to prepare for any eventuality.

As a reporting approach, telling stories is one way of putting across ideas that stimulate the imagination, and so can help organisations develop strategies that lead to success. There is a common theme to all three story types: problem identification, its acceptance and the need for strategies to cope with change. They differ in their perspective on the world. The fairy-tale approach looks at understanding problems and overcoming blindspots that relate to the past imposing on the present; case studies look at solving present problems; scenarios are aimed at preparing organisations for the future.

Follow

Get every new post delivered to your Inbox.

Join 699 other followers

%d bloggers like this: