Archive

Posts Tagged ‘Competitive Intelligence’

Internet Explorer is for Dummies! Anatomy of a hoax.

August 7, 2011 15 comments

Good business intelligence quickly identifies information that is real and what’s false – or should. It’s important that decision making is based on accurate, factual data – as otherwise bad decisions get made. So how do you tell whether something is real or fake?

Generally, the first rule is to check the source or sources.

  • Are they reputable and reliable?
  • Is the information in the story sensible and reasonable?
  • What’s the background to the story – does it fit in with what’s already known?

The problem is that even if information passes these tests it may still not be true. There are numerous examples of news items that sound true but that turn out to be false. One example is a BBC news story from 2002 quoting German researchers who claimed that natural blondes were likely to disappear within 200 years.   A similar story appeared in February 2006 in the UK’s Sunday Times. This article quoted a WHO study from 2002. In fact, there was no WHO study that stated this – it was false. The story of blonde extinction has been traced back over 150 years and periodically is reported – always with “scientific” references to imply validity.

The “Internet Explorer users have lower IQs” hoax

Often, the decision to accept a news item depends on whether or not it sounds true. If the story sounds true, especially if supported by apparent research then people think that it probably is – and so checks aren’t made. That is why a recent news story suggesting that users of Internet Explorer have lower IQs than those of other browsers was reported so widely. Internet Explorer is often set up as the default browser on Windows computers, and many users are more familiar with Explorer than other browsers. The suggestion that less technologically adept users (i.e. less intelligent users) would not know how to download or switch to a different browser made sense.

I first read the news story in The Register – an online technical newspaper covering web, computer and scientific news. Apart from The Register, the story appeared on CNN, the BBC, the Huffington Post, Forbes and many other news outlets globally (e.g. the UK’s  Daily Telegraph  and Daily Mail). Many of these have now either pulled the story completely, just reporting the hoax, or added an addendum to their story showing that it was a hoax. A few admit to being fooled – the Register, for example, explained why they believed it: because it sounded plausible.

The hoax succeeded however, not only because the story itself sounded plausible, but also because a lot of work had been put in to make it look real. The hoaxer had built a complete web-site to accompany the news item – including other research, implying that the research company concerned was bona fide, other product details, FAQs, and even other research reports, etc. The report itself was included as a PDF download.

In fact most pages had been copies from a genuine company, Central Test headquartered in Paris and with offices in the US, UK, Germany and India – as was highlighted in an article in CBR Online.

Red Flags that indicated the hoax

To its credit the technology magazine, Wired.com spotted several red flags, suggesting that the story was a hoax, stating that “If a headline sounds too good to be true, think twice.”

Wired commented that the other journalists hadn’t really looked at the data, pointing out that “journalists get press releases from small research companies all the time“. The problem is that it’s one thing getting a press release and another printing it without doing basic journalistic checks and follow-throughs. In this case,

  • the “research company” AptiQuant had no history of past studies – other than on its own web-site;
  • the company address didn’t exist;
  • the average reported IQ for Internet Explorer users (80) was so low as to put them in the bottom 15% of the population (while that for Opera users put them in the top 5%) – scarcely credible considering Internet Explorer’s market share.

After the hoax was exposed, the author, Tarandeep Gill, pointed out several red flags that he felt should have alerted journalists and admitted it had been a hoax i.e.

1. The domain was registered on July 14th 2011.
2. The test that was mentioned in the report, “Wechsler Adult Intelligence Scale (IV) test” is a copyrighted test and cannot be administered online.
3. The phone number listed on the report and the press release is the same listed on the press releases/whois of my other websites. A google search reveals this.
4. The address listed on the report does not exist.
5. All the material on my website was not original.
6. The website is made in WordPress. Come on now!
7. I am sure, my haphazardly put together report had more than one grammatical mistakes.
8. There is a link to our website AtCheap.com in the footer.

The rationale and the aftermath

Gill is a computer programmer based in Vancouver, Canada, working on a a comparison shopping website www.AtCheap.com. Gill became irritated at having to code for earlier versions of Internet Explorer – and especially IE 6.0 which is still used by a small percentage of web users. (As of July 2011, 9% of web-users use Internet Explorer versions 6.0 and 7.0 with a further 26% using version 8.0. Only 7% of web users have upgraded to the latest version of Internet Explorer – v9.0).

The problem with IE versions 6.0-8.0 is that they are not compatible with general web-standards making life difficult for web designers who have to code accordingly, and test sites on multiple versions of the same browser – all differing slightly. (As you can’t have all 4 versions of Internet Explorer IE6.0 – IE9.0 on the same computer this means operating 4 separate computers or having 4 hard-disk partitions – one for each version).

Gill decided to create something that would encourage IE users to upgrade or switch, and felt that a report that used scientific language and that looked authentic would do the trick.  He designed the web-site, copying material from Central Test, and then put out the press release – never expecting the story to spread so fast or far. He was sure he’d be found out much more quickly.

The problem was that after one or two reputable news sources published the story everybody else piled in. Later reports assumed that the early ones had verified the news story so nobody did any checks. The Register outlined the position in their mea culpa, highlighting how the story sounded sensible.

Many news outlets are busy flagellating themselves for falling for the hoax. But this seems odd when you consider that these news outlets run stories on equally ridiculous market studies on an almost day basis. What’s more, most Reg readers would argue that we all know Internet Explorer users have lower IQs than everyone else. So where’s the harm?

The facts are that AptiQuant doesn’t exist and its survey was a hoax. But facts and surveys are very different from the truth. “It’s official: IE users are dumb as a bag of hammers,” read our headline. “100,000 test subjects can’t be wrong.” The test subjects weren’t real. But they weren’t necessarily wrong either.

You may disagree. But we have no doubt that someone could easily survey 100,000 real internet users and somehow prove that we’re exactly right. And wrong.

The real issue is that nobody checked as the story seemed credible. Competitive Intelligence analysis cannot afford to be so lax. If nobody else bothers verifying a news story that turns out to be false, you have a chance to gain competitive advantage. In contrast those failing to check the story risk losing out. The same lessons that apply to journalists apply to competitive intelligence and just because a news story looks believable, is published in a reputable source and is supported by several other sources doesn’t make it true. The AptiQuant hoax story shows this.

Meanwhile the story rumbles on with threats of lawsuits against Tarandeep Gill by both Microsoft (for insulting Internet Explorer users) and more likely by Central Test. Neither company is willing to comment although Microsoft would like users to upgrade Internet Explorer to the latest version. In May 2010 Microsoft’s Australian operation even said using IE6 was like drinking nine-year-old milk. If Gill has managed to get some users to upgrade he’ll have helped the company. He should have also helped Central Test – as the relatively unknown company has received massive positive publicity as a result of the hoax. If they do sue, it shows a lack of a sense of humour (or a venal desire for money) – and will leave a sour taste as bad as from drinking that nine-year-old milk.

Zanran – a new data search engine

April 21, 2011 4 comments

I’ve been playing with a new data search engine called Zanran – that focuses on finding numerical and graphical data. The site is in an early beta. Nevertheless my initial tests brought up material that would only have been found using an advanced search on Google – if you were lucky. As such, Zanran promises to be a great addition for advanced data searching.

Zanran.com

Zanran.com - Front Page

Zanran focuses on finding what it calls  ‘semi-structured’ data on the web. This is defined as numerical data presented as graphs, tables and charts – and these could be held in a graph image or table in an HTML file, as part of a PDF report, or in an Excel spreadsheet. This is the key differentiator – essentially, Zanran is not looking for text but for formatted numerical data.

When I first started looking at the site I was expecting something similar to Wolfram Alpha – or perhaps something from Google (e.g. Google Squared or Google Public Data). Zanran is nothing like these – and so brings something new to search. Rather than take data and structure or tabulate it (as with Wolfram Alpha and Google Squared), Zanran searches for data that is already in tables or charts and uses this in its results listing.

Zanran.com

Zanran.com Search: "Average Marriage Age"

The site has a nice touch in that hovering the cursor over results gives you the relevant data page – whether a table, a chart or a mix of text, tables or charts.

Zanran.com - Hovering over a result brings up an image of the data.

The advanced search options allow country searching (based on server location), document date and file type, each selectable from a drop-down box, as well as searches on specified web-sites.  At the moment only English speaking countries can be selected (Australia, Canada, Ireland, India, UK New Zealand, USA and South Africa). The date selections allow for the last 6, 12 or 24 months and the file type allows for selection based on PDF; Excel; images in HTML files; tables in HTML files; PDF, Excel and dynamic data; and dynamic data alone. PowerPoint and Word files are promised as future options. There are currently no field search options (e.g. title searches).

My main dislike was that the site doesn’t give the full URLs for the data presented. The top-level domain is given, but not the actual URL which makes the site difficult to use when full attribution is required for any data found (especially if data gets downloaded, rather than opening up in a new page or tab).

Zanran.com has been in development since at least 2009 when it was a finalist in the London Technology Fund Competition. The technology behind Zanran is patented and based on open-source software, and cloud storage. Rather than searching for text, Zanran searches for numerical content, and then classifies it by whether it’s a table or a chart.

Atypically, Zanran is not a Californian Silicon Valley Startup, but is based in the Islington area of London, in a quiet residential side-street made up of a mixture of small mostly home-based businesses and flats/apartments. Zanran was founded by two chemists, Jonathan Goldhill and Yves Dassas, who had previously run telecom businesses (High Track Communications Ltd and Bikebug Radio Technologies) from the same address. Funding has come from the London Development Agency and First Capital among other investors.

Zanran views competitors as Wolfram Alpha, Google Public Data and also Infochimps (a database repository – enabling users to search for and download a wide variety of databases). The competitor list comes from Google’s cache of Zanran’s Wikipedia page as unfortunately, Wikipedia has deleted the actual page – claiming that the site is “too new to know if it will or will not ever be notable“.

Google Cache of Zanran's Wikipedia entry

I hope that Wikipedia is wrong and that Zanran will become “notable” as I think the company offers a new approach to searching the web for data. It will never replace Google or Bing – but that’s not its aim. Zanran aims to be a niche tool that will probably only ever be used by search experts. However as such, it deserves a chance, and if its revenue model (I’m assuming that there is one) works, it deserves success.

Telling stories – fairy tales, case-studies & scenarios….

April 14, 2011 5 comments

Telling Stories - At the ICI/Atelis competitive intelligence conference that took place last week (April 6-7, 2011) in Bad Nauheim, Germany there was a panel discussion on story-telling as a method of reporting intelligence. At about the same time, the Association of Independent Information Professionals (AIIP) held their 25th annual conference in Vancouver, Washington in the USA. Mary-Ellen Bates described how stories can help information professionals market themselves by showing how their skills can solve client problems. The fact that both conferences looked at story-telling shows how businesses are adopting the technique as a way of addressing complex issues.

Story telling is an ancient art-form that might seem strange as a business tool. However, often stories will be an excellent approach for solving business questions as they allow people to look at a situation objectively, remove themselves from the scene and take an outside view. The trick is to tell the right story, catching the imagination and making people think. During the ICI / Atelis conference I suggested a framework for when different story styles can be used.

The first story type is the “fairy-tale” – the “Once Upon a Time in a Kingdom Far Away” type of story. Fairy-tales are possibly the most abstract example of a story that can be applicable to business. The danger is that they can be seen as childish and far-removed from real-world business realities. In fact, they can be a powerful way of highlighting deep-seated organisational problems, as management refusal to see such problems can be illustrated with stories. Such stories can help managers recognise their own situation, and so identify the problems and think of possible solutions.

Consider a company where the CEO or other senior management refuse to see that their business has changed.  Often such management grew up in the industry and believe that they know it inside out. Accepting that things have changed is anathema to them. A standard comment given by such managers when asked why things are done in a particular way is “We’ve always done it that way“. Essentially such management suffers from corporate denial – or what Ben Gilad called a business taboo in his book “Business Blindspots“.

Telling such managers a fairy-tale story can help them see the problem (assuming that you can arrange a session they will be willing to attend).

Once upon a time, in a far-away country there was a king who loved to sing. He loved to sing so much that he made laws that all his people were to learn his favourite songs.

Every Sunday, the people were to gather in the town squares and village greens and sing the songs the king loved.  The people were happy as they also loved the music and they prided themselves as being the most musical people in the world.

One day, a travelling minstrel sailed into the the kingdom from across the sea – singing a new song. Soon, children started to sing this new song, followed by their parents, and word reached the king that the people were no longer singing the king’s songs but were singing something different.

The king flew into a rage, and put the minstrel into a deep and dark dungeon. However this didn’t stop the minstrel singing – and soon the guards started to sing the new song. The king then made laws saying the new song lacked harmony, was discordant, and that anybody caught singing it would be severely punished.

Gradually the people became unhappier. They liked the new song and wanted to sing it along with the old songs. Instead they stopped singing – and the king got angrier and angrier that his songs were no longer being sung. He tried to force people to sing, but they just sang out-of-tune. He made new laws that said they had to sing on Sundays and Mondays, but found that lots of people said they’d lost their voices from singing so much and so couldn’t sing on Sundays or Mondays. And so the king also got unhappier as he no longer heard his songs being sung as in the past….

The basic lesson for a story such as this is to accept and embrace change – rejecting change is likely to be self-defeating. There are many companies and industries that fail in this – the music industry being a classic example, that lost out by refusing to recognise the impact of music downloading, Napster, iTunes and peer-to-peer file sharing. A fairy-story can help highlight the problems – although the solution will need to come from full discussion and management acceptance.

The second story-type is the traditional case-study. Case studies should be used where the organisation knows the problem, but not the solution. Finding the solution directly is difficult as management is too close to the situation. The case-study serves as a way of examining the problem dispassionately, by looking at a parallel situation involving a company or organisation, from another industry, or market. The aim is to analyse the problem and work out appropriate strategies to solve the problem and apply them to the real situation. The key for a case-study is to find one that matches the organisation’s problems. There is a vast bank of case-studies for a range of industries, topics and problems at the Case Study Clearing House.

A third story-type are future scenarios, generally generated as part of a scenario-planning exercise. Such stories attempt to answer “what if” questions by looking at external factors and their correlations and impacts, and then considering how these could play out in the future. It is essential that such scenarios are internally consistent and that there is a clear line of development from the current situation to the future scenario. This can then allow for strategies to be put in place that take into account what could happen. Such strategies need to be adaptable to changing situations and allow for organisations to prepare for any eventuality.

As a reporting approach, telling stories is one way of putting across ideas that stimulate the imagination, and so can help organisations develop strategies that lead to success. There is a common theme to all three story types: problem identification, its acceptance and the need for strategies to cope with change. They differ in their perspective on the world. The fairy-tale approach looks at understanding problems and overcoming blindspots that relate to the past imposing on the present; case studies look at solving present problems; scenarios are aimed at preparing organisations for the future.

Competitive Intelligence & Culture

February 21, 2011 7 comments

My last few posts have gone off my topic – and the main raison d’être of this blog – competitive intelligence and finding business information. I’m tempted to write about the turmoil currently going on in the Middle East – but echo the reply that Zhou Enlai of China is reputed to have told President Nixon when asked for his views on the 1789 French Revolution: “It’s too soon to tell”.

At the same time, any attempt to understand what is happening has to take into account the cultures involved. Too many pundits ignore culture as an influencing factor and expectations that the Middle East will suddenly adopt Western democratic ideals strike me as unlikely. That is not to say that some form of democratic rule won’t appear – it just won’t be the same as found in the US, UK, France, Australia and other countries dominated by Western Christian traditions.

Understanding culture is also important for competitive intelligence professionals – yet is often ignored. Several years ago I was speaking to a US based consultant – and asked how he went about doing cross-border research. His response was that all research was done in-house. I then asked how he coped with different languages and time-zones. He replied that he used people fluent in the relevant languages working in shifts. At no point did he accept that different countries expect differing approaches – and that his approach may not always give the best results and so be value for money (or even safe) for his clients.

A few months ago, I led a workshop on competitive intelligence in Indonesia. On the second day I was asked a question that I’d never been asked previously. I was asked when I was going to teach the attendees “unethical” ways for gathering competitor intelligence. The reason given was that they knew that their competitors did not follow ethical approaches to gathering competitive intelligence and they also wanted to learn such techniques. They felt that if they only used the standard ethical approaches used in America and the UK then they would be at a disadvantage. Unfortunately they were probably correct.

Of course the ideal situation is to be ethical in all that you do – but if your competitors don’t follow US/UK ethical guidelines it can cause a problem for you, if you do. Especially if ethical norms are different in the country concerned. I handled the query by spending some time talking about counter-intelligence and what to watch out for – mentioning the sorts of things that could be done against them, and how to detect them. This way, hopefully, my questionner will realise that they can stop their competitors gathering material unethically – and that they can do things the right way themselves. (Of course they could turn what I told them round – and use the same approaches back. That is their decision and at least they’ll know the risk to their reputation if caught).

The point about the question however was that it illustrated a cultural difference. The standard SCIP code of ethics is essentially an American construct and relates to US business. I personally believe it is correct – but that may be because my cultural background as a Brit is not too dissimilar to American business norms.

As another example, I have had to spell out to a non-European client of mine that I am not willing to recruit “a company insider” to provide a constant flow of information. He kept asking me to try and locate an employee who I can periodically pay for the latest information on his employer. My client found it very difficult to understand why this is unethical – as he views it as normal that employees will want to supplement their income by sharing information. When I pointed out that this was industrial espionage he disagreed – as I was not hacking, or using bugs but just speaking to somebody who is willing to provide information on a regular basis, and rewarding this person for their time and effort.

Then there was the client who failed to understand why I preferred to interview UK contacts via the telephone rather than arrange meetings. In his culture, unlike in the UK,  face-to-face contact is crucial for any form of business relationship or transaction. The telephone is used to set up meetings – and the idea that you could ever do business over the telephone instead of in person seemed strange.

Japanese vs. European greetingsThese are just some of the ways that cultural differences can impact competitive intelligence practice. There are more. Imagine that you want to find out whether a particular product is about to launch. Within the UK, you may call up a contact and simply ask whether the product will launch within 3 months. You will get a yes or no answer. However doing the same research in Japan needs a different approach, as the standard response in Japan will be yes even if the answer is no. Saying no would be bad form – and result in a loss of face. A better approach would be to provide an alternative e.g. “when will the new product launch – less than 3 months or more than 3 months”. This way, you have to get an answer – it is not a yes/no type question.

These are extremes. However even within Europe there are differences – traditionally in Germany you would not use a first name to speak to a contact, while within the UK that is generally acceptable. Behaving in the wrong way can put a distance between you and your interviewee – so it is important to know the cultural impact of your approach.

Unfortunately not much appears to have been written on the impact of culture on competitive intelligence practice – and this is a topic ripe for research.

%d bloggers like this: