Archive

Archive for March, 2012

Disagreements at the top

March 16, 2012 1 comment

This week the news reported the departure from their companies of two executives – both long-standing.

Greg Smith’s departure from Goldman Sachs, after 12 years, has been reported globally. This is not surprising – as everybody loves to hate bankers, and investment banks. The claim that Goldman Sachs viewed clients as “muppets” is a delicious image, and so it’s not surprising to see a journalistic feeding frenzy following Smith’s resignation letter, published in the New York Times on 14 March 2012.

The real question however is whether Smith’s departure matters. I think that it depends on what clients do, and I suspect that the answer will be very little or perhaps nothing. Obviously Goldman Sachs’s aim is to make money. In a testosterone fueled environment, bravado, where clients are called muppets and phrases such as “hunt elephants” (referring to getting customers to spend more with you) shouldn’t be a surprise. If anything, the discussion will raise again (for a few more weeks) the issue of banker remuneration. It may even have a salutary effect by making firms such as Goldman Sachs emphasise that ethical behaviour in business must be the norm, and that the 1980s dogma that “greed is good” is not an asset post the 2008/9 financial crisis. As Goldman Sachs has said in response:

In our view, we will only be successful if our clients are successful. This fundamental truth lies at the heart of how we conduct ourselves.

In fact, as the Economist suggests, the real muppet may be Smith himself, for not realising that clients aren’t stupid, and that if they weren’t getting value from the firm they’d move elsewhere. I suspect that the real reason for Smith’s resignation was sour grapes. Perhaps somebody got a bigger bonus. Whatever the reason, it’s unlikely he’ll find similar work with other banks – as no company will want to employ somebody who is quite so vocal in their condemnation of their former employer.

The more interesting departure however, from a strategic perspective, was that of Richard Brasher, the UK boss of the supermarket Tesco. Brasher was the most high profile departure since new CEO, Philip Clarke, replaced Sir Terry Leahy. Leahy retired from Tesco at the end of February 2011 and since then a number of other senior executives have left or are leaving the firm. These include

  • David Potts, head of the Asian operations who will retire, aged 55, from Tesco in June;
  • Andy Higginson, head of Tesco bank and former group finance and strategy director – also aged 55;
  • David Reid, Tesco’s chairman – who was replaced by Sir Richard Broadbent in November 2011;
  • Lucy Neville-Rolfe, Tesco’s director of corporate and legal affairs, who will retire from Tesco in January 2013. Lucy Neville-Rolfe’s role is being split into two – neither of which will be a board post;
  • Richard Jones, Head of Clothing who has moved to the private Irish supermarket, Dunnes, taking the same role;
  • Laura Wade-Gery, CEO of Tesco.com and Tesco Direct, and head of non-Food, who has moved to a board-level position with Marks & Spencer.
The news stories reporting Brasher’s departure mentioned Tesco’s poor winter sales implying that this was the reason for the change. Philip Clarke will take over Brasher’s role, combining the job of UK CEO with that group Chief Executive. Some reports suggested that deep disagreements existed between the two over strategy for the UK – which issued its first profit warning for 20 years. Tesco has not denied this. Although originally Clarke said that there was no rift between the two, he changed his tune after the announcement of Brasher’s departure, saying

You can’t have two captains in a team

However it’s not just Brasher that seems to be finding a problem. The number of senior executives – especially long-standing executives – leaving Tesco suggests profound disagreements at the top.

David Reid was expected to retire and Tesco had been looking to appoint a new chairman to replace him. Potts, Higginson and Neville-Rolfe are also reported to be retiring. Their departures, so close together, suggests an unhappiness with Clarke’s management of Tesco as generally companies try and prevent large-scale boardroom changes to ensure continuity.

When a board is split over strategy and cannot agree, continuity is not possible. Management is all about consensus and agreement on the path that should be followed.  If this is not possible  there has to be change, with one side or the other leaving. The alternative is chaos, resulting in the company losing share and profitability as the focus moves to internal dispute, rather than market growth. This appears to be the situation at Tesco – forcing Philip Clarke to assert his authority. It was either his head or Brasher’s. As Clarke said: there can only be one captain.

 

Note: After writing the above article I came across a great Harvard Business Review blog looking in depth at Goldman Sachs culture and how it may have changed over the years since Greg Smith started (and why). Worth reading for any Goldman Sachs watchers:

http://blogs.hbr.org/fox/2012/03/greg-smiths-resignation-op-ed.html

How to spot web-site plagiarism and copyright theft with CopyScape

March 11, 2012 14 comments

AWARE has had a web-site since 1995 and our current domain (www.marketing-intelligence.co.uk) has been active since 1997. When we started there were less than 100,000 companies on the web. Google’s founders had not yet met each other, and even venerable search engines such as AltaVista had not yet started.

Over the years, we’ve made an effort to ensure that our web content was not copied and used on other sites without our permission.

Although doing manual checks by searching for key phrases is one way of checking for plagiarism and copyright theft, there are a number of dedicated plagiarism checking sites. One example is Plagium. Plagium’s drawback, shared with several similar services, is that you have to paste in the text you want to test rather than just enter the URL. Such services are generally aimed at helping teachers and college professors detect student cheating.

Although some services (such as Plagium) are free, most are not and may involve downloading dedicated software. Others only check a limited number of known “essay” type sites where students can download essays written by others. (We’ve found some of our content on such sites – evidently students who use them don’t care where they steal their content from. Once used successfully they then try to reuse it by uploading their A+ essay to the site).

CopyScape

Of all plagiarism detection websites probably the easiest and best is CopyScape. Copyscape’s aim is not only to help academics detect student cheating. It also allows webmasters to search for copied content in general. It doesn’t require users to paste in the suspect text. Instead web-site owners simply need to enter their URLs and get a report on other sites that use similar or identical wording. It’s sufficiently powerful that there is even a flippant web-page on ContentBoss’s website giving advice on how to bypass CopyScape and copy with impunity. (ContentBoss promises to provide unique content at a low monthly fee. Their bypass CopyScape tool uses a technique that will convert content into HTML guaranteed not to be picked up by plagiarism detectors. The catch is, as pointed out by ContentBoss, that using such content is also a guarantee that the site will be banned by search engines for spam content).

We’ve used CopyScape periodically over the years and miscreants included a competitor site that copied multiple pages from our site. We asked the site owner to change his pages and were ignored. We then took stronger action and within a couple of days the site was taken down. Another example involved an article published in a professional journal that took, almost verbatim, the content of our brief guide to competitive intelligence. We notified the publisher who ensured that the payment made to the “author” was recovered, and an apology published. The author said that he thought that material published on the web was copyright free. He was shown to be wrong.

Our most recent trawl for examples of copyright theft from AWARE’s pages turned up further examples where wording we’ve used has been stolen. The following images should show how effective the tool is – while at the same time naming and shaming the companies that are too weak, lazy or incompetent to produce their own copy and have to steal from others. (I’ve named them – but won’t give them the satisfaction of a link as this could help their search engine optimisation efforts – if they have any!)

The first example shows how text that appears on the footer of most of our pages is plagiarised.

This is the orignal text.

CopyScape found several sites had copied this text almost verbatim – for example Green Oasis Associates based in Nigeria:

or ICM Research from Italy and Pearlex from Virginia in the USA.

The ICM Research example is in fact the worst of these three, as their site has taken content from several other AWARE web-site pages.

The problem is that a company that is willing to steal content from other businesses is unethical – breaking the rule against misrepresenting who you are. If they are willing to steal content from others, they may also take short-cuts in the services provided and as a result should not be trusted to provide a competent service.

The page that is most often plagiarised is the Brief Guide to Competitive Intelligence Page, mentioned above. Clicking on a link found by CopyScape highlights the copied portions as seen in the following examples from AGResearch, Emisol and Wordsfinder.

Generally sites do not copy whole pages (although this does happen) but integrate chunks of stolen text into their pages – as seen in the AGResearch example, below – where 12% of the page is copied, and Wordsfinder where 13% has been copied.

The Emisol example below stole less – although copied key parts of the guide page:

Conclusion

Copyright theft is a compliment to the author of the original web-page, as it shows that the plagiarizing site views their competitor as top quality. However the purpose of writing good copy is to stand out and show one’s own capabilities. Sites that steal other site’s work remove this advantage as they make the claims seem anodyne and commonplace. They devalue both the copier – who cannot come up with their own material (and so are unlikely to be able to provide a competent service anyway) and the originator, as most people won’t be able to tell who came first. Fortunately search engines can, and when they detect duplication, they are likely to downplay the duplicated material meaning that such sites are less likely to appear high-up in search engine rankings. The danger is that both the originator of the material and the plagiarizer may get penalised by search engines – which is another reason to ensure that copyright thieves are caught and stopped. CopyScape is one tool that really works in protecting authors from such plagiarism.

Analysing weak signals for competitive & marketing intelligence

March 5, 2012 6 comments

I’ve just read an interesting blog post by  Philippe Silberzahn and Milo Jones. The post “Competitive intelligence and strategic surprises: Why monitoring weak signals is not the right approach” looked at the problems of weak signals in competitive intelligence and how even though an organisation may have lots of intelligence, they still get surprised.

Silberzahn and Jones point out that it’s not usually the intelligence that is the problem, but the interpretation of the gathered intelligence. This echoed a statement by Issur Harel, the former head of Mossad responsible for capturing the Nazi war criminal Eichmann. Harel was quoted as saying “We do not deal with certainties. The world of intelligence is the world of probabilities. Getting the information is not usually the most difficult task. What is difficult is putting upon it the right interpretation. Analysis is everything.”

In their post, Silberzahn and Jones argue that more important than monitoring for weak signals, is the need to monitor one’s own assumptions and hypotheses about what is happening in the environment. They give several examples where weak signals were available but still resulted in intelligence failures. Three different types of failure are mentioned:

  • Too much information: the problem faced by the US who had lots of information prior to the Pearl Harbour attack of 7 December 1941,
  • Disinformation, as put out by Osama bin Laden to keep people in a high-state of alert – by dropping clues that “something was about to happen“, when nothing was (and of course keeping silent when it was),
  • “Warning fatigue” (the crying wolf syndrome) where constant repetition of weak signals leads to reinterpretation and discounting of threats, as happened prior the Yom Kippur war.

Their conclusion is that with too much data, you can’t sort the wheat from the chaff, and with too little you make analytical errors. Their solution is that rather than collect data and subsequently analyse it to uncover its meaning you should first come up with hypotheses and use that to drive data collection. They quote Peter Drucker (Management: Tasks, Responsibilities, Practices, 1973) who wrote: “Executives who make effective decisions know that one does not start with facts. One starts with opinions… To get the facts first is impossible. There are no facts unless one has a criterion of relevance.”  and emphasise that “it is hypotheses that must drive data collection”.

Essentially this is part of the philosophy behind the “Key Intelligence Topic” or KIT process – as articulated by Jan Herring and viewed as a key CI technique by many Competitive Intelligence Professionals.

I believe that  KITs are an important part of CI, and it is important to come up with hypotheses on what is happening in the competitive environment, and then test these hypotheses through data collection. However this should not detract from general competitive monitoring, including the collection of weak signals.

The problem is how to interpret and analyse weak signals. Ignoring them or even downplaying them is NOT the solution in my view – and is in fact highly dangerous. Companies with effective intelligence do not get beaten or lose out through known problems but from unknown ones. It’s the unknown that catches the company by surprise, and often it is the weak signals that, in hindsight, give clues to the unknown. In hindsight, their interpretation is obvious. However at the time, the interpretation is often missed, misunderstood, or ignored as unimportant.

There is an approach to analysing weak signals that can help sort the wheat from the chaff. When you have a collection of weak signals don’t treat them all the same. Categorise them.

  • Are they about a known target’s capabilities? Put these in box 1.
  • Are they relating to a target’s strategy? These go into box 2.
  • Do they give clues to a target’s goals or drivers? Place these in box 3.
  • Can the weak signal be linked to assumptions about the environment held by the target? These go into box 4.

Anything else goes into box 5. Box 5 holds the real unknowns – unknown target or topic or subject. You have a signal but don’t know what to link it to.

First look at boxes 1-4 and compare each bit of intelligence to other information.

  1. Does it fit in? If so good. You’ve added to the picture.
  2. If it doesn’t, why not?

Consider the source of the information you have. What’s the chronology? Does the new information suggest a change? If so, what could have caused that change? For this, compare the other 3 boxes to see if there’s any information that backs up the new signal – using the competitor analysis approach sometimes known as 4-corners analysis, to see if other information would help create a picture or hypothesis of what is happening.

If you find nothing, go back and look at the source.

  • Is it old information masquerading as new? If so, you can probably discount it.
  • Is it a complete anomaly – not fitting in with anything else at all? Think why the information became available. Essentially this sort of information is similar to what goes into box 5.
    • Could it be disinformation? If so, what is likely to be the truth? Knowing it may be disinformation may lead to what is being hidden?
    • Or is it misinformation – which can probably be discounted?
    • What about if you can’t tell? Then it suggests another task – to try and identify other intelligence that would provide further detail and help you evaluate the anomaly. Such weak signals then become leads for future intelligence gathering.

With box 5 – try and work out why it is box 5. (It may be that you have information but no target to pin it to, for example – so can’t do the above). As with anomalies, think why the information became available. You may need to come up with a number of hypotheses to explain meaning behind the information. These can sometimes (but not always) be tested.

Silberzahn and Jones mention a problem from Nassim Taleb’s brilliant book “The Black Swan: The Impact of the Highly Improbable“. The problem is how do you stop being like a turkey before Thanksgiving. Prior to Thanksgiving the turkey is regularly fed and given lots and lots of food. Life seems good, until the fateful day, just before Thanksgiving, when the food stops and the slaughterer enters to prepare the turkey for the Thanksgiving meal. For the turkey this is a complete surprise as all the evidence prior to this suggests that everything is going well. Taleb poses the question as to whether a turkey can learn from the events of yesterday what is about to happen tomorrow. Can an unknown future be predicted – and in this case, the answer seems to be no.

For an organisation, this is a major problem as if they are like turkeys, then weak signals become irrelevant. The unknown can destroy them however much information they hold prior to the unforeseen event. As Harel said, the problem is not information but analysis. The wrong analysis means death!

This is where a hypothesis approach comes in – and why hypotheses are needed for competitive intelligence gathering. In the Thanksgiving case, the turkey has lots of consistent information coming in saying “humans provide food”.  The key is to look at the source of the information and try to understand it. In other words:

Information: Humans provide food.
Source: observation that humans give food every day – obtained from multiple reliable sources.

You now need to question the reason or look at the objectives behind this observation. Why was this observation available? Come up with hypotheses that can be used to test the observations and see what matches. Then choose a strategy based on an assessment of risk. In the case of the turkey there are two potential hypotheses:

  1. “humans like me and so feed me” (i.e. humans are nice)
  2. “humans feed me for some other reason” (i.e. humans may not be nice).

Until other information comes in to justify hypothesis 1, hypothesis 2 is the safer one to adopt as even if hypothesis 1 is true, you won’t get hurt by adopting a strategy predicated on hypothesis 2. (You may not eat so much and be called skinny by all the other turkeys near you. However you are less likely to be killed).

This approach can be taken with anomalous information in general, and used to handle weak signals. The problem then becomes not the analysis of information but the quantity. Too much information and you start to drown and can’t categorise it – it’s not a computer job, but a human job. In this case one approach is to do the above with a random sample of information – depending on your confidence needs and the quantity of information. This gets into concepts of sampling theory – which is another topic.

%d bloggers like this: