Phil Fersht on service provider rankings: make the experts accountable, not faceless brands

|

Vinnie Mirchandani has his latest take on the constant controversy of third-party researchers, consultants and associations compiling rankings of service providers.  This time the IAOP's Global Services 100 is being questioned. 

We've also had some banter about the Black Book of Outsourcing on this site, which made such a noise with its constant rankings of service providers, that Datamonitor decided to buy them to hop on this bandwagon.  And we've never even got to discussing the Global Services 100, or several other rankings that come out periodically.  Ben Johnson 1988Moreover, some "analyst" firms make a living ranking service providers, while barely bothering to talk to their customers, and selling white papers to the winners so they can flout their success (you all know who you are).

Personally, as an analyst and advisor, I find these lists useful – I sometimes find out about some provider I didn't know a lot about, and they draw attention to who's doing well at the


moment.  But that's all I care about. If these entities produced a directory of service providers, it would be a valuable resource to the industry at large.  But that isn't really the case; why produce free information of you can't sell it as marketing collateral?

What worries me is the following:

  • These entities make the majority (or all) of their revenues from service providers;

  • However accurate these rankings may be, they always make errors which can cause buyers to make poor purchasing decisions, or service providers to be unfairly discounted from a down-select process

So what can entities to do make these rankings more credible?

If I am advising a client in an area where I need some additional input or validation, I want to talk to the resident expert in that field.  At the end of the day, it's the recognized experts who have withstood the test of time that can give you the real deal when it comes to rankings.  A true expert in a niche area, for example application development services, would be in constant contact with all the service providers, their clients' successes and traumas, and have a unique view on who really is delivering the goods, versus who is struggling.  And we don't really want some fluffy top-ten score, we want to know exactly where each service provider has scale, expertise, language, technology and process acumen.  That's the real info customers need.  The only benefit of rankings is for service providers to send out press releases boasting about their "performance" and add more marketing vapourware to PowerPoint decks. Oh, and they need to pony up significant cash to whomever compiled the list.  But it's cheap point-scoring, so what do they care?

Hence, in my world, a service provider ranking is only really credible when the expert's personal name and brand is attached.  Someone is being held accountable, as opposed to some faceless corporate brand, which masks any real accountability.  At least an analyst firm is attaching an analyst's name to that ranking list, so that analyst has some skin in the game.  A good analyst will normally produce correct performance indicators.  A poor analyst will make mistakes, or simply not have good industry connections or guidance, and likely not last long in this economy.

Strong stuff I know, but it has to be said… your views are more than welcome.

Posted in : Confusing Outsourcing Information

Comment8

8 comments

Leave a Reply

Your email address will not be published. Required fields are marked *

  1. Phil – you’re spot-on with these observations. The motive behind these rankings is solely to extract marketing money from service providers, and as you and several others have pointed out, the data is often flawed.

    However, I do think you let some of the analysts off lightly. One of the leading analyst firms recently put out rankings of BPO service providers which was badly misinformed. These providers will extract these rankings for their marketing purposes and leave off the name of the analyst responsible.

    Gaurav

  2. Gaurav: I assume you’re referring to that IDC report on Indian BPO providers? If so, that was a perfect example of a firm producing research to exploit marketing budgets, with no focus on delivering value to a buyer. I do hope Datamonitor does not choose to go down this path…

    Stephen

  3. Gaurav,

    Thanks for the input, and I agree with your assessment of some “analysts” being as guilty as the rank-masters of exploiting service provider marketeers (thank goodness only a minority at present).

    I do worry for the direction of the analyst industry when we see deliberate “research” being produced to promote the highest bidder. Analysts are supposed to educate, not exploit.

    I never produce rankings, but feel I advise my clients with all the datapoints they need to make decisions. People pay for information and advice they know is credible. If I had to resort to exploiting marketing dollars in such an obvious fashion, I would question whether I was a geniune “analyst” anymore, or merely a sandwichboard for vendors…

    PF

  4. Rankings would be useful if there were an unbiased, uncompensated evaluator. Most of the current rankings seem to have the highest ratings given to their subscribers and evaluators…maybe a coincidence, of course. It’s disappointing to see “awards” given to companies with dissatisfied clients and difficult implementations. Who is making these decisions? It’s embarassing at times.
    PA

  5. Good commentary but too gentle. The 100 list is a joke. Colliers is a broker with little outsourcing. Provide any sticky revenue and suddenly “you are an outsourcer.”

  6. Phil – you’re being far too kind. These list-generators and award-givers are simply whores who feed off the marketing paranoia from vendors. They don’t have the expertise to give real value to buyers, so choose to feed off the dirty vendor dollar instead, as they have no other business model, or clue what to do. The vendors are just playing the game to make themselves look good – however, the highest bidders are buying the better ratings and the top awards. The whole thing stinks. Please publish this comment.

    John D.

  7. Well it seems we are all in agreement. The unfortunate part is that it seems many analyst firms can’t seem to subsist without taking money from providers. In most industries this would be called out immediately. I have no problem with rankings, if they are objective and use a proven, reliable method. But I can’t think of one out there that passes the smell test.

  8. Hi, Phil. Mike Corbett from IAOP here.

    Although no ranking of an industry as diverse as ours is perfect, IAOP’s Global Outsourcing 100 is used widely as a reference tool by our members and has in general been well received for its objective and open process.

    Just some quick background for your readers. Our ranking is essentially a ‘competition’ where service providers decide to participate and are judged based on their response to an online application covering some 18 characteristics including size, growth, third-party recognitions and certifications, customer feedback (including in most cases actual letters of recommendation from their largest clients), employee management and executive experience. The scoring system is published and a panel of judges – all identified and almost all customers with actual buying experience – do the scoring. Highest score = highest ranking, it’s that simple.

    All of this is well documented on our site: http://www.outsourcingprofessional.org/content/23/152/1793/

    Beyond a nominal $350 application fee that every participating company pays – whether they make the list or not – there is no outside sponsorship of the program. IAOP members and non-members participate equally. Only 3 of this year’s top ten are members. A full report is also available for purchase where we detail every company and how it scored in every area judged.

    One interesting way that the program seems to contribute to our field is that many of the companies have told us that the application and its evaluation feedback actually helps them do a better job of collecting and packaging the most important information about their companies in a way that is more consistent and easier for all customers to digest and compare. It’s the ‘process’ as much as the ‘result’ that seems to have a positive impact overall.

    Finally, we’d be glad to add you, Vinnie and/or a volunteer or two from your readers to next year’s judges panel. I can be reached at [email protected].

    Thanks and keep up the good work!

Continue Reading