Ollie O’Donoghue
 
Research Vice President - Head of Graduate Academy 
Learn more about Ollie O’Donoghue
When did you earn the right to stop learning new skills and abilities?
June 02, 2019 | Phil FershtOllie O’Donoghue

When you have to listen to literally hundreds of people a day spouting advice about reskilling, unlearning, change management, relearning etc., I am going to respond with “great, so what are you doing yourself to stay ahead of today’s digital environment and increase your value as a superstar worker?”  You may love to pontificate constantly weird definitions of digital transformation on twitter and harp on about today's digital talent needs, but do you truly practice what you preach?

Is it just me, or have we entered an environment where everyone loves to talk about change, but most aren't actually doing anything (themselves) about it?

I mean, if your accountant hadn’t bothered to brush up on the latest tax changes, or your personal trainer didn’t know how to use a Fitbit, you probably would seek to replace those relationships in your life.  So what gives IT professionals the right not to learn Python, or learn how to deploy data management / automation tools?  And what gives business executives the right not to learn how to use non-code analytics tools to help their decision-making, or social media products to help them communicate in the market?  And operations executives the right not to learn low-code automation and AI apps that can help them free up people-hours on work that adds no strategic value to the business?  And who told sales and marketing executives it was fine to ignore really learning the products / services they were selling because all they had to do was to follow a set of pre-defined processes to do their job effectively?

Why have so many of us become so complacent?

It just seems that the majority of workers today just think they need to learn to follow a few processes and that’s all they need to do to command a tasty salary and remain employed for years and years…. so few people actually realize that the whole nature of people value is changing for enterprises – they just love to do things the same old way they have always done them, and simply cannot be expected to learning anything new.  "We just don't have the talent in-house to do that" is the constant whine we hear from enterprises; and "our IT managers are project managers, not consultants" is what we hear from service providers.  Then why don't you train them?  Is our agonized response.  Why does everything have to stay paralyzed in this constant vacuum of sameness

Much depends on the approach our enterprises take to driving change

The biggest problem with enterprise operations today is the simple fact that most firms still run most of their processes exactly the same way as they did decades years ago, with the only “innovation” being models like offshore outsourcing and shared service centers, cloud and digital technologies enabling those same processes to be conducted steadily faster and cheaper.  However, fundamental changes have not been made to intrinsic business processes – most companies still operate with their major functions such as procurement, customer service, marketing, finance, HR and supply chain operating in individual silos, with IT operating as a non-strategic vehicle to maintain the status quo and keep the lights on.

As our Hyperconnected journey illustrates, many industries have now reached a place where they have maximized all their delivery methods for getting processes executed as efficiently and cheaply as possible.  They have tackled the early phases of digital impact by embracing interactive technologies to help them respond to their customer needs as those needs occur, whether electronic or voice.

Click to Enlarge

In short, most enterprises have been able to keep pace with each other without actually changing the underlying logic of processes.  Simply doing things the same old way has been enough for many, until a competitor comes along with an entirely unique way of servicing your

Read More »

The Life of Brian: Prettying up a baby that's got a bit ugly
May 11, 2019 | Phil FershtJamie SnowdonOllie O’Donoghue

What has happened to the Indian-heritage IT service provider that stoked fear into every Accenture client partner?  “They think like we do” was the declaration one of Accenture’s leaders made at an analyst briefing in 2016.  Well, the slide from grace has been alarming, leading to the appointment of a new leader to stem the bleeding. 

However, when the problems cut this deep, you can’t just apply lipstick to the pig, you need to reconstruct the whole farm, or you can quickly find yourself in the zombie services category alongside the likes of Conduent and DXC, where finding any sort of direction and impetus would be a major accomplishment.

Yes, it could really get this bad, as Cognizant has posted its slowest revenue growth and worst dip in profit margins. Ever. A mere 5% annual revenue growth, when in its heyday it was posting well over 40% (and slipping below double digits was unthinkable until last year). Yes, declining revenue growth is one thing, but declining profit margins is when the panic button gets pressed.

Frank should have left when Elliott came along to poison the well

It’s clear to see why Francisco “Frank” De Souza, the poster boy CEO of the emerging power of the Indian IT Services industry, jumped ship (or more accurately was made to walk the plank a burnt out husk due to the unenviable pressure Elliott Management placed him under to keep the gravy train on the tracks and kick back billions to shareholders.)  If anything, Frank should have considered making a move in 2017 as Elliott started squeezing Cognizant’s margins at a time is needed to keep pace with Accenture’s aggressive digital investments.  He’d grown the firm to over $15bn by then and could have exited with a legacy no one could rival in the tech business. 

And in his place comes IT Services newbie Brian Humphries – well we’re sorry to say this Brian, but the baby you just adopted has got a bit ugly, and is screaming for attention. Let’s just look at the numbers– now we’re going to be generous and forgive Cognizant’s dip in margin, a likely result of a reclassifying activity to meet fresh regulations. But the sinking revenue growth is much harder to look past:

Click to Enlarge

In 2012, Cognizant invented the Digital concept before everyone else jumped on it.  They were that cool...

In a punishingly competitive market, it looks like Cognizant has started to lose traction. Back in the good old days, the firm could do little wrong by challenging Accenture’s strategy – driving a hard-digital bargain and bringing in design consultancies along with their pony-tailed

Read More »

Quantum set to destroy blockchain by 2021
April 01, 2019 | Phil FershtJamie SnowdonOllie O’Donoghue

For all you blockchain aficionados, you'd better get quantum-savvy asap, or you'll find yourself having to re-skill yourself to do something relevant

This article will discuss some aspects of quantum computing, but - don't worry - we're not going to detail out all of the different uses in one initial education. It’s not going to describe the workings of quantum and we shall avoid using words like qubits as much as possible, we won’t mention quantum supremacy or the theory of quantum entanglement. If you want to know about these things, buy an undergraduate quantum physics textbook and then explore a decent quantum computing book like “Quantum Computing: A Gentle Introduction” by Eleanor Rieffel and Wolfgang Polak. Which we are lead to believe is only gentle to those with a good undergraduate understanding of maths and physics. Although in a review, Physics Today described it as a masterpiece.  But for you blockchain followers, we're sure you can quickly redefine your talktrack to wax lyrical about Quantum for your next Ted Talk.

The difference between quantum and traditional computing is at an eye-wateringly fundamental level. And this requires the knowledge we mention above to have a fighting chance to understand what it is. But is something every business leader needs to at least know about, even if it is just to be able to ignore with confidence. This is because quantum computing is potentially a disruptor with as big an impact as digital computing. And it is not an exaggeration that it can be used to simulate the very fabric of the universe.

The development of a practical quantum computer could have dire consequences for traditional encryption

However, the question still remains: Is practical quantum computing still just a theory, or an impractical experiment with any stable use decades away? Or is it potentially just around the corner poised to disrupt the very core of encryption technologies? Particularly given the (not passing) resemblance to other over-hyped transformative technologies like nuclear fusion and room temperature superconductors. All dreamt up in the golden age after the second world war and without a tangible end-point, with the seemingly constant promise of a miraculous breakthrough in spite of massive investment. Which seems particularly relevant given that current quantum computers need superconductors, and the insane supercooling that currently goes with them, to operate. Making them, to many, expensive, impractical flights of fancy; fuelled by journalist research hyperbole.

So, with that said, is that all you need to know? Your job is just to laugh in the face of any minion that utters the phrase “maybe we should invest in some quantum?” Unfortunately, it is not that simple. The trouble is no one really knows the actual timeframe, even John Preskill, the Richard P. Feynman Professor of Theoretical Physics at CalTech, can’t give you a firm time-frame. With predictions ranging from single to multiple decades and the current wave of “noisy” quantum experiments unlikely to have much practical use. However, this uncertainty needs to be weighed against the serious risk. The development of a practical or at least partially practical quantum computer could have dire consequences for traditional encryption.

The first algorithm set to run using a quantum computer could have seismic, rapid implications

Part of the excitement around the prospect of Quantum computing is the first real application – the first algorithm set to run using a quantum computer could solve the mathematical factoring equation very quickly. This can be used to break existing methods of encryption like RSA and ECC rapidly. So any organizations that use encryption technology need to understand that there is a potential weakness in current systems, which will need to be replaced or strengthened when practical quantum is available.

And recent experiments from Google and IBM have started to erode confidence in the long term predictions and have started to bring forward the prediction from decades to years. With both these firms recent experiments showing that quantum is starting to conform to Moores law. Which, if true, means we will have Crypto breaking quantum in 2 years rather than 20.

 As quickly as 2021, HFS researchers believe we could see a quantum computer capable of breaking RSA encryption of 256 Bits – which would have serious implications for blockchain, given this is the level of encryption currently used. According to HFS academy analyst Duncan Matthews-Moore, "If we don't get a handle on the potential speed of quantum soon, we could see the billions of dollars that have gone into blockchain become as quickly wasted as the vast sums Brexit is costing the UK economy."

Bottom Line – Quantum is the one to watch, particularly if you have any ambitions around blockchain.

Forget RPA, forget AI, forget cloud, forget disruptive mortgage processing - and especially forget blockchain.  Because if quantum can delivery real algos, everything tech that happened before is going to be disrupted like Betamax, like CB radio, like Sonic the Hedgehog.

And of course... this was an:

Read More »

Re-platforming the Hyperconnected Enterprise: AI must be led by business operators, not IT traditionalists
March 23, 2019 | Phil FershtOllie O’DonoghueTapati Bandopadhyay

If I have to listen to another technologist promoting “AI as a key component of the CIO’s agenda”, I am going to start getting a little irked… AI is not another app that can be installed and rolled out like a Workday, SAP or a ServiceNow.  I even had to listen to an IT executive asking me whether he should “leave AI in the hands of SAP as part of their S4 upgrade”.  Not only that, I noticed a well-known analyst firm promoting a webcast last week advising “CIOs how to rollout RPA”.

Re-platforming the enterprise is all about crafting the anticipatory organization

The whole purpose of AI in the enterprise is to have business operations running as autonomously and intelligently as possible, which means we need to build enabling IT infrastructure that supports the business process logic and design.  People are talking about “re-platforming the enterprise”… this is really about redesigning IT to support the business needs, to help the business respond to customer needs as soon they occur, and have the intelligence to anticipate the needs of their customers before its competitors can.  

Enterprises need to be as hyperconnected and as autonomous as possible within their business environments if they want to pinpoint where disruption is coming from, where to disrupt and how to keep reinventing themselves in an unforgiving world when we no longer have time to rest on our laurels:

Click to Enlarge

The problem for IT is that AI doesn’t come packaged in a nice box with an instruction guide

I’m sorry to be mildly offensive here, but AI and automation are only effective when they are designed to solve process and business problems, not check another box on the CIO’s resume. While it is important to keep the IT team in the communication loop so that it is ready to provide the right infrastructure and technology stacks required for operationalizing AI solutions, the steering wheel of any business application of AI must be in the hands of the businesses. Smart businesses  know their key pain areas and can identify the most relevant and feasible business cases. They own the data, they know the context, and how a process should run when it is augmented with appropriate AI techniques.  

For many firms, the day they implemented their first ERP was akin to pouring cement into their enterprise

The reality is the ERP system of the last 3 decades is no longer the system of record for ambitious, hyperconnected enterprises. It is a rigid suite of standard processes that keep when wheels on a legacy operation.  The emerging system of record is the data lake itself, when the business leaders have the ability to extract the data they need to make the right decisions, or have systems that can start to help make intelligent decisions for them.

My colleague, Tapati, has been doing some terrific work that looks at the interplay between business and IT with these emerging AI-driven environments and points to 10 prescriptive activities business leaders and IT leaders need to agree on, and put into effect, if they can genuinely develop AI capability that takes them into this hyperconnected state:

The 10 AI activities the business teams must lead to ensure AI success 

  1. Prioritize use cases from AI technology availability. The business team must prioritize AI business use cases from the initially identified list of potential AI application opportunities. The team must demonstrate its process knowledge and desired end-state scenario to help the IT team to ensure effective project coordination and outcome-setting. Using external consultants at this phase can be very effective to ensure the best business/technology fit.
  2. Develop the AI Business case: The most critical step, where the business team must set initial benchmarks, define pre- and post-process improvement metrics, and estimate target benchmarks.
  3. AI feasibility analysis and specification development: Business teams must solicit help from IT teams for their expertise with items such as technical feasibility analysis, infrastructure requirement specifications, and technology stack selection. Other areas are technology cost estimation, deployment, and production release, 
  4. AI Technology cost estimation: Developing estimates for the cost of technology stacks and solution deployment efforts must be the purview of business teams, but it requires significant and detailed input from the IT team.
  5. AI Data preparation and identification: Business teams must ensures success by identifying and preparing the data for training algorithms and building models. The team must solicit assistance from analytics and data warehousing teams.
  6. Coordinate with partners: During design phase of the target process model, the business team should must provide input to implementation partners (both internally and with their consultant/services partner) regarding ontology of the problem domain, the existing process models and rules. Teaming here with IT is essential, but the business team must define and communicate the business and process needs effectively. 
  7. AI Testing: The business team must lead testing the models against the project goals during the early POC and pilot phases
  8. Manage effective AI feedback loops: To make use cases fir for production release, the business team must provide detailed, regular feedback on the accuracy and performance. Again, they need  to work with implementation partners, which may be internal teams from an AI CoE or external partners.
  9. AI Training: The business team must be responsible for budgeting, planning and executing the training for large AI user teams, encompassing all of the staffing resources, external consultant costs, processes and task owners that are involved in the implemented use case.
  10. AI Deployment: Deployment doesn’t end once the use case is in production. The business team must continuously monitor the model’s outcomes, maintenance, and updates during the inferencing phase, and if the problem context changes with new rules or data, the team needs to add new dimensions and models and create new clusters. Users may also require retraining, especially as processes may change over time. There will also be the need to monitor change management issues, potential legal issues with data privacy / staffing impacts etc.

The Bottom-line:  AI is a business issue that must be directed and managed by business executives, supported by technology experts.  CIOs who ignore this will fail

The business team should seek help from IT in terms of infrastructure and tech stack needs, but it needs to own and run the AI projects because it owns the data, context, processes, and rules and understands the pain points.

CIOs will face an existential fight if they don't start genuinely enabling the business. The world where IT was all about mitigating outages and avoiding risk is being replaced by one that demands speed, agility, and a genuine understanding of the business.

Being tech-savvy isn't enough anymore… just knowing where to build a data center is pointless if you don't know what the rest of the business has planned. And this IT obsession of continually trying to upgrade ERP solutions, when most business units these days can handle it. That's the pitfall of the old traditional IT approach - we have to make sure we never get cemented in like that again.

(Weekend rant) Levin: a fading symbol of the legacy analyst industry? Or a hero helping edge those dots up the MQ for misunderstood vendors?
February 02, 2019 | Phil FershtOllie O’Donoghue

There’s nothing more jarring than an ex-Gartner analyst desperate to continue dining off a legacy analyst industry that is actually trying to change. And lo and behold, Just before the Christmas break, a blog emerged on LinkedIn with an enviably click-baity title ‘Is Gartner research quality under threat?’.  

Simon Levin, a Gartner Alum and owner of a boutique business “The Skills Connection” that helps tech vendors lobby their way through the Gartner and Forrester MQ and Wave processes, plies his trade on the fact he “knows” how to work his friends at Gartner, to help his vendor clients get their dots edged in a more positive direction for the firm.  And why not?  If I was a CMO, and lobbing Simon some moolah can help get some sort of leg-up in the process, I’d probably give him a shot.  And however you performed, there is no doubt Simon will claim it would have been worse without him.  It’s like that hair product “Rogain” that claims to slow down hair-loss… you’d never really know if it actually helped unless you went completely bald…

 Simon makes the case that now most his former Gartner buddies he worked with in the 90’s are being quietly “retired” and replaced with a new breed of youthful analysts, and their

Read More »

No-Deal Brexit isn't just a British problem: This could wipe $15 trillion off global markets
January 19, 2019 | Phil FershtJamie SnowdonOllie O’Donoghue

In 2008 Lehman Brothers nearly took down the global banking system... in 2017 Greece's debts were poised to destroy the European economy... today, we are staring at a stock market that gyrates up and down double-digit percentages in a single day, based on one awkward tariff tweet-up between Xi and Donald...

We're talking about the world's 5th largest economy going into immediate meltdown.  This is more than a UK-only debacle

So... who cares about the world's 5th largest economy potentially plummeting into a complete meltdown? Let's just have a good giggle at those idiotic British politicians hell-bent on destroying the country over a referendum staged 2.5 years ago on a topic no-one actually understands.  Yeah, let's not worry as they'll be screwed, and we can all make Brit-jokes at parties as those idiots run out of medical supplies and are forced to import frozen butterball turkeys pumped full of Ractopramine and several other GMOs... yum.

Here's the bad news - Lehman and Greece are small-time when you consider the potential damage a complete Brexit failure will cause, if - as it possible - the UK government paralyzes itself and lets its economy degenerate into a warzone of regulation chaos, complete data disaster, supply chain meltdown and political purgatory.  While we have boldly - and positively - predicted (see earlier post) that Brexit won't actually happen, there is also the distinct possibility that Brexit and no-Brexit blindly meander into the nothingness of a "No-Deal" scenario.

We have predicted that - at the end of the day - politicians are surely not that selfish, and voters really aren't that stupid to allow their country to descend into complete economic and social chaos... and madness.  But that's because we, at HFS, have assumed a modicum of intelligence does exist in the world. But, we could be sadly naïve.  However, there is some hope - and that hope is the simple fact that if we Brits commit the ultimate harakiri of a No-Deal Brexit, we take the rest of the global economy down with us.  You thought Lehman Bros was bad?  You've seen nothing yet folks.

Why this could be a $15 trillion global decimation

If we look at similar shocks to the stock market over the last century, it takes relatively little to create a major downturn in global asset values. We don’t need to look too far back this decade to see how even a moderate dip on global stock markets cans seriously impact the health of the economy.

If we look at the Asian financial crisis in 1997, for example, we can see just how quickly the collapse of even a relatively small economy can wipe off a huge percentage of global stock values. If we look at the potential consequences of not only one of the world’s largest economies, but one tightly integrated with the global economy, it’s not hard to see how much of an impact this could have on the major stock exchanges. That’s not to mention the major role the UK currently plays in global finance – with some estimates advising that the City of London manages over $9 trillion in assets, three times the size of UK GDP.

In a no-deal scenario, almost overnight the UK will no longer be compliant with EU rules and regulations – of which the previously discussed GDPR is just one of. There are countless other regulations that have formed part of the business environment of the United Kingdom, Europe, and by extension, the rest of the global economy, that are likely to emerge during the real-time stress testing that a no-deal crash out will lead to.

We can simulate (with the same degree of absolutely no certainty characteristic of the Brexit process) a major tumble in global stock prices by examining how previous shocks to the market have impacted in the past. And it’s worth noting, that our estimates are generally very conservative compared to other financial crises over the past century.

In the following illustration, we can see how some significant impacts to the value of stock markets can play out – particularly in areas most likely to be impacted by Brexit. In this simulation, we can expect the value of the twenty largest stock markets to drop by $14.9 trillion as a result of the major market shock of no-deal Brexit.

 

Click to Enlarge 

Bottom Line: A no-deal Brexit has far-reaching consequences, and could knock chunks of value from global stock markets to send us crashing into a serious economic depression

The warnings about the implications of no longer being compliant with GDPR are chicken-feed compared to the true global impact of allowing Britain to hive itself off from the EU with no insulation from the multiple disastrous consequences. In the past, major financial crises have been caused simply from a much smaller and less integrated economy defaulting on debts, now we’re facing the very real prospect that one of the world’s largest economies will wake up one morning with a completely different rule book, and much more red-tape and bureaucracy between it and the rest of the world. It’s not hyperbolic to say, the consequences to the global economy could be huge.

In a sick way, maybe this No-Deal scenario is what we all deserve to open the eyes of the politicians and gullible voters of the world for losing their grip on reality.  Maybe a period of poverty and hardship will knock us into shape to prepare for the next chapter of economic and political life.  

Ugh - we seriously hope it doesn't take a crisis of these immense proportions for everyone to wake up to the world we are shaping, where facts are merely tools to shape opinions and this sense of entitlement that so many people possess is threatening to destroy everything we've worked so hard to create.

There never was a "Brexit deal".  Brexit was all about pissed off working class people (mainly older folks) sticking it to the rich and to "foreign" people they saw 'stealing' jobs (they were never going to do themselves in any case).  So the only "Brexit" these people wanted was to ruin the economy for the wealthy British middle class and to stop immigrants coming into the country (and kicking out the existing ones too).  This is why the situation is such as mess.  The real motives behind Brexit are not the ones being discussed in Parliament or in Brussels.  It's a mess and needs to be somehow reset so the real debate can take place.  Otherwise this never ends.  

We all agree at HFS that change can be good, and we must embrace change... but changing to what?  That is the issue right now - what is wrong with the current system and what is the ideal system we need to move to...  and its not only the UK grappling with this problem...

Accenture, IBM, Capgemini and Wipro lead the first application services Top 10
January 14, 2019 | Phil FershtJamie SnowdonOllie O’Donoghue

So it's now 2019, and HFS' Ollie O'Donoghue and Jamie Snowdon waste no time in the world of the new feisty Top 10 methodology, where they take no prisoners in ranking how the leading application development and management service providers performed...

The market continues to test and experiment with new frameworks and methodologies. The most notable are DevOps and agile, which are now widely adopted by many of the major IT service providers. Providers are implementing sweeping training and culture redevelopment programs to adopt best practices to support innovation and delivery in the application services space.

Now more than ever, enterprises are looking for providers to help them rationalize and optimize their technology stack, of which business applications is a significant component. In their drive toward the Digital OneOffice, forward-thinking enterprises are engaging with providers that can build innovative solutions that can integrate and unite business applications and in the process break down business siloes.

Given the importance of technology and business applications, enterprises are looking for collaborative partners that are invested in their success. As a result, we’re seeing an increasing reliance on existing relationships to deliver on fresh engagements.

Service providers are also working tirelessly to ensure they are making the most of their talent—driving training and retraining programs to help keep employees’ skillsets up to speed in a changing market.

So let's see how the leading ten service provider shake out, based on interviews with 300 enterprise clients of IT services from the Global 2000 in which we asked specific questions pertaining to innovation and execution performance of service providers assessed. The research is augmented with information collected in Q1 and Q2 2018 through provider RFIs, structured briefings, client reference interviews, and from publicly available information sources:

Click to view full 22 service provider assessment

 

Key Research Highlights 

Developing talent. Providers are working hard to develop talent internally through retraining programs and bring in the right people by building out innovative talent attraction processes.

Building out partnerships. Providers are developing broader and deeper partnerships to support the increased demand from enterprises for a diverse and complex ecosystem.

Blurring service lines. Traditional service lines, particularly infrastructure and applications, are coming under more pressure as enterprises show less willingness to differentiate between siloes when designing an engagement.

Investment in capability.  Many providers are building out their capability through acquisition of innovative start-ups and boutiques, as well as some major investments in the acquisition or merging of major providers and ISVs already operating in the space.

Q&A with Report Author, Ollie O'Donoghue

"Are the partners who got us here the ones to take us to the next place?"

This is always a tough question to answer, particularly in the application services space where the scope of projects is getting larger and encompassing far more technologies. To thrive in this market there is no perfect route – we see firms like IBM bolster capabilities through acquisition (RedHat being the largest), while firms such as DXC and Accenture pull in capability through partnerships, and the major IT outsourcers try to build up skills and talent organically. At its core, this is to meet the needs of an evolving buyer community that expects the best solutions from a complex array of technologies and practices.

So, what we’re seeing is a large section of the provider community fight to stay relevant in a rapidly changing market. Honestly, we can expect to see some casualties, there’s just too much to specialise in for some providers to keep pace with, and many are spread too thin to become real specialists. The future in this space belongs to those who can keep layering valuable interfaces between a growing technology stack that includes advanced automation capabilities. For some, this will be through becoming a jack-of-all-trades, and for others, it will be through unique specialisms – all who are in between are vulnerable.

Which of this bunch are going to break out of the pack, based on your recent conversations?

As we’ve mentioned, there's a lot of movement across the leading service providers – but there are four or five that have a lot more going on than many of the others. Let’s start with IBM, which already has scale and differentiation in the space, but has jumped ahead of the pack in open source through the mammoth acquisition of RedHat. We also have Accenture which continues to be synonymous with innovation and bringing high-quality solutions to clients. The firm has also plugged in more digital design and apps agencies into its service lines in recent years, adding more brains and brawn to the rapidly growing market.

It’s also worth highlighting Wipro, which has a strengthening reputation in the application services market – strengthened by the firm’s big bets in digital. This part of the IT services market has always been the core of Wipro’s business, so the firm is able to pull in experience and skills that other firms still need time to develop. We also have Infosys which, with fresh leadership, has started to take the services game seriously again. The firm has done a lot of work to retrain talent and redevelop its strategy. Jumping on the developing push for onshore and nearshore, Infosys is also building out delivery centres, particularly in the US with plans for more work in Europe. Finally, Capgemini and TCS are gaining ground. The former through capturing more mindshare in Europe for its IT Services heft and expertise – a potential gold mine as businesses grapple with geopolitical pressures and look to local technology experts to help them. And the Latter for pushing a fresh narrative on the need for technology in the modern enterprise through its Business 4.0 thought leadership.

As a last note, HCL presents somewhat of a quandary to us since its purchase of IBM assets. It’s difficult to see the acquisition of somewhat legacy assets as a route to breaking out of the pack, but the reality is this could be a platform on to a broader customer base for HCL. All in all, though, we’re holding judgement until the firm has a clearer strategy for the assets.

Are there any niche firms popping up who can disrupt this space?

It’s a tough market for smaller firms to play in, but for specialists who can corner the market or disrupt business models, there’s plenty of room for manoeuvre. This is the first major IT Services analysis where we’ve included some of the mid-tier players where a lot of the innovation is taking place – simply because these firms have to try much harder to fend off the majors whether that’s the flexibility and agility of Mphasis or the vertical specialism of LTI.

There are even smaller players starting to challenge in the space – nClouds, an HFS Hot Vendor is an excellent example of a small firm with a compelling track-record in the market, particularly when helping enterprises shift applications and services to the cloud. There’s a vast amount of space opening up for players in the ‘small and cool’ category – the acquisition of RedHat leaves behind a massive gap in independent open source and there is a large portion of the community disillusioned by the acquisition that could be a huge boon to the right company. And with several mid-tier players hoovered up by the majors – notably Syntel and Luxoft - there are gaps in the market waiting to be filled by agile firms.

So, Ollie, which emerging apps services firms are worth keeping an eye out for?

nClouds – In many ways, nClouds is the definition of a company thriving from the increasing blend of application and infrastructure. The firm leverages practices and technologies such as DevOps, Containerization, and public cloud to help clients evolve their technology stack. We were so impressed by client feedback from this firm that they made their way into the first HFS Hot Vendors at the start of 2018.

Trianz – While not necessarily a niche player, Trianz has proven itself more than capable of taking on much larger firms to win deals. The firm has a broad range of services, but its edge seems to be the agility and flexibility it can bring to engagements. The firm has won multiple awards and seems to be benefiting from increased enterprise appetite to diversify engagements amongst many small players, rather than one giant one.

Linium – (acquired by Ness Digital Engineering) – For specialisation, we need to look no further than Linium which has worked tirelessly to carve out chunks of the enterprise service management space. The firm has dedicated practices for core business platforms such as ServiceNow, as well as capabilities in custom application development. The firm was acquired by Ness Digital Engineering in 2018 – bringing with it broader capabilities and access to talent, as well as access to a broader pool of clients.

GAVS Tech– When we covered Gavs Tech in our Q3 Hot Vendors, we concentrated on their zero-incident framework, an approach to reduce the impact of IT issues on end-users. But the firm has used the mantra across other service lines in the space, including a pay as you go DevOps models that focus on deploying reliable application code and resources. The DevOps platform provides an integrated solution for application development, testing, deployment, scaling and monitoring – not only offering improved speed and quality, but also a degree of simplicity in a complex technology environment.

Bottom-Line: Increasingly scarce talent, combined with a never-ending demand, places real pressure on service providers to keep innovating their delivery models

Simply put, the modern application services market is now so complex it’s not possible to be an expert in everything. Providers are beginning to recognize this and continue to bring in partners to support their delivery capabilities while retraining staff to move them into higher value work.

At the center of this changing market lies a huge question mark around talent. Enterprises are telling us that there are major talent crunches in key areas of the market and for some applications, which is forcing them to push more work over to providers. The challenge is that many of these providers are facing similar challenges. All of the IT services providers assessed in this research have extensive retaining and retraining programs in place to ensure they get the most out of their teams. They’re also partnering up with major sources of talent, particularly higher education institutions.

Nevertheless, the market is showing no signs of slowing down to allow providers any breathing space. Enterprise applications are now a major focus area for CIOs and technology leaders to get right. They need help writing off legacy, making sense of extensive technology estates, and finding areas of opportunity for new services and solutions.

Premium HFS subscribers can click here to download their copy of HFS Top 10 Application Development and Management Services 2018

Forget Brexit, RPA could wipe $820m a year of costs from the NHS with a common model across its 207 trusts
December 15, 2018 | Phil FershtJamie SnowdonOllie O’Donoghue

What I love about RPA is it most often has the highest impact where there is a serious amount of IT failure, disorganization and overworked staff. Yes, that's a lot of organizations to consider, and it's one of the reasons why its hard to find this technology sexy - it's built to fix the murky, dysfunctional stuff that has been squirreled away for decades, buried deep beneath failing ERP projects and conveniently ignored by senior executives who have few political points to score by acknowledging they should actually focus on fixing their broken underbellies.

This has been the failure of operations leaders for decades - simply focusing on layering more garbage over the top, when the real way to fix their inherent problems of dysfunction is to dig deep beneath their navels and address their broken process chains, and - heaven forbid - actually start to do something differently.

And there is no ground more fertile than the hallowed turf of the British National Health Service (NHS), the world's sixth-largest employer with 1.7m staff, where decades of hollow political rhetoric, obscene wastage on 'big-bang IT transformations" and big-ticket consultants on the gravy train, bravely held together by a woefully understaffed administration that end up spending on contract agencies just to keep the wheels turning. Let's face facts: the UK National Health Service makes the basket-case that is Obama Care resemble a slick, well-oiled machine.

Enter RPA: a tool that is reducing GP referral processing time by 75%

But there is renewed hope - and this hope can quite easily become reality if you entertain the idea of using RPA to unify document submissions, scrape data from legacy desktops to speed up GP referral times.  And the real value to be gained here is if the NHS can adopt a common enterprise-wide strategy to deploy a common RPA as-a-service toolset and methodology across its 207 individual trusts.  It's so simple, I describe in on the back of an envelope:

Even in these tough times for the institution, many of its leaders are looking optimistically at the opportunities new technologies that can be customized provide, which can solve business inefficiencies and don't involve the massive complexities of entire system upheavals. One particular example provides insights into how one NHS trust is actively addressing some of these issues, both in terms of saving the NHS money directly, easing pressure on administrative staff and providing a better more consistent service for patients being referred to hospitals. All of these endeavors are in line with the broader objective of ensuring that the NHS meets some overriding objectives to digitize services.

The starting point for this work began at the East Suffolk and North Essex Foundation Trust (ESNEFT). The organization faced many of the same pressures discussed above and like all healthcare services within the UK, they were directed to enable all GP referrals to be processed via the Electronic Referral Service (eRS) by October 2018. However, the existing system for processing electronic referrals was based on manual processes and was slowa common challenge.

Essentially, once the GP had made a referral to the Trust, the support staff have to find information such as scans, blood tests, and other results which need to be manually downloaded and appended to the file. In a process which may seem bizarre to many enterprises, this often meant admin staff were required to print off material and then scan it back into the same computer (using the same printer and scanner) to create a PDF file to navigate bottlenecks between unintegrated systems. The PDF document is then uploaded to the administration system. Approximately, this process took around 20 minutes for each referral and created, what the trust described as an avalanche of admin, distracting medical secretaries from their primary task of supporting patients and consultants.

ESNEFT had already started a pilot scheme looking to automate some accounts payable processes with the RPA provider Thoughtonomy, which was showing a great deal of promise. So, the Trust decided to use the system to automate the referral process across five clinical specialties, using “Virtual Workers” (BluePrism bots), which actively monitor incoming referrals from GP patient appointments in real-time, 24 hours a day. Once triggered, the Virtual Worker extracts the reason for referral, referral data, and supporting clinical information and merges the information into a single PDF document. This combined document is then uploaded into the Trust’s administrative systems. The RPA system uses virtual smart card technology for authentication providing the same level of data security assurance as the old manual process. Overall, the complete task now takes less than five minutes. The Virtual Workforce is able to update all systems, instantaneously and extract critical information, which it passes on to the lead consultant for review and grading.

One of the most important aspects of this technology is its ability to work within the current system, regardless of how chaotic and unstructured that may be. It is technology that adapts to the real world and the way people actually behave and work rather than expecting people to miraculously change current tropes and behaviours. This is perhaps the single most important reason RPA works: it provides whatever shaped peg is required, no matter the hole.

RPA negates the need to spend vast amounts on many complex technology integration projects

This first stage has significant cost savingsestimated to be $275,000 in the first yearwithout removing staff. Crucially, the $275K saving achieved is made up of agency staff and sundry costs such as printing. ESNEFT believe that 500 hours of time was saved thanks to the solution. Plus it increased the job satisfaction of the admin staff, who could concentrate on more important aspects of their role.

For us, although the top line cost saving number is important, it’s the fact that a technology solution proof of concept has been deployed successfully (and relatively painlessly) within the NHS. To deliver the outcome required, there was no need to drive an enormous transformation project to align and integrate systems. Which, given the lack of appetite for big bang projects in the NHS is an achievement in itself. Simply put, the way the technology is used can be fitted into the existing chaos—it’s technology for the real world. It can provide a bottom-up solution to productivity improvements,  which is a project that replaces part of existing work flows and automates manual and repetitive tasks. It accomplishes these things with the double whammy of removing tasks which is disliked, genuinely improving outcomes to patients, whilst helping to drive efficiency.

Bottom line: The NHS is not alone in facing an unforgivingly complex estate, but with technologies that fit into the chaos of the modern organization, this is only the start 

If we look more broadly at the impact RPA technology could have on the NHS, we can use a simple calculation to estimate the ramifications this technology can have. We know that savings of $275K have been made on 2,000 GP referrals per week. But the figure for NHS England as a whole, puts GP referrals at 3.5m from April 2018 to June 2018. So, if this were scaled up, we could see savings across NHS England purely for GP referrals at a staggering $38m, this included all hospital referrals the figure rises to almost $63m, or around $1.3m per week. To put this in context, this would equate to almost 850 nurses for the GP referrals or almost 1,400 for all referrals in England (using the average cost of $45,000 per annum for a mid-tier nurse, source: Nuffield trust).

This is the tip of the iceberg, considering that more than 520M working hours are currently spent on admin and approximately $3.3 billion is spent on agency staff across the NHS as a whole during 2016. There is a great deal of savings to be had. Even if only a quarter of the agency spend is non-medical, that could be $820M per year that could be freed up with only positive impacts on patient outcomes. 

Brexit will rip out the underbelly from the British economy - and we'll likely never recover
November 07, 2018 | Phil FershtJamie SnowdonOllie O’Donoghue

Click to Enlarge

However which way you analyze all the economic indicators, and whatever your opinion may be regarding Britain's relationship with the European Union (EU), removing the movement of EU labour into the UK will create a perilous shortage of labour, particularly for low-to-mid skilled professions.  If anything, removing worker base at the lower end of the skills spectrum is worse than at the high-end, for the simple reason it's much harder to entice people into jobs that may be low-paid, unattractive and - in many cases - require hard graft for low wages.  How are our hotels and restaurants going to find 120,000 staff willing to work for the minimum wage; our food factories to renew a third of their workforces to prepare our food; our cleaning firms going to backfill 132,000 people willing to mop and scrub for a living? The answer is sadly obvious - many of our industries will be under real threat of implosion because they simply cannot access the people they need to keep them functioning.

And without a thriving working class, the economy will suffer due to less money being spent, our businesses will suffer because of rising hotel costs, our entire society will suffer because of rising food costs, our commercial and domestic real estate markets will struggle to complete projects.  While professions like education and hi-tech can source talent from elsewhere (and are less reliant on EU people imports) it's those industries that form the underbelly of the economy which will really suffer.  Forget "trickle down economics" Brexit will cause a "trickle up" effect that will be hazardous for the British economy and its mid-long term sustainability.  In the short-term, many EU workers in the UK should be able to stay on, but the reliable conveyor belt of workers prepared to roll their sleeves up and support our entire economic underbelly will be permanently halted, and the availability of workers will get progressively worse - and much more expensive with this shrinking supply of people.

So, without further ado, let's dive into the fuller implications of this seemingly masochistic self-flagellation known as "Brexit"... 

Nice try Theresa, but even your dancing can’t make us forget about the increasingly no-win Brexit scenario 

For our fellow Britons, these past few weeks have been a refreshing break from the normal Brexit debates as we became distracted instead by our premier literally dancing for trade agreements. Trade agreements that, even for the most dismally poor mathematicians, don’t stack up when compared to the one we’ll soon be leaving.

 

Brexit has been a topic of heated debate for years now - and I'm sure we all have that friend or relative you daren't mention Brexit in front of or risk a lecture based on unfounded inferences and sketchy sources. In many ways, it's these long-winded and often inebriated debates that are the problem - we're close to the day we sever ties with Europe and reclaim some sort of democratic freedom that only a nation with several unelected heads of state can find any ironic sense in. And yet we're no closer to understanding what Brexit means - even if we had a clear picture of how awful it will be at least that's something we can prepare for. Instead of this mind-numbingly irritating narrative from British politicians of 'Don't worry, it'll all work out in the end.' Well, unfortunately, we're not an eight-year-old child looking for reassurance from our

Read More »

IBM / RedHat: A grand play at out-sharing Microsoft’s open source economy
October 29, 2018 | Phil FershtJamie SnowdonOllie O’Donoghue

It’s really not about the cloud – not at this massive $34bn price tag.  IBM's ingestion of RedHat, the third largest IT purchase in history, is all about Open Source.

Commentators are already pitching this deal as long-awaited reinforcements to the trench-warfare of the cloud wars. But in reality, we need to look much deeper to understand what persuaded IBM to part with such an exorbitant sum of money for Open Source giant RedHat.

Did we read that right? $34bn? – And what will happen to renegade RedHat?

Even for budding venture capitalists, the princely sum of $34bn is more than enough to make your eyes water – especially when it’s hurled at a firm with annual revenues of just $2.9bn and headcount that will be just a drop in the Big Blue Ocean. So there must be more to IBM’s thinking than a quick financial return – it’s either a play to kick the other hyperscale players out of play, or a push to get the upper hand in the increasingly valuable Open Source sharing economy.

If we dig into the financials, it’s clear that RedHat is a profitable firm with a strong track-record in the space – describing itself as the leader of Open Source capability. In many ways RedHat

Read More »