Jamie Snowdon
 
Chief Data Officer 
Learn more about Jamie Snowdon
The Life of Brian: Prettying up a baby that's got a bit ugly
May 11, 2019 | Phil FershtJamie SnowdonOllie O’Donoghue

What has happened to the Indian-heritage IT service provider that stoked fear into every Accenture client partner?  “They think like we do” was the declaration one of Accenture’s leaders made at an analyst briefing in 2016.  Well, the slide from grace has been alarming, leading to the appointment of a new leader to stem the bleeding. 

However, when the problems cut this deep, you can’t just apply lipstick to the pig, you need to reconstruct the whole farm, or you can quickly find yourself in the zombie services category alongside the likes of Conduent and DXC, where finding any sort of direction and impetus would be a major accomplishment.

Yes, it could really get this bad, as Cognizant has posted its slowest revenue growth and worst dip in profit margins. Ever. A mere 5% annual revenue growth, when in its heyday it was posting well over 40% (and slipping below double digits was unthinkable until last year). Yes, declining revenue growth is one thing, but declining profit margins is when the panic button gets pressed.

Frank should have left when Elliott came along to poison the well

It’s clear to see why Francisco “Frank” De Souza, the poster boy CEO of the emerging power of the Indian IT Services industry, jumped ship (or more accurately was made to walk the plank a burnt out husk due to the unenviable pressure Elliott Management placed him under to keep the gravy train on the tracks and kick back billions to shareholders.)  If anything, Frank should have considered making a move in 2017 as Elliott started squeezing Cognizant’s margins at a time is needed to keep pace with Accenture’s aggressive digital investments.  He’d grown the firm to over $15bn by then and could have exited with a legacy no one could rival in the tech business. 

And in his place comes IT Services newbie Brian Humphries – well we’re sorry to say this Brian, but the baby you just adopted has got a bit ugly, and is screaming for attention. Let’s just look at the numbers– now we’re going to be generous and forgive Cognizant’s dip in margin, a likely result of a reclassifying activity to meet fresh regulations. But the sinking revenue growth is much harder to look past:

Click to Enlarge

In 2012, Cognizant invented the Digital concept before everyone else jumped on it.  They were that cool...

In a punishingly competitive market, it looks like Cognizant has started to lose traction. Back in the good old days, the firm could do little wrong by challenging Accenture’s strategy – driving a hard-digital bargain and bringing in design consultancies along with their pony-tailed nose-ringed jean wearing creatives.  In fact, Cognizant can genuinely lay claim to “inventing” digital with its 2012 “SMAC” stack philosophy, which was swiftly followed by Accenture’s 2013 re-branding the SMAC stack as “digital”.

 

But the market has moved on – away from automation point solutions and funky apps to fend off uberized rivals. It’s now about integrating capabilities and meeting clients with legitimate flexibility, a real willingness to find out what they want to buy, rather than keep pestering them

Read More »

Forget Glassdoor, it’s Glass Millennials who could really harm a company culture
April 19, 2019 | Phil FershtJamie SnowdonOllie O’Donoghue

 

The emerging batch of “Glass Millennials” - a small but influential portion of young professionals born in the early 90s - is proving to be a unique challenge in the workplace to the age-old work ethic of having a backbone, working hard and making a workplace work for you. This batch is taking all of the stereotypes of millennials to a whole new extreme.

Forget Glassdoor, we need to vet the Glass Millennial

Many of the “older millennials” (born before 1985-90) are clearly a great mix of creative thinking with a desire to learn. And when this group of young, passionate and hardworking professionals get started, there really is no stopping them. But they’re being let down by this new batch of self-entitled kids who demand bloated paychecks and would never dream of going beyond the 9.00-5.00pm. 

In fact, having discussed these issues at length with many other employers, we believe many companies could find themselves in real trouble if they do not vet their Millennial hires effectively. Forget “Glassdoor”, if we do not have a good “Glass Millennial” detection process,

Read More »

The mid-cap service providers are killing it and LTI, Virtusa and Mphasis are setting the pace
April 09, 2019 | Phil FershtJamie SnowdonMartin GabrielSam Duncan

These are unique times for IT services - at the big-ticket end of the spectrum you have the mega-scale and competitive-cost propositions of the tier 1s vying for greater wallet share within their enterprise clients, while at the other, we have specific technical needs that warrant a lot of close attention that grabs the focus of the "mid-caps", which are much more flexible and can operate at smaller scale, while turning an attractive profit. 

The mid-caps are catering to the "build" needs of enterprises where the Tier 1s often struggle to deliver top talent

I recall just a couple of years ago how many of the big boys arrogantly called time on the smaller providers, but the exact opposite is transpiring; many clients are less brand obsessed as they once were and are more focused on accessing the skills they need with the attention they deserve.  Why settle for a B- team, when you can get a B+ team that's going to go the extra mile and work with you to figure out how to deliver complex requirements?  And the numbers, simply, do not lie:

Click to Enlarge

 All these providers, with the exception of Luxoft, grew their employee base and 7 out of the leading 10 grew revenues by double-digits 2017-2018:

Click to Enlarge

The mid-caps can rely on dynamic personalities to win deals

Remember the good ol' hyper-growth days of IT services where the likes of Chandra (TCS), Frank (Cognizant), Nandan (Infosys) and Shiv (HCL) would fly around the world to close deals? Well, those days are long-gone as the top tier providers are simply too large and clients know they can't just pick up the phone to scream at the CEO anymore.

However, they can still do that with most of these mid-caps. We conveniently forget that services is still largely about people and that personal touch from the top is still what most clients really want. One such eye-catching success story has been that of Mphasis, where the impact of CEO Nitin Rakesh (read the interview here) has been nothing short of remarkable:

Click to enlarge

Bottom-Line: The success of the mid-caps was not in the script... new rules of services are being written

In the last few years, Capgemini acquired IGATE and Atos acquired Syntel. In both cases, the company being acquired was the leading mid-cap on the market, and both provided some crucial resources for European-centric service providers lacking strong Indian delivery capability.  However, what transpired since has been the door opening for the next tranche to step up up - notably LTI, Virtusa and Mphasis - all of whom have blown past $1billion. While LTI and Mindtree are embroiled in a less-than-friendly merger and Luxoft has already been bolted into the DXC empire, it would be of little surprise if any of the successful ones in this list are snapped up in the coming months as enterprises grapple with their needs for close attention to their creaking IT infrastructures and the dire need to develop agile capabilities, take better advantage of automation and AI tools... and find more sophisticated help to sort out their cloud messes.  And as the latest ones are picked off, it's simply the time for the next wave to step into the void... firms like Zensar, NIIT and Hexaware are routinely discussed these days as strong providers in their own right, and are also potentially attractive acquisition targets, provided the fit is right(despite decades of heritage).  

These are the new rules of the services game... because the simple fact is that there are no rules and we're all writing new ones as the need for rapid, personalized IT salvation becomes more and more a critical part of the C-Suite agenda.

Quantum set to destroy blockchain by 2021
April 01, 2019 | Phil FershtJamie SnowdonOllie O’Donoghue

For all you blockchain aficionados, you'd better get quantum-savvy asap, or you'll find yourself having to re-skill yourself to do something relevant

This article will discuss some aspects of quantum computing, but - don't worry - we're not going to detail out all of the different uses in one initial education. It’s not going to describe the workings of quantum and we shall avoid using words like qubits as much as possible, we won’t mention quantum supremacy or the theory of quantum entanglement. If you want to know about these things, buy an undergraduate quantum physics textbook and then explore a decent quantum computing book like “Quantum Computing: A Gentle Introduction” by Eleanor Rieffel and Wolfgang Polak. Which we are lead to believe is only gentle to those with a good undergraduate understanding of maths and physics. Although in a review, Physics Today described it as a masterpiece.  But for you blockchain followers, we're sure you can quickly redefine your talktrack to wax lyrical about Quantum for your next Ted Talk.

The difference between quantum and traditional computing is at an eye-wateringly fundamental level. And this requires the knowledge we mention above to have a fighting chance to understand what it is. But is something every business leader needs to at least know about, even if it is just to be able to ignore with confidence. This is because quantum computing is potentially a disruptor with as big an impact as digital computing. And it is not an exaggeration that it can be used to simulate the very fabric of the universe.

The development of a practical quantum computer could have dire consequences for traditional encryption

However, the question still remains: Is practical quantum computing still just a theory, or an impractical experiment with any stable use decades away? Or is it potentially just around the corner poised to disrupt the very core of encryption technologies? Particularly given the (not passing) resemblance to other over-hyped transformative technologies like nuclear fusion and room temperature superconductors. All dreamt up in the golden age after the second world war and without a tangible end-point, with the seemingly constant promise of a miraculous breakthrough in spite of massive investment. Which seems particularly relevant given that current quantum computers need superconductors, and the insane supercooling that currently goes with them, to operate. Making them, to many, expensive, impractical flights of fancy; fuelled by journalist research hyperbole.

So, with that said, is that all you need to know? Your job is just to laugh in the face of any minion that utters the phrase “maybe we should invest in some quantum?” Unfortunately, it is not that simple. The trouble is no one really knows the actual timeframe, even John Preskill, the Richard P. Feynman Professor of Theoretical Physics at CalTech, can’t give you a firm time-frame. With predictions ranging from single to multiple decades and the current wave of “noisy” quantum experiments unlikely to have much practical use. However, this uncertainty needs to be weighed against the serious risk. The development of a practical or at least partially practical quantum computer could have dire consequences for traditional encryption.

The first algorithm set to run using a quantum computer could have seismic, rapid implications

Part of the excitement around the prospect of Quantum computing is the first real application – the first algorithm set to run using a quantum computer could solve the mathematical factoring equation very quickly. This can be used to break existing methods of encryption like RSA and ECC rapidly. So any organizations that use encryption technology need to understand that there is a potential weakness in current systems, which will need to be replaced or strengthened when practical quantum is available.

And recent experiments from Google and IBM have started to erode confidence in the long term predictions and have started to bring forward the prediction from decades to years. With both these firms recent experiments showing that quantum is starting to conform to Moores law. Which, if true, means we will have Crypto breaking quantum in 2 years rather than 20.

 As quickly as 2021, HFS researchers believe we could see a quantum computer capable of breaking RSA encryption of 256 Bits – which would have serious implications for blockchain, given this is the level of encryption currently used. According to HFS academy analyst Duncan Matthews-Moore, "If we don't get a handle on the potential speed of quantum soon, we could see the billions of dollars that have gone into blockchain become as quickly wasted as the vast sums Brexit is costing the UK economy."

Bottom Line – Quantum is the one to watch, particularly if you have any ambitions around blockchain.

Forget RPA, forget AI, forget cloud, forget disruptive mortgage processing - and especially forget blockchain.  Because if quantum can delivery real algos, everything tech that happened before is going to be disrupted like Betamax, like CB radio, like Sonic the Hedgehog.

And of course... this was an:

Read More »

No-Deal Brexit isn't just a British problem: This could wipe $15 trillion off global markets
January 19, 2019 | Phil FershtJamie SnowdonOllie O’Donoghue

In 2008 Lehman Brothers nearly took down the global banking system... in 2017 Greece's debts were poised to destroy the European economy... today, we are staring at a stock market that gyrates up and down double-digit percentages in a single day, based on one awkward tariff tweet-up between Xi and Donald...

We're talking about the world's 5th largest economy going into immediate meltdown.  This is more than a UK-only debacle

So... who cares about the world's 5th largest economy potentially plummeting into a complete meltdown? Let's just have a good giggle at those idiotic British politicians hell-bent on destroying the country over a referendum staged 2.5 years ago on a topic no-one actually understands.  Yeah, let's not worry as they'll be screwed, and we can all make Brit-jokes at parties as those idiots run out of medical supplies and are forced to import frozen butterball turkeys pumped full of Ractopramine and several other GMOs... yum.

Here's the bad news - Lehman and Greece are small-time when you consider the potential damage a complete Brexit failure will cause, if - as it possible - the UK government paralyzes itself and lets its economy degenerate into a warzone of regulation chaos, complete data disaster, supply chain meltdown and political purgatory.  While we have boldly - and positively - predicted (see earlier post) that Brexit won't actually happen, there is also the distinct possibility that Brexit and no-Brexit blindly meander into the nothingness of a "No-Deal" scenario.

We have predicted that - at the end of the day - politicians are surely not that selfish, and voters really aren't that stupid to allow their country to descend into complete economic and social chaos... and madness.  But that's because we, at HFS, have assumed a modicum of intelligence does exist in the world. But, we could be sadly naïve.  However, there is some hope - and that hope is the simple fact that if we Brits commit the ultimate harakiri of a No-Deal Brexit, we take the rest of the global economy down with us.  You thought Lehman Bros was bad?  You've seen nothing yet folks.

Why this could be a $15 trillion global decimation

If we look at similar shocks to the stock market over the last century, it takes relatively little to create a major downturn in global asset values. We don’t need to look too far back this decade to see how even a moderate dip on global stock markets cans seriously impact the health of the economy.

If we look at the Asian financial crisis in 1997, for example, we can see just how quickly the collapse of even a relatively small economy can wipe off a huge percentage of global stock values. If we look at the potential consequences of not only one of the world’s largest economies, but one tightly integrated with the global economy, it’s not hard to see how much of an impact this could have on the major stock exchanges. That’s not to mention the major role the UK currently plays in global finance – with some estimates advising that the City of London manages over $9 trillion in assets, three times the size of UK GDP.

In a no-deal scenario, almost overnight the UK will no longer be compliant with EU rules and regulations – of which the previously discussed GDPR is just one of. There are countless other regulations that have formed part of the business environment of the United Kingdom, Europe, and by extension, the rest of the global economy, that are likely to emerge during the real-time stress testing that a no-deal crash out will lead to.

We can simulate (with the same degree of absolutely no certainty characteristic of the Brexit process) a major tumble in global stock prices by examining how previous shocks to the market have impacted in the past. And it’s worth noting, that our estimates are generally very conservative compared to other financial crises over the past century.

In the following illustration, we can see how some significant impacts to the value of stock markets can play out – particularly in areas most likely to be impacted by Brexit. In this simulation, we can expect the value of the twenty largest stock markets to drop by $14.9 trillion as a result of the major market shock of no-deal Brexit.

 

Click to Enlarge 

Bottom Line: A no-deal Brexit has far-reaching consequences, and could knock chunks of value from global stock markets to send us crashing into a serious economic depression

The warnings about the implications of no longer being compliant with GDPR are chicken-feed compared to the true global impact of allowing Britain to hive itself off from the EU with no insulation from the multiple disastrous consequences. In the past, major financial crises have been caused simply from a much smaller and less integrated economy defaulting on debts, now we’re facing the very real prospect that one of the world’s largest economies will wake up one morning with a completely different rule book, and much more red-tape and bureaucracy between it and the rest of the world. It’s not hyperbolic to say, the consequences to the global economy could be huge.

In a sick way, maybe this No-Deal scenario is what we all deserve to open the eyes of the politicians and gullible voters of the world for losing their grip on reality.  Maybe a period of poverty and hardship will knock us into shape to prepare for the next chapter of economic and political life.  

Ugh - we seriously hope it doesn't take a crisis of these immense proportions for everyone to wake up to the world we are shaping, where facts are merely tools to shape opinions and this sense of entitlement that so many people possess is threatening to destroy everything we've worked so hard to create.

There never was a "Brexit deal".  Brexit was all about pissed off working class people (mainly older folks) sticking it to the rich and to "foreign" people they saw 'stealing' jobs (they were never going to do themselves in any case).  So the only "Brexit" these people wanted was to ruin the economy for the wealthy British middle class and to stop immigrants coming into the country (and kicking out the existing ones too).  This is why the situation is such as mess.  The real motives behind Brexit are not the ones being discussed in Parliament or in Brussels.  It's a mess and needs to be somehow reset so the real debate can take place.  Otherwise this never ends.  

We all agree at HFS that change can be good, and we must embrace change... but changing to what?  That is the issue right now - what is wrong with the current system and what is the ideal system we need to move to...  and its not only the UK grappling with this problem...

Accenture, IBM, Capgemini and Wipro lead the first application services Top 10
January 14, 2019 | Phil FershtJamie SnowdonOllie O’Donoghue

So it's now 2019, and HFS' Ollie O'Donoghue and Jamie Snowdon waste no time in the world of the new feisty Top 10 methodology, where they take no prisoners in ranking how the leading application development and management service providers performed...

The market continues to test and experiment with new frameworks and methodologies. The most notable are DevOps and agile, which are now widely adopted by many of the major IT service providers. Providers are implementing sweeping training and culture redevelopment programs to adopt best practices to support innovation and delivery in the application services space.

Now more than ever, enterprises are looking for providers to help them rationalize and optimize their technology stack, of which business applications is a significant component. In their drive toward the Digital OneOffice, forward-thinking enterprises are engaging with providers that can build innovative solutions that can integrate and unite business applications and in the process break down business siloes.

Given the importance of technology and business applications, enterprises are looking for collaborative partners that are invested in their success. As a result, we’re seeing an increasing reliance on existing relationships to deliver on fresh engagements.

Service providers are also working tirelessly to ensure they are making the most of their talent—driving training and retraining programs to help keep employees’ skillsets up to speed in a changing market.

So let's see how the leading ten service provider shake out, based on interviews with 300 enterprise clients of IT services from the Global 2000 in which we asked specific questions pertaining to innovation and execution performance of service providers assessed. The research is augmented with information collected in Q1 and Q2 2018 through provider RFIs, structured briefings, client reference interviews, and from publicly available information sources:

Click to view full 22 service provider assessment

 

Key Research Highlights 

Developing talent. Providers are working hard to develop talent internally through retraining programs and bring in the right people by building out innovative talent attraction processes.

Building out partnerships. Providers are developing broader and deeper partnerships to support the increased demand from enterprises for a diverse and complex ecosystem.

Blurring service lines. Traditional service lines, particularly infrastructure and applications, are coming under more pressure as enterprises show less willingness to differentiate between siloes when designing an engagement.

Investment in capability.  Many providers are building out their capability through acquisition of innovative start-ups and boutiques, as well as some major investments in the acquisition or merging of major providers and ISVs already operating in the space.

Q&A with Report Author, Ollie O'Donoghue

"Are the partners who got us here the ones to take us to the next place?"

This is always a tough question to answer, particularly in the application services space where the scope of projects is getting larger and encompassing far more technologies. To thrive in this market there is no perfect route – we see firms like IBM bolster capabilities through acquisition (RedHat being the largest), while firms such as DXC and Accenture pull in capability through partnerships, and the major IT outsourcers try to build up skills and talent organically. At its core, this is to meet the needs of an evolving buyer community that expects the best solutions from a complex array of technologies and practices.

So, what we’re seeing is a large section of the provider community fight to stay relevant in a rapidly changing market. Honestly, we can expect to see some casualties, there’s just too much to specialise in for some providers to keep pace with, and many are spread too thin to become real specialists. The future in this space belongs to those who can keep layering valuable interfaces between a growing technology stack that includes advanced automation capabilities. For some, this will be through becoming a jack-of-all-trades, and for others, it will be through unique specialisms – all who are in between are vulnerable.

Which of this bunch are going to break out of the pack, based on your recent conversations?

As we’ve mentioned, there's a lot of movement across the leading service providers – but there are four or five that have a lot more going on than many of the others. Let’s start with IBM, which already has scale and differentiation in the space, but has jumped ahead of the pack in open source through the mammoth acquisition of RedHat. We also have Accenture which continues to be synonymous with innovation and bringing high-quality solutions to clients. The firm has also plugged in more digital design and apps agencies into its service lines in recent years, adding more brains and brawn to the rapidly growing market.

It’s also worth highlighting Wipro, which has a strengthening reputation in the application services market – strengthened by the firm’s big bets in digital. This part of the IT services market has always been the core of Wipro’s business, so the firm is able to pull in experience and skills that other firms still need time to develop. We also have Infosys which, with fresh leadership, has started to take the services game seriously again. The firm has done a lot of work to retrain talent and redevelop its strategy. Jumping on the developing push for onshore and nearshore, Infosys is also building out delivery centres, particularly in the US with plans for more work in Europe. Finally, Capgemini and TCS are gaining ground. The former through capturing more mindshare in Europe for its IT Services heft and expertise – a potential gold mine as businesses grapple with geopolitical pressures and look to local technology experts to help them. And the Latter for pushing a fresh narrative on the need for technology in the modern enterprise through its Business 4.0 thought leadership.

As a last note, HCL presents somewhat of a quandary to us since its purchase of IBM assets. It’s difficult to see the acquisition of somewhat legacy assets as a route to breaking out of the pack, but the reality is this could be a platform on to a broader customer base for HCL. All in all, though, we’re holding judgement until the firm has a clearer strategy for the assets.

Are there any niche firms popping up who can disrupt this space?

It’s a tough market for smaller firms to play in, but for specialists who can corner the market or disrupt business models, there’s plenty of room for manoeuvre. This is the first major IT Services analysis where we’ve included some of the mid-tier players where a lot of the innovation is taking place – simply because these firms have to try much harder to fend off the majors whether that’s the flexibility and agility of Mphasis or the vertical specialism of LTI.

There are even smaller players starting to challenge in the space – nClouds, an HFS Hot Vendor is an excellent example of a small firm with a compelling track-record in the market, particularly when helping enterprises shift applications and services to the cloud. There’s a vast amount of space opening up for players in the ‘small and cool’ category – the acquisition of RedHat leaves behind a massive gap in independent open source and there is a large portion of the community disillusioned by the acquisition that could be a huge boon to the right company. And with several mid-tier players hoovered up by the majors – notably Syntel and Luxoft - there are gaps in the market waiting to be filled by agile firms.

So, Ollie, which emerging apps services firms are worth keeping an eye out for?

nClouds – In many ways, nClouds is the definition of a company thriving from the increasing blend of application and infrastructure. The firm leverages practices and technologies such as DevOps, Containerization, and public cloud to help clients evolve their technology stack. We were so impressed by client feedback from this firm that they made their way into the first HFS Hot Vendors at the start of 2018.

Trianz – While not necessarily a niche player, Trianz has proven itself more than capable of taking on much larger firms to win deals. The firm has a broad range of services, but its edge seems to be the agility and flexibility it can bring to engagements. The firm has won multiple awards and seems to be benefiting from increased enterprise appetite to diversify engagements amongst many small players, rather than one giant one.

Linium – (acquired by Ness Digital Engineering) – For specialisation, we need to look no further than Linium which has worked tirelessly to carve out chunks of the enterprise service management space. The firm has dedicated practices for core business platforms such as ServiceNow, as well as capabilities in custom application development. The firm was acquired by Ness Digital Engineering in 2018 – bringing with it broader capabilities and access to talent, as well as access to a broader pool of clients.

GAVS Tech– When we covered Gavs Tech in our Q3 Hot Vendors, we concentrated on their zero-incident framework, an approach to reduce the impact of IT issues on end-users. But the firm has used the mantra across other service lines in the space, including a pay as you go DevOps models that focus on deploying reliable application code and resources. The DevOps platform provides an integrated solution for application development, testing, deployment, scaling and monitoring – not only offering improved speed and quality, but also a degree of simplicity in a complex technology environment.

Bottom-Line: Increasingly scarce talent, combined with a never-ending demand, places real pressure on service providers to keep innovating their delivery models

Simply put, the modern application services market is now so complex it’s not possible to be an expert in everything. Providers are beginning to recognize this and continue to bring in partners to support their delivery capabilities while retraining staff to move them into higher value work.

At the center of this changing market lies a huge question mark around talent. Enterprises are telling us that there are major talent crunches in key areas of the market and for some applications, which is forcing them to push more work over to providers. The challenge is that many of these providers are facing similar challenges. All of the IT services providers assessed in this research have extensive retaining and retraining programs in place to ensure they get the most out of their teams. They’re also partnering up with major sources of talent, particularly higher education institutions.

Nevertheless, the market is showing no signs of slowing down to allow providers any breathing space. Enterprise applications are now a major focus area for CIOs and technology leaders to get right. They need help writing off legacy, making sense of extensive technology estates, and finding areas of opportunity for new services and solutions.

Premium HFS subscribers can click here to download their copy of HFS Top 10 Application Development and Management Services 2018

Forget Brexit, RPA could wipe $820m a year of costs from the NHS with a common model across its 207 trusts
December 15, 2018 | Phil FershtJamie SnowdonOllie O’Donoghue

What I love about RPA is it most often has the highest impact where there is a serious amount of IT failure, disorganization and overworked staff. Yes, that's a lot of organizations to consider, and it's one of the reasons why its hard to find this technology sexy - it's built to fix the murky, dysfunctional stuff that has been squirreled away for decades, buried deep beneath failing ERP projects and conveniently ignored by senior executives who have few political points to score by acknowledging they should actually focus on fixing their broken underbellies.

This has been the failure of operations leaders for decades - simply focusing on layering more garbage over the top, when the real way to fix their inherent problems of dysfunction is to dig deep beneath their navels and address their broken process chains, and - heaven forbid - actually start to do something differently.

And there is no ground more fertile than the hallowed turf of the British National Health Service (NHS), the world's sixth-largest employer with 1.7m staff, where decades of hollow political rhetoric, obscene wastage on 'big-bang IT transformations" and big-ticket consultants on the gravy train, bravely held together by a woefully understaffed administration that end up spending on contract agencies just to keep the wheels turning. Let's face facts: the UK National Health Service makes the basket-case that is Obama Care resemble a slick, well-oiled machine.

Enter RPA: a tool that is reducing GP referral processing time by 75%

But there is renewed hope - and this hope can quite easily become reality if you entertain the idea of using RPA to unify document submissions, scrape data from legacy desktops to speed up GP referral times.  And the real value to be gained here is if the NHS can adopt a common enterprise-wide strategy to deploy a common RPA as-a-service toolset and methodology across its 207 individual trusts.  It's so simple, I describe in on the back of an envelope:

Even in these tough times for the institution, many of its leaders are looking optimistically at the opportunities new technologies that can be customized provide, which can solve business inefficiencies and don't involve the massive complexities of entire system upheavals. One particular example provides insights into how one NHS trust is actively addressing some of these issues, both in terms of saving the NHS money directly, easing pressure on administrative staff and providing a better more consistent service for patients being referred to hospitals. All of these endeavors are in line with the broader objective of ensuring that the NHS meets some overriding objectives to digitize services.

The starting point for this work began at the East Suffolk and North Essex Foundation Trust (ESNEFT). The organization faced many of the same pressures discussed above and like all healthcare services within the UK, they were directed to enable all GP referrals to be processed via the Electronic Referral Service (eRS) by October 2018. However, the existing system for processing electronic referrals was based on manual processes and was slowa common challenge.

Essentially, once the GP had made a referral to the Trust, the support staff have to find information such as scans, blood tests, and other results which need to be manually downloaded and appended to the file. In a process which may seem bizarre to many enterprises, this often meant admin staff were required to print off material and then scan it back into the same computer (using the same printer and scanner) to create a PDF file to navigate bottlenecks between unintegrated systems. The PDF document is then uploaded to the administration system. Approximately, this process took around 20 minutes for each referral and created, what the trust described as an avalanche of admin, distracting medical secretaries from their primary task of supporting patients and consultants.

ESNEFT had already started a pilot scheme looking to automate some accounts payable processes with the RPA provider Thoughtonomy, which was showing a great deal of promise. So, the Trust decided to use the system to automate the referral process across five clinical specialties, using “Virtual Workers” (BluePrism bots), which actively monitor incoming referrals from GP patient appointments in real-time, 24 hours a day. Once triggered, the Virtual Worker extracts the reason for referral, referral data, and supporting clinical information and merges the information into a single PDF document. This combined document is then uploaded into the Trust’s administrative systems. The RPA system uses virtual smart card technology for authentication providing the same level of data security assurance as the old manual process. Overall, the complete task now takes less than five minutes. The Virtual Workforce is able to update all systems, instantaneously and extract critical information, which it passes on to the lead consultant for review and grading.

One of the most important aspects of this technology is its ability to work within the current system, regardless of how chaotic and unstructured that may be. It is technology that adapts to the real world and the way people actually behave and work rather than expecting people to miraculously change current tropes and behaviours. This is perhaps the single most important reason RPA works: it provides whatever shaped peg is required, no matter the hole.

RPA negates the need to spend vast amounts on many complex technology integration projects

This first stage has significant cost savingsestimated to be $275,000 in the first yearwithout removing staff. Crucially, the $275K saving achieved is made up of agency staff and sundry costs such as printing. ESNEFT believe that 500 hours of time was saved thanks to the solution. Plus it increased the job satisfaction of the admin staff, who could concentrate on more important aspects of their role.

For us, although the top line cost saving number is important, it’s the fact that a technology solution proof of concept has been deployed successfully (and relatively painlessly) within the NHS. To deliver the outcome required, there was no need to drive an enormous transformation project to align and integrate systems. Which, given the lack of appetite for big bang projects in the NHS is an achievement in itself. Simply put, the way the technology is used can be fitted into the existing chaos—it’s technology for the real world. It can provide a bottom-up solution to productivity improvements,  which is a project that replaces part of existing work flows and automates manual and repetitive tasks. It accomplishes these things with the double whammy of removing tasks which is disliked, genuinely improving outcomes to patients, whilst helping to drive efficiency.

Bottom line: The NHS is not alone in facing an unforgivingly complex estate, but with technologies that fit into the chaos of the modern organization, this is only the start 

If we look more broadly at the impact RPA technology could have on the NHS, we can use a simple calculation to estimate the ramifications this technology can have. We know that savings of $275K have been made on 2,000 GP referrals per week. But the figure for NHS England as a whole, puts GP referrals at 3.5m from April 2018 to June 2018. So, if this were scaled up, we could see savings across NHS England purely for GP referrals at a staggering $38m, this included all hospital referrals the figure rises to almost $63m, or around $1.3m per week. To put this in context, this would equate to almost 850 nurses for the GP referrals or almost 1,400 for all referrals in England (using the average cost of $45,000 per annum for a mid-tier nurse, source: Nuffield trust).

This is the tip of the iceberg, considering that more than 520M working hours are currently spent on admin and approximately $3.3 billion is spent on agency staff across the NHS as a whole during 2016. There is a great deal of savings to be had. Even if only a quarter of the agency spend is non-medical, that could be $820M per year that could be freed up with only positive impacts on patient outcomes. 

RPA will reach $2.3bn next year and $4.3bn by 2022... as we revise our forecast upwards
November 30, 2018 | Phil FershtJamie Snowdon

Well... it’s been quite a 2018 in the fantasy world of RPA (RPA plus RDA), where some of the fantasy dollars have magically become real, as the market hit $1.7bn – an increase of $250m from our forecast last year.  So when the more conservative of forecasters (HFS) undershoots the market by 17%, you know RPA has been sneaking down the growth hormones of late.

So why is RPA growing above initial analyst estimates?

  • RPA vendors, in particularly UiPath and Automation Anywhere (AA), have been able to recognize more revenues than expected. Bots licenses are being sold and deployed faster than we envisaged, due to effective training programs and aggressive support from third-party services firms;
  • The slowdown in new business process outsourcing engagements is driving more focus from enterprises in discrete strategies to drive efficiencies and digitize processes (and encourage more bots plus humans engagements);
  • The shift in the focus of RPA from job elimination to augmenting talent, digitizing processes and extending the life of legacy IT systems has increased the appetite of operations executives to fast-track RPA training programs and invest in broader intelligent automation strategies – even though most enterprises are still in the “tinkering phase”;
  • The initial adoption of "attended RPA", which makes up the majority of RPA and RDA engagements currently in play will eventually drive more "unattended RPA" where the increased value will be created and genuine alignment between RPA models proving to be a gateway to broader AI engagements;
  • The ramp up from service providers and consultants to support enterprise adoption has continued unabated, especially with the flattening of outsourcing investments and the waning interest in Global Business Services models. This reliance on third parties has proven to be a key dynamic behind the growth in RPA as solution providers prefer to sell through the services channel for larger enterprise deals and accelerate client training and development. The strong focus from the likes of Accenture, Capgemini, Deloitte, EY and KPMG has given the RPA market immense credibility;
  • Rapid funding of RPA vendors (in addition to rapid revenue growth) has encouraged these longer-term investments of many enterprises previously skeptical of investing in very small software boutiques. Largest examples have been AA and UiPath, attaining capital investment rounds as high as $250/$300m, but also some lesser-known niche RPA tools firms, such as Softomotive, which recently had a $25m investment round announced;
  • Increased focus from major ERP / orchestration software vendors with Pega’s acquisition of Openspan and SAP’s first foray into RPA adding Contextor.

Click on each chart to enlarge

 

RPA Definition: 

Example use-case: automating invoice processing across multiple business applications handling rule-based exceptions. RPA is different from traditional automation software as it is inherently capable of recognizing and adapting to deviations in data or exceptions when confronted by large volumes of data. In effect, it can be intelligently trained to analyze large amounts of data from software processes and translate them to triggers for new actions, responses, and communication with other systems. RPA describes a software development toolkit that allows non-engineers to quickly create software robots (known commonly as "bots") to automate rules-driven business processes. At the core, an RPA system imitates human interventions that interact with internal IT systems. It is a non-invasive application that requires minimum integration with the existing IT setup; delivering productivity by replacing human effort to complete the task. Any company which has labor-intensive processes, where people are performing high-volume, highly transactional process functions, will boost their capabilities and save money and time with robotic process automation.  Much for RPA is self-triggered (bots pass tasks to humans), but requires human intervention for judgment-intensive tasks and robust human governance and to make changes / improvements.

Similarly, RPA offers enough advantage to companies which operate with very few people or shortage of labor. Both situations offer a welcome opportunity to save on cost as well as streamline the resource allocation by deploying automation. The direct services market includes implementation and consulting services focused on building RPA capabilities within an organization. It does not include wider operational services like BPO, which may include RPA becoming increasingly embedded in its delivery.

RDA Definition:

In addition to RPA, the other software toolset which comprises the emergence of enterprise robotics software is termed RDA (Robotic Desktop Automation).  Together with RPA, RDA will help drive the market for enterprise robotic software towards $2.3bn in software and services expenditure in 2019 (with close to three-quarters tied to the services element of strategy, design, transformation and implementation of enterprise robotics).  HfS' new estimates are for the total enterprise robotics software and services market to surpass $4.3 billion by 2022 as a compound growth rate of 40%.

Example use-case: automating transfer of data from one system to another. RDA is essentially surface automation, where desktop screens (whether desktop-based, web-based, cloud-based) are "scraped", scripted and re-programmed to create the automation of data across systems.  A well-designed RDA solution can automate workflows on several levels, specifically: application layer; storage layer; OS layer and network layer. Workflow automation on these layers requires equally specific technologies but provides advantages of efficiency, reliability, performance and responsiveness. Much of this automation needs to be attended by humans as the automation is triggered by humans(humans pass tasks to bots), as data inputs are not always predictable or uniform, but adaptation of smart Machine Learning techniques can reduce the amount of human attendance over time and improve the intelligence of these automated processes.    Similarly to RPA, RDA requires human intervention for judgment-intensive tasks and robust human governance and to make changes / improvements.

The Bottom-Line: Automation and AI have a significant part to play in engineering a touchless and intelligent OneOffice

However which way we spin "digital", the name of the game is about enterprises responding to customer needs as and when they occur, and these customers are increasingly wanting to interact with companies without physical interaction.  Moreover, the onus is moving to the most successful digital enterprises being able to anticipate the needs of their customers even before they occur, by accessing data outside of the enterprise across the supply chain, or economic and market data that can help predict changes in the market, or emerging offering that customers will want to purchase.

This means manual interventions must be eliminated, data sets converged and process chains broadened and digitized to cater for the customer.  Hence, entire supply chains need to be designed to meet these outcomes and engage with all the stakeholders to service customers seamlessly and effectively.  There is no silver bullet to achieve this, but there is emerging technology available to design processes faster, cheaper and smarter with desired outcomes in mind.  The concept was pretty much the same with business process reengineering two+ decades ago, but the difference today is we have emerging tech available to do the real data engineering that is necessary: However, if these firms rest on their laurels, this market dominance will be short lived.  Once the digital baseline is created, enterprises need to create more intelligent bots to perform more sophisticated tasks than repetitive data and process loops. Basic digital is about responding to clients as those needs occur, while true OneOffice is where enterprises need to anticipate customer needs before they happen (see below).  This means having unattended and attended interactions with data sources both inside and outside of the enterprise, such as macroeconomic data, compliance issues, competitive intel, geopolitcal issues, supply chain issues etc.  

Click to Enlarge

In short, every siloed dataset restricts the analytical insight that makes process owners strategic contributors to the business. You can’t create value - or transform a business operation - without converged, real-time data. Digitally-driven organizations must create a Digital Underbelly to support the front office by automating manual processes, digitizing manual documents to create converged datasets, and embracing the cloud in a way that enables genuine scalability and security for a digital organization. Organizations simply cannot be effective with a digital strategy without automating processes intelligently - forget all the hype around robotics and jobs going away, this is about making processes run digitally so smart organizations can grow their digital businesses and create new work and opportunities. This is where RPA and RDA adds most value today... however, as more processes become digitized, the more value we can glean from cognitive applications that feed off data patterns to help orchestrate more intelligent, broader process chains that link the front to the back office.  In our view, as these solutions mature, we'll see a real convergence of analytics, RPA and cognitive solutions as intelligent data orchestration becomes the true lifeblood - and currency - for organizations. 

Do take some time to read the HfS Trifecta to understand the real enmeshing of automation, analytics and AI.

Brexit will rip out the underbelly from the British economy - and we'll likely never recover
November 07, 2018 | Phil FershtJamie SnowdonOllie O’Donoghue

Click to Enlarge

However which way you analyze all the economic indicators, and whatever your opinion may be regarding Britain's relationship with the European Union (EU), removing the movement of EU labour into the UK will create a perilous shortage of labour, particularly for low-to-mid skilled professions.  If anything, removing worker base at the lower end of the skills spectrum is worse than at the high-end, for the simple reason it's much harder to entice people into jobs that may be low-paid, unattractive and - in many cases - require hard graft for low wages.  How are our hotels and restaurants going to find 120,000 staff willing to work for the minimum wage; our food factories to renew a third of their workforces to prepare our food; our cleaning firms going to backfill 132,000 people willing to mop and scrub for a living? The answer is sadly obvious - many of our industries will be under real threat of implosion because they simply cannot access the people they need to keep them functioning.

And without a thriving working class, the economy will suffer due to less money being spent, our businesses will suffer because of rising hotel costs, our entire society will suffer because of rising food costs, our commercial and domestic real estate markets will struggle to complete projects.  While professions like education and hi-tech can source talent from elsewhere (and are less reliant on EU people imports) it's those industries that form the underbelly of the economy which will really suffer.  Forget "trickle down economics" Brexit will cause a "trickle up" effect that will be hazardous for the British economy and its mid-long term sustainability.  In the short-term, many EU workers in the UK should be able to stay on, but the reliable conveyor belt of workers prepared to roll their sleeves up and support our entire economic underbelly will be permanently halted, and the availability of workers will get progressively worse - and much more expensive with this shrinking supply of people.

So, without further ado, let's dive into the fuller implications of this seemingly masochistic self-flagellation known as "Brexit"... 

Nice try Theresa, but even your dancing can’t make us forget about the increasingly no-win Brexit scenario 

For our fellow Britons, these past few weeks have been a refreshing break from the normal Brexit debates as we became distracted instead by our premier literally dancing for trade agreements. Trade agreements that, even for the most dismally poor mathematicians, don’t stack up when compared to the one we’ll soon be leaving.

 

Brexit has been a topic of heated debate for years now - and I'm sure we all have that friend or relative you daren't mention Brexit in front of or risk a lecture based on unfounded inferences and sketchy sources. In many ways, it's these long-winded and often inebriated debates that are the problem - we're close to the day we sever ties with Europe and reclaim some sort of democratic freedom that only a nation with several unelected heads of state can find any ironic sense in. And yet we're no closer to understanding what Brexit means - even if we had a clear picture of how awful it will be at least that's something we can prepare for. Instead of this mind-numbingly irritating narrative from British politicians of 'Don't worry, it'll all work out in the end.' Well, unfortunately, we're not an eight-year-old child looking for reassurance from our

Read More »

IBM / RedHat: A grand play at out-sharing Microsoft’s open source economy
October 29, 2018 | Phil FershtJamie SnowdonOllie O’Donoghue

It’s really not about the cloud – not at this massive $34bn price tag.  IBM's ingestion of RedHat, the third largest IT purchase in history, is all about Open Source.

Commentators are already pitching this deal as long-awaited reinforcements to the trench-warfare of the cloud wars. But in reality, we need to look much deeper to understand what persuaded IBM to part with such an exorbitant sum of money for Open Source giant RedHat.

Did we read that right? $34bn? – And what will happen to renegade RedHat?

Even for budding venture capitalists, the princely sum of $34bn is more than enough to make your eyes water – especially when it’s hurled at a firm with annual revenues of just $2.9bn and headcount that will be just a drop in the Big Blue Ocean. So there must be more to IBM’s thinking than a quick financial return – it’s either a play to kick the other hyperscale players out of play, or a push to get the upper hand in the increasingly valuable Open Source sharing economy.

If we dig into the financials, it’s clear that RedHat is a profitable firm with a strong track-record in the space – describing itself as the leader of Open Source capability. In many ways RedHat

Read More »