Simplify Blockchain by Refusing to Let Interoperability Issues Bog You Down

|

We’ve previously written how interoperability will hold back blockchain adoption, at least until we can find ways around the problem. The cost and friction of joining multiple blockchains may hinder widespread adoption until we can figure out how to get them to talk to each other and reduce the cost of joining a blockchain implementation. However, recent thinking suggests there are some shortcuts we can take to make better use of blockchains in the short term, as their development and adoption matures.

 

For example, recently I met with the Deloitte blockchain team, and Principal Eric Piscini disagreed with my premise. He believes that interoperability really isn’t that big of an issue. First, he points out that, today, we have multiple environments that don’t connect to each other and the work still happens effectively. For example, different credit card payment vendors each have unique systems but everyone can still use any of them without an issue.

He also notes that interoperability seems like a bigger issue if you look at the blockchain implementation as needing to do every part of a transaction. However, he thinks of blockchain as having three layers:

  • Recording (actual transcribing of data into a block)
  • Transacting (an activity or transfer, such as moving money from one participant to another)
  • Business logic (the rules and controls of a process coded into the system)

You don’t have to do all three things in blockchain. You can use it for any of the three, or some combination. And as a result, you start to see how it’s possible to use blockchain technology and not necessarily have to worry about interoperability.  It’s not dissimilar to evaluating automation technology, where you will, simply, fail if you try to automate everywhere possible – you’d run out of time, money and patience trying!  Most experts will tell you to first focus on what not to automate, which is similar with blockchain:  first figure out where you can carry on just fine without all the expense and disruption of a blockchain implementation. 

Piscini also believes, in some instances, that firms do not need interoperability, but more a single blockchain per asset class, as it will be near impossible to transfer the same value across multiple blockchains. 

So, where does this leave us with our interoperability decisions?

1) Blockchain interoperability needs both a technology choice and business reason to exist. We need to separate the technology of blockchain from the business application of blockchain and from the business model of blockchain-based systems. From a technology perspective, for example, multiple blockchain implementations can exist and drive value even if not connected to other blockchains.

2) Network ownership may be more important than technical interoperability. For networks that are, essentially, owned and controlled by one party (the credit card examples above) and other parties just access those networks but don’t need to integrate per se, then Piscini’s view makes total sense. It also works in situations like Ariba’s, which we’ve written about before, where participants on don’t need blockchain implementations themselves to use Ariba’s blockchain. (Ariba also notes that clients can choose to do just recording on the blockchain, further supporting Piscini’s point of separating blockchain into layers.) However, in networks where the peer-to-peer aspect is more important, and no one participant has strong power, we believe interoperability will continue to be a barrier to widespread adoption.

Bottom Line: Clarity around when/if/how interoperability is really needed for the blockchain market to mature.

We expect that, by the end of this year, as companies continue to tackle implementation challenges like interoperability and the development of common industry standards continues[1], will the market will begin to pick winning platforms and technologies.

 

[1] Many consortia are dealing with this issue as we speak, and government agencies are beginning to weigh in. Expect a lot of activity in standards development this year.

Posted in : Blockchain

Comment0

Once Upon A Time…To Hold Management Attention, Security Execs Became Storytellers

|

Security is a complex space – changing and emerging threats, multiple interconnected technologies that each do one small piece of the security landscape, and an ever-changing regulatory and legal environment. And frankly, most senior executives don’t have the patience to really understand the threats to their business in great depth.

So what can a smart security executive do to capture and hold management attention on security issues? Become a great storyteller. There are lots of reasons storytelling helps in the security space:

  • People remember stories much more than they remember a bunch of data points or random facts
  • Stories connect emotionally as well as intellectually, making them more impactful, and increasing stakeholders’ investment in the topic
  • Having people re-tell stories is both a great validation of your original point but also a powerful way to make sure that your point is shared throughout the organization so that everyone understands security better

Start by studying storytelling. There are some basic plots for stories, such as boy meets girl, hero vanquishes evil, etc. There’s also a basic narrative structure you can use (see Exhibit 1):

 

So with this structure, you can explain security threats to your executives.

  • Exposition – threat the business faces, including what part(s) of the business, are affected (sales, brand reputation, data, etc.)
  • Rising action – how that threat is evolving
  • Climax – impact on the business if that threat occurs
  • Falling action – steps being taken to address the risk and protect the business
  • Denouement – any residual implications, requests for support or budget, etc.

You leave out the details that will take the focus off the overall story but leave the ones that add color and help people connect with the story. So, examples of how other companies are handling the threats can stay, but likely the reporting spreadsheets of the quarantined threats should go. This balance of the details is key to effective storytelling. Your team may find deep data invaluable, but it may cause your audience to give up trying to follow your story.

You’ll also save a lot of time. How? Typically, when something happens, you give the details and then try to explain those details in context. If you’ve told a story people understood, then when you have a conversation about details, you can refer back to the story and have the person “get it” faster. You can tell this works when stakeholders start asking more, and more relevant, questions. People who don’t understand a topic don’t ask as many questions.

How will you know the storytelling approach is working? When more people in your organization start to change their behaviors to support your security goals. And when senior executives begin to get more invested in your work.

Bottom line: To really improve security, get outside of security data and details and become a great storyteller.

Posted in : Security and Risk

Comment0

How Infosys Seizes the Momentum for Change in Oil & Gas

|

Oil & Gas has gone through a crippling crisis in the last three years. What are service providers doing to help Oil & Gas recover? This question plays a central role in our 2016 Energy Operations Blueprint, HfS’ inaugural report on the services provided to the oil and gas industry. Infosys is an As-a-Service Winner’s Circle provider with strong roots in the oil and gas industry and a clear vision for the services needed to pull the industry out of the slump it has been in since the oil price collapsed in 2014. Since the 2016 Blueprint, the oil price has rebounded and is stabilizing between $50 and $55 per barrel, giving the industry some more breathing room and a momentum for change.
Time to have a conversation with Robin Goswami and John Ruddy. Robin heads Infosys’ Energy practice in the Americas. John in the president of Noah Consulting, Infosys’ 2015 acquisition that bolsters its capabilities in Oil & Gas.

Robin Goswami

John Ruddy

Derk Erbé, Research Vice President, Supply Chain, Procurement, and Energy: Robin and John, thanks for sharing your vision for the Oil & Gas industry with our audience at Horses for Sources and candidly discussing the challenges of operating in a struggling industry. Infosys impressed us in the Energy Operations Blueprint, with its vision for the evolution of services that this industry needs to pull itself out of the downward spiral since 2014. Even though this is a very volatile environment with challenging economic circumstances, Infosys has continued to invest. How do you see the current situation in Oil & Gas?

Robin Goswami, Vice President and Head of Energy Practice Americas, Infosys: We are starting to see some recovery but there is a growing realization that this is not going to be a quick recovery, but more of a gradual one, like what we saw in the ’80s downturn.

In 2014, everybody thought this would be a six-month downturn, by early 2015 it looked like a one-year downturn. Only in late 2015, was there a realization that this could last a lot longer and would be much harder to predict. The last ten years now seem more like a spike, with the market having settled to a new normal of $50 a barrel of oil.

Due to the downturn of the last couple of years, companies have stopped most capital projects, whether in IT or the field. They are trying to optimize what they have. At some point, they must start looking at different ways of doing things including radically different ways of leveraging technology. This is where offerings like Infosys Mana – the knowledge based artificial intelligence will play a significant role in driving automation and innovation.  So far, this has happened in spurts and pockets. We’ve seen a couple of companies try to do it, but most of these have been the smaller to medium sized ones in a desperate situation. We have not seen the bigger companies do this just yet, but lately, are starting to see a changed mindset and some positive signs as a result.

We’ll see a lot more interest in the things that we’ve been trying to talk about for the last year, year and a half or so. Automation, how can that help to significantly reduce operational expenditures? Analytics, how can you leverage analytics to get a lot more efficient and predictive analytics around equipment failure.  

John Ruddy, President, Noah Consulting: The industry is learning how to be profitable in a $40 per barrel world, and that the days of anything more than $60 are over. They are learning that they need to be profitable at this price point, and that’s driving much leaner, much more efficient operations and much more reliance on automation, machine learning, and analytics. We’re starting to see modest growth because our clients recognize that this is their direction. They’ve stabilized following the workforce reduction, which was very significant. I believe they’re now starting to become leaner, more agile organization that they need to be to survive in today’s market.

Derk: Do you feel they are making the shift in mindset from cutting down costs in existing processes and with workforce reduction towards how to create value in a different way and create new value?

John: Now that the price is somewhat stabilized, we see the emergence of a focus on how they become a lean, responsive organization. We see a big focus on operational technologies around real-time data, and that the digital oil field wave that happened 10 -15 years ago is now re-emerging with IoT being the main catalysts, with even more sensors, and even more data and even more automation are happening.

We see a big push into more Cloud based As-a-Service models out there. In fact, the operators are forcing the software providers to move there more quickly than the software providers had anticipated. There is a very strong desire on the demand side for an As-a-Service ecosystem and the operators at one point were reluctant and are now pushing very hard for that type of a model to be offered by the vendors.

Derk: We’re on the verge of a very interesting period in oil and gas. Would innovation go slower or faster if the oil price were slightly higher?

Robin: Innovation would go faster if the oil price were slightly higher. Our clients have been focusing on primary goals, ensuring that they stay just cash flow positive. They don’t have the cushion to invest in innovation. If the oil price was slightly higher, I think that the money would be there. I also feel that if the price were to push 80 or 90 a barrel all of this would be forgotten, and we would be back to doing business the way we were before. But if the price consistently stays in the 50s, it will drive some efforts and activity to bring up the level of investment in innovation.

Derk: There is this fine line for innovation, investments and having the ability and willingness to innovate. Where do you think we’ll find the sweet spot?

John: I think $50 to $60 per barrel might be the sweet spot for a lot of innovation, a lot of demand to be addressed, especially with the smaller workforces that are out there and to a certain degree, a refreshment of the workforce regarding the average age coming down. I think you’re going to find more millennials driving automation as well. The voice for innovation will be a little bit louder perhaps than it was pre-downturn. I do think that the $50 to $60 sweet spot would have allowed more innovation to be applied to the clients. There are modest pockets of it happening with prices in the $40 range. $50 to $60 will open the floodgates for people to be innovative. Anything more than that ($60) and people don’t care about innovation anymore.

Derk: The industry has adapted to this ‘new normal’ of $60 per barrel as the peak price. What does that mean for the focus of your oil and gas practice and your competitors?

Robin: Oil and gas is a cyclical industry, and we’re in a down cycle, but we’ve got to continue to invest. I strongly believe that when it does come back, the folks who have invested will reap the rewards of their investments. We continue to focus on oil and gas and are seeing some positive movement. The tough part is the fact that our work is split between OpEx and CapEx and the CapEx side of the work really came to a halt. We’ve got clients who are doing work on analytics, data lake projects, or initiatives to get more efficient, but it’s small compared to the amount of work pre-downturn. You can count the number of ERP implementations currently happening in the industry on the one hand. That was completely different four years ago. At any given point in time, four or five projects were kicked off. That has not happened recently at all. In terms of competition, we used to have eight or nine competitors bid for the same projects. That has drastically changed, a lot of competition has re-focused or exited this space.

We have seen a lot of companies that in the past never looked at outsourcing who have now started to approach the market and say, “Let us explore working with outsourcing companies that can do IT a lot more efficiently than we can do it ourselves.” It has opened some opportunities. But the opportunities are still few and small. We are doing well from a perspective of winning them, but the squeeze in capital expenditure has hurt all the service providers.

We acquired Noah Consulting in late 2015. We continue to invest in oil and gas. We see a modest growth of some CapEx-projects, though not in a big way. The significant change that happened over the last year was people trying to find more efficient ways of doing their expansion. Whether it is the large or the small players, everybody is trying to use this down phase to optimize how their operating expenditure is leveraged.

We are focusing on delivering value by proposing automation (Infosys Mana) and leveraging digital to optimize the operational costs and are starting to see success.

Derk: What is the key to creating more of an innovation-minded culture and boost As-a-Service adoption in this hundred-year industry that doesn’t like to change and frankly lacked the incentive to change most of the time?

John: The key is education. A lot of it is repetition. A lot of it is helping to stimulate some of that demand and get our industry comfortable with new ideas. I’ll give an example. There is a lot of innovation happening in Infosys, for instance in our Palo Alto offices. We were out there a couple of months ago. The first thing you notice when you walk into the lobby is a science lab type experiment set up with plants. There are different basil plants. Each plant has sensors measuring the nutrients in the soil and the amount of light and the amount of water that they’re getting. Each plant is generating a growth curve, and they’re learning from each other. They’re looking at each other’s growth curves and adopting best practices and dropping bad practices, and are using our AI platform Mana.

That’s being used in the other industries, but not yet adopted by oil and gas in a large manner, but it’s applied to this little science experiment. That was an inspiration for us. We looked at that and take that same exact concept and took it out to an oilfield and have pumpers learn from each other. Have the pumpers look at the Geoscience strata for that field. They’re from the same field, comparing that pad to the one a quarter mile away and the pumpers look at the data and learn from each other and look at economic conditions.

We’re test driving those innovative concepts. It’s an industry that avoids disruptions. We’re working towards it, but it’s an educational process. We show them the possibilities and help them get more and more comfortable with innovation. Some clients are first movers, prove the benefits and the rest of the industry will follow. The onus is on us to find that first mover, and that’s what we’re out there doing.

Derk: If you were given the keys to the oil and gas services kingdom and you can rule the services world for a week, what’s the one thing that you would do to change the industry for the better?

Robin: That’s a tough one. John, I’ll let you go first.

John: Having the keys to the kingdom, one very broad-based public relations thing that I would do is I would promote natural gas as a clean fuel alternative. Why aren’t there more compressed natural gas vehicles? Why aren’t there more natural gas power plants? I know there is an uptake in those, but not to the degree it could be. There is a huge environmental and climate change concern, and our industry has the answer to that, and that’s natural gas. As king, I’d be out there to get the public comfortable that natural gas as a clean fuel alternative that should be embraced and not pushed away.

Robin: I would like the companies to look at the industry and say “Look, we all know, oil and gas will be there in our lifetimes. We will come out of the downturn at some point. Lets leverage this downturn and look to use technology to change our model. This is an opportunity for us to reset our entire cost model, our entire way of operating with technology for the next decade.”

Right from the beginning of the downturn, we’ve had a view from the outside. We are very much part of the oil and gas industry, but we have the luxury of having the bulk of our business focus on IT services, so any impact from oil and gas is well cushioned by the rest of the Infosys business. That gives us the cushion to make acquisitions, invest and enables us to continue what we are doing. That is not an advantage, unfortunately, that most oil and gas companies have. They are unfortunately dealing with the day to day of trying to keep positive cash flows and are forced to react to weekly, monthly, quarterly pressures and none of them have been able to step back and say, “Let’s assume this is a three year or a five-year downturn and let’s try to do things differently”. Definitely not when it hit in 2014.

I saw this as an opportunity in late 2014, for companies to completely change their model, move to other service models, look at analytics, automation, Internet of Things, to radically work differently with technology, not only IT, technology overall. Most of them were unable to take that opportunity. The longer the downturn continues, the more you are, in a sense, in a hole where you are trying to just survive. The amount of cash that companies had in 2014 and 2015 was just not there in 2016. Those initiatives could have been taken on in 2014 and 2015. It has become way more difficult, a lot more challenging, now. And this is the one thing I would like us to do differently.

If I had the keys to the kingdom, that is what I would do. Try to move away from the quarterly, the monthly survival and look at leveraging technology to change the model.

Derk: The saying is “never waste a good crisis” and they’re wasting a good crisis to change. Would you recommend having a different dialogue with the financial markets, because that’s part of the issue for oil and gas companies? They want to do the same for the shareholders as they did when the oil price was $100. Does that need to change or they’re addicted to doing the same as ever before?

Robin: I don’t think we have a choice, Derk. I think we have reached a stage where $80 barrel of oil is not coming back shortly. The boom won’t be back for a while and we must reset expectations and look to do things differently and leverage technology a lot more.

 

Posted in : Energy

Comment0

Don’t Look Now but Payroll Services Providers are Embracing Digital — and their Sexiness

|

 

A product strategy executive at ADP recently told me ”employees generally spend more time picking out a color TV than they do selecting their Benefit plans.” At that moment I knew the conversation would be different than any of my previous chats about payroll services. After all, I’ve never known an organization that cited a well-run Payroll operation as a major source of competitive advantage. In contrast, achieving only modest (e.g. 4%) upticks in employee productivity, perhaps from focusing more on improving employee engagement, can be a pretty big deal. The math: A 2,000-employee company that goes from $150,000 revenue per employee to $156,000 (or a 4% improvement) generates $12 million in new value ($6,000 x 2,000 employees).

Payroll Services in the digital age is where “intelligent automation” and the constancy of innovation empowers and enables all participants and customers – ergo, makes them more engaged and productive. And for newer readers of HfS fare, we basically define intelligent automation as moving from legacy technologies to a more on-demand environment that enables (as warranted) plug and play solutions/services, cognitive/AI elements, impactful analytics, highly engaging user experiences, mobile taps over mouse clicks, etc., often based on design thinking and always involving genuine customer advocacy.

To cut to the chase, my new Blueprint Report “Payroll-as-a-Service: 2017”, being published this July, will delve deeply into how Payroll Services are being transformed based on the best that intelligent automation has to offer. The Report will examine where this market is today and where it’s going, include actionable guidance for services buyers and providers, showcase innovations that are driving real business value, incorporate customer perspectives from around the globe, and of course, utilize our “As-a-Service Winners’ Circle” evaluation framework to separate market leaders from high performers and high potentials – replete with service provider analyses.

Readers wanting to see who offers great pricing for tax filing and reporting services should look elsewhere, but for those interested in digital capabilities in the form of impressive chatbots (for maximum responsiveness), benefit plan cost/benefit optimization and financial wellness support at the individual level, other types of “personalized value adds,” predictive and prescriptive guidance for managers, and other capabilities which take routine tasks out of the daily life of Payroll staff and its internal customers, we will have you covered.

Bottom Line: Payroll Services providers are jumping on the digital bandwagon with gusto, and the results belie that long-time characterization of Payroll not being sexy.

Posted in : HR Outsourcing, HR Strategy

Comment0

Placing HCM Stewardship Where it Belongs: Outside of HR

|

In retail, capturing data in real-time at the Point of Sale (POS) leads to better stock replenishment and more informed customer interactions and experiences. Now take that same concept into business operations with HR and employees, where transaction or event participants similarly have the biggest vested interest in achieving maximum data accuracy and transaction processing speed.

The principles of real-time data updates and logical transaction ownership led to a lot of new Employee and Manager Self Service functionality in the early days of HCM systems. Let’s also remember, though, that self-sufficiency — as in not having to deal with the occasional black hole that some HR Departments are identified with — is also directly correlated with stakeholder or customer satisfaction.

All of this “transactional mumbo jumbo” can be boiled down to one phrase: Human Capital Management stewardship … and also perhaps one question: Where should primary HCM ownership lie? The “HR as necessary interloper to keep the company out of trouble” model hasn’t really endeared itself to many outside of those running professional HR organizations. So why keep “workforce management activities to drive enterprise value,” aka HCM, strictly in the hands of the HR Department?  No reason. It’s a stupid waste of resources – both financial and human.

 

HR adds the most value, by far, when it enables line managers to be effective stewards of HCM  

How do you as an HR professional accomplish this?

(1) by truly understanding the business of your internal line manager customers
(2) by being a trusted advisor when it comes to HCM-related opportunities and risks (both — not just risks!)
(3) by syndicating best practices, tools, standards and innovations related to HCM across the organization … whether an HR-borne idea, an internal customer’s idea or something learned at a professional HR organization’s conference.

Business leaders don’t just have P&L responsibility. They interact with their teams every day, in all situations, and they ideally have the “HCM acumen” to know what will drive employee engagement, retention and productivity … or conversely, what will impede these outcomes and how to mitigate those impediments.

Bottom Line: HR Departments must place a huge emphasis on line manager enablement, thereby shifting HCM stewardship to where it belongs – to team leaders, department managers, and senior executives. HR Departments should enable, or get out of the way.

Posted in : Digital Transformation, HR Strategy

Comment0

WNS and Its HealthHelp Acquisition “Will Not Deny” Health Care

| ,

“Denial is not an option.” Contrary to the typical (and here, oversimplified) pre-certification, “approve” or “deny” approach to utilization of services in health care, HealthHelp launched a new model of utilization review based on the premise of non-denial procedures, and that utilization management is about collaboration and education. HealthHelp taps into its evidence-based database and network of physicians and academics to review and approve or to recommend alternatives to procedure requests. In tandem, HealthHelp drives studies and education opportunities to lead to better medical and financial outcomes when providing or using health care services. In short, the company that WNS just acquired is building out a patient- and healthcare provider-centric approach to utilization management designed to match procedure and treatment to the patient’s needs and network.

HealthHelp took roots in the founder’s own pain 

The HealthHelp approach is tied to the experience of its founder, Cherrill Farnsworth, who found the number of denials and appeals she managed for radiology procedures discouraging and painful. Thinking about “how to do this differently… why do we have to deny”? Cherrill tapped into her network of people at medical centers and universities, creating a collaborative model on the premise of using data, insights, and education. Instead of a review, approve/deny, the approach is review, approve and/or educate and/or recommend. The approach uses an increasingly sophisticated system of data, digital technology, and relationships. HealthHelp is taking product development further into the realm of machine learning and artificial intelligence, as well.

What gives HealthHelp the “right” to make recommendations to healthcare providers and patients?

An approach like this one—essentially, a break from the norm—depends on the credibility of the data, technology, and people involved. HealthHelp faced the challenge in the early days of people not being sure that a “non-denial” approach would be effective for containing costs. With 15 years of data, though, the company has been able to ingrain a lot of experience and knowledge into the approach and platform, to the extent that now 75% of prior authorization requests get approved for providers or are responded to with recommended changes that are approved by providers, without any human intervention. In about 25% of the cases, it goes to nurses for review; 6-7% of which are forwarded to doctors, and after that, the provider has the option to and can still disagree and go with treatment, which happens in under 0.5% of cases.

Results to date show improvement in the quality of care, which impacts Star and HEDIS ratings and reduces the cost of care by making sure the right kind of care is provided versus the lowest transaction cost at a point in time.  Also, in a fee-for-service model, a healthcare provider gets paid for the procedure regardless of the result. As the industry shifts to value based care with payments tied to outcomes, approval based on evidence or alternatives becomes more strategic to positively impacting outcomes (and payments). This approach, therefore, seems to have further credibility in the value-based care model and can help healthcare providers move into the new world of healthcare.  HealthHelp worked with CMS to get approval to qualify this program under provider education/quality improvement initiative and thus be included in the 85% Medical Loss Ratio for health plans.

The acquisition by WNS brings a complement of resources to both organizations and its client base

The healthcare industry is so ingrained in a yes/no approach that it took a few years before the model got adoption, primarily with mid-tier healthcare organizations. Joining with WNS gives HealthHelp the opportunity to scale and support a broader range of payers and providers. WNS also has a wealth of analytics capability, talent development and industrialization expertise that is complementary to HealthHelp, with resources that can help expand and develop the services and technology platforms to impact healthcare outcomes more broadly.

The acquisition of HealthHelp is part of the WNS strategy to shift attention from the cost of transactions to the cost of quality care and support—towards patient centricity. To date, WNS’ work in healthcare has been mostly analytical and transactional services: billing, collections, provider network services, and claims processing. HealthHelp brings in clinical and operational expertise to impact medical, as well as administrative outcomes, thus closing the loop. It also brings a human-centered (aka design thinking) approach to solving problems and developing a new business capability that the healthcare industry needs.

Posted in : Healthcare

Comment0

Ariba And Everledger Want Blockchain To Help Supply Chains Become More Ethical And Make The World Better

|

Last summer I wrote about my desire to be a superhero –to help companies buy IT products and services ethically and help suppliers create new opportunities for themselves and their people. When people source ethically they can reduce a lot of bad in the world – child labor, human trafficking, working conditions that harm and kill people, and a host of other problems.

Yesterday at SAP Ariba Live, the software company announced that it was partnering with blockchain provenance firm Everledger to explore the use of blockchain across Ariba’s suite of applications. As a first step, the two companies are working on a track and trace (provenance) application.

 

Everledger CEO Leanne Kemp and SAP Ariba Senior Vice President Joe Fox discussed the application and broader blockchain implications at the event, talking about empowering an ethical supply chain. They see a future where using blockchain to track goods from their raw materials through their final delivery would help companies have visibility into the entire supply chain. This would then allow companies to avoid problems such as:

  • Counterfeit goods being swapped in for the original goods at some point in the journey
  • Unintentionally supporting illegal and unethical conduct by suppliers and other third parties involved in conflict minerals like blood diamonds because you couldn’t tell where the diamond originated
  • Being out of compliance with government or industry regulations because related to the point above, you couldn’t prove that the product was made without conflict minerals or other illegal inputs

Undoubtedly, this announcement is a huge win for blockchain technology. It’s a major software company investing in a specific commercial application. It also reinforces the importance of provenance as a key blockchain “killer app,” coming soon after IBM’s announcement with Maersk that the two firms would work together to trace shipping containers. We’ve written before that provenance will get adopted faster than many fintech blockchain applications. These two deals show movement in that direction.

Even more powerful is the business and human story about making the world a better place. SAP Ariba’s and Everledger’s message of using blockchain to help business work more effectively AND to improve the lives of people is inspiring. It’s what technology is supposed to do, and we’re hoping to see more companies explicitly make corporate social responsibility a key factor in their technology decisions.

Posted in : Blockchain, supply-chain-management

Comment0

What will The Procurement As-a-Service Provider Landscape look like in 2020?

|

We have ranked the major service providers in the Procurement As-a-Service market in our 2016 Blueprint grid, see Exhibit 1. 

Exhibit 1: HfS Procurement As-a-Service 2016 Blueprint Grid

Click to enlarge

Looking further into the future, who will dominate the space in 2020? Three providers are set to remain at the helm for the foreseeable future: Accenture, IBM, and GEP.

IBM has a massive supply chain, which it smartly leverages in its procurement offerings. IBM is bullish on cognitive procurement. IBM BPS is morphing into Cognitive Business Solutions. Its own procurement provides a great playground for applying and road testing all the new cognitive procurement solutions, giving it an advantage over providers who don’t manage procurement for their own organization or have less ‘cognitive savvy’ clients.

Towards 2020 IBM will be leading in the cognitive procurement services space. Underpinned by a strong BPaaS platform, most clients will look at IBM first when it comes to new cognitive technology-driven services with vastly improved data analytics capabilities. The biggest challenge for IBM to succeed with cognitive procurement is to bring clients along this journey. The vast majority of procurement organizations perceive itself as far removed from advanced innovative procurement capabilities – they are fixing the basics, getting procurement technology to work and pondering the opportunities RPA could bring the procurement function. The gap between cognitive procurement and the (perceived) level of maturity and change readiness of procurement is the hurdle IBM needs to take to make its cognitive ambitions reality or be at risk of running too far ahead of the game.  

Accenture has a significant head start to all other providers, having invested and developed capabilities through acquisitions like Procurian and putting technology into every procurement engagement, leveraging one-to-many advantages for years. Now Accenture is betting on modularity to give them sustained advantage with current clients. And opening markets with medium-sized enterprises, for whom the business case of outsourcing procurement never added up.

Accenture seems to have a more ‘wait and see’ stance when it comes to cognitive procurement, investing in capabilities and use cases, but not willing to bet the farm just yet. Be confident they’ll pounce when the time is right and gobble up any procurement related cognitive and artificial intelligence capabilities they might lack. We expect via acquisition, maybe not of the magnitude of Procurian, but an inorganic technology growth strategy makes sense.

GEP plays in an increasingly contentious market, with its procurement BPO brethren gobbling up smaller niche firms, investing heavily in technology and partnerships. As the largest pure play procurement service provider and a pioneer in procurement technology, the onus is on GEP to continue its leading position and ‘best in class’ technology. We expect technology and services to converge more and GEP may emerge as an acquirer of cognitive capabilities as cognitive and AI in procurement are on the rise. 

Which are the providers emerging to challenge the leaders?

The early 2017 activity is driven by Indian heritage providers WNS and Wipro. They show their ambitions and make steps to move up the strategic value chain and incorporate more procurement technology into their service delivery and offerings.

WNS, with the Denali brand as a strategic procurement services arm, will have moved into the As-a-Service Winner’s Circle by 2020. The strong vision and upstream procurement capabilities from Denali put together with the execution prowess of WNS leads to cross-selling opportunities and investments in tech-enabled new services. The downstream procurement side of the business will have moved to procurement platforms of WNS’ partners, with WNS managing the platforms.

Wipro announced an investment and strategic partnership with Tradeshift, which is emerging as a top 3 digital procurement platform and arguably the only real “platform” in the space. This will turn out to be a smart move for Wipro, addressing a technology gap in its procurement offerings and developing on top of the proven platform that is Tradeshift, leveraging an existing and expanding network and adding Wipro Holmes capabilities. On top of this, the partnership with Tradeshift has the potential to help Wipro move up the strategic value chain, with more upstream services and shifting technology-based services to new commercial models faster.

Genpact continues to move up the strategic value chain, and between now and 2020 will have sought to bolster the technology layer in its Procurement As-a-Service offerings, something it lacks compared to other As-a-Service Winner’s Circle providers in 2016. Genpact’s conundrum is choosing between organic growth to add technology prowess to its BPO capabilities and acquisitions to get there faster. With a poor track record with acquisitions – Headstrong comes to mind –  we will follow their ability to make the RAGE Frameworks acquisition work and how the newly obtained Artificial Intelligence capabilities are transferred to procurement solutions.

Infosys builds out the procurement practice on AI, analytics and platform technology. ProcureEdge and Mana will continue to converge and bring innovation in downstream and upstream procurement. The lack of enthusiasm for BPO from Infosys’ top brass is a big concern. To make a concerted move in the Procurement As-a-Service space, Infosys needs a bigger commitment to BPO in general and more focus on bringing all the pieces (Mana, ProcureEdge, category management and strategic sourcing talent and capabilities) together.

Proxima leapfrogged incumbent legacy procurement BPO providers with high value, on-demand As-a-Service services, leveraging technology and expertise. Building out technology led point solutions (beyond current offering in Commercial Management) and marketing pure subscription-based services, a fairly new area for Proxima, will be a major effort to cement a leadership position in Procurement As-a-Service.

All signs point to buyers looking for more modular services – fitting well with Proxima’s focused approach and offerings. However, if the market would shift back to demanding end-to-end procurement services, Proxima would have to quickly acquire more end-to-end capabilities.

Fading into Obscurity?

In an earlier version, I wrote: “Capgemini will have lost most of its appetite for the BPO side of procurement, while IBX remains a technology asset in the increasing tech-focused procurement services market”. Reality caught up, and Capgemini sold IBX to Tradeshift last week – essentially selling its biggest asset in procurement. Combined with the seeming lack of focus for BPO in Capgemini’s C-suite post iGate acquisition, we can conclude it exited the Procurement As-a-Service market in early 2017, well before 2020. IBX needed significant attention and investment from its parent to compete in the procurement platform market, and Capgemini decided it wouldn’t stomach this.

HPE is now the home of Xchanging, once a force to reckon with in procurement outsourcing. Xchanging dropped from the As-a-Service Winner’s Circle in the 2016 Procurement As-a-Service Blueprint and are in danger of sliding down further in this space. Neither ‘interim-owner’ CSC nor HPE are big on procurement outsourcing, a market HP neglected in the last decade, although it had the biggest supply chain in the world to service and leverage.

The Bottom Line – From Cost Obsession to Value Creation

Looking into the glass bowl, we expect the procurement As-a-Service market to continue to build value for providers and service buyers as the value of digital solutions, analytics, procurement tech platforms and cognitive automation takes hold. Albeit with a smaller number of providers, which have a full stack approach ranging from upstream strategic capabilities to platform-based execution of transactional procurement – delivering business outcomes on a subscription model.

In 2020 the market will be bifurcated into the ‘Haves’ and ‘Have-nots’, the ‘Haves’ being those providers with technology, platform based delivery and upstream procurement capabilities, offering flexibility, agility, modularity and superior digital customer experience. 

With affordable, modular services making procurement services accessible to midmarket enterprises, a new hunting ground for service providers is gradually emerging. Moreover, emerging digital clients, which may be less than $50m in revenues, but have high volume transaction needs, will need to access procurement services.  It’s not going to be all about size and scale, but also profitability and transaction volume.

Providers will be venturing, more and more, into direct spend delivery models, supporting clients to drive value and efficiency with cognitive, AI capabilities. As enterprises like to keep control of sourcing of direct materials and services, this will be a collaborative, partnership approach as opposed to full-blown outsourcing.

 

Posted in : Procurement and Supply Chain

Comment0

Whatever the fate of the ACA, Consumerism in Healthcare is here to stay

| ,

While we wait for the new Obamacare “replacement” bill to sink or swim, we can’t help but ponder the implications of whatever outcome on the healthcare industry and the services ecosystem that supports it (especially since we get asked!). Amid all this uncertainty, one thing that is sure not to change is the consumerism that has taken a strong hold within the healthcare industry, which would be the case with or without the ACA. As consumers, we are wondering, if I can order merchandise from many different suppliers on amazon and pay in one place, why can’t I see all my clinical data and lab images and send them from one doctor or clinic to another? If I can send the record of my dog’s shots to a boarding kennel electronically, why not send my children’s immunization record to schools and summer camps just as easily? Yes, we know about interoperability and security issues. However, we have come to expect the same access and convenience in our healthcare experiences as we do in all the other aspects of our lives. 

Healthcare providers and payers are challenged to meet these increasing expectations—and are investing accordingly in digital enablement. HfS’ recent state of business operations survey indicated that 42% of healthcare companies are planning a significant investment in analytics to better understand what are the issues for whom, what are the opportunities to interact and impact members and patients and administrative support; and 36% are investing in social/mobile/interactive enablement to redefine, “modernize,” or create the customer experience. Despite all this planning and rhetoric, dealing with the healthcare system often feels like the dark ages rather than a modern customer experience. Our recent research found several examples of service providers and buyers working together that are hopeful of experiences to come:

  • Creating the digital customer experience by connecting front and back office: Due to ACA regulations, healthcare payers have needed to adjust to dealing with consumers (versus employers’ HR departments.) Many have set up retail storefronts including mobile centers where people can come in for enrollment (majority), questions and paying bills.  Teleperformance uses a proprietary software, TLSContact, to manage the process and workflow of the customer retail journey.  Representatives are able to access the initial app that the customer started online, and the workflow software helps identify the bottlenecks and how to better staff these centers.  For example, they can look at and analyze the processes to find out why there are long wait times—enabling clients to improve the process and better staff to meet demand.  
  • Developing customer journeys that look “outside the hospital walls” and building solutions that support the journey: Approaching healthcare in a consumer-centric economy drives healthcare organizations to look at how to initiate and keep the customer relationship over an extended period of time, not a point in time. Emergency rooms are designed to address a “point in time,” but we know that a health incident starts before a person arrives at the ER. VCU Health neurologist Dr. Sherita Chapman Smith is championing an effort to use telemedicine as a way to do assessments on stroke patients while they are in the ambulance, on their way to the hospital. (link).  In pilot simulations underway, the hospital is using trained actors to simulate stroke symptoms to test out the platform during ambulance rides to the hospital. “Patients” are picked up in an ambulance and connected via teleconference to the neurologist in the hospital, who conducts a remote assessment; and when they get to the hospital, they are quickly advanced to the next stage of treatment. The approach creates faster interactions between the points of care and speeds the time to treatment.
  • Using digital technology to make the users life easier and more real-time interactive with support systems: A healthcare organization that has partnered with NTT DATA Services described a consulting-led project which was aimed at the total redesign of the patient’s journey in various medical use cases (i.e. bariatric surgery, knee or hip replacement) in order to personalize that patient’s journey whenever he/she logs into the mobile app or accesses the website.  This means drawing together an understanding of that patient’s journey from start to finish, and knowing what stages they are in throughout their course of treatment, and what their needs might be. This hospital relied on the provider’s experience mapping expertise.  

It’s clear that healthcare isn’t getting less complicated any time soon. Whatever the fate of the ACA, the current political tone is foreshadowing more complexity and anxiety. Whether people are going to be uninsured or underinsured as critics of the current bill claim, or need to switch plans or providers, we can be sure that activity in the healthcare systems will increase. We can also be sure that that emotion will be at an all-time high, with the anxiety and fear that comes with people uncertain about what the changes mean for their lives and their loved ones: all the more reason that healthcare organizations need to be more nimble, intuitive and empathetic to that customer experience. Unfortunately, examples like the ones we highlighted above are the exception rather than the norm.

Bottom line: It’s time to think of and treat patients and members as customers you want to attract and retain, whether you are a health care provider or payer or a third party service provider partnering with a healthcare organization. Now we need to roll up our sleeves and partner in the effort to create a healthcare experience that puts the customer at its center.

 

 

Posted in : Healthcare

Comment0

How Design Thinking plays an integral role in increasing the value of outsourcing, service design, and delivery

|

In business operations, global shared services, and outsourcing, the mantra has been: centralize, standardize, industrialize, globalize. Traditional shared services and outsourcing contracts have been developed to focus on “lift and shift” and how to make processes increasingly more efficient and effective, measured by service level agreements. But what happens when the SLAs are green but customer or stakeholder satisfaction is level, stale, or down? When you feel that “innovation” is lacking? That the world is shifting to become faster, more flexible, and in-touch—but your business delivery isn’t and you just don’t have the time to think about it?

The answers to those questions are more and more often to use Design Thinking as a catalyst for innovation and continuous change. And it is the reason we explored the integration of Design Thinking into business operations and outsourcing design and delivery. Insights on how Design Thinking plays a role in creating a different experience—a different way of working and new insights for operational excellence and expansion—and 11 service providers are profiled in the recently published HfS Blueprint: Design Thinking in the As-a-Service Economy.

All of the service providers in the blueprint we’ve just published, Design Thinking in the As-a-Service Economy, are increasingly incorporating the principles of Design Thinking—human-centered, collaborative, action-oriented—into the way they work. Just like the increasing attention to robotic process automation and cognitive computing, experimentation has been going on for a while now… and so it is no longer “new” to many of them.

Design Thinking is a complement to, not a replacement for, operational excellence and solutioning for service design and delivery

You can use Design Thinking to understand what’s really causing problems or issues or expenses, by better understanding what people are actually doing –or not—and feeling. What is their experience? And then working through ideas that may revise, or replace, or eliminate process; that may change what people are doing and how; and could use current technology better, or new technology.  As one shared services executive told us, “we already know how to make something efficient [with Lean Six Sigma] and we required a new way of thinking in some specific areas.” Along these lines, we are not anticipating an end or replacement to Lean Six Sigma or “operational excellence” but adding a way of stepping outside the process to identify trouble spots and new solutions.

With Design Thinking you focus on understanding who is involved in whatever process or problem you are looking to address, and what are their expectations and needs (the “human” side)? And what is the industry and corporate context, the business outcomes to impact (the “business” side)? And what are the technology enablers? Then bringing it all together in a solution through a series of prototypes and tests. (See Exhibit 1: Incorporating Design Thinking Into Business Context for Shared Services and Outsourcing) Sometimes the solution is a quick fix, like changing the day of the week or where a request from a consumer is directed in a system; and sometimes it will help you identify a new way of working or a new service or solution.

Exhibit 1:

 

Bottom line: By using Design Thinking, we are moving a more human-centered, business-outcome oriented, and questioning approach to defining and delivering services in consulting and outsourcing, just the way the world is doing in general.

Using Design Thinking “was helpful because we make assumptions about people,” said an insurance executive interviewed in our study.  Taking the time to empathize with the end user through interviews and observations “helped us to make sure we understand not just what we do for the consumer but how it makes them feel.”  In other words, it’s not just about what you’re doing but the relevance of it to your end user. If you want your customers and your stakeholders to work with you to reach your business outcomes, then look for ways to make it easier for them to do it – and that means understanding them better, and that’s a role that Design Thinking can play.

Where is the outsourcing services industry on the path to integrate Design Thinking?

We want to help you, through our blueprints, to make the right match for a services engagement – short or long term. While that effort used to be about “we as a buyer post a list of requirements and look for cost reduction” and “we as a service provider will tap into our best practices, tell you about our features and functions, and hire or assign people who can process transactions” … times are changing.

Now, engaging a service provider or providing services to a business means understanding the context, the challenges, the outcomes desired, and how to broker and define a solution that can be flexible and change as the market changes.  We’ve been calling this movement the “As-a-Service Economy.” Design Thinking can play a role in this movement towards a new way of working as partners. But because it is a new way of working, it does take time – trial and error and willingness to work in a new way. It impacts roles, governance, budgets, and contracts. And equally important, you need to have alignment in the expectations and cultures of the partners involved in order to feel as though this way of working can work – to deliver results. 

In the Design Thinking in the As-a-Service Economy Blueprint, we look at the relevance, use and impact of Design Thinking in services engagements as it takes shape as an integral part of business operations and outsourcing solution and service design and delivery.  It includes coverage of the following service providers in terms of Design Thinking integration into the way service buyers and service providers are working together: Accenture, Capgemini, Cognizant, Concentrix, EXL, Genpact, Infosys, Sutherland, Tech Mahindra/BIO Agency, Wipro

Posted in : Business Process Outsourcing (BPO), Design Thinking

Comment0