Could the European Data Protection Regulation wreak havoc with Data, Automation and Cognitive solutions?


Whenever the European Commission proposes legislation that relates to Internet companies or data protection, tensions flare and lobbyists have a field day. Suggestions of protectionism and stifling innovation quickly enter the public sphere.

Among the customary saber rattling, the General Data Protection Regulation (GDPR) became law on 24 May 2016, but the broader IT industry didn’t take much notice. Yet, the implications of the regulation are profound and could conceivably dramatically impact the way companies deal with cloud services and Artificial Intelligence. As the adoption of Intelligent Automation starts to accelerate with Cognitive Computing and Artificial Intelligence being critical building blocks, we sat down with lawyers at Squire Patton Boggs to discuss the repercussions for the broader IT industry.

What is the legislation all about?

The key elements as well as implications of the legislation include:

  • The GDPR is the European Union’s (EU) new data protection law; it replaces the General Data Protection Directive 95/46/EC.
  • It took effect on 24 May 2016 and becomes enforceable on 25 May 2018
  • The legislation imposes a uniform data protection law on all EU members, though national governments and Supervisory Authorities (SAs) retain substantial powers
  • Sanctions and penalties of up to €20 million, or 4% of global turnover, whichever is higher, for a variety of infringements, including:
    • Breaches of core data protection obligations (e.g., transparency, valid justification, accuracy, security)
    • Failure to comply with data subjects’ rights (e.g., to access, object to processing, be forgotten)
    • Failure to comply with rules on transfer of data outside the European Economic Area (EEA)
  • The GDPR regulates not only businesses with operations in the EU but also companies (whether controllers or processors) that have no EU presence if they are involved in monitoring the behavior of EU citizens or selling products or services to them
  • The GDPR regulates data processors directly for the first time. Processors must now maintain adequate documentation regarding all categories of personal data processing activities carried out for a controller. They must also implement appropriate security standards.

Major data breaches, such as the ones we’ve seen with corporate giants like Anthem and eBay or the infidelity website Ashley Madison, are well documented. We will not attempt to dissect the broader implications of this regulation in this blog post. However, two implications jump out.

First, the severity of the penalties of up to 4% of global turnover means that the regulators have the means to come down hard on organizations that are in breach of the legislation.

Second, and much closer to our research agenda, the last bullet point about regulating data processors goes to the heart of the As-a-Service Economy and applies to cloud providers, BPOs and Intelligent Automation providers in equal measure.

Service providers face direct responsibility for the data they are processing

So why is the regulation of data processors important? Because, for the first time, processors are directly liable for the data they are processing. Without wanting to drift into legalese too much, a data processor is an organization that may be engaged by a client to process personal data on their behalf (e.g., as an agent or service provider). In our industry, that could include cloud storage providers but also the burgeoning Artificial Intelligence segment. There are broad legal implications for processors, many of which are difficult to translate into simple English. But here’s a crucial one: processors must implement appropriate technical and organizational measures to ensure a level of security appropriate to the risk involved, which means that the processor must make itself aware of the types of data involved and the associated risk levels.

In addition, processors will have to maintain records for the processing activities under their responsibility. Critically, these records must be made available to the supervisory authority on request. At the same time, processors have to guarantee confidentiality and security. For any breach the processor may be directly liable if it has not complied with the regulation or has acted outside the instructions of its client. The focus of the regulation is all about ensuring that processors assist their clients in protecting the freedoms and the rights of the individual by requiring processors to take responsibility for securing the data that they handle. They must also contractually obligate any sub-processors to do likewise. And they must assist their clients in meeting the requirements of the regulation, including in responding to data breach incidents.

So what does that mean in practical terms? Take the example of UK mobile operator TalkTalk. Its data breach is well documented. Under the new legislation the fine could be up to 4% of its turnover, which is massive. At the same time, when Wipro employees working on the TalkTalk contract were accused of making scam calls from their call center, it is likely that Wipro would be directly responsible.

However, beyond the broader and more generic issues, the most challenging clause for the journey toward the As-a-Service Economy comes from a stipulation on automated processing. It is so important that we are quoting it in all its legal splendor: “The data subject should have the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her or similarly significantly affects him or her, such as automatic refusal of an online credit application or e-recruiting practices without any human intervention.”

The legislation explicitly calls out that such profiling includes people’s performance at work, their economic situation, health, personal preferences and interests. This strikes at the heart of Artificial Intelligence and intelligent Automation at large. While fraud and tax-evasion monitoring are excluded, the thrust of Machine Learning could be seriously thwarted. However, it will probably take the first court cases to determine where statistical analysis of neural network and Machine Learnings ends and where personal data that potentially needs to be presented in court starts. Yet, the legal challenges for service providers become obvious as they have to able to answer to individual claims of breach of privacy.

HR processes and intentions of workforce reduction will be challenged

Let’s apply these stipulations to a couple of scenarios:

HR related processes will pose the biggest challenges. Many recruitment processes will come under scrutiny because the use of Machine Learning is widespread. To prove that many of these processes are not fully automated will be almost impossible. Similarly, using cognitive tools for performance management will come under the spotlight.

The use of cognitive and automation tools to assemble evidence to reduce staff can be challenged in court. Yet, beyond HR processes the impact will be also felt for broad customer onboarding processes where we already see broad use of RPA and Machine Learning. Profiling is not only used to enhance the user experience but to automate broad set of processes.

Organizations in the US—or the UK after Brexit—should be warned not to give this legislation short shrift. Because, as noted above, it applies to any company that markets goods or services to EU residents regardless of whether the company is located or uses equipment in the EU. To quote some sadly departed UK politician: “We are all in this together!”

Bottom line: The legislation needs to be enforced by national data protection authorities and, ultimately, in court.

Service providers need to evaluate the impact of GDPR on the way they deliver services. Suffice it to say, the legislation needs to be enforced by courts which can be cumbersome and costly. As with any data protection legislation, this will be a fluid process with continued lobbyism and with legal challenges. Thus, the legislation will not wreak havoc with the As-a-Service Economy but it will slow the journey down and make it more complex. And should you have specific questions, we will come armed with our lawyers.

Posted in : Cognitive Computing, Robotic Process Automation, Security and Risk


Leave a Reply

Your email address will not be published. Required fields are marked *

    Continue Reading