Sophie Chase-Borthwick, Director of Data Ethics & Privacy at Calligo, discusses how to apply Privacy by Design to digital transformation projects, without compromising either the business’ objectives or its adherence to privacy legislation.
There still remains some hostility amongst IT professionals around Privacy by Design becoming a legislative requirement within GDPR. If Digital Transformation aims for the freeing of as much data and accurate context as possible around the business, and Privacy by Design looks to ensure confidentiality and pseudonymisation, how can the two co-exist?
When the language of GDPR was originally announced and Privacy by Design was, for the first time, to be made a legal requirement for IT projects, it was feared that hundreds of Digital Transformation projects – many of which had been scoped and designed months, if not years, previously, – would become derailed. Or worse, would reach an indefinite impasse. After all, retrospective Privacy by Design is often impossible, but stopping the project and re-starting from the ground up is beyond unpalatable.
Clearly, many companies have managed to overcome this gross overstatement and have successfully completed profitable digital transformation projects that simultaneously respect the privacy of those whose data is being shared. Others, however, have not. Many have found the balance between openness and privacy difficult to find and compromised one by over-serving the other. Some have even neglected the need for privacy altogether.
In fairness, the compromise is not easy to find. Nor is it easy to describe. Instead, we have included below some stories from our own experiences of companies who have struggled, and shown how they could have avoided trouble altogether. The errors these companies made might seem obvious in hindsight, but you may be surprised at how often these mistakes are made.
Over-ambition gone wrong
A huge digital transformation project for a major European retailer looked towards designing a more productive working environment. Using AI, they wanted to analyse the employees’ use of physical and digital resources to identify what was being over- and underused.
What is unusual about this project is that the retailer wanted to integrate this data on resource use with employee HR records. They looked at usage reports, access card data and hardware in the context of employee attendance, training and performance ratings. The idea was that this would provide an objective analysis of which individuals exhibited below par productivity. The final step was to use this output to automatically determine who would be entitled to pay rises and promotions, removing the risk of subconscious or indeed conscious bias that may have been present in performance reviews.
It goes without saying that this use of data was a very delicate operation. The technical infrastructure was secure and the workflows and machine learning in place were admirable, but the protection of individual privacy rights had not even been a consideration during the project’s design phase.
The most troubling aspect of the project was that employees were victims of an automated decision-making process, without any form of awareness or thus consent, which directly contradicts GDPR. Because of this, the project had to be entirely rethought, creating both financial and reputational costs.
The first stage of the project – the analysis of the individuals’ productivity in the context of their training and role descriptions – was legitimate and valuable. It was the final step, where the output triggered automated profiling, that led to the handbrake having to be pulled. But how did the project continue for so long before anyone highlighted these problems?
In this instance, the issue was over-ambition, accentuated by a lack of prior communication from the privacy team about the inherent responsibilities of any IT project. This created a dangerous naivety, ultimately leading to the error.
The unsupervised developer team
The second scenario concerns an international medical organisation which produces devices for the healthcare industry. The developer team used IoT technology to monitor the use of every device they created, with the aim of using the data for product development and maintenance.
Due to the medical nature of these devices and the breadth of data collected, this was enormously sensitive. Any healthcare data is classified as a special category within GDPR, which means additional prohibitions over its use. Despite this, neither the patients themselves nor the healthcare professionals, or even the wider business beyond the developer team, were aware of the collection and use of this sensitive data.
What should have been in place here is a set of project oversight practices, that ensured that new projects would at an early point be run past a privacy or legal expert to ensure there were no red flags. On top of this, there should have been documentation to record and govern the data’s collection, storage and use. However, this was not the case. It was only once the legal team had begun their GDPR preparations and company-wide audit of data use that they discovered this activity.
Understandably, the project was immediately halted. This left the developer team incapable of identifying issues or bugs in the devices that needed immediate rectifying. The knock-on effects included product development delays, disenfranchised users, unhappy investors and extra costs to re-launching a similar project without data issues.
This is a classic example of a well-meaning project which had dramatic consequences simply because privacy had not been incorporated into its design from the beginning.
What could have been done better?
We can clearly see the costly consequences when digital transformation projects are launched without prior consideration of the potential data privacy implications. But businesses have recently begun a trend that aims to specifically avoid these ramifications: the use of Privacy Architects.
Privacy Architects are experts in both privacy and technology. They are able to assess a business’ objectives alongside a technical project, while also identifying the privacy legislation that it is subject to and the necessary steps to ensure compliance.
Such a combination of expertise is rare, yet essential. Without the wider context of the realities of technology, then the application of the law can be needlessly obstructive. Without knowledge of privacy law, technology projects can create new risks for a business. The wider effects of which go far beyond penalties and fines, but instead to the heart of whether customers can trust you.