Since Saturday, when former Cambridge Analytica worker (turned whistle-blower) Christopher Wylie revealed details of his workplace, Big Data has once again exploded in the news, for less than savoury reasons. People are outraged by how their personal data has been mined through Facebook, used to profile them, and then leveraged to influence others in matters of great importance - all without their knowledge or consent.
It came to light that academic Aleksandr Kogan had created a personality quiz on Facebook, something most app developers do at some stage in their career. What most don’t do however, is harvest the data and sell it on to third parties, in this case Cambridge Analytica, a political consulting firm heavily involved in US and UK political campaigns. It’s not just the 270,000 people who accepted the terms of this “personality quiz”, to allow Global Science Research (GSR) access to their personal data, who feel tricked. It’s one thing that their data was sold on to Cambridge Analytica, but what about the further 50 million people connected to the 270,000 whose data was harvested and sold via the same method?
Now, whether various terms and conditions between Facebook, its users, Cambridge Analytica, and any other third parties make all this legal is a matter for the lawyers, pending an investigation in both the UK and US. But in the world of data analytics, consumer privacy is a companion in any of the work that we at Ecovis do as practitioners, and there is a line that can be crossed without the correct culture and ethical framework. Regardless of the terms, the uproar alone is probably enough to say that the line has been crossed.
The truth is that companies know more about us as customers than perhaps we realise, and certainly more than ever before. In general, we benefit from this; the communication we receive is more relevant to us, offers are more tailored to our needs, supermarkets stock more of the products we like, retailers stop sending us catalogues when we’re clearly not interested. Even customer feedback can now be “mined” en masse, to understand themes in the customer base. Problems in customer experience are identified and rectified, and companies can identify issues of growing importance to act on, such as a visible commitment to corporate and social responsibility.
While the Facebook scandal certainly strikes us as immoral, soon it will also be illegal in the EU, and following Brexit, will continue to be illegal in the UK. With the advent of GDPR in May 2018, we know that power is being given to the people. Companies must explicitly request permission and be granted consent to use or process our personal data for any purpose other than what’s necessary to carry out their business (a “necessary” example, the shipping address to send our goods!). We can withdraw our consent at any time, and companies are required to permanently delete our data in that case. We will have the “right to be forgotten”.
As a firm of professional advisors, Ecovis have access to the most confidential client data and plenty of personal information. Data protection, privacy, security and confidentiality is in the Ecovis DNA. Through our data analytics service, we are helping our clients to realise immense value from their company data and helping to change the way their business is carried out. In doing so, Ecovis continue to make sure our moral and ethical obligations are fulfilled.
The use of data to companies is powerful and financially valuable, allowing businesses to optimise costs, recognise the needs of their customers (and service them) and dramatically increase their return on marketing investment. But as Peter Parker was told by his dear uncle, with great power comes great responsibility. It’s our choice as to how we obtain, interrogate, and act on the insights that we can derive from data, and how we take care not to cross the line.