Rebuilding consumer trust after Cambridge Analytica

Having invested heavily in online services, the last thing brands and public bodies want is to see customers and service users abandon them. J Cromack, chief commercial officer at Consentric, advises organisations how to navigate the current crisis of confidence,

It’s concerning to see companies still treating people’s personal data as a tradeable commodity – something they can use however they like and sell on for a handsome profit. But all that is about to end.

Many consumers have until now turned a blind eye to routine data collection – largely accepting it as part of the deal for having convenient and often free online services. But Cambridge Analytica’s activities and Facebook’s responsibility and subsequent response[1] have brought matters to a tipping point[2]. Online service providers are going to have to work hard to win back trust.

Data custodianship is a privilege

Meanwhile there are just weeks to go until the EU General Data Protection Regulation (GDPR) becomes enforceable, bringing data-privacy controls and organisational obligations up-to-date with the digital era. It still comes as a surprise to many companies to learn that the data they have collected is not their property. But now they will have to learn fast that they are merely custodians of that information – with a duty to use it circumspectly[3].

GDPR should be used by companies as an excuse to kick-start something new: being honest and transparent with people about data intentions; and having a holistic strategy and approach to customer data that transcends individual systems – and which is easily auditable.

Where organisations are seen to use people’s data respectfully, with permission, in ways that provide tangible benefits to an individual – such as more direct access to what interests them and personalised promotions – consumers are much more likely to opt in.

What is changing

Under GDPR, every organisation that captures customer data – from internet giants to banks and retailers, to local authorities, health services and charities – has an obligation to be transparent about the personal data it is collecting. It must seek permission for each specific use case, take appropriate steps to safeguard that data and ensure it is not shared beyond the parameters the customer has knowingly agreed to.

This is nothing new; these principles were enshrined in the existing Data Protection Act[4]. But until now there has been too much emphasis on consumers reading the small print and taking proactive steps to protect themselves. Under GDPR, and certainly in the aftermath of the CA/Facebook breach of public trust, accountability will shift back to the organisations collecting, storing and using people’s data. They already face an uphill struggle; the latest figures from the Information Commissioner’s Office suggest that across the UK as a whole only a fifth of people have trust and confidence in companies and organisations storing their personal information[5].

Reframing GDPR compliance as a priority for consumer trust

All this means, once and for all, that use and safeguarding of people’s personal data has ceased to be something that directors can palm off to compliance/governance and IT departments. It is central to organisations’ relationship with their customers, to how they are perceived in the market, to whether they will succeed or fail. At best, the way that companies and service providers manage and communicate about their handling of customer data will become a brand differentiator; as a minimum it will be a condition of engagement.

In this light, GDPR compliance takes on a different meaning. Suddenly it is less about how permissions are worded at each customer touch-point or the individual security requirements applied to each IT system and cloud application. Instead it is about being open and honest and true to the principles of data protection.

In time, this should also be something that customers themselves can have some control over (by being able to more easily review and edit their data permissions, for example). A new survey by Boston Consulting Group found that people were much more likely to willingly share their data if they felt confident that harmful use would be prevented[6].

Avoiding ever decreasing circles

Above all, what the Cambridge Analytica/Facebook story demonstrates most is that it is not the regulatory authorities that companies need to fear, it is the customers who will vote with their clicks and swipes about where they feel comfortable leaving their digital footprint.

If customers fear they are living out Dave Egger’s book The Circle – and are driven off the grid where they can’t be tracked – no one wins. Organisations need to be seen to be using data with respect, not in ways that create suspicion and mistrust, causing customers to untick all the boxes, refuse cookies and shroud themselves in anonymity.

[1]Facebook’s floundering response to scandal is part of the problem, FT.com, March 19, 2018

[2]Tech world experiencing a major 'trust crisis,' futurist warns, CNBC, March 20, 2018

[3]Cambridge Analytica and Facebook – did they put the individual at the heart of their data strategy?,Consentric blog, March 19th 2018

[4] The rights of individuals, Data Protection Act, Principle 6

[5]ICO survey shows most UK citizens don’t trust organisations with their data, ICO/ComRes research, November 2017

[6]Generating Trust Increases Access to Data by at Least Five Times (exhibit 2), Bridging the Trust Gap in Personal Data, Boston Consulting Group, March 2018