IBM and customer loyalty company Aimia, which runs the Nectar scheme, are ethical exemplars for big data governance, said the ICO this morning.
The ICO had said it would publish its guidelines for organisations that use big data to comply with data protection law at the launch of its annual report earlier this month, when it argued for stronger powers to police breaches of the Data Protection Act and other information laws.
Today’s report re-emphasised rules that companies must abide by when using personal information to improve customer services, targeted marketing and enhanced rewards on a mass-scale. It aired the best and worst examples of big data usage amongst organisations.
Aimia has developed an ethical big data model called TACT, which stands for Transparency, Added value, Control and Trust.
“Their research showed high levels of concern amongst consumers about privacy and a desire for control over their personal data, and, contrary to a commonly expressed view, this was shared by consumers aged 19-29,” the report stated.
IBM was equally lauded for its ethical big data analytics framework.
The report added that companies will be motivated to change the way they collect and record customer-based business intelligence by risk of bad PR alone.
“It would harm a company’s reputation if it were the subject of a media story about the misuse of personal data, while consumers can also publicise their views to the world instantly,” the report stated.
“This is an important consideration in a competitive world. There may well be a competitive advantage in being seen as a responsible and trustworthy custodian of customer data.”
The ICO referred to information law academic Paul Ohm’s comments that Google's infamous Flu Trends project "breached a wall of trust" by using search data to find correlations between search terms and recorded cases of flu. It also highlighted retailer Target, who found a correlation between women's due dates and purchases, enabling them to effectively predict pregnancies and send relevant offers.
Customer access to data
Customers should be granted easy access to data held by companies so they can re-use it or sell it to other organisations, the ICO added.
“The proposed EU regulations includes a provision on data portability that would enable data subjects, under certain conditions, to obtain their personal data in ‘electronic and structured format which is commonly used’ and transfer it to other systems,” the ICO said. Customers should be able to reuse the data for their own purposes or monetise it by selling it to other organisations, it said.
“Our subject access code of practice 76 supports this by encouraging data controllers, when responding to subject access requests, to provide personal data in open reusable formats where it is practicable to do so.”
An ex-Google employee recently spoke about disrupting big data operations like Google and social media.
"I think there is a huge opportunity for micropayment-based models to disrupt what we have now", James Whittaker, the Microsoft technology lead said.
"I'm really interested in monetisation, because information isn't really free. With Google and all technology you are all data - your data belongs to someone else."
Privacy impact assessments
The ICO stated that companies should carry out privacy impact assessments to sum up what information is being used and for what purpose. The assessment should outline any plans to keep personal information for longer than usual, but must state the reasons why.
Companies cannot hold on to information just to see whether it will be useful in the future, if they are to abide by data protection laws, the ICO said.
Further, customers must be given a “privacy notice” which states when their personal information is being recorded and what it is being analysed for.
Companies who buy datasets take responsibility that the customers would have been shown a privacy notice at the time their information was recorded.
“If an organisation is relying on people’s consent as the condition for processing their personal data, then that consent must be freely given, specific and informed. This means people must be able to understand what the organisation is going to do with their data and there must be a clear indication that they consent to it”, the ICO said.
“If an organisation has collected personal data for one purpose and then decides to start analysing it for completely different purposes (or to make it available for others to do so) then it needs to make its users aware of this.”
This could affect organisations like Facebook, which recently came under fire for allowing researchers to carry out a mood experiment on a sample of users.
“This is particularly important if the organisation is planning to use the data for a purpose that is not apparent to the individual because it is not obviously connected with their use of a service. For example, if a social media company were selling on the wealth of personal data of its users to another company for other purposes,” the ICO said.
Big data complexity is no excuse
The report is a reminder to companies that they cannot hide behind the complexity of big data if it wants to stay on the right side of the law.
Customers are well within their right to make a subject access request to find out what information a company holds on them, the ICO said.
It said: “It may be thought that the volume and variety of big data and the complexity of the analytics makes it more difficult for organisations to meet this obligation. However, these cannot be an excuse for not meeting legal obligations."
Steve Wood, the ICO’s Head of Policy Delivery, was keen to encourage organisations that the guidance was not a limitation on what can be achieved with big data.
He said: “What we’re saying in this report is that many of the challenges of compliance can be overcome by being open about what you’re doing. Organisations need to think of innovative ways to tell customers what they want to do and what they’re hoping to achieve.”