In 2018, it was revealed that a major data leak saw personal information harvested from almost 90 million Facebook users without their knowledge or consent. The key players in this notorious privacy event were Cambridge Analytica (a political consulting firm that worked on the Trump election campaign in 2016) and Facebook (the provider of the harvested data).
The Facebook-Cambridge Analytica data breach is one of the most notorious data privacy scandals in recent years — we’ve even referred to it a few times in previous posts. Interestingly, the scandal unraveled around the same time that the EU General Data Protection Regulation (GDPR) was about to come into effect.
Of course, it was too little too late for all those Facebook users, but we wanted to explore the event in light of how the world regards privacy today, and how the GDPR could have helped prevent such a large-scale breach of trust.
First, we need to understand how millions of people’s personal data made its way into the hands of Cambridge Analytica. That means going back to the early 2010s, when Facebook released its Open Graph API.
Focused on data portability, the API allowed third-party (external) developers to extract data from users and integrate their own apps with the increasingly popular social networking website. This is how a personality quiz called This Is Your Digital Life, launched by University of Cambridge data scientist Dr. Aleksandr Kogan, found a legitimate way into Facebook’s data ecosystem.
Unbeknownst to Facebook at the time, there was a loophole in the API where app developers like Dr. Kogan could not only get access to a person’s data, but also the data of that person’s friends as well — without the data owners’ knowledge or consent.
Kogan paid around 300,000 users to take the quiz, which collected reams of personal data that could be used to build psychographic profiles about each user and, consequently, millions of their friends.
From there, in direct violation of his agreement with Facebook, Kogan sold this data to Cambridge Analytica.
The loophole discovered in Facebook’s former version of the Open Graph API left the doors wide open for third parties to harvest huge amounts of personal data from users. When Facebook learned that Kogan had sold user data to Cambridge Analytica, they requested that the firm delete all data they had received.
Which, evidently, Cambridge Analytica didn’t end up doing:
Wylie said he received a letter in 2016 from Facebook, writing that they knew he still had data from Cambridge Analytica’s harvesting program, and asked him to delete it.
“It requested that if I still had the data to delete it and sign a certification that I no longer had the data,” Wylie said. “It did not require a notary or any sort of legal procedure. So I signed the certification and sent it back, and they accepted it.”
Had the GDPR been in effect when this happened, Facebook would have been considered a data controller, obligated under the EU privacy law to ensure that third parties like Kogan honour their data sharing contract, i.e. that they only use personal data for the purposes outlined in the contract.
In retrospect, Facebook’s “light standard of scrutiny” would have been considered insufficient, failing to enforce any real accountability from Dr. Kogan and Cambridge Analytica.
Equally concerning about all this is how much sensitive personal data was captured and shared about Facebook users. Information like political opinions, religious beliefs, sexual orientation and more could easily be gathered from users’ Likes on Facebook and their personal ‘About’ section.
Under the GDPR, sensitive personal data is subject to stricter regulations around collection and usage, given that such information could be used in ways that could potentially endanger, blackmail, or discriminate against the individual in question. Perhaps Facebook might have reconsidered how easily accessible they made this information to third parties if the GDPR’s harsh penalties for unlawful data processing were in place then.
Lacking tough privacy requirements such as those imposed by the GDPR, Facebook did not have a Data Protection Officer (DPO) to ensure user privacy was a company priority. They did not notify users that their personal information had been breached, instead choosing to keep users in the dark for years until a “whistleblower” finally brought the affair into the public eye.
The GDPR illuminated many privacy issues that had too often been swept under the rug. Issues such as failing to inform users of their rights over their data, putting profits ahead of privacy, undermining the importance of privacy in data sharing, and cleverly worded privacy disclosures that neglect the interests of natural persons.
Laws like the GDPR could have held Facebook to account much earlier in the unfolding of these violations.
The social media giant has since fallen into line with GDPR compliance requirements, implementing significant changes to their platform design and communications around user privacy.
In an era where privacy is now at the forefront of consumers’ concerns online, companies must evolve their practices and processes, lest they risk the same fall from grace that Facebook is still working to recover from.