In a post on Mark Zuckerberg’s Facebook page, he acknowledges that the company made a major mistake and he takes responsibility-
I started Facebook, and at the end of the day I’m responsible for what happens on our platform. I’m serious about doing what it takes to protect our community.
He does what any leader should do when there is a catastrophe of this size but an apology will not be enough, this is a big deal. The data analytics firm that worked with Donald Trump’s election team and the winning Brexit campaign harvested millions of Facebook profiles of US voters, in one of the tech giant’s biggest ever data breaches, and used them to build a powerful software program to predict and influence choices at the ballot box. Documents seen by the Observer, and confirmed by a Facebook statement, show that by late 2015 Facebook had found out that information had been harvested on an unprecedented scale.
One of the most questioned parts of Zuckerberg’s apology statement is whether the team at Facebook learned about the enormity of the breach beforehand and chose to stay silent. Zuckerberg says the company learned about it from journalists at The Guardian, The New York Times, and Channel 4, who have been investigating Cambridge Analytica since 2015. Many do not believe he was totally unaware and they say he should testify before Congress. In an exclusive TV interview with CNN last night, Mark Zuckerberg said he will testify before Congress if it is “the right thing to do.” And as for calls for stiffer regulation for Facebook, he also suggested that he is not opposed to regulation, saying, “I’m not sure we shouldn’t be regulated.” Instead, Zuckerberg said the focus should be on what is the right regulation.
In the meantime Zuckerberg has already started putting new policies in place at Facebook to win back the trust of the community. Based upon his statement the broken trust is both between Facebook and its users as well as between Facebook and Cambridge Analytica. He outlines events leading to the Cambridge Analytica breech, writing that he has been “working to understand exactly what happened”.
That timeline starts with 2007, when Facebook first launched the Facebook Platform which gave developers permission to create third party apps that could be integrated with the site. In 2013 a Cambridge University researcher created a personality quiz app that gained access not only to the few hundred thousand people who installed it, but also all of their friends. In 2014, Facebook changed their policy so that third party apps could no longer access information about a person’s friends unless those friends also gave permission, but the damage had already been done.
Zuckerberg writes that Facebook learned the Cambridge researcher had violated the company’s policies in 2015, when journalists revealed that the data had been shared with voter profiling company Cambridge Analytica. At that time, the researcher’s app was banned and both he and Cambridge Analytica provided certification showing they had deleted all user data. It was only last week, Zuckerberg says, that Facebook learned this may not have been the case. “This was a breach of trust between Kogan, Cambridge Analytica and Facebook,” Zuckerberg writes. “But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.”
So what’s next?
Zuckerberg says Facebook is working with regulators to conduct a forensic audit of Cambridge Analytic and Facebook is taking more steps to bolster security. The company is investigating other apps that were created before 2014’s policy revision and will audit any suspicious activity and alert users who might be effected; Facebook is restricting third party apps’ access to data; Facebook will automatically delete long dormant apps; and Facebook is also creating a new tool to help users learn more about what permissions they are giving apps.
“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you.”