Hacking your vote from inside your head

by Anton Janik ([email protected]) 387 views 

We recently learned some important lessons about the dangers inherent to losing control of your customer data, either through hacking, internal theft, or from poorly designed controls governing what your business associates have access to or may share themselves. That case in point is Cambridge Analytica.

Facebook on March 16 suspended the accounts of Strategic Communication Laboratories (SCL) and its affiliate Cambridge Analytica. (SCL created Cambridge Analytica. Cambridge Analytica was funded by Robert Mercer and had Steve Bannon on its board of directors.) Back in 2015, Cambridge Analytica began pooling Facebook user profile data from the myPersonality app, which was being used by Cambridge professor David Stillwell to understand and measure personality traits. Stillwell was able to track the scored personality traits across the app’s user base, and correlate personality scores back to a pattern of Facebook likes.

The Guardian reports this was groundbreaking in the way it revealed correlations between personality traits and measurable behavior via likes; that is, by knowing the pattern of a user’s likes, you could determine their personality traits.

Cambridge Analytica co-founder Christopher Wylie took that idea one step further. While studying for his doctorate, he had come across a paper discussing how personality traits could be a precursor to political behavior, i.e., determinative of likely voting behavior. One just needed to acquire a sufficient data set covering the voting pool, and a known set of likes to define those likely to vote in a particular way.

After graduation, Wylie was hired by SCL, which was in the process of gathering that large Facebook dataset. Reports have revealed that the myPersonality app had been granted special permissions to gather not just the personal data of consenting users, but also the same data for all of those users’ Facebook friends — without ever seeking or being granted such access from the Facebook friends themselves. Wylie revealed that the effort resulted in the gathering of 50-60 million Facebook accounts, including users’ status updates, likes, and potentially, private messages.

In a fascinating interview by The Guardian, Wylie revealed how Cambridge Analytica manufactured data outside Facebook to induce behavior among those Facebook users. Through datamining, they had learned what kinds of messaging each Facebook user would be susceptible to, as well as the framing, topics, content, tone, and threat level of the needed messaging, where the Facebook user would consume that messaging, and how many times Cambridge Analytica would need to “touch” the Facebook user with that messaging in order to change how the Facebook user thought about an issue.

Then, Cambridge Analytica used its data scientists, psychologists, strategists, and creative team of designers, videographers, and photographers, to create content including blog postings and websites which was sent to a targeting team that injected it into the internet. As Wylie says, “whatever we think this target will be receptive [to], we’ll create content on the internet for them to find, and they will see that and click on it and go down the rabbit hole until they start to think something differently.” As Wylie points out, this strategy was so successful because of the ability to mine the data so granularly that, “you are whispering into the ear of each and every voter; you may be whispering one thing to this voter and something different to that one,” all of which induces the particular intended outcome.

This isn’t simply an issue of leading a mouse with cheese. Through that active creation of misinformation, false narrative, and false support, Cambridge Analytica created a bubble of false reality to induce a predetermined outcome in actuality. When asked why Cambridge Analytica would undertake such a task, Wylie stated that “if you want to fundamentally change society, you first have to break it. And it’s only when you break it that you can remold the pieces into your vision of a new society. This was the weapon that Steve Bannon wanted to build to fight his culture war.”

The Cambridge Analytica lesson is important because it facets just how damaging data losses can be. That is, data stolen from one or more sources can be aggregated, segregated, and weaponized in ways that the originating data repositories could never have anticipated. And once lost, that data cannot easily be recaptured. In the coming days, we may well learn that this data loss was just the tip of the iceberg, or of a fleet of icebergs, and that other businesses have acquired similarly wide swathes of data without user consent. It may turn out that the acquired data has not been properly secured since acquisition and has continued to disseminate across the web, or even across the dark web.

So this brings us back to root precautions: pay attention to what data you have, how it is held, and who has access to it. Encrypt your data in motion and at rest. Penetration test your repositories and their access tools. Monitor your access rights, to whom they are granted, the scope of data that may be retrieved, and what must be done with that data at the end of the project.

Make sure your business associate agreements require compliance with these controls, and audit them for that compliance.
––––––––––––––
Editor’s note: Anton Janik is an attorney at Mitchell, Williams, Selig, Gates & Woodyard, P.L.L.C. The opinions expressed are those of the author.