Quantcast

How Facebook likes could profile voters for manipulation

Barbara Ortutay and Anick Jesdanun, AP Technology Writers | 3/20/2018, 6:36 a.m.
Facebook "likes" can tell a lot about a person. Maybe even enough to fuel a voter-manipulation effort like the one ...
A former employee of a Trump-affiliated data-mining firm says it used algorithms that "took fake news to the next level" using data inappropriately obtained from Facebook. (AP Photo/Thibault Camus, File)

NEW YORK — Facebook "likes" can tell a lot about a person. Maybe even enough to fuel a voter-manipulation effort like the one a Trump-affiliated data-mining firm stands accused of — and which Facebook may have enabled.

The social network is under fire after The New York Times and The Guardian newspaper reported that former Trump campaign consultant Cambridge Analytica used data, including user likes, inappropriately obtained from roughly 50 million Facebook users to try to influence elections.

Monday was a wild roller coaster ride for Facebook, whose shares plunged 7 percent in its worst one-day decline since 2014. Officials in the EU and the U.S. sought answers, while Britain's information commissioner said she will seek a warrant to access Cambridge Analytica's servers because the British firm had been "uncooperative" in her investigation. The first casualty of that investigation was an audit of Cambridge that Facebook had announced earlier in the day; the company said it "stood down" that effort at the request of British officials.

Adding to the turmoil, the New York Times reported that Facebook security chief Alex Stamos will step down by August following clashes over how aggressively Facebook should address its role in spreading disinformation. In a tweet , Stamos said he's still fully engaged at Facebook but that his role has changed.

It would have been quieter had Facebook likes not turned out to be so revealing. Researchers in a 2013 study found that likes on hobbies, interests and other attributes can predict personal attributes such as sexual orientation and political affiliation. Computers analyze such data to look for patterns that might not be obvious, such as a link between a preference for curly fries and higher intelligence.

Chris Wylie, a Cambridge co-founder who left in 2014, said the firm used such techniques to learn about individuals and create an information cocoon to change their perceptions. In doing so, he said, the firm "took fake news to the next level."

"This is based on an idea called 'informational dominance,' which is the idea that if you can capture every channel of information around a person and then inject content around them, you can change their perception of what's actually happening," Wylie said Monday on NBC's "Today." It's not yet clear exactly how the firm might have attempted to do that.

Late Friday, Facebook said Cambridge improperly obtained information from 270,000 people who downloaded an app described as a personality test. Those people agreed to share data with the app for research — not for political targeting. And the data included who their Facebook friends were and what they liked — even though those friends hadn't downloaded the app or given explicit consent.

Cambridge got limited information on the friends, but machines can use detailed answers from smaller groups to make good inferences on the rest, said Kenneth Sanford of the data science company Dataiku.

Cambridge was backed by the conservative billionaire Richard Mercer, and at one point employed Stephen Bannon — later President Donald Trump's campaign chairman and White House adviser — as a vice president. The Trump campaign paid Cambridge roughly $6 million according to federal election records, although officials have more recently played down that work.