On a crisp morning in early September, Christopher Wylie is waiting opposite Google’s London headquarters. It’s a brightly colored building that stands out–not unlike Wylie himself, with his pink hair and nose ring. Six months on from uncovering information that shook the world’s biggest social-media companies and questioned the legitimacy of the 2016 U.S. election, Wylie isn’t holed up in an embassy or in exile; he’s living freely in London.
The week before we meet, Google declined to send a senior executive to testify before the Senate on Sept. 5 about the role of tech companies in election meddling by foreign actors. At the hearing, Twitter chief executive Jack Dorsey and Facebook chief operating officer Sheryl Sandberg testified that their companies were taking sufficient action to protect the November midterms from foreign interference. “We are learning from what happened, and we are improving,” Sandberg said.
“I call bullsh-t on that,” says Wylie, 29, who has become a cheerleader for greater regulation online. “The idea that we should trust the security of our digital spaces to private companies that have no accountability except to themselves is ridiculous.”
Wylie speaks with a certain authority on the matter. In March, he publicly revealed how Cambridge Analytica, a political consultancy he helped found in 2013, used illegally obtained Facebook data to psychologically profile voters for electoral campaigns. Canadian-born Wylie had been the brains behind the company’s methodology but left in fall 2014, reportedly unhappy with his bosses’ willingness to work with right-wing politicians.
Now he has given evidence to the Senate Judiciary Committee about Cambridge Analytica’s contacts with Russia and work with Donald Trump’s campaign and to a U.K. inquiry investigating the role of fake news in the 2016 E.U. referendum. His revelations kicked off a debate about the untrammeled power of tech giants and the vulnerability of Western democracies to disinformation–concerns that, Wylie says, social-media companies have failed to address as U.S. midterms approach. “They have been completely obstructionist,” he says.
Their reluctance may not come as a surprise. Facebook’s stock price fell 7% after Wylie revealed that data obtained from a quiz app on its site was being used by political campaigns to identify people’s specific characteristics–such as openness, conscientiousness and neuroticism. Cambridge Analytica used this data to target voters with political content tailored to their individual psychological profile.
More damaging to Facebook, the app was allowed to scrape the personal details of the friends of anybody who had used it. In all, Cambridge Analytica took data from 30 million to 87 million profiles and combined it with data from sources like TV set-top boxes and credit cards to build an incredibly detailed picture of segments of the U.S. electorate.
Trump campaign officials said Cambridge Analytica only “provided limited staffing” in 2016 and that none of the Facebook data was used. But Trump’s son-in-law Jared Kushner had previously boasted of the company’s involvement, while CEO Alexander Nix was recorded saying, “Our data informed all the strategy.” Without that data, Wylie believes Trump’s victory might not have happened. “I think about it a lot,” he says.
After leaving school in Canada at 16, Wylie–who was diagnosed with attention-deficit/hyperactivity disorder and dyslexia as a teenager–threw himself into learning about the political uses of data. By 20, he found himself in London juggling his university studies with work on voter targeting for the U.K.’s centrist Liberal Democrats. Then, in 2013, Wylie stumbled across some research funded by the U.S. military agency DARPA into psychological profiling using social data. He took these findings out of a laboratory context and set about applying them at the national level.
Though Wylie left Cambridge Analytica well before his work was adapted for the Trump campaign, the damage was done. Wylie had already shown his profiling tool to billionaire Robert Mercer–who partly owned and funded Cambridge Analytica and went on to become a Trump mega-donor; he had also shared it with Trump’s future campaign chief Stephen Bannon.
Wylie says his real failure was not realizing the “potential misuse” of his research earlier. “You’re going to get a situation where you’ve created an atomic bomb,” he says, pausing before carrying on more softly. “And that harms a lot of people.”
Leaving Google’s office behind, we make our way past Cambridge Analytica’s old headquarters, on New Oxford Street. Wylie barely looks at the polished revolving doors through which, six months earlier, his former boss was bundled by aides to avoid waiting journalists. The company attempted to ride out the storm caused by Wylie’s revelations but eventually found itself forced to close in May after several clients bailed.
Wylie is keen to leave that chapter of his life behind. He says he has tried to redeem himself by raising awareness of just how vulnerable social-media users are to exploitation, whether by foreign actors, shady political consultancies or the companies themselves. It’s a task made harder by the fact he’s still banned from Facebook, a company that he says has unique, unregulated power over public discourse. (Facebook declined to comment.)
“It’s very hard to participate in society when you can’t talk to people on the medium that they talk to other people on,” he says. If social-media platforms are going to call themselves communities, he adds, “there should be transparency and oversight, which is a role for government.”
He finds himself, awkwardly, in a similar position to Alex Jones, who runs the conspiracy-theorist website InfoWars. Jones has been permanently banned from Facebook, Twitter and other social media platforms in recent months following a drawn-out debate over whether the sites should give his often untrue and inflammatory content a platform.
But Wylie is at pains to point out a crucial difference between himself and the Texan, because his own revelations were truthful. “I’m sorry Alex Jones, but the content that InfoWars has been putting out is just flat out lies. You are entitled to say whatever it is that you want, that is fine; you are not entitled to a megaphone if that information is false.”
But he does agree that the social media companies themselves should not be the ones making the decision to “remove somebody from society,” as he puts it. “You’ve got social media companies that do not have to provide a fair process, a due process, and can unilaterally get rid of people because they disagree with what’s being said. I think there’s a real risk, long term, of what happens when you’ve got a company who decides to start doing that.”
Chief among his other concerns is that Western democracies are sleepwalking into yet more elections without taking sufficient action to reduce the chances of another Cambridge Analytica or another Trump-Russia saga. Compare the unregulated social-media landscape to food, electricity and airplane safety laws, he says. “When you get into an airplane, do you feel safe?” He doesn’t wait for an answer. “Most people would say yes, and that’s because there are rules in place. You might get on a plane once or twice a year. You check your phone 150 times a day. This impacts people’s lives on an hour-by-hour basis, and there are no rules.” But how can real regulation happen in a cyberspace that transcends national borders? “It is possible to create some common ground rules on the Internet in our international framework of nation states,” he says. “The infrastructure of our democracy is just open season right now.”
Although he continues pushing for tech regulation, Wylie has left political consulting to work in fashion, using data to spot trends. That informs a political metaphor he’s been thinking about lately. “I think about Donald Trump like I think about Crocs, the hideous shoes. There’s a period of time where no one would ever wear them. Then all of a sudden everyone wears Crocs, and a couple of years later people look at photos of themselves and go, ‘What the f-ck was I just wearing–that was hideous.’” It’s a lesson that also applies online, he suggests. “People are willing to do really ugly things if lots of other people are also doing them.”
This appears in the October 01, 2018 issue of TIME.