LONDON, UK – The 2018 Facebook-Cambridge Analytica incident opened the world’s eyes to how much private user data was being collected, shared and sold. But experts worry that future ramifications of this scandal go way beyond Facebook and have created distrust in the election process as a whole.
Jamie Bartlett, senior fellow and former director of the Centre for the Analysis of Social Media at Demos, said that Cambridge Analytica gives the U.S. a glimpse of how the data of voters is used in an ultra-personalized way to sway them during elections.
“Our elections are radically changing because of data analytics,” he said during a Wednesday keynote at Infosecurity Europe. “Elections are becoming a question of data science, microtargeting smaller groups of people – that’s not what elections should be about. The risk is, people stop trusting elections.”
The incident involving Cambridge Analytica, the U.K.-based company that helps political parties target voters with specific messages, based on information gleaned from data mining and analysis, thrust Facebook into the spotlight in 2018 for its data protection policies.
The consulting company, which was linked with Donald Trump’s presidential campaign and the Leave.EU campaign as part of the U.K.’s withdrawal from the European Union, was able to access the data from up to 50 million Facebook users.
But beyond the outrage surrounding data collection, Bartlett warned that the Cambridge Analytica incident gave a dire glimpse into the future of elections and how companies will use mined data to sway voters – especially given that many more companies are also collecting massive amounts of highly-personalized data.
This data, gleaned from psychoanalysis quizzes on Facebook, created psychographic profiles on quiz-takers to match them to an exact personality type – and then created personalized campaign messages for them given their score.
And while Bartlett said he didn’t believe that Cambridge Analytica swayed the 2016 elections, he does think that “lots more people are doing the same thing as Cambridge Analytica… it’s pretty industry standard.”
“Focusing on the 2016 election, people are not thinking of what’s really going on, which is a change in how we think about elections,” he said. “I don’t think that Cambridge Analytica worked [in swaying the election] but I think one day however it will work… and that’s what we should worry about.”
Exacerbating the issue is that data is becoming more embedded in people’s personal lives. Consider the slew of IoT devices that are collecting data straight from the privacy of consumers’ homes, he said.
To show just how personal campaigns could get based on IoT data, Bartlett outlined an example of how data could be mined from smart refrigerators, indicating when consumers eat – so a politician could reach out with a charged-up campaign message to a consumer who has not yet had dinner and may be hungry and more apt to get riled up by a message.
“How can you effectively run an election that people trust when it’s coming through personalized, possibly automated messages through your smart fridge?” He asked, “No one is thinking about that message. There’s no rules for running elections – that’s the problem with the Cambridge Analytica story, that regulators can’t monitor it.”
Trust is Gone
Cambridge Analytica’s impact on elections is a “perfect illustration” of the issues that tech is sparking, said Bartlett.
The advent of new tech is impacting older, traditional systems in place. But these systems aren’t ready for the unprecedented security, privacy, and legal issues that tech is bringing about the current world – including election security, bots, data collection and more, said Bartlett. At the same time, cybercriminals are getting more sophisticated, with more data being stolen and sold illicitly online.
Illegal marketplaces are growing more agile and effectively lowering the barriers of entry for cybercrime, said Bartlett. At the same time, cybercriminals are growing more sophisticated, adopting ways to automate crime (including automated exploitation tools like AutoSploit) and artificial intelligence applications.
“This extremely vibrant, rapidly-changing marketplace for stolen information isn’t going away, it’s one of the reasons why the problem is getting worse,” said Bartlett. “Part of the problem is that there’s no easy way to stop these marketplaces from stealing data.”
Because of these issues, concerns are increasing that people will begin to distrust processes: from elections, to marketplaces, to tech itself, said Bartlett.
“The closer that tech gets in our lives, home, and political systems, the more people notice when they are impacted,” Bartlett said.
Despite these worries about trust, Bartlett remained optimistic that the infosec community “can overcome this.”
“All of these concepts together are a fundamental threat to people’s relationships with technology,” he said. “The way that can be avoided is by people continuing to see that there are solutions available to them.”