Traditional news media on the eve of the United States Presidential election was reporting a closely-contested election and there were many polls indicating that it might be a photo-finish. However, one thing has become clear from this election: not all polls are created equal. The pollsters using the latest data processing and analysis techniques were the most successful in predicting the outcome of the elections. For those who had the stamina to watch the election campaign unfold over 22 long months, it became not just a battle of ideologies and campaign issues, but also a rivalry between old media pundits and new media analysts.
Three different “polling aggregators” used different methodologies to predict the outcome of the elections. One, Drew Linzer, posted on his website in June 2012 that the election would be won with a result of 332 votes for Obama, 206 for Romney. Over the months that followed, that prediction didn’t change, even as new information came in. And with Florida only being called at the weekend the final electoral count came in at... 332 votes for Obama, 206 for Mitt Romney.
What is remarkable here, is not only that savvy pollsters are using predictive analytics to determine outcomes – but just how accurate these predictions have proved, particularly compared with old-school Washington methodologies of going on a hunch, following your instinct, etc. So, these results show that we have now got better at predicting the future which is no small feat – but how about influencing the outcome? It turns out this was also the election where that happened too.
President Obama’s campaign ran an extremely sophisticated and relentless digital operation that threw out the rule book and took no assumption for granted. This was not an election won with a clever advertising campaign – that is too 90s – and actually, that is what the Republicans did. This campaign was masterminded by data analysts who left nothing to chance. They revived the virtual campaign centre called mybarackobama.com from the ’08 election (and thus highlighting the benefits of “owning” your data), and encouraged supporters to volunteer their personal information, comments, post photos and videos, and donate funds. But this was only the starting point. In a multi-pronged engagement strategy, webmasters used supporters’ content to galvanise others and drive traffic to other campaign sites such as Obama’s Facebook page (33 million “likes”) and YouTube channel (240,000 subscribers and 246 million page views).
The Romney campaign struggled to keep up in this digital arms race and resorted to an old-fashioned conservative rhetoric posted in blogs, sympathetic TV and radio stations which attracted headlines and roused followers but Romney’s YouTube channel only attracted 23,700 subscribers and 26 million page views.
What this tells us is that data mining is changing politics and the Obama campaign micro-targeted potential supporters. Take this example: a contest to dine with Sarah Jessica Parker in her New York home targeted a small selection of people who have an affection for competitions who also like small dinners and are attracted to celebrities in order to raise funds. Such a group exists – and has deep pockets. Everything about a person that can be measured, was measured and, combined with predictive analytics, allowed the campaign not only to find voters but also to determine what sorts of messages would get their attention and what types of people would be persuaded by certain types of messages.
The entire volunteer system of the Obama campaign, a not inconsiderable number of people, was also carefully parsed and call lists allocated based on likelihood to match. Call lists ranked names in order of persuadability; 75% of data covered basics such as gender, age, address and voting record, but an additional 25% of consumer data allowed them to predict who was going to make a donation online, who would do it by mail, or who would become a volunteer.
When the campaign felt it had sufficient momentum with galvanising supporters, it turned to fundraising, and then to voter turnout. Every single night, the team ran 66,000 computer simulations going over how the elections might pan out in order to find the optimum breakdown. Such exercises turned them on to avenues they hadn’t taken, such as Reddit, a social news website.
This was also the first time that cloud computing played a big role in the campaign: the Obama team ran their data mainly using Amazon Web Services and used open source software and Amazon services to inexpensively write or tailor their own programmes.
Team Obama’s experience will tell you that David beat Goliath – it wasn’t the super-expensive ads on the national networks that won the election, but the very careful micro-targeting of messages tailored to each reader. The campaign messages were directed differently at siblings and spouses.
And this is something that corporate marketers and CIOs can replicate very easily. If, predictive analytics and data processing is taking some of the magic out of democracy – will the same concepts applied to consumer mean the end of free will?
About the Author:
Mike Lynch is the founder and former CEO of UK software company Autonomy