Presidential election a victory for quants
Nate Silver, other quantitative analysts show how statistical modeling can be used to predict election outcomes with surprising accuracy
Computerworld - If there was one lesson for political pundits from last week's presidential election, it was that basic statistical modeling techniques can be used to predict election outcomes with stunning accuracy.
Quantitative analyst and New York Times blogger Nate Silver garnered a lot of attention for calling the results accurately in every single state and in the District of Columbia.
Silver's repeated predictions that President Obama would win 332 electoral colleges votes and challenger Mitt Romney would garner 206 were initially dismissed as biased by many who insisted the race was a virtual tossup. Silver's spot-on forecasts proved his critics wrong and focused unprecedented attention on the work that several quants, as quantitative analysts are known, have been doing to forecast election outcomes more reliably.
As far back as June , Drew Linzer, an assistant professor of political science at Emory University, predicted that Obama would win reelection and secure at least 52% of the popular vote. Like Silver, Linzer also had Obama winning 332 electoral college votes and Romney taking home 206 votes.
Even as political pundits breathlessly forecast a tight race, Linzer's blog site Votamatic consistently had the president winning the election by a small but comfortable margin.
"I never saw it as being a close race" Linzer said speaking with Computerworld this week. "When I started producing my forecast in late May, the historical model that I was using showed that Obama would get about 52% of the major party vote."
Despite minor fluctuations in support for both candidates in the weeks leading up to the elections, such as immediately after the first debate, the data always showed Obama winning in the end, he said.
Linzer, like Silver, made his forecasts by aggregating state-level poll data with economic indicators and data from previous polls. He started by constructing a baseline forecast for each state by using a statistical model developed by Alan Abramowitz, a fellow Emory professor , who used the model to predict the outcome of the 2008 elections.
The model, called Time-For-Change, predicts the incumbent party candidate's national vote share by looking at factors such as the president's approval rating in June, the percentage change in gross domestic product in the first two quarters of the year, and the number of years the incumbent party has held the presidency, Linzer said.
Historical data shows that these measures are especially useful indicators of how a first-term president will fare in the elections, Linzer said. For instance, since 1948, presidents that have been popular in June have been much more likely to get reelected in November, he noted.
As the weeks progressed, Linzer began basing his forecasts increasingly on state-level opinion poll data and less on the historical data that he had used to build his baseline model. "When I started off in May and June, the forecasts were based on long-term fundamental economic and political variables," because there was little poll data available at the time.
As more poll data became available, it was thrown into the mix. "The basic idea is that on Election Day or the weeks leading up to Election Day, the polls are the best indicator," of an outcome, he said.
- Best iPhone, iPad Business Apps for 2014
- 14 Tech Conventions You Should Attend in 2014
- 10 Desktop Apps to Power Your Windows PC
- How to Add New Job Skills Without Going Back to School
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Cybersecurity Imperatives Reinvent Your Network Security With Palo Alto Networks The Rise of CyberSecurity
Red Hat Enterprise Linux - The Original Cloud Operating System
Linux adoption is growing against a number of measures, such as the
number of supercomputers that run Linux and the size of the contributing...
- OpenStack Hype vs. Reality: CIO Quick Pulse Open-source architecture can enable IT departments to build infrastructure-as-a-service (IaaS) clouds running on standard hardware.
- Building a Bridge to the Next Generation Data Center Selecting a widely adopted operating system is a foundational component of a standardization strategy.
- Live Webcast Best Practices for the Hyperconverged Enterprise Network To the Age of Constant Connectivity and Information overload
- Live Webcast Unmasking the Differences between Consumer and Enterprise File Sync & Share The consumerization of IT combined with the rapid pace of the modern mobile workplace is forcing enterprise IT teams to evaluate file sync...
- Live Webcast Government Agency Webifies Outdated COBOL Applications Let this CTO tell you how his agency converted 1980s-era green screens into an e-filing portal for the 100,000 cases handled each year...
- The New Way to Work Knowledge Vault This Knowledge Vault focuses on how, in today's increasingly virtual world, it's more important than ever to engage deeply with employees, suppliers, partners,...
- Getting Ready for BlackBerry Enterprise Service 10.2 Find out how BlackBerry® Enterprise Service 10 helps organizations address the full spectrum of EMM challenges, while balancing the needs of both the... All Applications White Papers | Webcasts