Thursday, September 12, 2013

Crossing Party Lines with Predictive Modeling

With the rise of Nate Silver and the emergence of mainstream data science, we've seen many uses for predictive analytics, including the entrance of predictive modeling into the political arena. Actually, although predicting election results is a booming business now, it has been around for quite some time. 

I recently got the chance to talk to Matt Hennessy, Managing Director at Tremont Public Advisors, about a campaign he worked on for Joe Lieberman in 2006, and how they implemented predictive modeling for a successful Senate election. For those who are interested, we'll be discussing this and other examples of predictive modeling in action in a webinar on Tuesday, September 17th. 

Can you give us some background on the 2006 Senate election?

In 2006 in Connecticut, Joe Lieberman was up for reelection to the Senate as a Democrat. He had been the Vice Presidential nominee in the 2000 election and had taken a position supporting the Iraq war which upset a lot of the Democratic base. He wound up losing the Democratic primary to Ned Lamont who won on a big anti-war push. Once Lieberman lost the primary election, he lost access to a considerable amount of infrastructure – union support, door to door field workers, and all of the other boots on the ground that he would have had were all gone. He lost most of his staff except for the people who had been there for a decade or two. He needed to figure out how to replace some of the advantages he’d had with other resources out there.

As someone advising him, I saw that we had a problem: without a field operation and all of those bodies, we didn’t know exactly who we wanted to get out the vote and who the likely voters for Lieberman were. We had a very expensive polling operation going which  was using the conventional method to reach some conclusions about which demographics were most likely to vote, but we decided that we needed something more.

How was the decision made to use predictive analytics in the campaign?

The resources that normally would be used for generating ‘get out the vote’ or direct voter contact were gone the day after the primary. Usually we’d go out and try to visit all of the potential voters, but this just wasn’t possible anymore. We needed to figure out a way to work smarter to compensate for a new lack of resources. We wondered if there was a way to determine which characteristics indicated a likelihood of voting for Lieberman so that we could figure out exactly who to pull out on Election Day. After a conversation with Mike Laracy about performing this type of analysis, we decided to give predictive modeling a try. Our goal was to score every registered voter on their likelihood of voting for Lieberman, and we used Rapid Insight to build a model to do that.

We knew what data we had, the voter file, and determined which additional information we would need to build a model, like demographic information. Then we hired a polling company to call about 10,000 named voters in a random phone pull so that we’d have a statistically significant result. The poll question was a very simple yes/no question on likelihood to vote and who each voter planned on voting for. We weren’t trying to persuade people at this point; all of this polling was meant to influence the field side, not the messaging side. This approach was different than what we’d been doing before because we were calling named voters – people who actually existed and were registered to vote and had demographic information that we could attach to them – and polling them. Using this poll, we scored each of the 1.9 million registered voters in  Connecticut on their likelihood of voting for Senator Lieberman.

How did predictive analytics help the campaign?

Predictive modeling allowed us to optimize our limited resources. As opposed to working with pure assumptions, we now had an actual score attached to each individual voter, which allowed us to spend our resources on the voters with the highest propensity to vote for Lieberman. At the time, it was quite a cutting-edge use of analytics – it was the first time anyone had ever scored an entire state’s voters for the purposes of an election.

Another thing the predictive model did was to disprove our assumptions about who was likely to vote for Lieberman. Some of the key indicators that we were getting from the traditional pollsters were proven to be incorrect by the model results. Based on this we changed some of our campaign messaging The model allowed us to re-allocate our resources more efficiently and it challenged some of the notions we held. In the end, the model did a good job of predicting who the voters would be.

Do you think predictive modeling affected the outcome of the election?

It’s difficult to say, but I can say that the resources that were deployed based on the predictive model were effective. Once we started deploying based on the modeling, the polling margins started to increase; this was toward the end of the race, which is when this model was implemented. I think it increased the margin of victory. The polling was showing a very tight race, but the predictive model was showing there was a margin of victory for Lieberman that was already there, and it was actually ahead of the polling in this case.

How do you see predictive modeling being used in future elections?


If you look at what the Obama campaign did with predictive modeling – taking different factors and a complex web of data points to pinpoint individuals who are likely to vote – it’s here. Predictive modeling is here, it’s now; that’s the future of elections. The complexity of the work they’re doing in this field is truly amazing. I don’t think it will be as focused on many of the smaller races – like those below governor, but it can be very, very effective. I think this last election confirmed that it’s a major part of any political campaign that’s being conducted on scale. This is here to stay.

This example of predictive modeling in action is one of three that we'll be co-presenting in a webinar with Tableau on Tuesday, September 17th, "Turbocharge your Predictive Models with Visualizations". For more information, or to register, click here

*
Matt Hennessy has over two decades of experience in federal, state, and city government. He has built a reputation as a trusted and effective advisor to leading elected officials on public policy, communications and campaign issues. He has served as a trusted political advisor and fundraiser for candidates and political campaigns ranging from Mayor to U.S. Senator to President. Matt is an alumnus of Harvard Business School and the Kennedy School of Government where he was a Wasserman Fellow. He also holds degrees from the Catholic University of America and trinity College in Hartford.

Wednesday, September 4, 2013

#TCC13

In honor of the upcoming Tableau Customer Conference and our recent partnership, I sat down with Rapid Insight President & COO, Ric Pratte, to talk about the partnership, what to expect from Rapid Insight at the conference, and how predictive analytics and data visualization go together. 

Can you talk about the partnership between Rapid Insight and Tableau?

It was an observation that a number of our very successful clients were using Tableau, and it was this observation that led us to build a partnership. We have strengths in massaging, analyzing, and predicting data, and we’ve gained a partner who helps communicate that data to executives and decision makers in a way that’s easily understood. It’s a very complementary relationship. We both share a common position focused on empowering business users to make data-driven decisions by using their data to look forward. Essentially, we’re working together to help people visualize the future.

We now have a Tableau page on our website that we’re constantly updating so that visitors can continue to learn about the power of combining predictive modeling and visualization, and that’s a great place to get more information.

How do the two products interface?

Actually our interfaces focus on the same methodology – no coding, and the user manipulates graphical objects, places them where they need to, and literally connects the dots to perform an analysis. We’re able to natively connect the output of data analysis from our tools directly into the Tableau system as a .tde system, as well as the ability to output to the cloud. The process is very smooth.

How can visualization enhance predictive analytics work?

Visualization provides the end user a better way to see the results of a predictive model applied in the context of a business problem.

For example, a heat map overlay onto a geographic region of customers who are most likely to renew, purchase, or enroll tells a much more powerful story than summary statistics or a table containing the same information. Visualizations are great for storytelling and physically seeing things in your data that you may have missed in a more black and white analysis. By adding a visual component to their predictive analytics work, users can make data-driven decisions faster.

What value does predictive analytics add to your visualizations?

It’s one thing to know where your current customers are, but it’s a different thing to know where your future customers will be coming from. If your visualizations are based on traditional data analytics, you can think of them as looking at what’s in your rearview mirror. Useful, but driving your car while looking in your rearview may not get you where you want to go. Using your data to look at what’s coming down the road will help you set a clear path based on data-driven decisions. Once you’ve predicted the probability of future outcomes, you can focus your resources accordingly.

What can attendees expect to see from Rapid Insight at TCC13?

We’ll be doing some short sessions with attendees so that they can see the entire process – all the way from data extracting and federating through the data cleanup and modeling phase, and ending with how to bring the data into Tableau for visualizations. We’ll show the contrast between predictive and non-predictive visual outputs to demonstrate the power of predictive analytics. We’ll have a few examples and datasets to play with.

We’ll be sending part of our executive team – Mike, Sheryl, and myself – and our booth will be fully stocked with chocolate, which are also great reasons to stop by.

Mike Laracy is co-presenting with Yale at TCC13. What do you expect from their presentation?

Yale has lots of and lots of data and needed to find the most efficient way to analyze it to predict donor behavior. Their presentation, "Fusing Predictive Analytics and Data Visualization", will be on Tuesday September 10th at 3pm in Annapolis 3-4. They’ll be presenting a case study on their successful use of predictive modeling and discuss how they are sharing and communicating the results through visualization. This will be another expanded example of the full process with the added viewpoint of the end customer and their experience in starting this, the iterations they’ve went through, and how they’ve reached success.

How can I learn about the partnership if I’m not attending the conference?

After the conference, we have a follow-up webinar on September 17th for everyone who can’t attend. It’s a co-hosted webinar between Rapid Insight and Tableau where we have some data analysts who will show some new examples of the process. It will be a good way to gain an understanding of how you can put predictive analytics to use to gain a competitive advantage for your business. [For more information, or to register, click here.]

*
Ric Pratte, President and COO of Rapid Insight, is a longtime entrepreneur with a history of building innovative software companies. He was previously the CEO/Co-founder of JitterJam, a pioneer of Social CRM that was acquired by the Meltwater Group in 2011. He is a father of two, an avid skier and backpacker, and devotes time and energy to numerous non-profit organizations including Girls, Inc. and the Boy Scouts. You can follow him on Twitter at @ricpratte.