AASP: First steps in predictive modeling
Mark Mathyer, Tre Geoghegan and Pamela Jenkins are presenting on an analytics project that the Museum of Science and Industry undertook. The Museum had a big gap between $250 and $1000 -- not many donors were giving at this level. ($250 was the highest membership level, and $1K was the lowest donor recognition society level.) Surprise: people weren't giving at this level because they weren't being asked at this level.
The Museum also realized that there was a distinction between members (buying access) and donors (supporting specific projects). And -- that the opportunity was to upgrade members, not to downgrade donors! When the Museum created new higher-level memberships, they were successful in upgrading donors to new levels of giving.
Prior to this point, the Museum was relying on vendors for predictive analytics. They decided to create an in-house analytics program, starting with predicting the likelihood of members giving to the annual fund.
What does it take to do your own predictive modeling? You do not necessarily need to be an expert, but you do need to be data-savvy. Ideally, build a team with both prospect research experience and statistics experience.
Project steps: Define the project. Collect the data. Analyze the data (cleanse, transform, model). Use statistics to validate and test your hypothesis. Deploy the model (make business decisions).
When defining your project, determine what question you are trying to answer. For example, are you looking at prospects likely to make a major gift? Prospects likely to respond to email? Constituents likely to attend a fundraising event?
In data collection, figure out what data you already have. Are the records accurate? Do you need to do a data append first (e.g. buy addresses or age data))? Look at the consistency of the data -- has data entry changed over time? Is there useful data living outside your constituent database, i.e. "shadow systems"?
What team members should you partner with? Consider involving your data team, your annual giving team, your prospect research team.
What are the tools you need? Analytics software, unrestricted access to data, and plenty of time.
Advantages to in-house models: you can answer a variety of questions with the same data, your model can be dynamic (that is, you can update scoring as needed, rather than having a static results returned by a vendor).
The tools used by the Museum: Rapid Insight -- helpful providers of hands-on assistance. Now the Museum is transitioning to SPSS. There are a ton of tools out there, and these both have a great reputation. Tre recommends that when you are doing a model, consider how much support the vendor can provide to help you walk through your first projects.
So, what were the results? They solicited as they normally would (all constituents), and their Tier 1 donors (those identified as most likely to give) performed much better than the norm. Twenty-eight percent of Tier 1 donors gave, and only between 1 to 4% of the other tiers gave.
If the Museum had decided not to mail to those who were rated least likely by the model, they could have actually saved a lot of money on direct mail. If they had sent to only the top 5 tiers, they would have captured 91% of donors. This is a great demonstration of the power of predictive modeling!
Tre recommends budgeting for data appends if you are going to do analytics. She also discussed the importance of partnering with other staff to really understand the data -- if you don't know how the data relates to the business processes, then it's impossible to draw meaningful conclusions.
Pamela is sharing some of the factors that were predictive of giving. It's interesting to me that the external data they appended (average home value, purchasing power, age, ethnicity) was very helpful. The only models I've done thus far have relied on in-house data only, so I hope I can find a guinea pig organization that will be willing to buy external data before I model!
Be sure to allot sufficient time for modeling -- not only your time, but that of the partners within your organization! And generally, when starting an analytics program, it will be on top of work that is already being done, so plan on some long hours when you are proving the concept.
Getting the green light from management to build an analytics program can be a challenge -- sometimes you have to start modeling and show your results before you can make the case for more investment.
Very interesting: MSI is looking at building a model for corporate giving. Fascinating! I hope to see this team present on that project at next year's Summit.