How a Single Source of Data Truth Can Improve Business Decisions
One of the great things about writing MarketingSherpa case studies is having the opportunity to interview your marketing peers who are doing, well, just cool stuff. Also, being able to highlight challenges that can help readers improve their marketing efforts is a big perk as well.
A frustrating part of the process is that during our interviews, we get a lot of incredible insights that end up on the cutting room floor in order to craft our case studies. Luckily for us, some days we can share those insights that didn’t survive the case study edit right here in the MarketingSherpa Blog.
Today is one of those times.
Setting the stage
A recent MarketingSherpa Email Marketing Newsletter article — Marketing Analytics: How a drip email campaign transformed National Instruments’ data management — detailed a marketing analytics challenge at National Instruments, a global B2B company with a customer base of 30,000 companies in 91 countries.
The data challenge was developed out of a drip email campaign, which centered around National Instruments’ signature product, after conversion dropped at each stage from the beta test, to the global rollout, and finally, to results calculated by a new analyst.
The drip email campaign tested several of National Instruments’ key markets, and after the beta test was completed, the program was rolled out globally.
The data issues that came up when the team looked into the conversion metrics were:
- The beta test converted at 8%
- The global rollout was at 5%
- The new analyst determined the conversion rate to be at 2%, which she determined after parsing the data set without any documentation as to how the 5% figure was calculated
Read the entire case study to find out how the team reacted to that marketing challenge to improve its entire data management process.
For the case study, I interviewed Ellen Watkins, Manager, Global Database Marketing Programs; Stephanie Logerot, Database Marketing Specialist; and the new analyst who calculated the 2% conversion rate, Jordan Hefton, Global Database Marketing Analyst; all of National Instruments at the time.
Watkins’ current title is Global Marketing Ops Training Manager, and both Logerot and Hefton have moved on to other opportunities.
In this blog post, we are going to provide some insights into how the team turned this challenge into a process that included determining a “single source of truth” for the data and how they used their reporting on this new process when making business decisions.
Use reporting data to influence business decisions
When the global rollout conversion dropped from 5% to 2%, it became apparent that there was a problem — data analysts were working in a silo when parsing the numbers. There was no documentation and no reporting as to how those numbers were calculated.
When Hefton came on board as a new analyst, she didn’t have access to the process on how that 5% conversion was achieved. Without any documentation, she conducted her own analysis on the same data set and determined conversion was actually 2%.
“When I first saw the number change, I was a bit freaked out,” Logerot said, recalling her reaction to the inconsistent analyses.
After digging into the process, the team understood that Hefton’s analysis more accurately described the ROI of the campaign through what Watkins described as a “tighter” view of the numbers.
After clearly communicating the new analytics process to the database marketing team, and then across the enterprise, the team was able to achieve a single source of data truth for National Instruments’ marketing campaigns, which they shared through reports that went out across the enterprise.
“I think one of the most useful things about the report is that we used the data to make decisions on things like which email touches are doing well and the ones that are not, and what do we need to change to [improve results],” Watkins said.
Having a single source of data truth is key
Watkins said having the single version of truth within the data allows the team to look back across multiple quarters to see what works and what doesn’t.
This is one of the main areas where numbers mislead companies. When you only see the 8%, 5% and 2% conversion rates that National Instruments faced, it’s human nature to compare them. After all, to the human eye they are single digit percentages; they seem to go together.
Data does not equal metrics
Just like how lumping all fruit together overlooks the differences between apples and oranges, lumping all numbers together and considering them equal overlooks the calculations behind them and the dangers of comparing two totally different metrics, even when they’re both called conversion rates.
Previously, the team had multiple ways to pull the numbers on key performance indicators, and that made it difficult to correlate changes in campaigns to changes in the outcome, as interpreted by the raw data.
Watkins explained, “I have two quarters of data now, so I can say, ‘OK, this is turning down. Let’s go make some changes to the CTAs,’ or whatever. And then the next quarter I can go look and see, and I can probably pretty definitely say whether or not those content changes are the reason the metrics went up or down.”
This consistency gives the team confidence in the campaign content and the marketing decisions. Watkins added it also gives the stakeholders, at all levels at National Instruments, confidence as well.
You might also like
Now, Go and Do: Create standardized, well-understood metrics for your organization [More from the blogs]
Analytics: How metrics can help your inner marketing detective [More from the blogs]
Marketing Research Chart: Top data analysis challenges for landing page optimization [MarketingSherpa Chart of the Week]
Marketing Analytics: 4 tips for productive conversations with your data analyst [More from the blogs]
Marketing Analytics: 20% of marketers lack data [More from the blogs]