Apptimize
May 20, 2014
Twenty years ago, marketing meant expensive television commercials or PR pieces in trade magazines. Measuring return on investment (ROI) was practically impossible. But online businesses today can quantify the number of impressions, clicks, purchases, and almost anything else surrounding their marketing efforts.
Measuring ROI seems like it should be simple given the explosion of data available to us. But this is far from true. Too many businesses are using models for ROI that are misleading and make incorrect assumptions. Here’s a scary thought: you might be running marketing campaigns where every dollar you spend ends up earning only 50 cents, but you don’t even realize it because of flaws in your attribution model or how you’re measuring customer LTV.
In this article, we’ll look at three common issues with measuring ROI, and what you can do to address them.
ROI often comes down to two simple numbers:
On average, if you earn more from a customer than it cost to acquire a customer, you’re making money.
But it’s not always that simple. You may have a good understanding of your business and have a solid number for overall customer LTV. That doesn’t necessarily mean that a given marketing campaign will deliver customers who match your existing customer base. For example, it’s possible your AdWords campaign targets keywords that only small business owners search for, instead of the midsize / large businesses that make up most of your revenue.
Takeaway: When using LTV to determine ROI of a campaign, make sure that you’re using an LTV number that closely resembles the target customer segment, not just your overall business. This isn’t always easy to predict, and you may need to iterate on your model quickly as you start seeing the results of a campaign.
One of the major problems in measuring the success of a marketing channel is attribution: the problem of figuring out which marketing channel actually resulted in the conversion.
Here’s an example to illustrate this problem. Let’s say you have a content marketing site, and you see that a certain blog post results in a certain number of new customers. Some of those customers are long time readers of your blog who only just now converted. Others are people who had clicked through your search ads before but didn’t convert. Still others are people who’ve heard about you through word of mouth but only just now visited your website.
How do we account for all these new customers? Do we attribute those conversions to the blog post, since it was the last thing they saw before converting (last touch attribution)? Or do we attribute those conversions to wherever the customer first heard about us (first touch attribution)? Or should we develop a more complicated model that takes into account multiple touches?
There’s a lot of research and debate on the best ways to measure attribution. How we choose to measure attribution has a large impact on the ROI we assign to a given marketing campaign or distribution channel.
Takeaway: Attribution is difficult to measure, so stick to a given attribution model and be consistent. It won’t be perfect, but it will avoid problems like attributing a conversion to multiple sources (and artificially inflating your ROI). You can try and increase the sophistication of your attribution mode down the road, but make sure you have solid reasons for doing so. Also, use qualitative data to better understand attribution: talk to your customers about what got them interested in your product.
Let’s say you’ve done a good job measuring ROI so far. You haven’t fallen victim to any of the mistakes outlined above, and you have an ROI model that you trust. But despite the soundness of your model, it might not be a good predictor of future results.
Here’s a simple example that illustrates why. If you put a billboard up on the US 101 highway, it will generate a certain amount of leads and revenue. But putting up five billboards on 101 won’t necessarily yield five times the return on investment. A lot of the impressions on those additional billboards are going to be from people who would’ve seen the first billboard anyways.
The same logic applies to web-based marketing channels as well, even though it may not seem like it. If you blindly double your AdWords budget, for example, it doesn’t necessarily mean you’ll get twice as many leads of the same quality. It could instead mean that your [AdRank](https://support.google.com/adwords/answer/1752122?hl=en) goes up, or that you bid enough to rank for keywords where you have a lower [quality score](https://support.google.com/adwords/answer/2454010?hl=en), depending on how you manage that increased budget.
Takeaway: Don’t assume past campaigns will scale up (or down) in a predictable fashion. It’s reasonable to start off with that assumption, but as data comes in, be prepared to update your model aggressively.
ROI is difficult to measure and inexact at best. Start with simple assumptions and a consistent model, but iterate as you get more data and better learn how customers are responding to your marketing efforts.
These are just three of the common problems with measuring ROI, but it’s by no means a complete list. If you have any comments, let us know on Twitter. Also, if you want to see more data-driven business insights, check out the Heap data blog.
———-
Guest Author Bio: Ravi Parikh is a founder of Heap, a web and iOS analytics product that tracks every user interaction automatically.
Thanks for
reading!
How much does it cost your company to acquire new users for your mobile app? And more importantly, is that cost worth the marketing budget you’re spending to get new users to install your app? For the first question, there’s...
Read MoreWhenever I talk to mobile teams today, 9 out of 10 teams tell me that their biggest challenge is acquiring new users. It makes sense. We know that there are some huge challenges when it comes to funneling in new...
Read MoreEveryone who’s not A/B testing is leaving their app unoptimized and losing out on more engaged customers and more revenue. Most app managers I talk to seem to realize this, but we at Apptimize still often hear from some customers...
Read More