Conversion Attribution When Cross-Device Behavior Is Rampant

The toughest question for marketers to answer is how many incremental sales their ad campaigns generate, and the biggest frustration is cross-device behavior. Nothing breaks attribution models faster.

cell-phone-usage-cropped.jpg

This post is not about solving for cross-device behavior. No matter how many steps you take to tie users across devices - and despite ad platforms’ best efforts - you cannot eliminate this problem entirely. Incognito and private browsers, frequent device-switching, and cookie-clearing (among other things) will stymie your attempts. Not to mention, privacy concerns threaten to make cross-device tracking even more challenging.

Your best option is to recognize when your data is distorted by cross-device behavior and use that knowledge to optimize the performance of your ad campaigns (both online and offline).

A Telltale Sign Of Cross-Device Behavior

Imagine you run marketing for a bank and you launch a six-figure ad campaign to increase online applications for home equity loans. Based on experience, you anticipate that most applications will come from users on desktop and tablets, but you spend the majority of your budget on mobile devices, where your target audience spends a lot of time.

To measure the campaign performance, you create a goal in Google Analytics that is triggered when users fill out a loan application. A few days into your campaign, you see that the majority of conversions are coming via desktop and tablet from “new” users who visit your site directly.

In other words, these are people who opened their desktop computers or tablets and typed in your bank’s website URL on their first visit to the site and converted. Seems bogus, right? That’s like driving to a store for the first time without getting the address and then making a big-ticket, impulse purchase. For high-consideration products with a long average time to purchase, this scenario is even more mystifying.

The most probable explanations for this scenario include:

  1. Google Analytics is unable to detect the users as previous site visitors.

  2. The users have visited your site before on mobile, but not on desktop or tablet.

  3. There are people viewing ads and converting, but they never clicked an ad. Google Analytics does not track view-through conversions.

  4. Offline marketing has exposed these new users to your brand, leading them to directly type in your site URL on their first visit.

Either way, the impact of your ad spend is concealed by the disconnect between where your audience engages with your ad(s) and where users take action. Depending on how often this behavior occurs, you could be flying blind without a strong handle on the levers driving ad performance.

In Google Analytics, go to Audience => Mobile => Overview and add “User Type” as a secondary dimension (above). Then you can isolate by traffic source and choose the Direct Traffic segment. Finally, select the appropriate goal to see how conversions break down by device, new vs. return visitor, and Direct Traffic.

In Google Analytics, go to Audience => Mobile => Overview and add “User Type” as a secondary dimension (above). Then you can isolate by traffic source and choose the Direct Traffic segment. Finally, select the appropriate goal to see how conversions break down by device, new vs. return visitor, and Direct Traffic.

Connecting The Dots On Cross-Device Behavior

Media mix modeling can paint a clearer picture on how ad spend in one channel influences sales through another channel. That said, few companies have the resources to run these types of advanced studies, which are time-consuming and often become obsolete when variables like ad budget fluctuate. Also, media mix models are typically done 1-2x per year and not on a per-campaign basis.

A quicker method for determining the magnitude of cross-device conversions involves finding correlations between mobile ad spend metrics and direct-new-desktop/tablet (DNDT) conversions.

In Facebook Ads Manager, you can filter performance by placement and device. Go to Breakdown => Delivery and then select “Placement & Device”

In Facebook Ads Manager, you can filter performance by placement and device. Go to Breakdown => Delivery and then select “Placement & Device”

Quick acknowledgement: Do we realize that correlation does not equal causation? Yes. Is this imperfect? Yes. Will it help? Not every time. Should you let perfect be the enemy of better? No. 

OK, getting back to the bank example, gather the campaign data and segment the performance by device. Some of the mobile metrics to pay attention to are: impressions, reach, engagement (video ThruPlays, likes, comments, shares, etc.), and clicks. If you want to get more granular, segment the mobile metrics by ad channel as well.

If you know that the average consideration lag is 7-10 days for loan applications*, chart DNDT conversions against some of the aforementioned mobile metrics to see if spikes in DNDT conversions are preceded by increases in any of those metrics a week earlier. For instance, you may notice that increases in mobile ad engagement in one week have a high correlation with DNDT conversions one week later.

The goal is to identify leading indicators of DNDT conversions so that you can understand better what levers you can pull to improve performance. Something is causing users to visit your site directly and convert on their first visit. You suspect it’s mobile users converting via desktop or tablet. You just need to identify the most likely signals of this behavior and be able to measure the estimated impact.

*You can figure this out in Google Analytics and by surveying loan applicants.

Presenting Your Findings

This method of analysis requires a large number of conversions in order to have a high confidence level in the results. We suggest that you compare conversions from the DNDT segment against total conversions in the time period of the campaign to see what percentage of conversions are from those users. If it’s a small portion of conversions (i.e. <10%), then it could be harder to draw strong conclusions. 

A lot of data can be explained by randomness, and even if you have a correlation coefficient above 0.8, there could be external factors that drive DNDT conversions. For instance, if you’re running OOH ad campaigns during the same period, that may influence the results. Your best bet is to acknowledge these factors in your analysis.

What you’re doing is sacrificing precision for a quicker feedback loop. You want to be directionally correct and make decisions informed by data and supported by logic and common sense.

One final suggestion: Let’s assume that mobile ad engagement on Spotify (or another channel) is highly correlated with DNDT conversions. Armed with this knowledge, do not automatically assume that spiking mobile ad spend on Spotify will cause hockey-stick growth in DNDT loan applications. Run some sensitivity analysis with your budget to find out when performance flattens. Each ad channel has a point of diminishing returns, so be methodical in your approach.

Need help with this type of analysis? Reach out to us at hello@3rdandlamar.com. Also, sign up for emails below to get notified about future blog posts.