What We've Learned About Customer Acquisition on TV

Our beta participant had already run TV advertising before joining the D2Cx.com beta, working with a vendor that operates as a traditional, full service agency and specializes in direct response (DR). They had seen enough success in that first campaign to believe that TV could work.

That's when they decided to give D2Cx.com a try.

Click here to download PDF of the full Beta Participant case study.


D2Cx.com by Simulmedia launched with a simple premise: with the right recipe of software and data science, growth marketers could turn TV advertising into a customer acquisition channel. Designed so that its tools and user experience would resemble search and social, D2Cx.com launched as a beta in October, 2018.

But is it really possible to optimize linear TV advertising toward a customer acquisition cost (CAC) metric?

We now have enough information to answer this question. It’s safe to assume we wouldn’t be writing this if we didn’t have something good to say. But, we also learned a lot in the process. We intend, therefore, to document the entire journey that one of D2Cx.com’s early participants has experienced.

The client experience we’ll showcase here is indicative of other clients in the D2Cx.com beta. Here, we’ll take a deep dive into campaign goals, strategy, targeting, measurement and attribution. In this way, we hope to help advertisers accelerate their learning in the channel that remains a more powerful driver of sales growth than any other. We also want to acknowledge what growth marketers have learned all too well: rarely is the path to efficient growth at scale in any channel a straight, up-and-to-the-right line.

Is it really possible to optimize linear TV advertising toward a customer acquisition cost (CAC) metric?

Key Takeaways

  • Define the conversion event to optimize. E.g. website visits, app installs, app usage, purchases, etc.
  • Pixel accordingly. Some first-time TV advertisers don’t placed a pixel in all the places a customer may go after seeing their ads, resulting in an incomplete view of performance.
  • Develop a perspective, backed by observable data, on both the short and long term effects attributable to TV.
  • Target an audience specifically enough to be able to value some programs over others, but not so narrowly that the CPMs get too high.
  • Get access to the best TV audience forecasting data possible. TV inventory always sells out days, if not weeks or even months, before shows and ads air. In order to maximize exposure with any given audience, advertisers need to be able to reliably forecast what their audiences will watch in the future.
  • Develop the right buying strategy: is it a dispersed, pattern buy, or a more concentrated burst?
  • Get access to as many of networks as possible. This ensures that advertisers can activate with few if any restrictions based on the programs they forecast their audience will be watching.
  • Cheaper spots don’t necessarily result in better CPIs. Sometimes, context, including time of day and the nature of the show itself, may affect results.
  • Measure both those who are unexposed to any other marketing to isolate TV’s unique impact, as well as those who are exposed to multiple channels to get a read on the effects TV and other channels have on one another. In this way, marketers can determine their optimal media mix.
  • Monitor and adjust the spend in other channels during a TV test, especially branded search. That data can help explain the results of the TV test and help avoid spending more than is needed.
  • Run a mix of :30 and :15 spots. :30s may result in a superior CR, but :15s are so much cheaper - as much as half the cost - that it can more than compensate for declines in CR.

Why Are Growth Marketers Drawn to TV Advertising in the First Place?

The penetration of TV is nearly 120 million households, which is over 95% of the U.S. population.1 By comparison, according to Pew Research, 76% of suburbanites, 73% of urban-dwellers, and just 63% of those in rural areas are home broadband users. Similarly, when it comes to smartphone usage, the numbers are only marginally better: 79% of suburbanites, 77% of urban-dwellers, and just 67% of those in rural areas own one and presumably can receive data on their devices.2 This means a significant portion of the country is shut out of digital advertising.

TV is 100% premium and fraud-free:
Unlike digital, TV inventory is finite and therefore supply-constrained. It’s also immune to the fraud that affects digital advertising. These are just some of the reasons why so many advertisers consider TV to be the most premium of all media channels.3

People watch a lot of TV:
Think about how much time Adults 18+ spend on their phones, tablets, and PCs every day. Now add all those times together. People still watch more TV than that. Even millennials (Adults 18-34) watch about 3 hours of linear TV a day.4 That’s more time than they spend eating, shopping, and using social media—combined.

Resilient viewing behavior:
In spite of the headlines about cord cutting, over 80% of all American households subscribe to cable, which is about as many who own washers and dryers.5

What Is D2Cx.com?

D2Cx.com is the world’s only biddable marketplace that connects TV advertisers and sellers of inventory. Designed to enable growth marketers to go “hands on keys” with software that resembles what they use for search and social, it lets brands take a ‘test, learn, and scale’ approach to using TV advertising to acquire customers at or below their CAC tolerances. To make launching the first campaign easier, D2Cx.com has no minimum investment requirement.

Meet the Featured Beta Participant

Like many D2C brands, this company considers its approach to customer acquisition to be proprietary and an asset worth protecting. For that reason, we’re not going to share their identity here. Instead, we’ll call them BC, short for Beta Client. The brand offers a mobile app that enables users to order products that can be used to beautify any room or office space. For this reason, BC first must encourage consumers to download its app. Then, BC's marketing must encourage users to purchase items through that app.

The Problems BC is Trying to Solve

BC's primary marketing objective is customer acquisition. As is often the case with other mobile-first companies, to date BC has spent the majority of its marketing on app install ads via paid social. The company wants to minimize its reliance on any one channel, especially one that, like social, is subject to significant outside forces and changes in the marketplace.

BC had seen enough success in that first campaign to believe that TV could work. Plus, they could see other mobile app and e-commerce companies scaling on TV. With a vendor that held such a different view of campaign performance, though, BC decided to explore other alternatives to TV.

Not BC'S First Rodeo When it Comes to TV Advertising

BC had already run TV advertising before joining the D2Cx.com beta, working with a vendor that operates as a traditional, full service agency and specializes in direct response (DR). By the time that first campaign had ended, BC could see that some subset of spots, networks, days and dayparts were working, meaning they had satisfied the company’s objectives for cost-per-install (CPI) and CPA. On the other hand, some parts weren’t working. In fact, the campaign overall fell short on these two metrics.

Complicating matters, the vendor considered the campaign’s overall CPI to be successful because it fell within a range of CPIs the vendor had achieved for other clients. This approach to the analysis disregarded BC's own cost tolerances. In addition, the vendor would not give BC access to the raw campaign data. As a result, BC could not verify the vendor’s claims or offer any input on how to optimize the campaign.

BC had seen enough success in that first campaign to believe that TV could work. Plus, they could see other mobile app and e-commerce companies scaling on TV. With a vendor that held such a different view of campaign performance, though, BC decided to explore other alternatives to TV.

They had two conditions for their next attempt: they needed access to the raw campaign data, and they had to have a way to optimize campaign performance based on that data. That’s when they decided to give D2Cx.com a try.

Why BC Selected D2Cx.com:

  • Simulmedia’s previous work with broadcast networks and their ‘tune-in’ campaigns. These campaigns feature closed-loop measurement, which appealed to BC.
  • In addition, D2Cx.com provides BC access to campaign data.
  • D2Cx.com offers IP address matching, which would improve campaign measurement precision.

The First Campaign: Methodology

1 - Establishing Pre-Campaign Baseline Performance

Simulmedia’s Data Science team analyzed the performance of BC's previous TV campaign and affirmed the results that the the company’s former vendor had reported. Next, the team established a baseline that showed what a typical day of app downloads looked like, down to the minute of each day.

Next, Simulmedia wanted to see how many existing BC app downloaders matched households in Simulmedia’s panel, which is comprised of more than 5 million households (HHs) and is balanced to the U.S. Census. Using IP addresses, Simulmedia discovered that approximately 6,000 app downloaders were also in its panel.

This match population was large enough to construct a model that would measure TV’s short term effects - defined here to be actions that took place within five minutes of the ad airing and that exceeded the expected baseline for the advertiser at a given day and time. It also was large enough to build a long term effects model, which we define here to mean an improvement of KPIs that can take as long as several weeks to materialize. Since TV is known for both short and long term impact, both models are needed in order to establish what is and isn’t working. We’ll touch more on measurement later.

Building these models is hard to do for two reasons. First, there must be sufficient omnichannel campaign data (which may or may not include TV) with which to build a baseline model. This requires delivering enough impressions in that channel to see results above the norms, e.g. site traffic, app downloads, app usage, expected on any given day.

We prescribe that as advertisers get started on TV, they set wider targets so they can detect signal. Then, they can start refining their campaigns by day, daypart, creative and target audience.

2 - Targeting and Planning

Based on BC’s existing customer base, they targeted broadly, selecting an audience comprised of Women 25-54. We prescribe that as advertisers get started on TV, they set wider targets so they can detect signal. Then, they can start refining their campaigns by day, daypart, creative and target audience.

In addition, D2Cx.com accounted for the volatility of each inventory unit. Here, volatility represents a measure of how predictably this audience watches during a particular ad break. D2Cx.com also captures the index for the target audience.6 As a result, D2Cx.com removed as possible inventory choices all spots that were predicted to deliver less than 75,000 impressions for P2+ because they were not likely to reach enough of the BC's target audience to produce measurable spikes in downloads of BC's app.

Finally, we referenced BC's previous TV campaign. Using the platform’s short term attribution model, D2Cx.com estimated which programs worked best in terms of app installs. This informed how BC bought in the marketplace.

3 - The First Media Buy

BC spent $50,000 on a campaign that included 112 spots, spread over a one week period. D2Cx.com provides access to 130 national networks. The first campaign ran on 12 networks. All of the budget went to media, the result of a promotion open to D2Cx.com beta participants to receive fee-free media. BC used just one, :30 second creative. The result was a campaign that delivered 16.5 million impressions and reached 10.6 million people in the target audience, for an average frequency of 1.55.

4 - Measurement and Attribution

D2Cx.com’s short-term and long-term effects models both rely on IP addresses. It maintains an IP address book that keeps track of IP address change over time for each TV device. Outliers that have a few hundreds of devices or devices that change IP addresses every few seconds, such as hotels, are discarded.

Next, we looked at app install data and anonymously matched BC’s customer IDs with a TV device ID using the IP address and a time stamp, all in a completely privacy-safe and secure way. This ensured the protection of all personally identifiable information. (The work here resembles how digital ads can use pixel tracking to anonymously match app users to people who viewed a campaign.) Once we knew which customer owned which TV, it was fairly easy to determine if they had been exposed to a TV campaign.

We looked at people who were only exposed to the TV campaign a total of one time and were not exposed to paid social. Next, we counted installs per minute after ad exposure within 24 hours. From this result, we created a long term effects factor, which is the number you would need to multiply the attributed installs in the short-term, five minute window to get the total attributed installs in the 24 hour window. We’ll share these actual results shortly.

What About Deterministic Matching?

The absence of sell-side market concentration in TV makes deterministic matching harder because it’s more difficult to get a significant volume of matches that link viewing to purchase. Sophisticated TV advertisers take a blended approach, calculating attribution deterministically for a portion of all viewers, typically via a panel or viewing graph, and then scaling those results to the entire TV viewing universe in a probabilistic way.

What We Thought Would Happen

Our extensive experience with TV had us feeling confident that we’d be able to measure the long term effects of the campaign on app downloads, installs and in-app purchases. In addition, we anticipated we’d be able to capture the short term effects of TV.

Results of the First Campaign

We saw clear and significant short-term results from BC’s first TV campaign that targeted, planned and executed via D2Cx.com. Beyond the five-minute window, BC saw some long-term effects, too, though those effects weren’t strong enough to attribute at the spot level. With just $50K in campaign spend, BC was able to get a good read on which days, dayparts and networks were the most effective. It’s worth noting that the more an advertiser wants to parse out what other aspects of their campaign, e.g. creative and audience segments, drive performance, the more media spend in general it will require.

Impact on App Installs

DMA Test: BC ran an experiment outside of D2Cx.com by only buying TV ads in two DMAs over one week. This allowed us to control for other type of spending and isolate TV’s medium and long term effects. Combining both short and long term modeling, we found that there were 2.5 times as many total installs attributable to the combination of short, medium and long term effects of TV, compared to installs that happened just within the short term, five minute window after an ad aired. To illustrate this multiplier, take this example of 1000 installs attributed to TV via the short term, five minute spike analysis. The total number of installs attributable to the campaign overall would be 2500, distributed as follows:

  • 1000 that occurred in the short term, five minute window
  • 1000 from the campaign’s lift outside the five minute window (medium term lift)
  • 500 from the residual lift that occurs from the time the campaign ends until 52 days after the campaign (long term lift).

IP Matching Test: For the IP addresses we matched, we saw that 37% of installs happened within the first five minutes of seeing the ad. The remaining 63% happened between five minutes and 24 hours of seeing the ad. So 1.7x as many downloads happened outside the short term effects window as in the long term window. The long term effects factor, therefore, was 2.7, similar to the result obtained from the DMA test.

Impact on App Opens: We were able to show that the overall campaign drove an average of 2.9 app opens after install per day over a 55 day post-exposure duration. Further testing is needed to be able to prove if this is a constant that can be applied to all optimized TV campaigns or spots. In addition, we found that TV drove purchases in the range of $25K (conservatively) to $85K (aggressively).

What We Learned

In certain ways, our hypotheses were both right and wrong. For example, the short term effect model detected some spikes in app downloads, but for many of the spots the effect was below the threshold of detectability.

In addition, we learned that the original baseline performance model in D2Cx.com needed to be more flexible. That’s because in the midst of the first campaign, BC significantly increased its Facebook advertising. As a result, the original model gave too much credit to TV. We therefore rebuilt our baseline performance model using local regression (lowess). The new model is more conservative in predicting TV’s impact and is able to flex in the event the marketer suddenly changes spending in other channels or doesn’t know how to do proper experimental design.

Here’s what worked better than we expected:

  • We were able to see significant spikes in many of the spots BC bought through D2Cx.com.
  • When TV was live, BC's Facebook advertising performance improved. Results showed that BC's Facebook CPI decreased by 10% while TV ran. Though we have more analysis to do, we preliminarily conclude that TV plus Facebook is better and more cost efficient than Facebook alone. Essentially, a $50K TV campaign plus Facebook is just as effective as spending $55K more on Facebook alone. Thus, it’s better to diversify your channel spend.

Similarly, here’s what fell short of our expectations:

  • While a significant portion of the $50K performed, not all of it did.
  • Certain networks we thought would perform well actually failed to show detectable spikes. We could simplify this to read, “not all impressions are created equal.”

Here’s a summary of the key learnings:

  • Spots with fewer than 75,000 impressions delivered lift that was harder to detect.
  • Overnight and other spots that D2Cx.com selected in part because they had a lower CPM had a lower conversion rate, or CR, defined here to be the number of attributed installs / total impressions. We think this is because there is less engagement overnight and overall lower viewership.
  • CRs varied substantially by networks and dayparts.
Exhibit A: An example of a spot that worked - detectable spike in red.
Exhibit B: An example of a spot that did not work. Response during spots in red; note absence of a spike.7
In preparing for Campaign 2, we built a new buying strategy into D2Cx.com, one that starts new-to-TV brands with spots that provide a large amount of the target audience on average but are still cost-efficient.

The Second Campaign

In preparing for Campaign 2, we built a new buying strategy into D2Cx.com, one that starts new-to-TV brands with spots that provide a large amount of the target audience on average but are still cost-efficient.8 D2Cx.com also set out to deliver more impressions per day to simulate what a larger budget would give, as opposed to spreading the buy and impressions over more days.

For its second campaign, BC placed bids on inventory that resulted in a plan that cost $80,000. As with its first campaign, 100% of the spend went to media (no fees). It included 102 spots, 18 networks, and one creative (15 seconds in length). The time frame was one full weekend. The result was a campaign that delivered 47.9 millions impressions, 23.1 million reach of the target audience, and an average frequency of 2.07.

Because some spots in the first campaign had underperformed and failed to produce detectable spikes in the app install time series, BC changed its approach for the second campaign as follows:

  • Increase the percentage of the campaign’s budget that went toward spots that showed detectable app install spikes.
  • Choose spots that had low CPI based on the observed performance by network and daypart from the previous campaign.

As the table below shows, BC achieved both these goals.

Campaign Spending On “Significant” Spots and CPIs

Percentage Significant Spots Percentage of Budget Spent on Significant Spots CPI
Campaign 1 21% 61% $8.78
Campaign 2 44% 72% $5.17

The biggest improvement was in the CPI. We think this happened for two reasons:

  1. An improved inventory selection process, driven by D2Cx.com’s performance solve: With more data, D2Cx.com’s modeling improved, enabling it to more accurately predict which programs, dayparts and networks would deliver the best performance.
  2. Switch to 100% :15 second spots: Based on the CR, the :15 second creative was almost as effective as the :30 second spot used in the first campaign at driving installs for BC. The CR declined by 12% when going from :30 to :15 seconds spots, but this was more than overcome by the fact that :15 second spots are typically half the cost of :30 second spots.

In terms of purchases we found that TV drove purchases in the range of $70K (conservatively) to $114K (aggressively). This is significantly higher than the first campaign.

Here’s what worked better than we expected:

  • The percentage of spots with detectable spikes more than doubled. Simulmedia is working on the models that underpin D2Cx.com to increase this percentage even more.
  • While most of the spots BC bought in its second campaign also appeared in the plan for its first campaign, they also tried new networks, some of which performed very well. For example, ION was all-new in the second campaign, and it accounted for about 40% of all visits with only 27% of the total budget spend.

What We Learned Following the Second Campaign

  • Within a testing plan, advertisers should set aside 15-30% of their campaign budgets to try new networks, programs and dayparts.
  • While TV is running, advertisers should turn off or turn down branded search and focus more on SEO. During the second campaign, BC kept its branded search steady. After viewers saw the ad, many searched for BC's products. These search engines offered up branded sponsored results, which users clicked or tapped. Each one of those clicks needlessly increased BC's spending on SEM, when it’s likely that their products and company name would have come up first in search results even without spending on branded search.


As this report demonstrates, TV can be transformed into a growth channel, with CPAs that portend profitable customer acquisition. Achieving this result likely requires a optimal TV media plan that’s unique for each advertiser.

The approach to developing this optimal plan, however, is something every advertiser can repeat. In closing, here’s a summary of BC’s first and second campaigns:

Summary of Campaigns Inputs and Outputs

Campaign 1 Campaign 2
Budget $50,000 $80,000
Duration 7 days 3 days
Target audience Women 25-54 Women 25-54
Spot length 30 seconds 15 seconds
Number of spots 112 102
Number of networks 12 18
Reach 10.6 million 23.1 million
Frequency 1.55 2.07
Impressions 16.5 million 47.9 million
Cost per install $8.78 $5.17
(conservative and aggressive estimates)
$25,000 - $85,000 $70,000 - $114,000

Click here to download PDF of the full Beta Participant case study.


  • 1 Nielsen. Click here for the report. Click this icon to go back to where this footnote was mentioned.

  • 2 From the Pew Research Center. Click here for the executive summary and data.

  • 3 Supply-constrained inventory also changes the way advertisers have to think about planning and buying. Simply put, TV inventory always sells out days, weeks and sometimes even months in advance. Advertisers need to place their bets much, much earlier than when they execute real-time bidding nanoseconds before a digital ad appears in front of a target customer.

  • 4 Nielsen Total Audience Report for Q2 2018

  • 5 Money magazine. Click here for the article.

  • 6 An index is a measure of a show’s audience composition, relative to the average on TV for that audience.

  • 7 These exhibits should not be interpreted as a generalization about the performance of any given network or daypart. Many factors affect campaign performance, including creative and target audience, that can make one brand’s strategy inappropriate for another brand.

  • 8 This requires superior forecasting. D2Cx.com shares a common software core as Simulmedia’s Transparent TV product, which is used by marketers who work at television broadcasters to drive ratings of their programs. This shared core includes dynamic forecasting technology, which enables D2Cx.com to more accurately predict which programs will most efficiently deliver the most impressions within any given target audience.

Join the World's Fastest Growing Brands

The Beta is currently closed to new participants but we'd still love to show you around. Complete the form below to have one of our experts walk you through the D2Cx.com application.