In the World of Programmatic, Data is Continually Evolving. Are We Evolving Fast Enough?

We’ve come a long way in the programmatic space.

Brands are performing at far more successful and efficient rates today than ever before, with private marketplaces and hyper-targeted audiences playing critical roles in the collective growth of the programmatic channels. Today, we can recognize and target not just audiences, but also highly curated environments, and we can approach planning and evaluation through a significantly more strategic lens.

But the programmatic space undergoes incessant evolution, thanks to the increasingly sophisticated data that we now find at the center of it all. Once common practices are changing as data becomes more abundant, and the supply and demand balance shifts. Measurement, layering, value and pricing all take on new meaning in this more complex ecosystem.

It makes for a challenging time for advertisers, yet simultaneously, the most important time to start pushing the boundaries of possibility. Even with GDPR still on the minds of many, data availability and usage don’t seem to be receding.

But pushing those very boundaries of what’s possible requires a new kind of talent, mindset and process when it comes to seeing the complete picture of data in programmatic. The faster our opportunities evolve, the faster we must evolve along with them.

The Changing Data Environment & the Limitations of Third-Party Data

The ‘mainstream’ processes for leveraging data in programmatic buys typically occurs through two defined methods; one that utilizes pre-arranged data segments within a DSP interface, and the other that involves working with a data provider directly to access an assembled data segment to use in any one preferred programmatic platform. The latter frequently performs at higher rates because the data is more closely matched to the desired target, but data bought this way is also more expensive (3x the cost on average).

Both methods get advertisers involved with the world of third-party data, but as many of us know, that also creates some limitations. Most notably, when we’re dealing in third-party data sets, we may not always be getting exactly what we think we are, and accuracy has proved itself a problem more times than we’d likely want to admit.

Our internal testing has shown that most third-party data providers’ accuracy is low, sometimes as low as 40%. It’s not unusual to see, and it’s also not optimal for modeling. In a personal experience when looking at my profile within one of the larger data providers, I was classified as Single, Married, Divorced, Separated and Widowed. I’m divorced. Technically, some of those qualifications are correct, but I certainly wouldn’t define that as accuracy.

What’s going on here? Every day, more data comes into the marketplace as data exhaust. The data footprint left behind by our online activity, or proximity-tracked movements, is growing at a faster pace than the ad space itself. What it ultimately means is that advertisers need to be even savvier about negotiations, and more discriminate about quality.

When it comes to third-party data, layering on multiple data sources should be a standard, not the single source of targeting. Adding data into Facebook, for example, can create a highly qualified audience (Custom Audiences – note this data isn’t exportable, so insights are somewhat limited, i.e. walled garden) just as transactional data can be combined with behavioral and content segments, as another example. The same can be done on the programmatic side. But I’ll get into this more later.

Data Engineering in Programmatic – A New Discipline on the Rise

Right now, perhaps more than any other component, talent has become a quintessential part of the equation for succeeding with data in programmatic.

Media Planning was, and continues to be one of the most highly sought-after skillsets, as the industry’s need for people who understand ad channels, audiences, messaging and Mar/AdTech simultaneously, only increases in demand. And for a long time, programmatic was a convenient way to buy display quickly without much media planning.  Just set a pixel on your site, and the algorithm would find the likeliest audiences regardless of where they were.  But, I think most of us have learned the hard way that chasing the cheapest impression is not brand-safe, let alone on-brand. Today, there is a return to direct buys or highly selective inventory within created publisher lists.  First price and header auctions are adding media planning elements back into digital marketing.  But, data is where planning will truly be needed.

Data source, how fresh is it, who certifies it…. in short, “what am I actually buying?” are critical questions marketers need to be asking. Who else has this kind of data, what should I be paying for it, and what should I expect in return for adding it? These questions need to be planned for and answered.

An Analogy to Consider…

Think of programmatic display like the TV show, Chopped. The data planners we need are the cooks in the kitchen, who are tasked with assembling a great food experience in a set time, that will be judged critically using only the select ingredients available to them. But, unlike in the Chopped kitchen, every new episode in our world brings new quantities, varieties and qualities of ingredients. More reach of targetable audiences, a broader range of good and bad quality across many different price points, and more places where the ingredients are sourced from.

Facebook, Google, and Amazon are kings of data and programmatic media, each providing a walled garden experience that protects the value of their programmatic ad buying, but shows the value of a highly accurate data append within a programmatic environment. To succeed outside these walled gardens, and to “win the game”, we need more innovative chefs who can think their way through the changing complexity, and who can control the presentation to ultimately serve up the best experience.

What is the right talent doing to make it happen?

Identifying the data sources and the companies that have this data is one step to making this happen.  Negotiating price is next. This is something advertisers aren’t doing enough.  A formal RFP structure works very well to understand what the real marketplace is for data.  Even after a test has concluded, tracking the data cost as its own ratio to the incremental success of a program is key.  Pricing models are also on the table, as data companies are pressured to stay competitive.  Percent of media, and even performance-based (CPA) pricing is now available through companies like Ground Truth or TheTradedesk’s data alliance.

As header bidding and first price auctions allow for more premium placements within premium sites, getting the data and audience building piece in programmatic display right will be how smart marketers can stay competitive, and add a comparable channel to the media mix that includes the giants of Google, Facebook and Amazon.

Recommended Posts

Leave a Comment