GUEST COLUMNIST JANE CLARKE
This article first appeared in AdExchanger:
Certainly, transparency and data accuracy were already dominant
topics in 2017, but as the industry moves toward ever more granular
forms of targeting and measurement, while combining differing data sets,
the need for assurance that the information represents what it purports
to represent will become even more critical as the year progresses.
2017 saw some solid progress taken toward addressing this.
CIMM and ARF proposed a data labeling initiative to create a
“nutritional label” for data, enabling users of third-party data to
understand its source and composition. A similar effort is being
undertaken by the IAB.
The initiatives provide a critical first step in ensuring confidence
and trust in the data that’s used in increasingly granular and effective
targeted advertising. It, in effect, helps marketers to better
understand the impact of their advertising efforts and the role that the
data played in those outcomes. Innovation in targeting would stall if
uncertainty about data – the “raw ingredient” used – persisted.
While the approaches being undertaken by these associations differ,
they do complement each other, and each has merits and challenges. The
CIMM-ARF initiative is broader in scope than the IAB’s initiative, which
is limited to digital media and doesn’t include a TV focus. CIMM and
ARF want to make the labeling appropriate to audience-based buying in
TV, and also bring more buyers into the process.
But as is sometimes the case with our industry, methodology and
technology are not the main impediments to addressing challenges –
consensus and unanimity are.
At this stage, what is needed is not that the perfect solution be
found, but that the first step be taken together. The best technology or
methodology at this point is not all that is needed to achieve the
objective. It is also important that all facets of the industry agree
that the end goal is of sufficient importance that we move forward
together. Working together, the best approach for ensuring transparency
on data quality will emerge.
Certainly, an agreement on a universal data identification protocol
benefits everyone. The stakeholders for such a protocol are primarily
the data owners, but the benefits are widespread. And while the
initiative is focused on disclosure, there is also a validation aspect
to the project.
It promises to successfully link together all current and future big
data sets used to accurately measure the consumer journey across
platforms. Data labeling, which would be an actual listing of the
components, sources and characteristics that were compiled to create
targets, would raise overall data quality because of its focus on
transparency. But it also requires a uniform, agreed-upon structure so
all interested parties can participate and an implementation process
created that fits into both standard and proprietary systems.
There are inherent challenges in gaining consensus on technically
complicated problems and navigating the natural forces of
competitiveness that define our industry and make it great. For example,
as the information for the labels would first be self-reported, vendors
could, of course, provide misleading or incorrect information. But that
misinformation would ultimately be revealed through third-party
At the moment, we don’t even know how many lists and providers would
participate. But through this process, undoubtedly some of the
poorer-quality lists and providers would be found out as third-party
validation and experience of matching reality to reported information
Never has the need for consensus and placing cooperation over
competition been greater than in finding systems to advance data
quality. All that we want to accomplish as an industry, in terms of
better targeting and accurate measurement, depends on confidence in the
data on which we base insights and decisions.
I am interested in people’s ideas, approaches and their “aha” moments
for how to best ensure data quality. As 2018 gets underway, let’s take
that first step to this goal together as a statement on how even when
dealing with the most challenging of issues, consensus can emerge as the
greatest trend in the year ahead.
Follow CIMM (@CIMM_NEWS) and AdExchanger (@adexchanger) on Twitter.
At this time of great disruption in the media industry, I find it interesting to look back and realize that disruption in media was always a constant. The systems we used to measure content continually changed, improved and even disrupted our ways of doing business. I recall that, when I was an intern at NBC years ago, I was impressed that my computer did not require punch cards. Am I dating myself? Probably.
As we embark on 2018, I asked others in the industry to answer the question: “When you first started in the industry, what was the most amazing device/application/program/aspect/item at the time?” One person noted that in the 1990s when she was at Discovery “it was PCs and the internet. That technology changed everything.” For others, it was a range of other advancements:
Arlene Manos, President Emeritus, AMC Networks: When I started at A&E, we did a lot by spreadsheet. Someone I hired as an intern’ recently mentioned in an article, that he shared a computer with me since they were scarce. The first system we were on was Columbine, followed by a Nesbitt system for planning and posting. Don’t remember any more than that.
Mitch Oscar, Advanced TV Strategist, USIM: In 1999 it was the introduction of TiVo, The inventor came to my office to talk to me about advertising and TiVo’s functionality. At about the same time, the head of IPG called me and said, “So advertising is dead?” TiVo was momentous because everyone was worried about the impact of two functionalities – the recording of programming and the ability of fast forwarding to skip commercials. We wondered if the speed be would be fast or slow enough to see the brand messaging.
Kathy Newberger, Advanced Advertising Consultant: I was working in local ad sales at the time and we said it was going to be digital ad insertion. We were going from six networks that were inserted using tape decks to sixteen networks using digital equipment. We thought that was going to be amazing … and it was. Now it’s amplified by 500 times more – every network is insert-able. And on top of that is OTT.
Brad Adgate, Independent Media Researcher: I think the most important introduction early in my career were spreadsheets. Long gone products like Lotus 1-2-3 and afterwards Quattro Pro were being used. Before that, workers used those large green accounting pads and calculators to fill in the data, took a lot longer and more error prone.
Dave Morgan, CEO and Founder, Simulmedia: In early 1993, I was working in "new media" helping newspaper companies develop ad and content strategies for early online services and partnerships with telcos and cable companies and had a chance to play with the Mosaic browser. It was pretty clear, even then, that a user managed rendering engine like the browser would change the media industry, particularly for print companies with text and still photos, which rendered well even without high speed internet. It certainly did.
Jane Clarke, CEO, Managing Director, CIMM: Back in 1982, we were analyzing clickstream data from set top boxes in a Pilot Test for Time Teletext, which was a text and graphic service similar to the early AOL, but delivered via the Vertical Blanking Interval (VBI) of a channel on Time Warner’s cable system! I never thought it would take this long to get to nationally representative samples of Return Path Data!
Sheryl Feldinger, Media Consultant: I often comment to my 16-year-old that the biggest difference between growing up today versus the 1970s is the pace of life. Everything happens so much faster today. The pace of communication, especially, flies at warp speed. Confession: early in my career, fax machines were a game changer. They revolutionized the work place. No longer could you tell the client, "We will messenger it to your office first thing tomorrow." The new retort was, "Why wait? You can fax it tonight!" It didn't matter that the edges of the thermal paper curled. All of a sudden, deadlines got pushed up and we all had to work faster.
Next article – Looking Ahead to 2018.
This article first appeared in www.Mediapost.com
One of the more challenging aspects of advertising sales is calculating ROI. This is made even more complex with the proliferation of content platforms and consumer devices. What really contributed to Sales uplift? ABC, in addressing this issue, just released Phase 2 of an attribution analysis conducted by Accenture Strategy. Phase 2 is the follow-up to a custom study completed in 2016 and used four big data sources to prove the role and value of content in context in driving Sales ROI.
Cindy Davis, Executive Vice President Consumer Experience, Disney ABC Television Group, took me through the study and how it contributes to their overall research strategy. “Our objective in this study is to measure what matters and there was industry pressure to measure ROI,” she stated. To that end, “we found a very interesting connection between engaged audiences and their content and the sales and ROI we can drive to clients who participate.”
To achieve that goal, Disney|ABC commissioned Accenture Strategy which, according to Davis, “had a robust database of marketing spend. This year in our Phase 2 of the study, we examined 26 national brands over six industries and their corresponding sales data representing $25 billion in marketing spend, with $11 billion in television spending.”
Mike Chapman, Managing Director, Accenture Strategy, global lead for Media and Entertainment Strategy Practice added that his company provided three years of data, which provided a “closed loop view of advertising ROI – types of impressions delivered, how many, which channels, what prices were paid and the impacts from those impressions delivered on incremental sales week over week.”
In addition to Accenture’s marketing data, Davis asked Accenture Strategy to incorporate four other datasets in their recent study – Nielsen ratings, E-Poll, Nielsen Social and Magid’s Emotional DNA, which Davis described as, “intriguing because we are in the business of connecting with viewers emotionally and Magid’s DNA work speaks to that.”
Phase 1 Takeaways
Davis and her team, focusing on the impact of multi-platform TV (premium long-form video across screens and devices) and how advertisers can leverage that impact, discovered three major takeaways from the Accenture Strategy 2016 study:
1. There is a halo effect on sales with multi-platform television. “This doesn’t get talked about a lot,” Davis noted, “You hear about last click attribution in digital advertising. But TV goes a long way to establish and amplify the impacts of all media.”
2. Multi-platform is under-valued, under attributed and under-represented in the industry. Eighteen percent of all of the ROI impact is traditionally attributed to digital but it should actually be attributed to multi-platform television. “Television has been traditionally undervalued and digital over-valued,” Davis concluded.
3. Multi-platform TV has a long-term amplification impact. The study compared sales lift over years and found that by year two or three, you no longer see a sales lift impact from digital. But the study proved that there is a long-term effect on sales lift with TV.
Phase 2 Takeaways - Drivers of ROI
Davis highlighted three key drivers to ROI that were identified in the Phase 2 study.
1. Audience size matters. “Higher-rated programs deliver more ROI than lower rated programs by 2X,” she stated, “so not all programs are created equal which makes sense.” And notably, these higher-rated programs deliver more ROI than their cost premium indicating that higher rated and therefore more expensive programming is worth the cost in greater sales lift ROI. This is because these programs have a greater footprint, greater social amplification and therefore have the ability to reach people beyond a narrowly defined target audience.
2. Consumers’ to commitment to the content matters. “We looked at both the expressed and the observed commitment to the content,” Davis stated, “and found that the greater the effort to watch, the greater the ROI.” And there is 2X the ROI with Magid’s Intentionality measurement.
3. Content quality matters. Davis’ group examined perceived quality, as defined by the viewer, and quantified quality indicators using Magid’s emotional dimensions. They found that the higher the perceived quality of the content, the greater the ROI. And, using Magid, the three most impactful dimensions for higher ROI were Smarts (programs that are informative, real and inspiring), Edge (unpredictable, outrageous and funny) and Relatability (originality, suspenseful and intelligent). “There is a direct connection between the emotions viewers feel about a show and the benefits advertisers gets in terms of greater ROI,” Davis concluded.
“We are already starting to have good conversations with clients as to what this means for them,” Davis stated. “It goes without saying that not all GRPs are created equal and now we can prove that. Yes, higher-rated shows command a premium but they deliver even greater ROI at that level.” Adding to all of this insight the impact of social connection and emotional dimensions, ABC is poised to help their clients take advantage of the best that multi-platform TV can offer.
As retail becomes increasingly digital, greater pressure is being felt on the brick and mortar side of the business. But even online retailers face challenges, according to Esteban Ribero, Senior Vice President, Planning & Insights, Performics. His company recently fielded a study using the Digital Satisfaction Index™ (DSI) to measure online consumer attitudes. “We executed a retail-specific DSI surveying 1500 respondents that compared digital satisfaction for retailers in general, as well as for specific brands,” he explained.
Retailers, whether online or in-store, need to be able to deliver the goods to consumers in terms of quality, service and value while also finding the careful balance between personalization and privacy. I sat down with Ribero and asked him the following questions:
Charlene Weisler: What do you mean by digital satisfaction? What are the most important drivers in digital satisfaction?
Esteban Ribero: There are all kinds of studies done around consumer satisfaction but there has never been one for DSI retail. We wanted to see what drives customer engagement in this area. We found that there are four components of digital satisfaction:
1. How useful the experience is. Can people accomplish what they set out to do when they visit your site? How easy is that to do?
2. How secure is your online environment? There are still a lot of concerns about privacy where people have to feel comfortable about sharing their personal information online.
3. Trust, which is different from privacy. Retailers have to make sure that the information they are giving online is truthful, accurate and reliable, especially in the context of fake news. It is more important than ever now.
4. How social is the experience? How much customers can get a peek into other peoples’ lives to create a more engaged experience, how much they can read reviews and comment on those reviews.
Weisler: What do shoppers generally think about the user interface of retail websites and apps? Is there a constant? Do some retailers do it better and if so, what do they do to stand out?
Ribero: We were surprised to find out that consumers were very satisfied already with the utility of their retail websites and apps. We thought that perhaps some consumers would find sites clunky or not very human but the research shows that people find the experience positive. Of the three retailers in our study (Lululemon, Gap, H&M), Lululemon customers were the most satisfied with landing page and app experience, and Gap customers were the least satisfied. This could be due to Lululemon offering a more modern digitized experience.
Weisler: What is the balance between privacy and personalization? Is there a concern about the ultimate use/sale of personal data?
Ribero: This was the most interesting takeaway from the study. There is a trade-off between privacy and personalization. We go with the assumption that consumers want more personalization and the industry strives to ascertain ahead of time what consumers may want to insure greater personalization. However, as we have done that, consumers may push back because we have been tracking them using information that they did not give explicit permission for us to use for tracking. So they feel more concerned about privacy and all the information we gather about them. But at the same time they say that they want more personalized experiences. The struggle is they want more personalization without giving us any information to do that. What we need to do as an industry is be more open with the consumer as to what information we use to track them. When we don’t do this it tends to backfire on us. The trick is to make it feel like a generic message but finding a way to tailor it to the consumer. Don’t put one’s name on it – it feels creepy.
Weisler: What is showrooming? What kind of shoppers are most likely to showroom, and how does showrooming fit into driving digital satisfaction for retailers if at all?
Ribero: Showrooming is the ability of consumers to experience the merchandise without having to actually order online. When consumers are shopping in a brick and mortar store, they may at the same time use their cell phone’s mobile apps to browse products for that same store or competitors. That behavior is here to stay and we are seeing it more and more.
Weisler: Based on what you have seen in your research, where do you see the future of retail in the next 3-5 years?
Ribero: We always dream about that moment where, as in the movie Minority Report, Tom Cruise enters a store and they know all about him – his preferences, his past purchasing. I think we are getting to that but in a way that consumers see as more controlled in their environment and their choices. At the end of the day, that where I see where the future is headed. Consumers taking control of their experiences, of the events they want, of the way they want to engage with brands. And I see continued merging between the digital space and the brick and mortar space. Brands will continue to transform their stores as showrooms to get a seamless way for people to interact with the brand.
For many in marketing, deploying multi-touch attribution is more of an aspiration than a current reality, but that doesn’t mean we should defer efforts to find a complete attribution solution. There are many considerations when constructing a workable model, including variations in the consumer journey based on products and categories, the impact of unmeasurable factors like word of mouth and the ceaselessly expanding choice of datasets — some valuable and some, depending on the advertiser and category, not so much.
I had the opportunity to sit down with Matthew Krepsik, Global Head of Analytics for Nielsen to talk more about facing the challenge of attribution.
Charlene Weisler: Nielsen has always been proud that they own their data. Can you talk about the new reality where Nielsen will need to go beyond their own data and partner with other data suppliers who, in turn, control their own datasets?
Matthew Krepsik: From our point of view, we have a lot of unique and valuable data, just like other suppliers with walled gardens have incredibly valuable data. What we see as the biggest opportunity is our ability to co-mingle those data sets. If we think about our constituents and our users, whether marketers or other business executives, what they really want is greater intelligence. We understand the complementary nature of all of these datasets and we can bring them together for specific use cases and help in enabling desired outcomes. This is where we generate the most value. As we think about the next generation of growth and innovation as a company, we think that the innovation stems from building across different partners. We have opened up access to our data with partners to make it easier to use and more permissible for marketers and brand owners across the value chain and in places where we don’t operate.
Weisler: How do you manage the de-duplication of data when you use other walled garden datasets?
Krepsik: There are a couple of dimensions of de-duplication around identity, around devices and around the cookie itself. So if you think about the ad model side, the ad tech really starts with the cookie. Your phone right now probably has more than 50 or 100 cookies on it. All of those cookies roll into a device. That device has to roll into other devices. So de-duplication is improving right now. Is it perfect? Not at all. But the first challenge is getting from cookies to devices and devices to people. I would say that we are getting more robust at the device level. The challenge is that they keep reinventing and updating the device. Every customer is getting a new phone. They are getting new laptops. They are getting new devices at home. Most households get new devices about every six months. So the challenge is that we have gone past the period of “build” and we have to constantly reinvest in the updating. The technology we have makes it easier to do this than ever before. The bigger challenge is de-duplicating your on-boarding data. That gives us a tremendous opportunity to get better at connecting to a digital consumer from an offline consumer. There is still a lot of work to do there.
Weisler: Looking at multi-touch attribution, where do you see it today, grading it from A to F?
Krepsik: I am going to answer this question along two dimensions. When I think about the overall need of a marketer or a CMO, I would give most attribution models a grade of D. If I think about the overall need for digital media planning, I think attribution models today are approaching a B+ / A- grade.
I say this because for a digital media manager, what you really want to know is whether this creative, or this site, or this device, or this placement working better than another one? Is this audience working better than another one? How do I make decisions and trade-offs across all of the possibilities out there? Today’s attribution models are really good at allowing digital owners to understand the cornucopia of media channels and how they can get more improvement out of them.
That being said, from an overall CMO standpoint, out of every revenue dollar that goes to the cash register, 25 cents is spent back on some form of marketing investment. Digital is only two cents of that quarter. So what they want to know is “should I be spending three cents on digital or should I be spending one cent on digital?” If you think about most attribution models today, they are not expressly measuring incrementality. They are not taking into account a lot of the “last mile” problems.
Weisler: Where do you see attribution going in the next couple of years?
Krepsik: Where I think the attribution industry has a chance to grow and where the marketing mix industry can take a step forward is bringing those two pieces together. Attribution models bring speed and granularity together and the marketing mix world offers scale and coverage. Where I see the industry going in the next two years is that both of those pieces come together; Leveraging the technology, real-time nature, and granularity in the attribution model with the scale, coverage, and sophistication of marketing mix modeling.