Ads That Don’t Overstep

in ads •  7 years ago 

The internet has dramatically expanded the modern marketer’s tool kit, in large part because of one simple but transformative development: digital data. With users regularly sharing personal data online and web cookies tracking every click, marketers have been able to gain unprecedented insight into consumers and serve up solutions tailored to their individual needs. The results have been impressive. Research has shown that digital targeting meaningfully improves the response to advertisements and that ad performance declines when marketers’ access to consumer data is reduced. But there is also evidence that using online “surveillance” to sell products can lead to a consumer backlash. The research supporting ad personalization has tended to study consumers who were largely unaware that their data dictated which ads they saw. Today such naïveté is increasingly rare. Public outcry over company data breaches and the use of targeting to spread fake news and inflame political partisanship have, understandably, put consumers on alert. And personal experiences with highly specific ads (such as one for pet food that begins, “As a dog owner, you might like…”) or ads that follow users across websites have made it clear that marketers often know exactly who is on the receiving end of their digital messages. Now regulators in some countries are starting to mandate that firms disclose how they gather and use consumers’ personal information.

This throws a whole new dynamic into the mix: How will targeted ads fare in the face of increased consumer awareness? On one hand, awareness could increase ad performance if it makes customers feel that the products they see are personally relevant. Supporters of cookies and other surveillance tools say that more-relevant advertising leads to a more valuable, enjoyable internet experience. On the other hand, awareness could decrease ad performance if it activates concerns about privacy and provokes consumer opposition.

The latter outcome seems more likely if marketers continue with a business-as-usual approach. One study revealed that when a law that required websites to inform visitors of covert tracking started to be enforced in the Netherlands, in 2013, advertisement click-through rates dropped. Controlled experiments have found similar results.

Some firms have done better than others in anticipating how customers will react to personalization. Amazon features shopping ads throughout its site, making product recommendations based explicitly—and often conspicuously—on individual users’ search data, without seeming to draw any consumer ire whatsoever. However, in a now-infamous example, when Target followed a similar practice by creating promotions that were based on individual shoppers’ consumption data, the response was not so benign. The retailer sent coupons for maternity-related products to women it inferred were pregnant. They included a teenager whose father was incensed—and then abashed to discover that his daughter was, in fact, expecting. When the New York Times reported the incident, many consumers were outraged, and the chain had a PR problem on its hands. Similarly, Urban Outfitters walked back the gender-based personalization of its home page after customers complained. “We saw customer frustration at being targeted outweigh any benefit,” Dmitri Siegel, the marketing executive in charge of the initiative, concluded in an interview with the Times.

For the consumer who prefers relevant ads over irrelevant ones (an ad-free experience is not realistic in today’s ad-supported web landscape), it’s important that marketers get the balance right. Digital marketers need to understand when the use of consumer data to personalize ads will be met with acceptance or annoyance so that they can honor consumers’ expectations about how their information should be used. The good news is that social scientists already know a lot about what triggers privacy concerns off-line, and new research that we and others have performed demonstrates that these norms can inform marketers’ actions in the digital sphere. Through a series of experiments, we have begun to understand what causes consumers to object to targeting and how marketers can use personalization while respecting people’s privacy.

The Privacy Paradox
People don’t always behave logically when it comes to privacy. For example, we often share intimate details with total strangers while we keep secrets from loved ones. Nevertheless, social scientists have identified several factors that predict whether people will be comfortable with the use of their personal information. One of these factors is fairly straightforward—the nature of the information. Common sense holds that the more intimate it is (data on sex, health, and finances is especially sensitive), the less comfortable people are with others knowing it.

A second, more nuanced factor involves the manner in which consumers’ personal information changes hands—what social scientists call “information flows.” One such norm is, to put it colloquially, “Don’t talk about people behind their backs.” While people may be comfortable disclosing personal information directly (what scientists call “first-person sharing”), they may become uneasy when that information is passed along without their knowledge (what we term “third-party sharing”). If you learned that a friend had revealed something personal about you to another, mutual friend, you’d probably be upset—even though you might have no problem with both parties knowing the information. It can also be taboo to openly infer information about someone, even if those inferences are accurate. For example, a woman may inform a close colleague of her early-term pregnancy, but she’d likely find it unacceptable if that coworker told her he thought she was pregnant before she’d disclosed anything.

In our recent studies we learned that those norms about information also apply in the digital space. In our first study, we collected a list of common ways in which Google and Facebook use consumers’ personal data to generate ads. We then asked consumers to rate how acceptable they found each method to be, and—employing a statistical technique called factor analysis—identified clusters of practices that consumers tended to dislike, which mirrored practices that made people uncomfortable off-line:

obtaining information outside the website on which an ad appears, which is akin to talking behind someone’s back
deducing information about someone from analytics, which is akin to inferring information.
Next, we wanted to see what effect adherence to—or violation of—privacy norms would have on ad performance. So we divided participants in our study into three groups. In a simulation of acceptable, first-person sharing, one group first browsed a website; on that same site we later displayed an ad accompanied by the disclosure “You are seeing this ad based on the products you clicked on while browsing our website.” In a simulation of unacceptable, third-party sharing, another group browsed a website and then visited a second site, where we displayed an ad accompanied by the disclosure “You are seeing this ad based on the products you clicked on while browsing a third-party website.” The final group served as a control; like the other groups, these participants engaged in a browsing task and were then shown a targeted ad, but without a message. In all groups, we measured interest in purchasing the advertised product as well as the likelihood that participants would visit the advertiser’s website. Additionally, to understand how these three ad scenarios affected consumers’ attitudes, we asked all participants which they valued more: the personalization of ads or the privacy of their data.

If people dislike the way their information is shared, purchase interest drops.

We found that when unacceptable, third-party sharing had occurred, concerns about privacy outweighed people’s appreciation for ad personalization. Those attitudes in turn predicted interest in purchasing, which was approximately 24% lower in the group exposed to unacceptable sharing than in both the first-party sharing and the control groups—a clear indication of backlash.

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!
Sort Order:  

Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
https://hbr.org/2018/01/ads-that-dont-overstep