How CNET Learns Their Users Behavior to Double Banner Clicks

CNET segments and targets users based on their cognitive style

Recently, the MIT Center for Digital Business Marketing Group led a study to test the real-world effectiveness of morphing, a term they use to describe when a banner ad changes dynamically to match a user’s cognitive-style segment. The results are impressive and orders of magnitude higher than what had been seen in earlier content-matching studies.


Hauser, Urban, Liberali & Braun proposed the idea of morphing in 2009. Essentially, if a web site can know the cognitive style of a user, it can automatically adapt to present contents in a congruent format. Someone who prefers analytical stimuli can be shown a chart, whereas another user with a preference for more visual holistic appeal can be shown and image and pitched with text associated with feelings or overall impressions. But on a largely anonymous internet, knowing someone’s cognitive style ahead of time is largely impossible.

Using the procedures developed in 2009, Urban, Liberali, MacDonald, Bordley, & Hauser (2013) tested in the real world the work-around proposed in the 2009 paper by conducting a field study in cooperation with the CNET web site.

Knowing a user’s cognitive style is an iterative process. First, the researchers brought subjects into a behavioral lab. They evaluated cognitive styles using existing psychological techniques. Namely, they gave the gave the subjects questionnaires, and evaluated the results. Using this information, the researchers were able to segment the subjects into four groups based their preferred cognitive processing style.

  • deliberative-holistic: 9% of participants
  • deliberative-analytic: 42%
  • impulsive-holistic: 23%
  • impulsive-analytic: 27%

The researchers were then able to observe the participants as they interacted with the CNET web site and track click patterns. They now had two major pieces of information which they could employ: 1) the base rates to expect for each segment and 2) the website interaction behavior patterns one would find more likely for members of each segment.

Using the lab data to make improvements in the field

Armed with this information, the researchers used a sophisticated algorithm which monitored the clicks of each user and returned a probability (based on Bayesian estimation) that the user belonged to a certain segment based on the clicks. After only 5 clicks, the algorithm could segment users better than base rate chance. As more clicks registered, the algorithm continually evaluated and updated its estimate that the user was a member of certain segment.

Once the algorithm had a reasonable estimate of a users cognitive-style segment, it began to target banner ads calculated to be more effective on that particular segment. The banner ad targeting was also an algorithmically controlled process. The algorithm also constantly evaluated which ads were most effective with each segment.

Results

Past research has shown that matching the contents of the ad, to the contents of the web page on which it’s displayed boosts the click-through rate of the ad (Iyer, Soberman, & Villas-Boas, 2005; Kenny & Marshall, 2000). This practice lifts banner ad performance by around 3.2 – 3.3%. Another common technique is to use behvioral targeting based on a user’s browsing history, which can yield lift of 16 – 26% (Chen, Pavlov and Canny 2009). The banner morphing technique described here yielded lift of 83%. The table below has the complete results.

from "Morphing Banner Advertising"  Urban, Liberali, MacDonald, Bordley, & Hauser (2013)
from “Morphing Banner Advertising” Urban, Liberali, MacDonald, Bordley, & Hauser (2013)

What’s Great About This Study

This type of research represents a fantastic opportunity for application of consumer behavior theory. The world of marketing is becoming increasingly enamored with the possibilities represented by “big data.” However, like any actuarial models, machine learning comes with some natural limitations. The one I’d like to specifically address here is sometimes called the “flashlight effect.” A computer learning scenario presumes that the appropriate relevant data are being measured. It can’t see anything that isn’t being measured.

This study doesn’t solve the problem. However, by incorporating the lessons of consumer behavior research, the programmers of computer learning algorithms are suddenly able to shine the spotlight in directions which were previously dark. As we see here, learning from the right data could mean dramatic improvement in targeting and ad performance.

Image Source: CNET web site

1 Comment

Leave a Reply

Your email address will not be published. Required fields are marked *