A/B Testing or the HIPPO: Your Choice

Confession: prior to diving into the world of digital marketing, and specifically digital analytics, I was terrified about the idea of kicking our class off with Google Analytics. The material was dense, the exam was looming, and it was fundamental to understand the concepts. However, after taking the deep dive into the world of digital analytics I have come to understand that it really is the backbone behind many other digital marketing concepts; A/B testing being no exception.

What is A/B Testing:
A/B testing defined technically by Optimizely is, “a simple way to test changes to your page against the current design and determine which ones produce positive results. It is a method to validate that any new design or change to an element on your webpage is improving your conversion rate before you make that change to your site code.” In other words, A/B testing allows organizations to “test out” changes to digital touch points such as a website (or even an email campaign) and track if the changes are leading to greater conversion rates or helping to attain the organizations goals. While Optimizely provides a suitable definition of what A/B testing is, I feel it is missing one key element; the how.

  • “A/B testing is a simple way to test changes to your page against the current design and determine which ones produce positive results” via the use of analytics/experimentation tracking technology. In Optimizely’s explanation of how A/B testing works, they offered the simple example of tracking how many visitors saw the new red button that they had implemented and how many clicked on it. They also tracked how many people landed on their “Thank You” confirmation page after seeing either button. In Microsoft’s article, where they also explored A/B testing in depth, they explain the statistics behind proving if one version of a site is statistically significant and thus different from another. While Microsoft offers a comprehensive explanation of the math, admittedly a very confusing one, I think it is important to emphasize that the data to do such statistical calculations would not be available with out the analytics/tracking technology. Via the use of web analytics for example, companies are able to track click rates of two different buttons and how many people land on a certain page from two different versions of a site, thus allowing A/B testing and comparisons to be possible. Thus, a more complete definition of A/B testing and what it is involves the inclusion of the use of analytics or experimentation to test/track changes on certain pages and determine if those changes are leading to greater conversion rates. Here is a good blog that explores further the use of analytics in A/B testing and why you should be “listening” to your analytics.

Another important aspect of A/B testing that makes it increasingly valuable is that it allows “quantitative data to speak for itself” and, hopefully, limits the power of the HIPPO. For those of you that don’t know, like myself walking through the doors of digital marketing on day one, HIPPO stands for the highest paid person’s opinion. Because A/B testing matches actual conversion data/metrics to the changes that are being made on a site (or other digital mediums), executives no longer have a lot of room to say that something wont work simply because it’s their opinion. Claude Hopkins puts it best in the Microsoft article when he says, “Almost any question can be answered cheaply, quickly, and finally by a test campaign. And that’s the way to answer them – not by arguments around the table.” Because A/B testing utilizes a control and a test page, questions about changes can be answered “finally” with the data to back them up. The best technique that marketers can apply is to let the customers “do the talking” and show us what changes are prompting them to purchase, download, click, or whatever the goal may be.

By listening to the customers and making decisions based on data, A/B testing can improve an organizations business in several ways beginning with increasing revenue’s and ROI. For example, if an organization is going to invest money in optimizing certain areas of their website in order to increase ecommerce sales, they will want to test out which changes are bringing more people to purchase. By utilizing A/B testing, and analytics, an organization can track which changes are leading to a greater increase in sales, thus increasing revenue and ROI. Along the same lines, organizations can use A/B testing to discover if changes to a digital touch point may actually hurt them, instead of optimizing their site. Microsoft brings up an extreme caution about implementing changes simply because the  managers think the changes “do not hurt.” Microsoft mentions that sometimes A/B testing experiments don’t have the “power” needed to produce actionable results, and implementing changes to the site because you think they “cant hurt,” may actually produce the opposite effect. Finally, using A/B testing and frequently making attempts to optimize a page can lead to overall better customer satisfaction, which then opens the door to leads and revenue.

Obama Campaign:
A very interesting and notable application of A/B testing came from the ever-famous Obama Campaign emails that flooded inbox’s during Obama’s presidential election campaign. In fact, those of you reading may have received some of the emails yourself and thought for a brief second, “oh cool Obama sent me an email!” And then, especially for those of you that are in marketing, probably realized that someone else developed the content of this email and unfortunately maybe Obama was not reaching out to you personally. Bummer. However, having an email from Obama in your inbox with a subject line that read “Hey” probably caught your attention and the attention of millions of others. That being said, what you might not realize is the endless testing, formatting, rewriting, and effort that went into this email that you are reading with the simplistic subject line of “Hey.”

Amelia Showalter, director of digital analytics for the Obama 2012 campaign, was one of the many masterminds involved in the A/B testing of various email campaigns. Showalter explains, “We did extensive A/B testing not just on the subject lines and the amount of money we would ask people for, but on the messages themselves and even the formatting.” From the articles, it is clear that the effort put in by Showalter and her team was extreme, but at the same time it was very necessary. Organizations can learn a few very valuable lessons about A/B testing and optimization from this campaign, beginning with the valuable insight that this type of experimentation brings:

  • In the article published by Business Week about Showalter and the email campaigns, there was one very standout and glaring lesson that all organizations should be able to take away: you never really know as much as you think you know. This certainly proved to be the case for Showalter and her team as they tested various subject lines, fonts, and formats, and soon came to realize that people were receptive to plain Jane emails; exactly the opposite of what they expected. In the article,  Showalter explained “[how] bad they were at predicting the subject lines that would be most effective” because people were interested in different things then they expected. The key take away here is that Showalter and her team would have had no idea that citizens wouldn’t be receptive to their content, unless they had tested out various forms. This lesson overlaps with the Microsoft article in addressing the HIPPO. While there is no HIPPO per say in this situation, there is still a group of analysts who thought they knew what content/subject lines would be the most influential, and they were proven wrong. A statement in the Tech President article says it best when they state, “the driving principle, based on interviews with several people who worked in or around the campaign, was to take an evidence based approach to person-to-person contact. The core of the campaign was not flashy or even particularly innovative except in the willingness of senior staff to listen to numbers people rather than to consultants acting on old fashioned political intuition.” So, to all organizations out there that think they know how customers will respond, test everything and let the “data do the talking.” Organizations will benefit from testing because it will ensure they are spending their money in the right places, and making changes that will increase revenues and ROI.
  • While it is clear that the main takeaway from each of the articles on the Obama campaign is to test anything and everything, another lesson I found valuable that organizations can learn from was the frequency with which they were testing. Showalter, in Business Week and Tech President, speaks about the need for “constant” and regular testing because after awhile the “novelty” of certain email campaigns would run out. In the Tech President article Showalter states, “For example, for a fundraising email, the digital department decided to try emphasizing text with highlighting that was ugly on purpose. It outperformed other emails, so the campaign kept using it-the formatting trick seemed to draw the eye. When the novelty wore off and that tactic stopped performing better than other ones, the campaign dropped it and moved on.” The important lesson that I think organizations can take from Showalter and her team here is that after they implemented a campaign, they still had to test and track the campaigns performance. They could not simply discover a technique that worked for them and turn a blind eye to it. Analysts had to perform “constant testing” so that they were aware when a campaign had indeed “lost its novelty” and “stopped performing better than other ones.” Conversely, A/B testing could have revealed to Showalter that a campaign was still working efficiently and there was no point in spending money to develop a replacement campaign for something that wasn’t broken. Organizations can take this tid bit and apply it to their own business by acknowledging the importance of performing A/B testing on a regular basis. For example, if you implement a new button on your website that is meant to capture the attention of visitors, you will want to regularly test and run conversion metrics on that change to ensure that its “novelty” is still proving successful. Organizations will benefit from testing on a consistent basis because they will be aware when something is no longer working, or, on the opposite end of the spectrum, not spend money adjusting an aspect of a page that is still producing adequate conversion rates.
  • An aspect of A/B testing that organizations cannot learn from Showalter and her email campaign, but none the less is something important to emphasize, is that sometimes it can be better to “test one factor at a time.” Clearly the analysts that worked on the Obama campaign had many elements of the email that were worth testing. That being said, Microsoft cautions about testing more then one factor at a time when doing A/B testing. For some research, testing multiple factors at once is the correct technique, and for others, testing a single factor is the best technique (Microsoft suggests this technique when one wants to “gain insights”). Organizations should analyze the goals they want to obtain from A/B testing, and then choose the proper technique.

Similar to how Showalter used A/B testing for subject lines, several publishing companies are now using A/B testing to experiment with the “best headlines” for their articles. One publishing company called The Dodo, a client of RebelMouse, uses A/B testing to, “fine tune headlines and find the ones that will resonate the most on social media.” The publishing company found that including the word “Epic” brought greater attention to an article. Click here to read more about how publishing organizations use A/B testing to optimize their business.

All quotes and statistics used in this post came from the following sources:
What is A/B Testing? By Optimizely
Academic Journal by Microsoft
Obama Campaign Article by Business Week
Obama Campaign’s Legacy by Tech President
Publishing Companies Using A/B Testing

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s