man in front of two doors

With A/B Testing Strive for Insight, Not Just Data

June 4, 2021
Discover

Business executives often use the terms data and insight interchangeably, but statisticians make a big deal about the distinction, for good reason. Data is the raw, unprocessed facts captured from a wide range of inputs, whereas insight is the meaningful analysis of context and situational factors that help us make better decisions. “Data-driven marketing” may be the flavor of the month, but insight-led decision making is what businesses need to get closer to the voice of their customers and deliver sustained growth.

Yet, as W. Edwards Deming rightly put it, “without data, you are just another person with an opinion.” With A/B testing the dance between data and insight is an elegant one. It requires intent, connected infrastructure and open interrogation so leaders can derive business insight from their iterative A/B tests. In other words, A/B testing alone does not give you answers, perspective and analysis do.

A/B testing is the process of comparing variations in marketing assets to a control (a variable that does not change) to determine the right combination of formats, copy, visuals, headlines, call to action buttons, channels, etc. These differences are measured to enhance user experiences and motivate them to respond to your message. This step is highly recommended before conducting a full campaign as it helps identify top-performing messages, assets, and tactics assets that deliver the right strategic outcomes for your business.

While digital platforms offer opportunities for real-time data and continuous learning, here are some common mistakes that prevent teams from designing  A/B tests in a manner that translates data into insight.

  1. Lack of clarity in understanding and defining the target persona.
  2. Rush to confirm previously help biases that stifle the discovery of new ideas.
  3. Time pressures that encourage mindless activity over thoughtful frameworks and design.
  4. Unfamiliarity with foundational statistical concepts: types of hypothesis, types of errors, sampling methodologies, confidence levels, significance, etc.
  5. Not defining standards or measures of success prior to running experiments

With a stream of continuous demand signals that customers, consumers, guests, shoppers, or stakeholders send our way, businesses should be making deeper connections. Avoid being left with a pile of numbers that don’t tell you anything valuable about your business. Think through the following steps before you start:

  • Define the real problem. Successful experiments start with clear articulations of the root cause. It is important to resist the temptation to address the symptom, rather than the cause as the problem will continue to persist. The “5 why rule” is an interesting technique to get to the underlying reason for the surface problem1.
  • Articulate the hypothesis. The hypothesis is a carefully crafted assumption that is proposed to solve the real problem. A simple way to get started is to write down a series of statements in the following format, changing this to that will yield measurable success.
  • Identify the independent and dependent variables. Simply put the independent variable is what we change through the experiment, while the dependent variable is what we measure. You may include more than one independent or dependent variable in the study, but each should be associated with a different problem or research statement. The more we test, the more we learn, but it is important to design experiments in ways that isolate variables to avoid conflation.
  • Define your benchmark. You need a standard or point reference against which the results of the experiments should be compared or assessed. A true understanding of a gap or improvement is only possible when benchmarks are clearly articulated. Without one, it is exceedingly difficult to take meaningful action for improvements.
  • Measure what matters. Marketing in a digital world means that we can measure pretty much anything. It is imperative to sift through data points to determine which create noise and confusion vs. those that move the business forward. While your marketing team may want to look at leading indicatorsto optimize their mix, the ultimate proof of marketing efficacy should be anchored metrics that move the business: brand equity3 and sales measures4.

In a nutshell, information is data with meaning. When we can find patterns in this information, we create knowledge. The application and understanding of this knowledge repeatedly generate the awareness of the underlying essence or truth that all business leaders seek – Insight.

1. The 5 Whys Problem-Solving Method

2. Leading Indicators: such as open rates, bounce rates, and response rates (e.g., cart inclusions or downloads),
3. Brand Equity Measures: awareness, relevance, consideration, uniqueness, loyalty, advocacy.
4. Sales: response to lead ratio, qualified lead, yield, upsell value, cross-sell value, average order value, market share, customer lifetime value.

Subscribe Subscribe to receive the latest posts
Share:
Certifications