Nearly everyone with a website uses A/B testing tools; they are the not-so-secret weapons of website design and...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
The role of data in testing different messages played a large role in both of President Barack Obama's successful bids for the White House. Large web commerce sites have entire teams devoted to testing changes. And even mom and pop operations can perform sophisticated tests by connecting to simple APIs offered through content management systems like Drupal and WordPress.
It's a strategy for quantifying seemingly subjective opinions about what works in page layout and functionality. But just because anyone can perform A/B tests doesn't mean they are done right. There's a lot that goes into a successful test -- from effective planning to accurate analytics -- and it's easy to misstep along the way.
In a webinar on the subject, Seth Hutchings, director of multichannel marketing at digital marketing agency Axis41, offered some advice on how to get the most out of A/B testing tools. Here are five of the simple tips he shared:
1. Don't test without a strategy. Hutchings said he sees clients who want to run tests just to run them. But they don't have a clear goal or specific website features they want to optimize. This nullifies the benefits of A/B testing.
"What happens when you do that is you get obvious results that just support user experience best practices," Hutchings said.
2. Make everything data-driven. Tests can't be random. Changes should be based on a problem identified in your data. For example, Hutchings said he had a client that wanted test out different options for their site's add-to-cart button. They figured it would lead to more sales. But after looking at the data, Hutchings and his team saw that page visitors were adding products to their cart at a healthy rate, they just weren't checking out. Changing the cart button wouldn't have solved this problem. He recommended testing different check-out options instead.
3. Follow the test's recommendations. Often, people perform good A/B tests and then sit on the results, Hutchings said. If you're not listening to what the tests tell you and implementing beneficial changes, there's no point in testing at all. To get changes implemented, make sure whoever has responsibility for page changes is on board and ready to implement new approaches.
4. Patience is a virtue. As with many areas of analytics, people want quick wins from A/B testing tools so they can show ROI to their supervisors. But A/B testers produce better results when it is part of a long-range strategy, which is unlikely to produce quick results. Pushing for quick wins can diminish the statistical reliability of tests, Hutchings said.
"The problem with being impatient is you run into problems like sample sizes, statistical significance [and] misaligned goals," he said.
5. Don't test bad designs. This should be obvious, but Hutchings said he often sees clients who roll out tests of design changes that go against established best practices for user experience. These designs will always perform more poorly than the better designs. Hutchings recommended getting designers to sign off on any changes before they're rolled out in A/B tests so that you're always testing good designs against good designs.
"I'm always going to my designers," he said. "They need to see it first. And then we can go out and do the test."
A/B testing plays big role for online retailers
Funnel analysis relies heavily on effective of A/B testing strategy
How a testing methodology and a testing strategy differ