For eCommerce businesses, A/B testing (also known as split or bucket testing) is a perfect way to define purchasing triggers, optimize your website for conversions, and improve the user experience.
A/B testing in a Nutshell
A/B testing basically allows you to experiment on your customers to learn what incentivises them to make a purchase on your website. It works like this: take two versions of the web page you want to test, also known as the control and the alternative. The control is the original page while the alternative is a replica, but with one different element that you want to test for (e.g. a different hero image, a different color for the Call-to-Action button, a different title text, etc.). These two versions will be split among your website’s visitors to see which version achieves better performance.
The most popular A/B testing tools used by thousands of marketers across the globe are Google Optimize (formerly Google Analytics Experiments), Optimizely and Omniconvert.
What A/B tests to run in eCommerce
You can split test virtually everything on your eCommerce site and see how customers respond to the tweaks and alterations. You can play around with home page elements, navigation structure, site layout, product descriptions, information visibility, pricing, promos, and much more. Sometimes even the slightest difference in, say, image placement or button size may affect conversions.
Not sure whether a certain page or its element communicates the right message to the buying audience? Wondering which landing page headline, product description, or design of the call to action buttons better speaks to buyers? Verify your assumptions with an A/B test.
Real-life A/B test results you may leverage
Let’s take a look at real life A/B test scenarios eCommerce businesses from all around the world have run. We hope that these experiments can inspire you to create some valuable tests of your own.
Two types of email capture experience have been tested. The gamified opt-in version B, (see the pictures below) offering a ‘mysterious discount’ of 15%, won over a standard two-step email capture method directly offering the same discount in exchange for visitors’ email addresses. The click rates increased by 48.9%, email submission rate grew by 25%, and the conversion rate got lifted by 12.7%.
Different hues have different psychological properties, they affect human behavior and invoke specific emotions, claims color psychology. Designing a button based on color science might be worth it. This case shows the results of the split testing meant to define which color the ‘Add to Cart’ call to action (CTA) button proves to be more eye-catching and clickable. Simply changing the color of the CTA button to emerald (version B) boosted sales conversions by 14%.
The product page with an embedded next-day-delivery countdown timer (version B) led to increased purchases. In fact, this variation had 8.6% more conversions than the alternative variant that was just offering a free delivery.
Due to customer segmentation via a CRM software, it’s possible to communicate the right marketing message to the right audience. As A/B testing for this case revealed, a precisely segmented email campaign based on a customer’s purchase history (version B) turned out to be way more effective than a general marketing offer (version A). Segmented promotions increased customer clicks by 61% and boosted monthly subscriptions to bundled products.
A longer, detailed quote form placed on the left side of a B2B eCommerce page (version B) got astonishing 88.7% increase in form submissions resulting in more sales. When we further analyzed these pages, the shorter variation actually brought buyers to another page containing more data fields. This experience may have frustrated website visitors resulting in more incomplete form submissions.
Just like the previous A/B experiment, buyers from this test case positively perceived the detailed information around goods and ordering. The version with shipping costs specified under the call to action button (version B) stimulated sales by 60.1%. There’s been another test which displayed both delivery details and shipping costs. Interestingly, the version with only the shipping costs was the winner again.
This A/B test experimented whether a comment or review under a product page lead to more sales. It obviously did. Purchase volumes grew by 69% and revenue skyrocketed by 99%.
Just like out-of-stock pages, non-existing product pages of eCommerce sites are full of potential. When tested, the 404-page providing recommended items for a missing product (version B) offered amazing results. In contrast to the simple, ‘We’re sorry but the page you’re looking for is no longer available’ version, the page with other recommendations led to increased purchases by new clients, optimized add-to-shopping-bag rates as well as increased page visits.
Don’t let customers remove the products added to their carts before they confirm this action. A “talking” shopping cart can suggest them to save these items for later, giving them some time to reconsider, just in case their minds change. According to the test results, the triggered overlay version asking customers if they were sure they wanted to remove an item from a cart (the winning version B) resulted in a 4.4% lift in sales.
Labelling a CTA checkout button is as important as designing it. Is it self-explanatory? Does it inform the buyer of what’s next in the checkout process? This A/B test proved that the “Go to Payment Options” button (version B) achieved a 87.5% lift in clickthroughs to the subsequent page.
This test provided an interesting outcome. UK Tool Center, a business selling power- and hand tools, created an alternative product category page with nice filtering options to help their visitors navigate through hundreds of products. But tests showed that the version with the filtering options (version B) reduced the number of clicks by 27%! While it may seem rational that filtering options would help the customer experience and lead to more page visits, it actually distracted the buyer. In addition, removing the filter allowed more products to be shown above the fold.
The major points that we wanted you to take away from this write-up are:
- There are no strict recommendations of what to A/B test. Depending on your business objectives, you can test any concept and on-page element to see how it impacts traffic, changes customer behaviour, and promotes conversions.
- Never forget that you are not the target audience, so the design elements and content that you might find appealing may not have the same appeal to your website visitors. Always verify your improvements and changes with an A/B test.
- There’s always an opportunity to improve your website conversion rates. Even your 404 pages can bring you additional revenue.
- Most importantly, never stop testing. A/B Testing isn’t a one-shot project, it is a process that needs to be repeated periodically when done correctly.
So never stop improving your website. In case you are stuck, just do what we do – go and check WhichTestWon to pick up some fresh A/B testing ideas.
Happy experimenting! And feel free to share any other fascinating A/B test outcomes that you’ve seen in the comments below.