Recently, we took a look at A/B testing in eCommerce and shared some split testing ideas to get you started. Playing around with B2C sites to identify which page version or element gets better response is fun. It’s like trying different displays in a shop window to see which one attracts more visitors. In B2B, you don’t have a shop window, but that doesn’t mean you can’t experiment. Just because the B2B buyer isn’t emotional or the products are highly specialized or technical doesn’t mean you can’t try out new designs or formats to optimize conversions. Need inspiration? Here’s nine A/B testing scenarios that prove split testing in B2B just takes a little creativity.
Real World A/B tests for B2B
The following are real-life examples of A/B tests that demonstrate the various site elements companies from different industries have tested. We hope these samples will prompt you to split test your own site and increase your conversions.
In the business environment, conventional thinking says auto-play and auto-sound videos are not effective. This test case proves the opposite. The best converting version of the page (image B) contains an auto-play animation that explains the company’s service. This animation boosted leads by 38.78%. Using onsite animation was a bold move but it worked, not to mention it added a dynamic element to the page layout.
Customers and prospects want the private data they share to be secure and nondisclosed. Will a privacy guarantee displayed on the newsletter sign-up page help get more subscriptions (version B) or will the message fall through the cracks? As the results of this A/B test show, site visitors responded to the privacy guarantee and form submissions increased by 35%.
A/B testing best practices warn us against making too many changes to the control page. Numerous changes make it difficult to determine what element is the major player when conversion rates grow or drop. But sometimes you just have this feeling that changing one element won’t be enough. A complete redesign can work when done properly. For instance, this shredding and storage company ran an A/B test to determine if a radically redesigned landing page would impact their PPC results. It did! The new version (version B) displaying an attention grabbing headline, light colors, and imagery that visualized their services increased the number of form submissions by a stunning 107%.
At the first glance, the form versions don’t appear very different. However, the best-performing variation (image B) increased form fill-out rates by an unbelievable 368.5%. The difference is the eye-catching red call to action button, an actionable heading, images that call attention to benefits, and neatly arranged fields with precise labelling.
Well, it’s not about gender war but rather a curious observation. DHL Express, a world famous delivery operator with typically male couriers, tested whether using a female character in the header of a discount code request form (version B) would make any difference to the PPC rates. It was a real eye-opener. The female variant won at 108% of lead conversion rate gained at a 95% confidence level. Sometimes all it takes is a tweak to an image to make all the difference.
As this A/B test reveals, an awards badge placed next to a signup form (version B) achieves a higher form fill rate. In this case, a 29.2 % increase. Along with security seals and social proof indicators, awards badges are anxiety-reducing page elements that help earn visitor’s trust and prompt them to act as desired.
Consumers buy benefits. So it would seem to make sense that communicating benefits should be a good move. But should you identify benefits on a registration form? In this particular case, the page version without benefits was the clear winner (version B). The A/B test proved it to increase form submissions by 11% with a 99% confidence rate. Sometimes less is more and it’s important to keep pages from being cluttered.
This test case compared two dedicated PPC landing pages with completely different layouts. The traffic to both pages was generated by PPC ads. Version B offers a simple structure that is clutter-free and has a smart and fresh design. In the course of testing, this variation increased lead generation by 320%.
Customer behavior isn’t always predictable. That’s exactly what this A/B test proved. The control and alternate homepage versions differ by a single element: the copy on the call to action button. Which version will get more clicks, the one offering a free consultation or the one encouraging collaboration? The winner is version B. The action-oriented “Work with us” button label increased clicks by 171% at a 98.6% confidence level.
A/B testing wouldn’t be necessary if human behavior was always predictable. Even though we design sites and landing pages based on our perception of customer behavior and needs, we often miss the mark. By running A/B tests we can replace assumptions with facts and improve business outcomes.
Studies show that regular A/B testing helps B2B businesses generate about 30-40% more leads. There is no limitations on what should be tested. Depending on your current goals and challenges, you can test against different key performance indicators such as the number of page views per session, bounce rate, average time on site, button clicks, and form fills. It pays to experiment with page titles, copy, images, lead generation form layouts and other design elements. While B2B buyers are more rational and less emotional, we all react to specific styles and colors in design.
We hope these cases have inspired you to run at least a couple of A/B tests to improve your website’s conversions.
Are there any astonishing A/B test outcomes you never expected? Let us know in the comments below.