Business leaders already know they’re doomed for failure if they aren’t constantly evolving to meet the needs of their customers. What some may not realize is often they can anticipate those needs effectively and easily to stay ahead of the competition.
In the e-commerce industry, we use A/B testing to gain insight into consumer preferences and tailor our e-commerce stores to increase interest, engagement and, ultimately, sales. A/B testing, at its simplest definition, is measuring the popularity and performance of two slightly different versions of the same thing.
For an e-commerce startup like Engine, A/B testing may mean making a “buy” button a different color on some users’ devices, then comparing the results and applying the version that had the most clicks. The next day’s A/B test may be sending an email newsletter with varying subject lines, then learning from open rates about how the next email subject line should read.
Brick-and-mortar stores use a type of A/B testing every time they rearrange their layouts. Apple stores do it often. So do some local retailers. When you find a store that has an identical layout to one in another city, you’re probably walking through the results of a successful A/B test.
At Engine, we are always testing. Testing allows you to learn deeply from your customer base. Testing challenges assumptions, and forces you to prove out those assumptions, before implementing any changes.
Our CEO at Engine, John James, founded Acumen Brands in 2011. He would not have been able to grow Acumen’s primary e-commerce store, Country Outfitter, from $1 million that year to more than $15 million in a month in 2012 without using A/B testing. Country Outfitter’s Facebook likes grew to 7 million in just four months.
Of course, Facebook itself is a notorious A/B tester. Just compare your version of the Facebook app to someone else’s. Chances are, there’s a difference. Facebook couldn’t make $9.3 billion in a quarter without consistently testing what works.
Likewise, Google is constantly experimenting with its search algorithms and ad layouts to gain a competitive advantage and drive revenue for customers using AdSense. Those global powerhouses have turned A/B testing into an integral part of their business, and they have countless analysts who scour the data gleaned from such testing. If there’s any risk to an A/B test, it’s that some businesses may make quick and incorrect decisions based on tests that aren’t statistically significant.
It’s not enough to run a test for just one afternoon and presume to make changes to your store design or website from that. Gut feelings aren’t the answer here. Effective tests run for a while, to the point of statistical significance. Even if 25 of 1,000 consumers see your test, and all 25 make the same decision, that’s still not a large enough sample size for a business to make the change.
Fortunately, businesses do not have to measure their testing success on their own. Online testing services like Optimizely enable anyone to test on their website. There are multiple testing options out there, meaning that businesses have no excuse at all for stagnancy.
At Engine, successful tests for us are those that lead to more products in a shopping cart. For a restaurant, it may mean finding that new menu item that makes critics rave. For retail stores, it could mean finding that strategic location in the store where a customer is more likely to make that “impulse buy.”
Whatever the sector, whatever the strategy, businesses that fail to test are doing a disservice to their customers and their bottom lines.
Editor’s note: Blake Puryear is product lead at Engine, a Fayetteville-based e-commerce platform. The opinions expressed are those of the author.