This list compiles the most counterintuitive and therefore most interesting test results from multivariate and A/B (or split) tests I was involved in over the past year. I thought that this would be more valuable, a better read and more modest than just a rank list of tests that had the highest lift in conversion metrics.
These results were surprising either to me or to clients. I realize that it’s hard to place these results out of context with what they were tested against. Many of these tests or the ideas behind them have been blogged about over the past year and I’ve provided those links where I can. So, in no particular order, the ten most interesting 2007 results with notation on what type of test and where the test took place.
1. The highest factor of influence on conversion rate was having global navigation present (MVT – landing page). More info on this test
2. Different call-to-action button colors, shapes and locations performed better on different site pages (MVT – product pages, category pages, checkout flow). More info on this test
3. Three bullets and a call to action button increased conversion rate vs. a content focused page describing the service in detail (A/B – homepage).
4. A single product increased conversion rate vs. a choice of three products (A/B – landing page) More info on this test
5. A long scroll text heavy page (ala direct mail) performed better than shorter more concise pages (A/B – landing page) More info on this test
6. Having each radio button for additional search results be opt-out instead of opt-in increased overall query volume (MVT – search engine interface). More info on this test
7. Replacing benefits content with a larger image of the product helped increase conversion rates (A/B – landing page/product page)
8. Replacing marketing content with a search query field — aka “Googleizing” increased almost all performance metrics (A/B – homepage)
9. Taking the small grey print in the footer that the company didn’t want people to see and putting it front and center helped improve conversion rates (A/B – landing page)
10. Client elected not to implement winning homepage because the optimized user experience reduced page views.
Making this list was more difficult than I anticipated. So many results from testing over the past year were counterintuitive. Even as expert marketers we cannot predict how people will respond to creative and messaging. Therefore, don’t take these results as recommendations. However, they are all great test ideas for 2008.
For me, the main takeaway from this list (though there is a result on the list that refutes this) is that visitors know much more about your product or service than marketers give them credit for. Focus more on making it easier for someone that is ready to convert than on persuading people that are still in consideration stage. Chances are those consideration folks are making their decisions based on content gathered elsewhere rather than on your landing page or site. That thought is probably worthy of its own post.
It’s been a great year for testing. Many happy returns in 2008!