Measuring and Optimizing Engagement: The Audible.com Case Study

https://s3.amazonaws.com:443/slideshare/ssplayer.swf?id=9259&doc=adtech-ny-next-generation-strategies-for-ecommerce-the-audiblecom-case-study-9259-9996

You can measure engagement. More importantly, engagement can be optimized. Yesterday, at ad:tech New York I delivered the presentation above on work we just completed at OTTO Digital for Audible.com on the panel Next Generation Strategies for eCommerce, moderated by David Schatsky, President of JupiterKagan. The case study shows how we used multivariate and targeted content delivery to optimize engagement and drive an incremental revenue gain of 55% to a key user segement.

Multivariate testing breaks down each component of your messaging and brand and mines it for business intelligence. Each and every element of the engagement is measured for its influence on user behavior. What parts of your page; images, headline, offer, call to action, or content influence engagement? We’ll find out. Actually we do one better. We find out the influence of each element on the big picture, your bottom line. All our work is optimizing one thing. The ROI and brand value of your digital initiatives.

Audible_logo792_1
Audible.com has been at the forefront of subscription-based e-commerce for a while. I remember them because they were one of the biggest New Jersey. com’s (naturally a hardscrabble collection) before the bubble burst. Audible weathered the storm, likely because of their business model and have been given even more life with that greatest icon of the new millennium, the iPod.

The Thinking
Picture_1

Audible had an issue many sites face. Their homepage/landing page for anonymous (un-cookied) users was seeing a big drop off. For some reason people new to Audible were not engaging with their business. So we sat down with Audible to learn about their business and came up with some ideas as to why this was happening.

Ideas why this was happening included:

• The main image on the site was turning people off

• The headline was not enticing enough

• Having internal search was driving people away from the call to action

• There was no confidence or trust messaging

• No images that conveyed the purpose or idea of the site

Due to the issue of time we decided to forego A/B split testing on layout and design strategies for the page (most times a recommended first step) and forge ahead using the confines of the existing experience. Since this page got plenty of traffic we decided to do a 7 x 2 Multivariate (MVT) test. A 7 x 2 MVT is sort of like an A/B test on steroids. We A/B seven elements of the page and through the Taguchi methodology employed the Offermatica tool we get 128 possible combinations of elements of which to determine a best predicted recipe (mix of tested elements) and expected lift (that turned out to be 100% accurate).
Picture_2_1

Picture_3

Test 1

It took us one week to get the test set-up and create the element variations. We started by sending 10% of the traffic to the page but quickly increased to 40% after two days and 80% after 4 days. Day 5 we were at 100% distributed to 8 page recipes.

The results were profound and since we had lots of traffic it took us only two weeks to get confidence (provided by the Offermatica algo based on the amount of traffic and actions). Our best recipe provided a 24% lift in engagement. This was measured by getting people to click the call to action and go to Audible.com’s “Pick a Plan” page where they are presented with subscription options (and one of the pages we are currently testing). We were also able to measure drop off among every other step in the Audible.com purchase funnel and resolve cost data for AOV and RPV increases.
Picture_4

The slide below shows the contributing factors to engagement with the headline being the overriding factor. What we learned about the other elements, and why I’m passionate about the business intelligence that the tool provides was even more interesting.

Picture_5

Having the email address and country field present before the call to action was actually a contributing factor. This was completely counterintuitive based on most of my experience. So much for “best practices.” Also, there were other elements that factored like the testimonial and trust. The fact that these influence levels were low was not important. They could have been low due to a number of factors like placement, or creative. What was important was that they had any influence. This is why our methodology is that we iterate on test results. These elements were ones we felt could be further optimized. On to test 2.

Test 2
Picture_7

Since we were now getting 24% more users engaged in the funnel, for the second test we wanted to see if we could optimize based on the intelligence we learned from individual element contribution to engagement that we could resolve back to an attributable lift in sales.

There were a few problems with the original test in my opinion that we corrected in the second test. The trust messaging wasn’t as close to the call of action as I would have liked. It also looked kind of off center. The other thing I wasn’t happy about on test was that the testimonial was on the bottom of the page under the fold. I had wanted to put it in the header however it was going to be a bit of time and politics to get that prominent position. I also would have liked to have the VeriSign seal coupled with the TRUSTe badge.

Picture_8

In the second test we did just that. Set it up as I liked and made the trust elements un-clickable. No one clicks on these things anyway (and if your site does please let me know and I will stand corrected…for your site). We got the testimonial not in the header but near the call of action. Not the best spot but better than below the fold. We also thought that a more iconic treatment of the center well image might be easier better, so we created one. The test took one week to set-up and ran for two weeks. The results speak for themselves.

The trust messages had a 98% influence on conversion and raised Average Order Value (AOV) and Revenue per Visitor (RPV) 30%. The compounded impact to RPV from both tests was 55%!

Picture_10

Picture_11

Summary

What we found out for Audible.com’s business was startling. While the changes were simply revising a headline and adding more trust messaging they were the result of strategic insight into Audible’s business, their prospective customer base and our testing and optimization methodology of speed, frequency and iteration. They are insights that they can use across media since prospects will likely have the same discovery needs at any touch point with Audible’s brand.

The key takeaways:

• Segment Your Audience

• Test & Measure the Factors for Engagement

• Optimize those Factors with Iterative Tests

Currently Audible.com is running tests on other High Impact Segments like paid search and Amazon referrers. More great results should be on the way. What is your business doing?

5 thoughts on “Measuring and Optimizing Engagement: The Audible.com Case Study

  1. I enjoyed this article, as it showed clearly, the testing process. One thing I’m confused about is how you worked out which factors “engaged” the customer, before moving on to test them. Will you elaborate please?

    Like

  2. I love this article. I get tired of other sites that just talk fluff.
    I also have two questions.
    1. When measuring a call to action, such as a conversion rate (e.g. which is currently at 1.5%), what sort of numbers are needed to reach a high level of confidence? You said that Omniture told you what to do. Can you share? I mean, all the multi-variate testing in the world is useless without knowing this.
    2. I have potentially one of the coolest test ideas you can do: simply do testing between all the trust logos out there including: hackersafe, controlscan, a couple different ssl seals, and even a made up one. these companies are making wild claims. I would even be willing to help contribute.
    If interested, contact me through my blog.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s