Fine Tuning Premium Products Through A/B Testing

October 27, 2015

This is the first in a two-part series about how LinkedIn’s different business units use A/B testing and XLNT, our A/B testing software platform, to build better products for users. Today, we’ll the benefits of experimentation on how we build and iterate our premium and paid products. In part two, we explore how A/B testing helped us tune our email communication to give users what they need.

LinkedIn offers a number of subscription products and we know that some members have been overwhelmed by the numerous Premium offerings we’ve had. Last year, we embarked on a project to simplify our portfolio and align our paid products with member needs. This required a major overhaul of our acquisition flow and self-service purchase experience, across design and information architecture. While we were excited about the product changes, we wanted to be mindful of the business impact a large change like this could bring. Premium subscriptions is a large business that accounted for 18 percent of LinkedIn’s revenue in Q2 2015. With our advanced A/B testing platform, called XLNT, by our side, we were able to understand a number of member behaviors and business impacts. Here are a few learnings below that you might find helpful as you think about leveraging experimentation frameworks.

Less is more

One key simplification we made for Premium subscriptions was reducing the number of available offerings from twelve to four (yes, a 67 percent reduction in options). The goal was to offer Premium membership plans that are aligned to specific objectives, namely job seeking, professional networking, sales, and recruiting. We wanted members to have a clear understanding of the plans available to them and the value they get for each. In order to understand the impact on business growth, we tracked the impact on a number of metrics, including traffic through acquisition funnel and sign-up rates. With a single A/B test, we were able to monitor these metrics at the aggregate level as well as at the plan level. For example, the treatment bucket showed a lift in total sign-ups, which helped validate our hypothesis that simplicity leads to a better member experience. In addition, we observed a shift towards sign-ups for Job Seeker Subscription. In the previous experience, members were not able to discover this plan effectively but with the new information architecture, more members were able to self-select this option. This was a key learning and drove many future optimization decisions.

Show me the details

Once members select a Premium plan, they are taken to the checkout page. In line with the simplification theme, we removed a number of components from the checkout page and made it as clean as possible. As part of this effort, we tested the impact of an FAQ module. This module answers basic questions regarding the subscription plan, e.g. length of subscription, next billing date, ways to cancel, etc. We A/B tested the module placement in two locations: in-line and at bottom of the page. In-line placement took extra space and made the page look busy, while the bottom placement kept the page simple. Contrary to our initial thinking, the in-line placement showed a material lift in conversion rates, i.e. more members upgraded when they were shown the FAQs in-line. A key learning was to address the top-of-mind questions that a member would have before committing to the purchase. It was a mistake to give these FAQs any less real estate on the page. With the help of experimentation, we were able to assist customers and improve checkout page conversion rates.

Bold, bright, subtle or soft

Experimentation can also be very helpful in identifying the right design choice. Prior to launch, we debated endlessly on the size of calls-to-action (CTAs) in the acquisition flow. There was a strong belief that a bolder CTA will drive more sign-ups. Based on feedback from our design team, we decided to test the bold CTA against a more subtle one. The design team won this contest - the subtle CTA performed better and resulted in an improved acquisition flow. We quickly dropped the big blue button!

XLNT gave us the ability to measure impact of these and many more experiments on key product, business metrics, and help make data driven decisions for changes that ranged from small tweaks to large structural modifications. With minimal involvement from the development team, this tool constantly helps us iterate at a fast pace.

Simplification continues to be a big focus for us. As we build new premium features and further simplify our offering, we will be relying heavily on our experimentation toolkit to guide product ramp and business decisions.