Introducing Phenom’s Built-In A/B Testing for Optimizing Career Sites

If you’ve ever designed a career site, you know you make hundreds of decisions in the process. From the video on your homepage, to the type of search bar used, to the location of your call-to-action button, each choice can impact the overall effectiveness of your site. So, how do you know which choices are the right choices? With Phenom A/B testing, you can find out.

 

Introducing Phenom A/B Testing

At Phenom People, we understand you want your candidate experience to be informative, enjoyable, and seamless. We also realize how important reliable data can be to your career site’s success. Our new A/B testing platform allows us to test any change on your site so you can deliver the best user experience with real numbers to back up your decisions
 

Real-World Examples

One Phenom customer increased their conversion rate by 3.5% just by changing their hero image. Small wins like this one may not sound remarkable, but over thousands of visitors, a single change can pay off by generating a substantially higher number of leads and applies received

 

Even something as seemingly simple as your site’s color scheme can influence conversions. Don’t believe me? Check out the color sets below. Which one is better at increasing ad revenue at Bing? Does such a minor color variation really make a difference?

 

Left color scheme is lighter than the right side

You can make decisions like these based on your gut, coworker feedback, or focus groups. But why not eliminate the guesswork and let conclusive data do the talking? 

 

That is exactly what Microsoft did before choosing Bing’s color scheme. They set up an A/B test so they could compare the two versions at the same time to make the right decision. And it’s a good thing they did—the treatment color scheme (on the right) increased Bing’s revenue by $10M per year!

 

This is the power of A/B testing. It allows you to quickly and reliably test different versions of the same thing so you can put real-time data behind your decisions.

 

What exactly is an A/B test?

An A/B test, or split test, is a way to compare two versions of a single variable on your website at the same time. Essentially, you get a side-by-side comparison of how two versions are performing so you can quantify the impact of the differences between the two. In addition, you can validate whether a distinguishing factor of one version versus the other is an improvement or detriment. Companies can use this type of testing to analyze and optimize the elements on their websites. 

 

Why bother with A/B testing?

You're probably wondering if an A/B test is worth the effort. Can’t you just make the change and measure the performance of the old version versus the new one? Put simply, yes. You could make the change and then see if your conversion goes up or down. However, it’s common to see fluctuations in user behavior and conversion rates based on many different factors: the day of the week, economic uncertainties, even the weather! Therefore, if you see an increase in conversion, how can you be sure it’s because of your change instead of another outside factor?

 

A/B tests eliminate the possibility of outside factors disrupting your results by testing them at the same time and randomly determining the variation each user will receive.

 

Which changes should be tested?

All of them! Or at least as many as possible. This is because changes may not have the effects you think they will, and even adjustments perceived as small can have big impacts on conversion. It’s crucial to ensure any changes you’re implementing are actually making your site better, not worse. In fact, it’s not uncommon for an experiment to have an unexpectedly negative effect. One-third of Microsoft’s tests produce negative results—but Microsoft still learns from them. This is why many companies run hundreds, even thousands, of experiments every day.

 

Some examples of valuable A/B tests include: updates to your hero images, new content in the form of videos, blogs and articles, and the ordering sequence of your widgets.

 

How do you measure success?

When reviewing the results of an A/B test, it’s important to look at the number of “impressions,” or the number of times the test has run. There must be a statistically significant number of tests performed to be confident which version is the better option. Most A/B testing tools provide a “confidence interval” which provides a percent certainty on what is the optimal solution. In general, anything above 95% confidence is enough to be declared a winner. 

 

You must also look at other metrics, including bounce rate, exit rate, and engagement since some changes may have unintended consequences. For example, moving your talent community widget to the top of the page may increase the number of people who join the community, but simultaneously reduce the number of applies. Therefore, it’s important to review all of your metrics before declaring one version is better than another.

 

Eager to tweak your site to its full potential? Request a demo or contact your Account Manager today.

Dan is a product manager at Phenom People, where he focuses on creating engaging career sites. He enjoys staying active and competing in triathlons.

Responses

Show all responses