15 Oct Follow the Numbers: A/B Testing for Best Performance
A friendly run-in with a real-life customer prompted a great testimonial. Well, it was great until we realized the testimonial was actually hurting overall website performance.
I approached him, eager to hear his impression of the brand. My enthusiasm was matched by his, and we quickly began a dialogue about the quality machines that Wright produces. I asked for an interview, snapped a few pictures, and recorded the whole conversation. I knew that this story would be great on Wright’s website. We put it on the website, in the header for all visitors to see it.
This was going to be great. Except for one thing: It didn’t work.
What do you mean, “it didn’t work”? The testimonial actually ended up hurting the performance of the website; it was converting fewer users to leads. We use a tool for A-B testing called Optimizely. By running this test we were able to show two different versions of the site to our incoming traffic: one version with the testimony and one without. We found that the version with the testimony generated 18% fewer leads than the version without the testimony. The loss was pretty significant.
So why didn’t it work? I think mainly because life happens. We had a great story that engaged the reader, and frankly that took them away from generating into a lead. We gave people more content than they could absorb, and eventually the dog needed to be let out, the kids were ready for dinner or the game was about to start. They got lost on the site and failed to become a lead. If we don’t keep potential customers moving through your site, then life will happen for your potential customer and they’ll get distracted from the flow of the page.
The bottom line is: We don’t know really why it didn’t work, but we don’t really care. We tested two different versions (among thousands of users) and one of those versions didn’t work as well. If the numbers tell me this testimony didn’t work, then I trust that.