I’m opening the kimono and showing you my 7-step process to increase conversion rates. And believe me, it IS a process. It’s not a quick-fix, one-time activity and you’re done. But taking the time to do it right makes all the difference in the world.
People who simply learn the mechanics of running A/B or multivariate tests without following a process end up with meaningless tests, such as testing to see whether a green button or an orange button gets more clicks. Normally, the button color is NOT the site’s biggest problem, so this kind of test is often inconclusive.
And when that happens, people claim that A/B testing doesn’t work. But the problem is this: they only knew HOW to test, not WHAT to test.
95% of the work comes from figuring out what to test.
Learning a testing tool and running an A/B test is the easy part in comparison.
So I’m going to walk you through my own step-by-step process for figuring out what to test and show you a case study.
Step 1: Understand Your Site’s Visitors
Since you cannot be everything to everyone, you need to define exactly who your target market is. Do this by creating customer personas and sharing them with everyone involved in your marketing efforts: your SEO, Social Media Specialist, graphic designer, copywriter, web developer, PPC agency, and your CEO. Here’s my process for building customer personas.
If you don’t do this, you might as well write a love letter “to Whom It May Concern”.
Step 2: Define What Your Site’s Visitors Are Trying To Do
Next, take your customer personas and keyword research and determine what people are trying to do when they come to your site. What problems are they trying to solve? What would make their life better? What action do they want to take, and what information do they need to see on your site in order to make a decision?
Step 3: Define What YOU Want Them To Do
Note: what your customers want to do is not always the same thing as what you want them to do. Your job then is to give them what they want and still get what you want. You must define the specific actions that you want them to take on your site such as:
- Calling you
- Filling out a form
- Chatting with a sales rep
- Subscribing to your blog or newsletter
- Buying something directly from your site, etc
Configure these as goals in analytics, so you can measure conversions.
Step 4: Find Out What’s Preventing Them from Completing the Goal(s)
When you configure your goals in analytics, you’ll be able to see WHERE your problem pages are on your site, but it won’t tell you WHY. For example, the funnel report below from analytics shows us that 625 people came to the “Education Page” and 618 left without continuing on to the “Webinars Page,” which was the conversion goal. And of the 7 people who went to the Webinars Page, all of them continued the process and signed up for a webinar. This tells us, then, that we need to prioritize our efforts on the Education page. Not on the Webinars page. Not on getting people TO the education page. Just get more people from the Education page to the Webinars page.
To understand WHY there’s a problem on that page, you’ll need to dig deeper with user data. Here are some examples of data that you can collect, which will give you a ton of insight without breaking the bank:
- User Testing: gives random, anonymous people tasks to complete on your site, such as finding a certain product and adding it to their shopping cart. They record their screen while clicking around your site and they explain where they’re getting confused or frustrated. You’ll be amazed at what you find! Something that looks so obvious to you, like your big orange button, might be completely missed by people who are new to your site. It’s like watching a horror movie where you yell at the girl not to go in the basement. You’ll want to yell at the screen recording and tell the person where your button is. But the tester can’t hear you. And more importantly, neither can the rest of your site’s visitors.
- Short Surveys: Wondering what people want to do on your site and whether they run into problems doing it? Ask them! Tastefully done, short surveys can give you a wealth of knowledge without annoying your site’s visitors.
- Mouse Click Maps: allow you to see what people are actually clicking on. You might learn that they think a certain element is a clickable link, but it’s not. So maybe it should be.
After collecting all this data from analytics and user feedback, your eyes will open to a whole heap of problems you never knew you had. You’ll probably want to jump in and make a slew of changes on your site to fix them.
Resist that temptation!
After all, you might end up fixing one problem and creating 2 more. Instead, go to step 5.
Step 5: Define What You Think Will Fix the Problem
This is an important step in the testing process. It requires articulating the problem and what you believe the solution to be. You’ll also want to document your expected outcome.
Step 6: Run the Test
A/B and multivariate testing used to be prohibitively expensive for all but Fortune 500 companies. Thankfully, this technology has now trickled down to all price and ability levels, so there’s no excuse not to test. Google Analytics even has a free testing tool called Content Experiments built right into the interface. On the other end is enterprise-level testing tools like Monetate or Adobe Test and Target. In between, there are several easy to use and affordable tools, such as Optimizely, Visual Website Optimizer, and Convert.
Step 7: Repeat Steps 4 – 6
If you followed steps 1 – 6 above, you most likely had a winning test and increased your conversation rate.
Hooray! Buy a round of drinks and celebrate your success!
But when the hangover wears off, you’ll realize that you’re not done. After all, unless your site has a 100% conversion rate (I wouldn’t believe you if you said it did), you still have plenty of people leaving your site without converting.
So what do you do for them?
Go back to step 4 and repeat the process to uncover more problems, and come up with more ideas of how to fix them.
Case Study: See It in Action
Here’s a case study of how we figured out WHAT to test for a client, which resulted in a big win.
This was a B2B client whose primary goal was to have visitors fill out a Request a Quote Form. Once done, leads go to the sales team.
This was their original Request a Quote Form:
A Click Map shows that people were reluctant to enter their personal information. But they wanted to share their needs.
Note: Google Analytics showed that most people’s screen sizes didn’t allow the “Solution Details” section to be seen without scrolling down the page.
An Attention Map highlights that a lot of attention was wasted on the blue header. As a result, the form didn’t get any attention!
Our alternate page.
Based on these and other findings, we created an alternate page that featured the following improvements:
- Asked the questions in the order people wanted to answer them.
- Also asked fewer questions – only the most important questions that the sales team needed.
- Got rid of the big header, which drew attention away from the form and pushed it too far down the page. Now people can see the entire form without having to scroll.
- We added badges of the client’s credentials, as well as logos and a testimonial. All meant to instill more trust and credibility.
Notice that this isn’t merely a new design. It’s portraying a better understanding of how visitors are using the page and what factors influence conversions.
Now for the fun part.
We tested the two pages. Half the visitors saw the original version and half saw the new variation. We measured which one got more people to fill out the form.
Result: the new page showed an improvement of 70.93% over the original (at 97% statistical confidence). That means the sales team got 71% MORE LEADS. Booyah!
You can see how this is not a quick-fix and how most of the brainpower goes into finding your site’s problems by gathering and interpreting data, then defining what you think will fix those problems. And how running an A/B or multivariate test is just a small part of the process.
Do you think we would have gotten a 71% lift if we had just tested a green button vs. an orange button?
Not a chance.
Latest posts by Theresa Baiocco-Farr (see all)
- Case study: An updated design isn’t good enough - April 12, 2019
- Protected: Testing for Increased Quantity of High-Quality Leads - April 12, 2019