Testing for Increased Quantity of High-Quality Leads
Client: B2C lead-gen site in the luxury travel industry
Primary conversion goal: get high-quality leads to fill out a long form.
Background on qualifying leads: They had a 2-part system for filtering leads for quality: first, auto-filtering would kick out the low-quality leads. Then, an employee would manually scrub the remaining leads. All leads that passed both screenings would be sent to the company’s travel partners: travel agents, tour operators, etc, who would compete to create the best itinerary for the traveler. The traveler would then choose which company to hire, and pay them directly. Our client got paid on the back end.
So they didn’t want to merely increase the number of visitors who filled out the form on their site; they wanted to increase the number of visitors who ended up getting matched to a travel partner. They labeled this group “matched leads”.
They had been running A/B tests for a several years before hiring us. And although they had gotten a few small wins in their tests, they were never able to have a winning variation on their most important page – the form – on desktop.

We used form analytics to learn that the question with the highest abandonment rate — 44% — was for the question: “Stage in Planning”.


We also used best practices to make numerous changes to the form:
- Breaking up the long form into 4 pages
- Removing the colorful (and visually distracting) image, and changing the placement and format of the testimonials
- Removing links to news stories, which take the user off this critical page — right when they’re ready to convert
- Changing the headline to emphasize the traveler’s benefit, and matching it to the final CTA
- Removing all caps, which is slower to read
- Following rules of thumb for using radio buttons, dropdowns, text fields, and sliders, depending on quantity and format of answer choices
- Appropriate label alignment for faster readability

Because the goal was to increase matched leads, rather than just leads, we used Google Optimize. They had already created a customized goal in Google Analytics to measure matched leads, so Google Optimize allowed us to test for matched leads.
If we would’ve simply tested for the number of people filling out the form, our test would’ve been inconclusive. Our new variation only showed a 1.21% improvement in the number of form submissions, and it wasn’t statistically significant.
However, measuring for matched leads showed a different story: our new variation resulted in 6.83% more matched leads, with 96.4% statistical significance.
Theresa Baiocco-Farr
Latest posts by Theresa Baiocco-Farr (see all)
- Case study: An updated design isn’t good enough - April 12, 2019
- Testing for Increased Quantity of High-Quality Leads - April 12, 2019