There's more to meaningful test results than simply running incessant A/B tests. Success in conversion rate optimization (CRO) requires efficient test management via a streamlined process for hypothesis development.
Understanding your user is paramount to identifying the user's friction points that fuel idea and hypothesis generation. This is why Jorrin Quest, UX and conversion specialist at Online Dialogue, notes that "the gap between user-centered design — UX — and psychology-powered revenue-driven growth — CRO — is being abridged."
Building a partnership with your user experience team now for ongoing user research can streamline the testing process, reach quicker and bigger wins, and reduce time and costs.
The reason for Quest's depiction of the decreasing gap is that the goals of the UX and optimization teams do not differ greatly. Inherently, optimizers aim for improving key performance indicators (KPIs) focused on the business rather than the user. But in order to reach sustained growth in those KPIs, the user must be considered as well.
Therefore, aiming to understand who the user is, what the user wants and in what context — what a UX researcher aims to identify — can expedite reaching test success and tangible results.
Choosing the right UX research methods is half the battle
In the process of CRO, UX research should be proactive and address three goals:
- identify problem areas of the site
- identify potential reasons for site problems
- identify potential solutions to test
UX research has a variety of methods for better understanding the user, but determining the best research method or combination of methods is where the challenge lies. Often research methods are discussed in the framework of quantitative and qualitative data sets, or more plainly, as web analytics and voice of customer studies.
Quantitative methods
Web analytics, a more passive and indiscriminate route for user research, looks at aggregated views of the customer and can be a strong method for identifying the scope of an issue. Web analytics take the form of session and user tracking, heat mapping of clicks or scrolling, and automated session recordings.
They're often an easy option for marketers and product designers for their low cost and ease of implementation. Still, web analytics software lacks transparency into the behavioral and psychological factors that impact the results supplied in the software.
Abnormally low page-level conversion rates or increased exit rates may indicate a user frustration to investigate. Additional data from a usability test or customer interview could define the problem at hand and potential solutions for testing. Likewise, heat map of user clicks may suggest a need for reprioritized content, but can be confirmed with conversations with real users and customers.
Qualitative methods
Inversely, the qualitative voice of customer studies more actively seeks to define the customer's reasons behind product or site issues. Qualitative UX studies focus on understanding behavioral and psychological dynamics.
"With a better understanding of the user," says Chris Callaghan, UX & Optimisation Director at McCann Manchester, "tests are less about hoping your variation works and more about validating your architected variation, risk mitigation and ROI."
These research methods include attitudinal-focused studies like interviews, diary studies or ethnography and behavioral-focused studies like eye tracking or physiological monitoring. The former aids in verbalizing the conflict between user and product, while the later identifies the unconscious motivators behind decision-making.
But both seek to answer the why behind customers' willingness to convert or their stressors inhibiting it.
Considering both types of studies in UX research is valuable for comprehensively understanding the mentality of user. Distinguishing the behavioral and attitudinal variants in a tool set is important because a UX researcher must also consider how responses may vary between the users' vocalized response and their physiological response. An increased heart rate might not elicit verbalized feedback from a user during a usability test, but may still signal a notable moment of stress that may trigger a friction point in the conversion funnel.
However, five minutes on Google search for conversion optimization methods will show qualitative research doesn't dominate the CRO space. It is overshadowed by arbitrary tests or oversimplifying metrics like bounce rates. Qualitative research and thorough UX studies in general is pricier to implement and requires more technical experience.
Understanding the user expedites results
Partnering both types of UX research is key to understanding the scope of a customer's problems. With aggregated metrics to define a problem's scope and individual data that defines why the issue exists, testing becomes a matter of confirming rather than hoping for results. For an optimizer focused on lean processes, UX research streamlines the testing pipeline to more tailored and meaningful hypotheses.
Dominic Hurst, digital transformation consultant at Valtech, defines research as the catalyst to a sound hypothesis and assures that in A/B testing, UX drives quality over quantity.
"User research, for me, is a fundamental step before optimization," Hurst says. "If you have no logic to why you're doing the tests, then it's time potentially wasted."
When optimizers get ready to test, the result they ultimately want is to relieve a user's pain point and, in turn, improve their KPI — whether it's a conversion rate, number of qualified leads or revenue. Entering the testing phase with data that creates a smarter, more focused test reduces the chance of testing failure or retesting, and accomplishing goals quicker.
Ultimately, this speed is key to reducing waste and becoming a more efficient optimizer.
Case in point: Els Aerts, managing partner at AGConsult, knows a better user experience can't just be made with best practices.
"We saw a great example of this on an e-commerce client's site that sells sunglasses and watches," Aerts said. "There was a lot of drop-off on the cart overview page. We fixed everything that was objectively wrong with the page: removed the clutter, emphasized the USPs, made the payment options clear — you know, all the standard stuff.
"We ran an A/B test, and it did nothing: zero difference. Because that wasn't what was holding people back from buying. A targeted exit survey where we literally asked people that question, 'What's holding you back from buying from us today?' taught us people weren't buying because they couldn't choose.
"This brand had so many different cool sunglasses and watches people were struck by choice paralysis. We added a single line of copy to the cart overview page to take away that fear: 'You've made an excellent choice.' We ran another A/B test, and sure enough, we saw sales go up. Stuff like that you can only get from user research."
The contributions of a UX team to optimizers are highly beneficial but can get overlooked. User research can aid in defining a CROs next test or determine the test variants.
Ultimately, it supplies a better understanding of the individuals whose problems your product solves and helps you help those individuals make the most of your product more easily.