Save thousands of Euros using UX research, by getting people to do tests in exchange for chocolate
What do you do when you need to decide between two design solutions? A/B testing can be one of the first research methods that comes to your mind. But in some cases, a poorly prepared A/B test can lose you thousands of crispy euros every day.
Liligo is a search engine that specialises in travel product comparisons, allowing millions of users to find the cheapest price for their journeys. We are currently working with Liligo to design an interface for their app to make it easier for users to choose from 100s of flight options, and motivate them to buy tickets on Liligo's partners' sites.
We all know that choice can be an attractive thing, but as Sheena Iyengar has discussed, when users are faced with too many options they don't know what to do, and are burdened by the responsibility of distinguishing good decisions from bad.
Extensive choices do not necessarily lead to enhanced motivation when there are too many options, or substantial time and effort is required for users to make truly informed comparisons between alternatives.
So, our UX company decided to improve Liligo's business performance using a transparent option list which has enough information to help users decide and raise the conversion rate. In this case study I am concentrating on only one phase of the user journey, the moment when users have already chosen their trip and need a clear summary of their selection.
That is what the original version looked like:
We always create 2 or 3 different solutions for a problem, and test it on real people.
Version A: Tabs: We separated the outbound and return travel information into two tabs.
Version B: Scrolling: We put the outbound and return travel information on the same page and let users scroll up and down.
Both versions looked good to us but we wanted to know which worked better for users. As I am sure you will be aware, a two-week A/B testing period with millions of users can cause thousands of euros to be lost if one version does not work well and users cannot find what they are looking for. We had to find a quick and cheap solution to prepare appropriate A/B test versions.
Guerilla tests proved to be the best solution in this case because:
- the product was not too specific,
- the test took no longer than 3 minutes
- and we could ask lot of people at the same time.
So we left our safe and cozy office, bought the cheapest ticket we could find and went to Budapest Airport to do the test on real travellers who were waiting for their flights. We also visited a busy shopping centre.
We observed that in the case of the tab version users didn't instinctively click on the return tab. They tried to scroll down, or swipe back and forth, for the return information. and some of the users didn't even find it. We concluded that the winner was Version B, with its long and scrollable information.
Our takeaway? Think twice about what versions you test on your users! Instead of a long testing period, risking money and playing with users patience, we spent only 2 days (and it cost us only some chocolate) on user testing to create the feedback to make a usable app.
You can read more about the best UX research methods in one of our previous blog posts.