Nice to have or need to have
If you want to work out what someone thinks about something, you start by asking them what they think. Quantitative and qualitative data can help you understand how people feel about a particular policy, program, paint colour, or trans-Tasman argument over who invented pavlova…
But when you’re asking people what they think, how do you know they understand your questions?
Key innovation principles
- take opportunities when they arise
- test your assumptions with actual users — the results are often surprising
Enforced remote working gave researchers from the Department of Agriculture, Water and the Environment an excuse to try a new way of testing their survey questions on the general public.
Cale Hubble and Kate Boucaut from the department’s Behavioural Analysis team were working on developing insights for the National Waste Policy. They wanted to find out the public’s attitude towards supermarket products made with recycled content or recycled packaging.
Little specific research existed, so they designed a survey to fill the gap. Cale explains that after creating the survey, they’d normally ask a few colleagues if they understood the questions.
‘The usual way to test a survey is to run the questions past a few mock participants in the department to make sure they run well. This is to give us assurance the questions make sense and that people are answering them in the ‘right way’,’ says Cale.
But COVID-19 put any in-person testing out of the question.
So, they looked to try out a different testing method that had previously been considered above-and-beyond.
Nice to have
Enter stage left — cognitive interviewing online.
Cognitive interviewing is a way of trialling the quality of the questions, not just whether they make sense. Interviewers guide survey testers through the questions, asking them to explain the thought processes behind why they’re answering in a particular way.
This then gives the survey designers, like Cale and Kate, insight into whether or not people actually understand what the survey is asking them.
The pair said they’d always wanted to try the method, but resisted the extra step because of a lack of funding or resources.
Low risk pilot
Enter stage right — a trial to trial things.
Fortuitously, while designing the survey, the team got the opportunity to test out a beta-video interview function being run by a Brisbane-based research company, Askable.
‘The company provides survey testers plucked from the general public, something we’ve always struggled to access. The testers are usually used for testing products before they hit the market, but with the new video feature we thought it was worth exploring for our purposes,’ explains Cale.
Kate says they took advantage of a trial credit they had received at a conference (pre-pandemic).
‘It was the perfect storm really. We needed to get this survey out and it’s difficult to access a wide range of survey participants at the best of times, let alone during a pandemic. The free credits were about to run out and, with a bit of time up our sleeve, we just thought, why not?’ explains Kate.
‘There was no stress on the budget or time pressures with a procurement process, since it was free. With virtually no risk, it seemed like a no-brainer.’
Not only were Cale and Kate moving their whole process online, they were also piloting the online service, trialling the company’s beta version video function, and giving cognitive interviewing a go for the first time.
The pair say while it would have been nice to test out the new ways of working one at a time, rather than all at once, the experiment was a total success.
‘We’d always had this gut feeling we should try cognitive interviewing,’ says Cale. ‘It plays a really powerful role in improving the quality of your data and legitimising it, and the service allowed us to test the user experience and question our assumptions.’
Need to have
The cognitive testing results led to an epiphany.
‘Our assumptions were completely busted. We expect people to think like we do and to understand the recycling terms we use. But this wasn’t the case — they were interpreting our questions differently,’ says Cale.
In any survey, misunderstandings can potentially skew the dataset.
‘We looked at the project from a completely different perspective after the results came back,’ says Kate.
‘We’ll definitely continue with cognitive testing after this. Until we were able to test it out, we didn’t even realise we weren’t happy with the veracity of the old method. We now think of it as old school.’
Cale’s and Kate’s advice to others is: ‘Don’t be scared of new ideas or new technology — just give it a go.’
And you might just find out that nice-to-have was necessary all along.
- Read our case study on hosting a virtual cafe
- Read our case study on an online workshop by the BizLab team
- See more public sector innovation case studies
Connect with us