Response rates in Australian market research


Autoria(s): Bednall, David; Spiers, Micaela; Ringer, Allison; Vocino, Andrea
Data(s)

01/08/2013

Resumo

RICA commissioned Deakin University to “establish whether response rates are in decline in the Australian market research industry and to identify, as far as possible, the reasons for these declines if they exist. This is likely to involve a review of previous research, a literature review and collection of data on response rates provided on a confidential basis and with the assistance of AMSRO to facilitate data provision.”<br /><br />Attempts were made to contact all listed market research companies in Australia, including all major internet panel companies. While industry co-operation was not high with the study, sufficient data was provided to depict current response rates and to show how they had declined over time. Because of the low contactability issues, this Report proposes the use of better methods to compute the reliability of survey estimates by taking account of past survey results.<br /><br />The literature review revealed a wealth of recent studies, with the main emphasis being on studies of telephone and internet surveys. This review of the research produced 34 evidence-based guidelines for social researchers. While some of these reflect current practice, the emergence of the internet as the main survey method raises a number of disclosure and sampling issues. Esomar (2012) has produced 28 issues to be raised with providers, which sets the basis for good industry practice. This suggests an opportunity for the industry to adopt these standards as its own and to conduct training courses for major clients and suppliers. There are many panel providers, some of whom are not AMSRO members. AMSRO may need to examine what role it can play in mandating or encouraging adherence to these standards as a way of promoting the industry.<br /><br />Talks with key industry people, as well as the literature, have revealed the importance of blended surveys, where multiple contact and response mechanisms are used. Particularly where an internet panel is used as one source, this poses representativeness and weighting issues which are difficult to resolve. The Report recommends that where blended survey methods are used, measures be taken to measure contactability in the other contact media, along with more sophisticated weighting schemes. The industry should examine its training courses to ensure that industry expertise keeps pace with these developments. <br />Summary of Results<br /><br />The results focus on two main collection methods – the telephone and the internet. As far as the telephone is concerned response rates have been in a gradual decline the last decade. This outcome is hard to detect because the data show considerable fluctuations from one survey wave to the next. Among cold-calling surveys, telephone response rates are typically below 10%, for a range of topics and survey types. Co-operation rates, (the ratio of obtained interviews to refusals) are typically below 0.2 (that is below one interview to five refusals). Telephone interviews with clients have a higher response rate – typically above 20% with co-operation rates above 1.0. It would appear that some topics, such as financial services, may induce a lower level of co-operation. Government sponsored surveys have higher response rates, at times over 50%, but even here a sharp decline in response rates over time for one long running monitor was observed. Co-operation rates were also higher in government sponsored surveys.<br /><br />One long data series from a telephone omnibus suggested that the “Do Not Call Register” which began in May, 2007 had some positive effects for the industry. Initially there was a spike in both response rates and co-operation. Although this was relatively short-lived, response rates thereafter declined more slowly and co-operation rates were somewhat higher and remained stable. These conclusions should be regarded as tentative as more data series would really be necessary to see if similar trends occurred elsewhere. <br /><br />As far as the internet is concerned, panel response rates are around the 20% mark and appear to be relatively stable over the last few years. In this case, the gross response rate is the number of interviews divided by the number of invitations sent. As the number of invitations may be a function of the need to fill a survey quickly, it should be considered a gross indicator of response. In order to capture this phenomenon, a further measure has been devised, termed the “attempt rate” which measures the percentage of people who attempt to participate once sent the invitation. The available data suggests that it is relatively stable. However, it is also somewhat susceptible to the time the survey was left open. Finally, a co-operation rate was also calculated. It measures the ratio of completed to terminated interviews, typically at least five interviews to each termination, but often much higher. This measure is not directly comparable with the co-operation rate in telephone surveys because it cannot take account of the number of eligible people on the panel who open the invitation, see the company doing the survey or its length and decide not to take the survey. For internet client studies, response rates were typically somewhat higher than shown for the panels, but there was marked variability.<br /><br />There was only one study provided of intercept interviews. It showed response rates of over 60% and co-operation rates of nearly 2 interviews per refusal. A strength and a weakness of intercept interviewing is the ability to be selective in who is asked to participate. As for mail, one government sponsored mail survey from 2010 is reported, with a response rate over 50%. The previous review contains more data, as mail appears to be infrequently used within the industry for commercial surveys.<br /><br />While surveys remain a major and highly effective tool for the industry and its clients, issues with contactability and co-operation mean that even closer attention is needed to survey design, sampling, weighting and analysis than was previously the case. <br />

Identificador

http://hdl.handle.net/10536/DRO/DU:30065463

Idioma(s)

eng

Publicador

Deakin University, School of Management and Marketing

Relação

http://dro.deakin.edu.au/eserv/DU:30065463/bednall-researchstatement-2013.pdf

http://dro.deakin.edu.au/eserv/DU:30065463/bednall-responserates-evid-2013.pdf

http://dro.deakin.edu.au/eserv/DU:30065463/bednall-responserates-post-2013.pdf

Tipo

Report