Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A comparison of general practitioner response rates to electronic and postal surveys in the setting of the National STI Prevention Program

A comparison of general practitioner response rates to electronic and postal surveys in the... I n May 2009, the Australian Government launched its National Sexually Transmissible Infections (STI) Prevention Program (NSPP). The program aimed to promote safe sex and to encourage young people aged 16–29 years to see their doctor about appropriate STI testing. The major component of the NSPP was a national social marketing campaign, the first wave of which ran for one month in June 2009. To support general practitioners (GPs) during the time of campaign activity, a number of resources were developed. These resources were designed to inform GPs about the campaign, provide information about STIs, advise on appropriate management and to support patient communication. Following the first wave of campaign activity an evaluation of the GP support materials was undertaken. The evaluation was in the form of a questionnaire and was delivered both online and through the post. Very little Australian research has been published to indicate the best method of recruiting GPs for evaluation work in the field of sexual health. Temple‐Smith et al (1998) described methods to maximise response rates from a postal survey about chlamydia; however, there is no work describing response rates to online surveys on this topic. A Cochrane review of randomised control trials, including 56 studies targeting GPs or other medical practitioners, identified numerous methods to increase response rates for both postal and electronic surveys. For postal surveys this included follow‐up, confidentiality and monetary incentives. For online surveys non‐monetary incentives were more effective. For both delivery methods personalised questionnaires, interesting topics and shorter surveys were all beneficial. This paper discusses the response rates for each of the delivery methods in the setting of the NSPP and their implications for future evaluations of similar support materials. Methods A survey was developed to evaluate the support materials for GPs that formed part of the NSPP. It was based on similar surveys performed in recent Australian studies that looked at chlamydia testing in general practice. The final survey was trialled with a small number of peers and colleagues (n=4), some of whom work in general practice. Two different survey delivery methods were used: online and postal. The online version targeted all practising GPs in Australia (approximately 20,000). The survey was created in Survey Monkey. GPs were directed to the survey through a link that was included in one of two electronic GP newsletters. The first newsletter was the Australian General Practice Network (AGPN) weekly newsletter to the Divisions of general practice and was sent during August 2009. This letter is circulated via email to member GPs. The second newsletter was the RACGP weekly newsletter, Friday Facts, and was sent during October 2009. This letter is also emailed to GPs and is available to read online on the RACGP website. At the time of recruitment around 20,000 individual contacts received the RACGP newsletter. Almost 17,000 received the letter via email with around 3,500 receiving it via fax. The postal version of the survey was sent to 509 GPs across Australia. The survey was distributed by National Mailing and Marketing and the GP contact details were provided by the Department of Health and Ageing (DoHA). The GP sample selected for recruitment was chosen by a process designed to generate a range of demographic attributes. All GPs (approximately 23,000) were listed according to MBS benefits activity. One GP was selected every 46 positions down the list to produce a sample of approximately 500. Using every 46 th GP on the benefits activity list gave a final sample of 509. This process provided a relatively random selection of practitioners however some elements of bias must be considered. Researchers were not given access to the GP contact details due to the confidentiality requirements of the DoHA data. This meant that follow‐up of the initial mail out was not possible. Surveys were sent with a return addressed envelope, however postage was not prepaid. The mail‐out took place during the first week of November 2009 and respondents were given up to two months to return completed questionnaires. Goodness‐of‐fit χ 2 calculations were used to compare the demographic characteristics of the respondents with the Australian GP population as a whole. Results Out of the roughly 20,000 GPs targeted for the online survey a total of 20 responses were received (response rate of <0.1%). Eight (0.04%) responses were received following the AGPN newsletter wave with a further 12 (0.06%) responses following the RACGP newsletter wave. Given the very low response rate further statistical analysis was not performed on this data set. A total of 63 postal surveys were returned of the 509 that were sent out (response rate of 12.4%). One of the surveys was excluded from further analysis as the respondent did not work in general practice. There was a good general spread in the demographic characteristics of responders to the postal survey that roughly represents GPs across Australia ( Table 1 ). Almost one‐third of responders (31%) were very interested in sexual health. 1 Demographic characteristics of respondents. Demographics Survey % (number) National %* Sex Male 50 (31) 62 Female 45 (28) 38 [No answer] [3] Age 31–35 3 (2) 36–40 16 (10) 41–45 3 (2) 46–50 21 (13) 49.8 (average) 51–55 15 (9) – median 56–60 16 (10) 61–65 11 (7) 65+ 15 (9) State/territory ACT 3 (2) 2 NSW 23 (14) 30 NT 3 (2) 1 QLD 23 (14) 18 SA 3 (2) 8 TAS 5 (3) 3 VIC 24 (15) 26 WA 16 (10) 12 Sessions per week 1–2 3 (2) 3–4 19 (12) 5–6 15 (9) 7–8 31 (19) – 7.2 (average) 8.6 (average) 9–10 27 (17) 10+ 5 (3) Region Metropolitan 68 (42) 69 Regional 19 (12) 29 Rural 13 (8) 2 * Data from the Australian Institute of Health and Welfare Goodness‐of‐fit χ 2 analysis showed little in the way of a statistically significant difference between the responders and the national GP population ( Table 2 ). It is worth noting that when rural and regional practitioners are combined the study sample even more accurately reflects the national population. 2 Goodness‐of‐fit χ 2 calculations comparing the study sample to the national population of GPs. Demographic Chi squared p value Sex 2.24 0.13 State/territory 7.78 0.35 Region 38.86 <0.01 Region (Metropolitan vs rural/regional) 0.05 0.83 Discussion In 2008, this journal published a brief report by Aitken et al that considered the use of an online questionnaire to survey medical practitioners. They describe both their own poor response rate (8.7%) as well as numerous other studies that have seen better response rates from postal surveys when compared to online surveys. As such, it was expected that similar results would be seen here. Given the potential to reach all GPs in Australia (around 20,000), it was felt that the actual number of respondents to the electronic survey would still be very high and therefore provide useful information to aid in the evaluation of the campaign. What was unexpected, however, was the very low response rates seen for both methods of recruitment. The incredibly low response of around 0.1% for the online survey was much lower than other studies using similar methods. There is little published literature that specifically looks at the use of online surveys to evaluate support materials for GPs. Other online surveys of medical practitioners often use personalised recruitment methods via email or letter and have achieved better response rates (Panettieri et al, 2009 – 15%; Aitken et al, 2007: 8.7%). This more personalised nature of recruitment is likely to have been more effective than our method of including a general recruitment call in a widely circulated electronic newsletter and is supported by the Cochrane review previously mentioned. It is perhaps the impersonal nature of these electronic newsletters, as well as the inclusion of numerous, unrelated news items, that were detrimental to the electronic recruitment seen here. Many GPs are too busy to read a weekly newsletter and would not have been aware of the online survey. Previous Australian postal surveys of GPs looking at their knowledge and attitudes towards chlamydia have had relatively good response rates. The study mentioned in the introduction by Temple‐Smith et al (1998) had an impressive response rate of 85%. This followed an intensive process that included two telephone prompts and a follow‐up letter. There were also incentives in the form of QA&CPD points, entry into a prize draw and a reply paid envelope. Hocking et al (2006) compared GP response rates to telephone interviews with response rates to postal surveys for a questionnaire about chlamydia. The postal survey also included two follow‐up rounds and incentives. Their response rate of 59.9% was lower than that seen by Temple‐Smith et al but was still higher than the response rate to their telephone interview (40%). This decline in response rate over an eight‐year period is perhaps a symptom of the increased workload our GPs are experiencing, along with an increasing number of requests to complete surveys as part of the ever‐growing research culture. Further to this, our response rate no doubt suffered from some time and financial restrictions that meant we were unable to offer incentives (including the lack of a postage paid envelope and QA&CPD points). GP contact details were obtained from DoHA as this was efficient and incurred no additional cost. It did prevent any follow‐up of the initial recruitment, however, and this would have also significantly affected response rates. Other more open sources of GP contact details should be considered in the future. The initial sampling of the 509 GPs that we approached was designed to produce a balanced cross‐section of the GP population. As this was based on MBS benefit activity the possibility of selection bias may have had some impact on this sample. Fortunately, however, despite a response rate of only 12.4%, it was pleasing to see that the demographic characteristics of our respondents were somewhat similar to the Australian GP population as a whole. This meant that further analysis of the responses to the survey would be likely to bear some relation to the total population. It is important to note however, that around one third of respondents were very interested in sexual health. Those with a strong interest in sexual health would have been more likely to respond to our survey and this may impact on the outcome of the evaluation. An important factor to consider in the overall low response rates for this study was the emergence in 2009 of the H1N1 influenza pandemic. This pandemic placed considerable strain on GPs across Australia and may have contributed to the poor responses seen here. The pandemic was at its peak in July and August 2009 during the first wave of recruitment for the online survey. It is possible that the delay before distribution of the postal survey until November 2009 may have played a part in the vast differences seen in response rates for each method. Conclusion It is clear that in the setting of an evaluation of support materials for GPs on the subject of sexual health postal surveys are likely to provide a much better response rate than online surveys. This may in part be due to the more personal nature of receiving a letter in the post when compared to a request placed among the numerous items of a weekly newsletter. Furthermore, this study highlights the need for further rounds of follow‐up after an initial round of recruitment and the benefits of incentives if response rates for postal surveys are to be maximised. External factors, such as the H1N1 pandemic, may affect GPs’ willingness to spend time on completing surveys. Finally, it remains to be seen whether GPs are growing tired of requests to complete surveys and what impact this may have on future research in all fields of public health. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Australian and New Zealand Journal of Public Health Wiley

A comparison of general practitioner response rates to electronic and postal surveys in the setting of the National STI Prevention Program

Loading next page...
 
/lp/wiley/a-comparison-of-general-practitioner-response-rates-to-electronic-and-0x450ZEO7j

References (11)

Publisher
Wiley
Copyright
© 2011 The Authors. ANZJPH © 2011 Public Health Association of Australia
ISSN
1326-0200
eISSN
1753-6405
DOI
10.1111/j.1753-6405.2011.00687.x
pmid
21463418
Publisher site
See Article on Publisher Site

Abstract

I n May 2009, the Australian Government launched its National Sexually Transmissible Infections (STI) Prevention Program (NSPP). The program aimed to promote safe sex and to encourage young people aged 16–29 years to see their doctor about appropriate STI testing. The major component of the NSPP was a national social marketing campaign, the first wave of which ran for one month in June 2009. To support general practitioners (GPs) during the time of campaign activity, a number of resources were developed. These resources were designed to inform GPs about the campaign, provide information about STIs, advise on appropriate management and to support patient communication. Following the first wave of campaign activity an evaluation of the GP support materials was undertaken. The evaluation was in the form of a questionnaire and was delivered both online and through the post. Very little Australian research has been published to indicate the best method of recruiting GPs for evaluation work in the field of sexual health. Temple‐Smith et al (1998) described methods to maximise response rates from a postal survey about chlamydia; however, there is no work describing response rates to online surveys on this topic. A Cochrane review of randomised control trials, including 56 studies targeting GPs or other medical practitioners, identified numerous methods to increase response rates for both postal and electronic surveys. For postal surveys this included follow‐up, confidentiality and monetary incentives. For online surveys non‐monetary incentives were more effective. For both delivery methods personalised questionnaires, interesting topics and shorter surveys were all beneficial. This paper discusses the response rates for each of the delivery methods in the setting of the NSPP and their implications for future evaluations of similar support materials. Methods A survey was developed to evaluate the support materials for GPs that formed part of the NSPP. It was based on similar surveys performed in recent Australian studies that looked at chlamydia testing in general practice. The final survey was trialled with a small number of peers and colleagues (n=4), some of whom work in general practice. Two different survey delivery methods were used: online and postal. The online version targeted all practising GPs in Australia (approximately 20,000). The survey was created in Survey Monkey. GPs were directed to the survey through a link that was included in one of two electronic GP newsletters. The first newsletter was the Australian General Practice Network (AGPN) weekly newsletter to the Divisions of general practice and was sent during August 2009. This letter is circulated via email to member GPs. The second newsletter was the RACGP weekly newsletter, Friday Facts, and was sent during October 2009. This letter is also emailed to GPs and is available to read online on the RACGP website. At the time of recruitment around 20,000 individual contacts received the RACGP newsletter. Almost 17,000 received the letter via email with around 3,500 receiving it via fax. The postal version of the survey was sent to 509 GPs across Australia. The survey was distributed by National Mailing and Marketing and the GP contact details were provided by the Department of Health and Ageing (DoHA). The GP sample selected for recruitment was chosen by a process designed to generate a range of demographic attributes. All GPs (approximately 23,000) were listed according to MBS benefits activity. One GP was selected every 46 positions down the list to produce a sample of approximately 500. Using every 46 th GP on the benefits activity list gave a final sample of 509. This process provided a relatively random selection of practitioners however some elements of bias must be considered. Researchers were not given access to the GP contact details due to the confidentiality requirements of the DoHA data. This meant that follow‐up of the initial mail out was not possible. Surveys were sent with a return addressed envelope, however postage was not prepaid. The mail‐out took place during the first week of November 2009 and respondents were given up to two months to return completed questionnaires. Goodness‐of‐fit χ 2 calculations were used to compare the demographic characteristics of the respondents with the Australian GP population as a whole. Results Out of the roughly 20,000 GPs targeted for the online survey a total of 20 responses were received (response rate of <0.1%). Eight (0.04%) responses were received following the AGPN newsletter wave with a further 12 (0.06%) responses following the RACGP newsletter wave. Given the very low response rate further statistical analysis was not performed on this data set. A total of 63 postal surveys were returned of the 509 that were sent out (response rate of 12.4%). One of the surveys was excluded from further analysis as the respondent did not work in general practice. There was a good general spread in the demographic characteristics of responders to the postal survey that roughly represents GPs across Australia ( Table 1 ). Almost one‐third of responders (31%) were very interested in sexual health. 1 Demographic characteristics of respondents. Demographics Survey % (number) National %* Sex Male 50 (31) 62 Female 45 (28) 38 [No answer] [3] Age 31–35 3 (2) 36–40 16 (10) 41–45 3 (2) 46–50 21 (13) 49.8 (average) 51–55 15 (9) – median 56–60 16 (10) 61–65 11 (7) 65+ 15 (9) State/territory ACT 3 (2) 2 NSW 23 (14) 30 NT 3 (2) 1 QLD 23 (14) 18 SA 3 (2) 8 TAS 5 (3) 3 VIC 24 (15) 26 WA 16 (10) 12 Sessions per week 1–2 3 (2) 3–4 19 (12) 5–6 15 (9) 7–8 31 (19) – 7.2 (average) 8.6 (average) 9–10 27 (17) 10+ 5 (3) Region Metropolitan 68 (42) 69 Regional 19 (12) 29 Rural 13 (8) 2 * Data from the Australian Institute of Health and Welfare Goodness‐of‐fit χ 2 analysis showed little in the way of a statistically significant difference between the responders and the national GP population ( Table 2 ). It is worth noting that when rural and regional practitioners are combined the study sample even more accurately reflects the national population. 2 Goodness‐of‐fit χ 2 calculations comparing the study sample to the national population of GPs. Demographic Chi squared p value Sex 2.24 0.13 State/territory 7.78 0.35 Region 38.86 <0.01 Region (Metropolitan vs rural/regional) 0.05 0.83 Discussion In 2008, this journal published a brief report by Aitken et al that considered the use of an online questionnaire to survey medical practitioners. They describe both their own poor response rate (8.7%) as well as numerous other studies that have seen better response rates from postal surveys when compared to online surveys. As such, it was expected that similar results would be seen here. Given the potential to reach all GPs in Australia (around 20,000), it was felt that the actual number of respondents to the electronic survey would still be very high and therefore provide useful information to aid in the evaluation of the campaign. What was unexpected, however, was the very low response rates seen for both methods of recruitment. The incredibly low response of around 0.1% for the online survey was much lower than other studies using similar methods. There is little published literature that specifically looks at the use of online surveys to evaluate support materials for GPs. Other online surveys of medical practitioners often use personalised recruitment methods via email or letter and have achieved better response rates (Panettieri et al, 2009 – 15%; Aitken et al, 2007: 8.7%). This more personalised nature of recruitment is likely to have been more effective than our method of including a general recruitment call in a widely circulated electronic newsletter and is supported by the Cochrane review previously mentioned. It is perhaps the impersonal nature of these electronic newsletters, as well as the inclusion of numerous, unrelated news items, that were detrimental to the electronic recruitment seen here. Many GPs are too busy to read a weekly newsletter and would not have been aware of the online survey. Previous Australian postal surveys of GPs looking at their knowledge and attitudes towards chlamydia have had relatively good response rates. The study mentioned in the introduction by Temple‐Smith et al (1998) had an impressive response rate of 85%. This followed an intensive process that included two telephone prompts and a follow‐up letter. There were also incentives in the form of QA&CPD points, entry into a prize draw and a reply paid envelope. Hocking et al (2006) compared GP response rates to telephone interviews with response rates to postal surveys for a questionnaire about chlamydia. The postal survey also included two follow‐up rounds and incentives. Their response rate of 59.9% was lower than that seen by Temple‐Smith et al but was still higher than the response rate to their telephone interview (40%). This decline in response rate over an eight‐year period is perhaps a symptom of the increased workload our GPs are experiencing, along with an increasing number of requests to complete surveys as part of the ever‐growing research culture. Further to this, our response rate no doubt suffered from some time and financial restrictions that meant we were unable to offer incentives (including the lack of a postage paid envelope and QA&CPD points). GP contact details were obtained from DoHA as this was efficient and incurred no additional cost. It did prevent any follow‐up of the initial recruitment, however, and this would have also significantly affected response rates. Other more open sources of GP contact details should be considered in the future. The initial sampling of the 509 GPs that we approached was designed to produce a balanced cross‐section of the GP population. As this was based on MBS benefit activity the possibility of selection bias may have had some impact on this sample. Fortunately, however, despite a response rate of only 12.4%, it was pleasing to see that the demographic characteristics of our respondents were somewhat similar to the Australian GP population as a whole. This meant that further analysis of the responses to the survey would be likely to bear some relation to the total population. It is important to note however, that around one third of respondents were very interested in sexual health. Those with a strong interest in sexual health would have been more likely to respond to our survey and this may impact on the outcome of the evaluation. An important factor to consider in the overall low response rates for this study was the emergence in 2009 of the H1N1 influenza pandemic. This pandemic placed considerable strain on GPs across Australia and may have contributed to the poor responses seen here. The pandemic was at its peak in July and August 2009 during the first wave of recruitment for the online survey. It is possible that the delay before distribution of the postal survey until November 2009 may have played a part in the vast differences seen in response rates for each method. Conclusion It is clear that in the setting of an evaluation of support materials for GPs on the subject of sexual health postal surveys are likely to provide a much better response rate than online surveys. This may in part be due to the more personal nature of receiving a letter in the post when compared to a request placed among the numerous items of a weekly newsletter. Furthermore, this study highlights the need for further rounds of follow‐up after an initial round of recruitment and the benefits of incentives if response rates for postal surveys are to be maximised. External factors, such as the H1N1 pandemic, may affect GPs’ willingness to spend time on completing surveys. Finally, it remains to be seen whether GPs are growing tired of requests to complete surveys and what impact this may have on future research in all fields of public health.

Journal

Australian and New Zealand Journal of Public HealthWiley

Published: Apr 1, 2011

There are no references for this article.