Listly by Erika Yigzaw
Surveys are required for accreditation and are an integral part of any Institutional Research and Improvement Plan. However, there is a big gap between a good survey and a bad one. Many people creating surveys do not have training and are unaware that there are many best practices for creating a good useful survey that will generate high response rates. Here is a collection of my favorite tools and references for anyone tasked with putting together a survey, particularly in an education setting. Pls vote!
WebSM is a leading global site for web survey methodology, on-line survey software, internet mediated research and bibliography of e-social science methods.
SR Porter, Raising response rates: What Works, New Directions for Institutional Research No 121, Spring 2004, Wiley Periodicals
www.atn.edu.au/docs/Raising%20Response%20Rates.pdf
Survey Research Methods Section of the American Statistical Association
Journal of Official Statistics publishes articles on statistical methodology and theory, with an emphasis on applications. Published by Statistics Sweden
Response Rates in Organizational Science 1995-2008: A Meta-analytic Review and Guidelines for Survey Researchers J Bus Psychol (2010) 25:335-349
Abstract
Purpose
This study expands upon existing knowledge of response rates by conducting a large-scale quantitative review of published response rates. This allowed a fine-grained comparison of response rates across respondent groups. Other unique features of this study are the analysis of response enhancing techniques across respondent groups and response rate trends over time. In order to aid researchers in designing surveys, we provide expected response rate percentiles for different survey modalities.
Design
We analyzed 2,037 surveys, covering 1,251,651 individual respondents, published in 12 journals in I/O Psychology, Management, and Marketing during the period 1995–2008. Expected response rate levels were summarized for different types of respondents and use of response enhancing techniques was coded for each study.
Findings
First, differences in mean response rate were found across respondent types with the lowest response rates reported for executive respondents and the highest for non-working respondents and non-managerial employees. Second, moderator analyses suggested that the effectiveness of response enhancing techniques was dependent on type of respondents. Evidence for differential prediction across respondent type was found for incentives, salience, identification numbers, sponsorship, and administration mode. When controlling for increased use of response enhancing techniques, a small decline in response rates over time was found.
Implications
Our findings suggest that existing guidelines for designing effective survey research may not always offer the most accurate information available. Survey researchers should be aware that they may obtain lower/higher response rates depending on the respondent type surveyed and that some response enhancing techniques may be less/more effective in specific samples.
Originality/value
This study, analyzing the largest set of published response rates to date, offers the first evidence for different response rates and differential functioning of response enhancing techniques across respondent types.
Methods for Improving Response Rates in Two-Phase Mail Surveys
Development and Validation of an Instrument for Assessing Distance Education Learning Environments in Higher Education:
The Distance Education Learning Environments Survey (DELES)