AAA

E-research - selected issues concerning the Internet attitudes and opinions research

Konrad Kulikowski

Internet research of attitudes and opinions have become extremely popular not only among researchers, but also in business and education. This is mainly due to increased access to the Internet in society and availability of affordable and easy-to-use online survey tools. Therefore, it is important to summarise contemporary knowledge about creating and conducting research in the online environment and to present clear recommendations and best practices. The author of this article tried to answer five common questions which arise in online research practice: How to maximize the number of completed questionnaires? How to invite people to participate in the Internet study? Whether to use suggestive question? How to minimize non-response error? How to ask question about sensitive issues? Additionally, the importance of pilot studies was highlighted. In order to find valid and accurate answers to presented questions, a review of the scientific literature reporting the results of empirical studies relating to the Internet research methodology was conducted. Contemporary literature review allowed putting forward specific recommendations, as well as heuristic guidelines for the Internet research supported by empirical findings. Thus, this paper can contribute to improving the quality of data collected during studies via the Internet. The final result of online research depends on a number of factors appearing on the stage of: creating, transferring, filling out and returning the questionnaire. Although there is no simple way to predict and solve all possible problems and issues, following the recommendations presented in this article may significantly reduce number of potential issues in online studies.

Bibliography

  • Alessi E.J., Martin J.I., Conducting an Internet-based Survey: Benefits, Pitfalls, and Lessons Learned, „Social Work Research”, 2010, Vol. 34, No. 2, s. 122-128, http://dx.doi.org/10.1093/swr/34.2.122.
  • Anseel F., Lievens F., Schollaert E., Choragwicka B., Response Rates in Organizational Science, 1995-2008: A Meta-analytic Review and Guidelines for Survey Researchers. „Journal of Business and Psychology” 2010 , Vol. 25, No. 3, s. 335-349, http://dx.doi.org/10.1007/s10869-010-9157-6.
  • Batorski D., Polacy wobec technologii cyfrowych - uwarunkowania dostępności i sposobów korzystania. Diagnoza Społeczna 2013. Warunki i jakość życia Polaków - raport, „Contemporary Economics” 2013, Vol. 7, s. 317-341, http://dx.doi.org/ 10.5709/ce.1897-9254.114.
  • Bentley D., College P., Randomized Response, https://www.dartmouth.edu/~chance/teaching_aids/RResponse/RResponse.html.
  • Blair G., Imai K., Zhou Y.Y. , Design and Analysis of the Randomized Response Technique, http://imai.princeton.edu/research/files/randresp.pdf.
  • Buchanan E., Hvizdak E.E., Online survey tools: ethical and methodological concerns of human research ethics committees. „Journal of Empirical Research on Human Research Ethics” 2009, Vol. 4, No. 2, s. 37-48, http://dx.doi.org/10.1525/jer.2009.4.2.37.
  • Cook C., Heath F., Thompson R.L., A meta-analysis of response rates in Web or Internet-based survey, „Educational and Psychological Measurement” 2000, Vol. 60, No. 6, s. 821-836, http://dx.doi.org/10.1177/00131640021970934.
  • Couper M.P., Conrad F.G., Tourangeau R., Visual context effects in web survey, „Public Opinion Quarterly” 2007, Vol. 71, No. 4, s. 623-634, http://dx.doi.org/10.1093/poq/nfm044.
  • Couper M.P., Kapteyn A., Schonlau M., Winter J., Noncoverage and nonresponse in an Internet survey, „Social Science Research” 2007, Vol. 36, No. 1, s. 131-148, http://dx.doi.org/ 10.1016/j.ssresearch.2005.10.002.
  • Coutts E., Jann B., Sensitive Questions in Online Surveys: Experimental Results for the Randomized Response Technique (RRT) and the Unmatched Count Technique (UCT), „Sociological Methods & Research” 2011, Vol. 40, No. 1, s. 169-193, http://dx.doi.org/10.1177/0049124110390768.
  • Crawford S.D., Couper M.P, Lamias M.J., Web surveys: Perceptions of burden, „Social Science Computer Review” 2001, Vol. 19, No. 2, s. 146-162, http://dx.doi.org/10.1177/089443930101900202.
  • Dillman D.A., Tortora, R.D., Bowker D., Principles for constructing Web surveys. „SESRC Technical Report” 1998, http://www.sesrc.wsu.edu/dillman/papers/2001/thewebquestionnairechallenge.pdf.
  • Fan W., Yan Z., Factors affecting response rates of the web survey: A systematic review, „Computers in Human Behavior” 2010, Vol. 26, No. 2, s. 132-139, http://dx.doi.org/10.1016/j.chb.2009.10.015.
  • Giles W.F., Feild H.S., Effects of amount, format, and location of demographic information on questionnaire return rate and response bias of sensitive and nonsensitive items, „Personnel Psychology” 1978, Vol. 31, No. 3, s. 549-559, http://dx.doi.org/ 10.1111/j.1744-6570.1978.tb00462.x.
  • Göritz A.S., Incentives in Web Studies: Methodological Issues and a Review, „International Journal of Internet Science” 2006, Vol. 1, No. 1, s. 58-70.
  • Göritz A.S., Stieger S., The impact of the field time on response, retention, and response completeness in list-based Web survey, „International Journal of Human-Computer Studies” 2009, Vol. 67, No. 4, s. 342-348, http://dx.doi.org/10.1016/j.ijhcs.2008.10.002.
  • Göritz, A.S. The impact of material incentives on response quantity, response quality, sample composition, survey outcome, and cost in online access panels, „International Journal of Market Research” 2004, Vol. 46, No. 3, s. 327-345, http://dx.doi.org/
  • Gosling S.D., Vazire S., Srivastava S., John O.P., Should we trust web-based studies? A comparative analysis of six preconceptions about internet questionnaires, „American Psychologist” 2004, Vol. 59, No. 2, s. 93-104, http://dx.doi.org/10.1037/0003-066X.59.2.93.
  • Groves R.M., Peytcheva E., The impact of nonresponse rates on nonresponse bias: A meta-analysis, „Public Opinion Quarterly” 2008, Vol. 72, No. 2, s. 167-189, http://dx.doi.org/10.1093/poq/nfn011.
  • Hall J., Brown V., Nicolaas G., Lynn P., Extended Field Efforts to Reduce the Risk of Non-response Bias: Have the Effects Changed over Time? Can Weighting Achieve the Same Effects?, „Bulletin of Sociological Methodology/Bulletin de Méthodologie Sociologique” 2013, Vol. 117, No. 1, s. 5-25, http://dx.doi.org/10.1177/0759106312465545.
  • Harrison Ch., Tip Sheet On Survey Sampling, Coverage And Nonresponse http://psr.iq.harvard.edu/files/psr/files/PSRTipSheetSamplingCoverageNonresponse_1_0.pdf.
  • Joinson A.N., Self-disclosure in computer-mediated communication: The role of self-awareness and visual anonymity, „European Journal of Social Psychology” 2001, Vol. 31, No. 2, s. 177-192, http://dx.doi.org/10.1002/ejsp.36.
  • Kraut R., Olson J., Banaji M., Bruckman A., Cohen J., Couper M., Psychological Research Online: Report of Board of Scientific Affairs' Advisory Group on the Conduct of Research on the Internet, „American Psychologist” 2004, Vol. 59, No. 2, s. 106.
  • Krumpal I., Determinants of social desirability bias in sensitive surveys: a literature review. „Quality & Quantity” 2013, Vol. 47, No. 4, s. 2025-2047, http://dx.doi.org/10.1007/s11135-011-9640-9.
  • Lensvelt-Mulders G.J.L.M., Hox J.J., Heijden P.G. der, Maas C.J.M, Meta-analysis of randomized response research: 35 years of validation studiem, „Sociological Methods and Research” 2005, Vol. 33, No. 3, s. 319-348, http://dx.doi.org/10.1177/0049124104268664.
  • Mühlenfeld H.U., Differences between 'talking about' and 'admitting' sensitive behaviour in anonymous and non-anonymous web-based interviews. „Computers in Human Behavior” 2004, Vol. 21, No. 6, s. 993-1003, http://dx.doi.org/10.1016/j.chb.2004.02.023.
  • Munoz-Leiva F., Sánchez-Fernández J., Montoro-Ríos F.J., Ibánez-Zapata J.A., Improving the response rate and quality in Web-based surveys through the personalization and frequency of reminder mailings, „Quality and Quantity” 2010, Vol. 44, No. 5, s. 1037-1052, http://dx.doi.org/10.1007/s11135-009-9256-5.
  • Porter S.R., Whitcomb M.E., E-mail subject lines and their effect on web survey viewing and response, „Social Science Computer Review” 2005, Vol. 23, No. 3, s. 380-387, http://dx.doi.org/10.1177/0894439305275912.
  • Rains S., The Nature of Psychological Reactance Revisited: A Meta-Analytic Review, „Human Communication Research” 2013, Vol. 39, No. 1, s. 47-73, http://dx.doi.org/
  • Rogelberg S.G., Spitzmüller C., Little I.S, Reeve C.L., Understanding response behavior to an online special topics organizational satisfaction survey, „Personnel Psychology” 2006, Vol. 59, No. 4, s. 903-923, http://dx.doi.org/10.1111/j.1744-6570.2006.00058.x.
  • Sánchez-Fernández J., Munoz-Leiva F., Montoro-Ríos F. J., Improving retention rate and response quality in Web-based survey, „Computers in Human Behavior”, 2012, Vol. 28, No. 2, s. 507-514, http://dx.doi.org/10.1016/j.chb.2011.10.023.
  • Sauermann H., Roach M., Increasing web survey response rates in innovation research: An experimental study of static and dynamic contact design features, „Research Policy” 2013, Vol. 42, No. 1, s. 273-286, http://dx.doi.org/10.1016/j.respol.2012.05.003.
  • Toepoel V., Das M., Van Soest A., Design of web questionnaires: The effects of the number of items per screen, „Field Methods” 2009, Vol. 21, No. 1, s. 200-213, http://dx.doi.org/10.1177/1525822X08330261.
  • Tourangeau R., Couper M.P., Conrad F.G., Spacing, Position, and Order: Interpretive Heuristics for Visual Features of Survey Questions, „Public Opinion Quarterly” 2004, Vol. 68, No. 3, s. 368-393, http://dx.doi.org/10.1093/poq/nfh035.
  • Villar A., Callegaro Myang Y., Where Am I? A Meta-Analysis of Experiments on the Effects of Progress Indicators for Web Surveys, „Social Science Computer Review” 2013, Vol. 31, No. 6, s. 744-762, http://dx.doi.org/10.1177/0894439313497468.
  • Wouters K., Maesschalck J., Peeters C.F.W., Roosen M., Methodological Issues in the Design of Online Surveys for Measuring Unethical Work Behavior: Recommendations on the Basis of a Split-Ballot Experiment, „Journal of Business Ethics” 2013, Vol. 120, No. 2, s. 275-289, http://dx.doi.org/10.1007/s10551-013-1659-5.
  • Zając J.M., Batorski D. Jak skłonić do udziału w badaniach internetowych: zwiększanie realizacji próby, „Psychologia społeczna” 2007, t. 2, nr 3-4, s. 234-248, http://dx.doi.org/
AUTHOR

Konrad Kulikowski

Jagiellonian University

Table of contents

About the article

DOI: 10.15219/em62.1211

The article is in the printed version on pages 21-28.

pdf download PDF

pdf read the article (Polish)

Citation

K. Kulikowski, E-badania - analiza wybranych problemów internetowych badań postaw i opinii, „e-mentor” 2015, nr 5(62), s. 21-28, http://dx.doi.org/10.15219/em62.1211.