Individual Submission Summary
Share...

Direct link:

Survey Professionalism: New Evidence from Browsing Data

Fri, September 6, 2:00 to 3:30pm, Pennsylvania Convention Center (PCC), 104A

Abstract

Online panels have become an essential resource for low-cost survey research across many different fields. As scholars increasingly use them for data collection, the recruiting methods of many commercial providers raise urgent questions about data quality, such as a lack of representativeness and low attentiveness of participants (Cornesse & Blom, 2023; Krupnikov et al., 2021). A complicated web of panel companies, who often recruit via third parties and even from one another, makes it hard for researchers to know from where responses actually come (Enns & Rothschild, 2022), and the goals of commercial panel providers may be at odds with standards of research transparency (Jerit & Barabas, 2023).

From the panelist’s perspective, the sheer number of platforms offering payment makes it attractive to become a “survey professional,” namely a panelist who visits many online surveys and spends substantial time answering them. However, little is known about the extent of survey professionalism. Our article provides a novel empirical strategy to tackle this relevant issue. We use behavioral measures constructed from web browsing data across three different online recruitment strategies and link these behavioral data to surveys. This allows us to identify the prevalence of survey professionalism and its consequences for researchers.

We examine three different U.S. samples from 2018/2019 for which we have both survey responses and browsing data for multiple waves, recruited through Facebook (n = 707), Lucid (n = 2,222) and Yougov (n = 554). The browsing data of these three samples comprise over 95 million web visits in total. We identify survey taking with three different approaches. First, we rely on existing lists of questionnaire platforms, second, we use regular expressions to identify web addresses likely to constitute survey sites. Third, we manually code the most frequent websites in our data.

Our research focuses on three primary outcomes. We first report estimates of the extent of survey professionalism across the three different samples. Second, we compare survey professionals and non-professionals on various sociodemographics and political outcomes. Third, to shed light on issues of response quality, we compare the two groups in terms of straightlining, speeding, and over-time response stability. Last, we explore whether any subjects take the same questionnaire multiple times, and how typical this behavior is.

We present evidence that survey-taking makes up a substantial part of participants' online activity, namely 54.3% of all visits in the Lucid sample, 24.5% in the Yougov sample and 9.8% in the Facebook-recruited sample. In both the Lucid and Yougov sample, this is much more than visits to, for example, google.com (6.5% and 11.2%, respectively). When comparing professionals and non-professionals, the most pronounced demographic difference appears in the Lucid sample, in which professionals are older, more highly educated and more ethnically white. Considering political outcomes, we describe a tendency for professionals to be more right-wing---although this difference only remains statistically significant across methods in the Lucid samples and professionals also tend to feel more positive towards out-partisan: In the Yougov sample, we find this difference to be statistically significant no matter what the definition of professional.

To conclude, even though we report a high prevalence of survey professionals across the three samples, particularly in the Lucid case, we do not find substantive differences in the quality of the survey responses between professionals and non-professionals. While professionals do consistently show signs of speeding through the questionnaires, we do not find professionals show a higher incidence of straightelining through grid questions or lower attention (higher heterogeneity) to questions asked repeatedly across multiple waves.

Authors