Back to Top

Representative vs Convenience Samples

New collections of papers release in AEPP

Survey results are only as good as the sample from which responses were taken.  In a new set of papers featured in the Applied Economic Perspectives and Policy, researchers explore the differences between various survey samples and how these differences manifest themselves in valuation surveys. The Whitehead et al. paper compares a probability-based internet sample (Knowledge Panel) to an opt-in internet sample for a contingent-valuation survey on passive-use losses in Florida stemming from the Deepwater Horizon oil spill. The Penn, Petrolia, and Fannin paper compares a probability-based internet sample (Knowledge Panel) to a "snowball" convenience sample generated using word of mouth and social media for a contingent-valuation survey of willingness to pay for an app that provides information on Gulf Coast beach conditions. The Sandstrom-Mistry et al. paper compares a probability-based sample (based on USPS mailing addresses) to two opt-in samples (MTurk and Qualtrics) for a contingent-valuation survey on improved water quality among Michigan residents. The Goodrich et al. paper documents recent survey efforts that were inundated with "bots" and offers guidance for mitigating such fraudulent responses. "Bots" are automated responses stemming from computer software that seeks out and completes online surveys offering participant incentives, thereby generating income or other benefits for the programmer, but providing meaningless responses to the survey.

Articles in the collection:

Estimating the Benefits to Florida Households from Avoiding Another Gulf Oil Spill Using the Contingent Valuation Method: Internal Validity Tests with Probability-based and Opt-in Samples

  • John C. Whitehead, Appalachian State University
  • Andrew Ropicki, University of Florida/Florida Sea Grant
  • John Loomis, Colorado State University
  • Sherry Larkin, University of Florida
  • Tim Haab, Ohio State University
  • Sergio Alvarez, University of Central Florida

Hypothetical Bias Mitigation in Representative and Convenience Samples

  • Jerrod M. Penn, Louisiana State University
  • Daniel R. Petrolia, Mississippi State University
  • J. Matthew Fannin, Louisiana State University

Comparing Water Quality Valuation Across Probability and Non-Probability Samples

  • Kaitlynn Sandstrom-Mistry, Michigan State University
  • Frank Lupi, Michigan State University
  • Hyunjung Kim, Michigan State University
  • Joseph A. Herriges, Michigan State University

Battling Bots: Experiences and strategies to mitigate fraudulent responses in online surveys

  • Brittney Goodrich, University of California, Davis
  • Marieke Fenton, University of California, Davis
  • Jerrod Penn, Louisiana State University
  • John Bovay, Virginia Tech
  • Travis Mountain, University of Alabama

If you are interested in setting up an interview, please contact Allison Ware in the AAEA Business Office.

ABOUT AAEA: Established in 1910, the Agricultural & Applied Economics Association (AAEA) is the leading professional association for agricultural and applied economists, with 2,500 members in more than 60 countries. Members of the AAEA work in academic or government institutions as well as in industry and not-for-profit organizations, and engage in a variety of research, teaching, and outreach activities in the areas of agriculture, the environment, food, health, and international development. The AAEA publishes three journals, the Journal of the Agricultural and Applied Economics Association (an open access journal), the American Journal of Agricultural Economics and Applied Economic Perspectives & Policy, as well as the online magazine Choices and the online open access publication series Applied Economics Teaching Resources. To learn more, visit

Contact: Allison Ware
Senior Communications & Membership Manager
(414) 918-3190