Giving people a voice to influence positive change in health and social care

01225 701851 | info@evolvingcommunities.co.uk

Evolving Communities Logo

How to spot and stop fraudsters when collecting research data online

by | Dec 11, 2019 | Our Blog

Online survey platforms are becoming an increasingly popular method for both quantitative and qualitative data collection. But there can be some risks involved in collecting data online.
But there can be some risks involved in collecting data online.

Online surveys (hosted on platforms like SmartSurvey and SurveyMonkey) can provide researchers with multiple benefits such as:

  • Access to large amounts of people (surveys links can be distributed widely)
  • Cost efficient (some survey software is even free to use)
  • Provides participants with anonymity and confidentiality, often leading to more honest and free responses (Teitcher et al., 2015).

However, if preventative steps are not taken, collecting data online can open up the opportunity for fraudsters to submit fake responses. A study by Bauermeister et al. (2012) has identified several patterns of fraud in online research:

  1. Eligible participants who take studies twice, usually without malicious intent;
  2. Eligible participants who take studies multiple times, in order to receive compensation (e.g. money or vouchers);
  3. Ineligible individuals who fabricate responses, once or multiple times (often via bots) in order to gain compensation.
What is a bot?

Bots are internet software programs that are designed to imitate or replace human behaviour, typically undertaking repetitive tasks at a much faster rate that human users could perform. Often bots will be designed to complete surveys en masse.

What effect can this have on my research?

Duplicate and fake responses can not only compromise the quality and validity of research data but can also affect research budgets if fraudulent answers are not picked up on before any compensation is sent (Teitcher et al., 2015).

For example; if there was only enough money to recruit 50 participants, and two responses were found to be fraudulent, then the researcher is not only out of pocket but also only able to now recruit 48 participants. This is especially concerning if research is being conducted with funding from charities in which money is extremely tight.

How can you prevent fraudulent activity in online survey platforms?
  • CAPTCHA. This test is designed to determine if an online user is really human and not a bot that might be programmed to take a survey multiple times. You have probable seen these on internet sites, but may not know it stands for ‘Completely Automated Public Turing Test to tell Computers and Humans Apart’. Many online survey platforms will allow you to add one of these as a question type. Add to the first page in your study to fend off bots.
  • Prevent multiple responses on the same IP address. Many surveys will allow you to check a button that means participants cannot take the survey more than once on the same computer.
  • Back Button. In ‘Survey Options’ you can enable a Back Button, which is useful for participants to be able to change their responses after consideration. However, this could potentially enable them to re-take the survey multiple times. Think carefully about the pro’s and con’s of adding or removing a back button.
  • Change the order of questions. In most survey platforms you can set the questions to appear in a random order, this method can detect bots as it indicates a level of human attention. This may not always be possible, depending on your methodology.
  • Make invite only and add a password: Instead of distributing an anonymous link on study advertisements, make the survey ‘Invite Only’ and request that potential participants email you for password access. This adds another layer of protection, as you can ask participants where they saw the study advertised and check their language, grammar and email address for fraudulent red flags.
  • Consent form: Add a line about fraud to your consent form, such as, “I understand that I will not be compensated if my answers are suspected of fraudulent behaviour”. This may deter potential fraudsters, and will also cover you if they try to claim compensation.
  • Pay respondents nothing, less, or provide a lottery: De-incentivise fraud by reducing the amount of compensation available. However, this may reduce uptake of the survey.
  • Include an interview stage to study: This is not always possible or appropriate. However, including an interview stage to the study may deter fraudsters from participating and adds another means to detect fabricated responses.
How can you detect fraudulent activity in online survey data?
  • Screen data for inconsistent responses: This is easier to do with qualitative research, as you can check that the answers are coherent, relevant and written in the relevant language. With quantitative research, check for inconsistent answers. Make sure to confirm fraudulent answers with other researchers/supervisors.
  • Speed checks: You should have some idea of how long it would realistically take for a human to take your survey. Some software (such as Qualtrics) will show you this estimate. When checking answers, keep a close eye on any responses that took drastically less time to complete, or substantially more.
  • Look for strange email addresses: Fraudsters will often create batches of email addresses with random letters and numbers preceding ‘@xxx.com’.
  • If unsure, send an email to participant: Once you have checked with others, and are still unsure about potential fraud, you could send an email to the participant explaining the situation and request more information about where they saw the advert etc.

All of these preventative and reactive steps have their pros and cons, and not all will be possible due to methodological and ethical implications.

How can we help?

Evolving Communities can help you set up and run your online surveys, ensuring that they are protected from any bots or fraudulent activity.

They can also check data for fraudulent responses. Contact the research team at research@evolvingcommunities.co.uk for more details.

Bauermeister, J. A., Pingel, E., Zimmerman, M., Couper, M., Carballo-Dieguez, A., & Strecher, V. J. (2012). Data quality in HIV/AIDS web-based surveys: Handling invalid and suspicious data. Field methods, 24(3), 272-291.  Link.

Teitcher, J. E., Bockting, W. O., Bauermeister, J. A., Hoefer, C. J., Miner, M. H., & Klitzman, R. L. (2015). Detecting, preventing, and responding to “fraudsters” in internet research: ethics and tradeoffs. The Journal of Law, Medicine & Ethics, 43(1), 116-133.  Link.

Translate »
Share This
Skip to content