This paper argues that online survey experiments are robust against demand effects (EDEs), a type of bias where participants respond strategically based on perceived study intent. To test this claim, the authors analyze over 12,000 responses across five political science studies, replicating findings in diverse subfields like public opinion and voting behavior.
Participants were randomly shown information about researcher expectations to see if they could adjust their answers accordingly. However, even when informed or incentivized financially, response patterns did not significantly change.
This limited ability for participants to detect demand effects has crucial implications: it suggests that survey experiment findings may be more reliable than previously thought.






