▶ e ha A 0; ,; ;1 0 , f H i Hc ▶ p Tnm T I l t vo r Suk dHgP z 4 C 60;2 60 ( 0 011 60;2 & retrieved from http://www.uh.edu/engines/epi2765.htm MTurk = Mechanical Turk
= .B;=B ) , EB CCEB ( Ὂ 12 . 1 E= U iwo j VR > I E= e ce N TQ 1 E= Wu sV l v rt j ☐ gd g ▶ m 0 5 + mh m kmh m m m ▶ e a g - + -EC CE Experiments 1: Stroop 2: Flanker 3: Switching 4: Simon 5: Posner UNIV 100 (19.8) 100 (19.8) 100 (19.8) 100 (19.9) 100 (19.8) CW 99 (37.7) 99 (38.0) 100 (38.7) 100 (37.5) 97 (38.5) Table 1. Number of participants and mean age for each experiment
using crowdsourced convenience samples. Annual Review of Clinical Psychology, 12, 53-81. doi:10.1146/annurev-clinpsy-021815-093623 Lutz, J. (2016). The Validity of Crowdsourcing Data in Studying Anger and Aggressive Behavior. Social Psychology, 47(1), 38-51. doi:10.1027/1864-9335/a000256 Majima, Y. (2017). The Feasibility of a Japanese Crowdsourcing Service for Experimental Research in Psychology. SAGE Open. 7(1),. doi: 10.1177/2158244017698731 Majima Y, Nishiyama K, Nishihara A and Hata R (2017). Conducting Online Behavioral Research Using Crowdsourcing Services in Japan. Frontiers in Psychology. 8:378. doi: 10.3389/fpsyg.2017.00378 Paolacci, G., & Chandler, J. (2014). Inside the Turk: Understanding mechanical Turk as a participant pool. Current Directions in Psychological Science, 23, 184-188. doi:10.1177/0963721414531598 Peer, E., Brandimarte, L., Samat, S., & Acquisti, A. (2017). Beyond the Turk: Alternative platforms for crowdsourcing behavioral research. Journal of Experimental Social Psychology, 70, 153-163. doi:10.1016/j.jesp.2017.01.006