to begin recording your question and again to stop. side, User ? Web Server - Database - Local Client Remote Services and Worker Interface 4UFQϢʔβ͕࣭Λߘ ίʔϯͷ؈ͲΕʁ 4UFQγεςϜ෦ͷਓ͕ؒճ Ұ൪ӈͷ؈Ͱ͢ J. P. Bigham et al.: VizWiz: Nearly Real-time Answers to Visual Questions. In Proceedings of the 23rd annual ACM symposium on User interface software and technology (UIST), 2010.
Human-Intelligent Sensor Feeds. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI), 2015. How many glasses need a re fi ll? or… I can’t Tell https://www.dreamstime.com/empty-white- wine-glasses-table-restaurant-bar-setting- close-up-alcohol-image215682647 ͓͖ʹਓؒʹը૾Λૹ৴ͯ͠ ͍߹ΘͤΔ
ਓؒͰͳ͘ਓೳΛར༻ G. Laput et al. Zensors: Adaptive, Rapidly Deployable, Human-Intelligent Sensor Feeds. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI), 2015.
reCAPTCHA ೋͭͷจࣈΛೝࣝ͢ΔΑ͏ʹࢦࣔ͢Δ ˞ͲͪΒ͕ਖ਼ղط͔͑ͳ͍ ਖ਼ղط ਖ਼ղະ L. von Ahn et al.: reCAPTCHA: Human-Based Character Recognition via Web Security Measures. Science, Vol. 321, Issue 589, 2008. ᶄʮਓؒɹʯͷ݁Ռ͚ͩΛ ɹೝࣝ݁Ռͱͯ͠࠾༻ ᶃɹਖ਼ղͳΒˠਓؒ ɹෆਖ਼ղͳΒˠϘοτ
https://fold.it/ S. Cooper et al.: Predicting Protein Structures with a Multiplayer Online Game. Nature, Vol. 466, No. 7307, 2010. ϓϨΠϠʔߴείΞΛ ૂ͍ߏΛมԽͤ͞Δ
ճ ճϞσϧʢ ʹର͢Δճऀ ͷճʣ i j αj Pr[yij ∣ ti = 1] = αyij j (1 − αj )(1−yij ) Pr[yij ∣ ti = 0] = β(1−yij ) j (1 − βj )yij ճϞσϧΛߏஙͯ͠ਖ਼ղ༧ଌʹ༻͍Δ 35 ɿճऀ ͕ਖ਼ղ͕YESͷʹYESͱ͑Δ֬ αj j ɿճऀ ͕ਖ਼ղ͕NOͷʹNOͱ͑Δ֬ βj j ճऀͷ৴པੑύϥϝʔλʢࠞಉߦྻʣ ճ YES NO ਖ਼ ղ YES NO αj βj 1 − αj 1 − βj ࠞಉߦྻ A. P. Dawid and A. M. Skene: Maximum likelihood estimation of observer error-rates using the EM algorithm. Journal of the Royal Statistical Society, Series C (Applied Statistics), 1979.
ྻ߹ͤʹΑΔ࣭อূ 36 When the crowd is finished, Soylent calls out the edited sections with a purple dashed underline. If the user clicks on the error, a drop-down menu explains the problem and offers a list of alternatives. By clicking on the desired alter- native, the user replaces the incorrect text with an option of Figure 2. Crowdproof is a human-augmented proofreader. The drop-down explains the problem (blue title) and suggests fixes (gold selection). M. S. Bernstein et al.: Soylent: a Word Processor with a Crowd Inside. Communications of the ACM, Vol. 58, Issue 8, 2015. Find Fix Verify
contributed articles platform for viral collaboration that used recursive incentives to align the public’s interest with the goal of win- ning the Challenge. This approach was inspired by the work of Peter S. Dodds et al.5 that found that success in us- ing social networks to tackle widely distributed search problems depends on individual incentives. The work of Mason and Watts7 also informed the use of financial incentives to motivate crowdsourcing productivity. The MIT team’s winning strategy was to use the prize money as a finan- cial incentive structure rewarding not only the people who correctly located balloons but also those connecting the finder to the MIT team. Should the team win, they would allocate $4,000 in prize money to each balloon. They promised $2,000 per balloon to the Figure 1. Locations in the DARPA Red Balloon Challenge. Figure 2. Example recursive incentive-structure process for the MIT team. J. C. Tang et al.: Re fl ecting on the DARPA Red Balloon Challenge. Communications of the ACM, Vol. 54, Issue 11, 2011.