User-generated Recipes ▪ cookpad: Most viewed recipe sharing service in the world •Largest recipe sharing service in Japan •Users can upload and search original, user-generated recipes •Most viewed recipe sharing service in the world •Community platform for people to share recipe ideas and cooking tips •Available in 68 countries 22 languages ▪ Step or Not: Automatic step validation •Recipe steps •Main part of recipes (besides title, ingredients) •Include text and/or photos •Used by authors for communication (fake steps) •Fake steps •Some of the “steps” are not actually part of the cooking process. •Advertisements of recipe (e.g., “introduced on TV”) •Comments (e.g., “Thanks for many messages”) •Advice (e.g. “Also good with rice” ) •Arrangements (e.g., “Using as pasta sauce is good” ) •Fake steps cause problems •Spoken by devices such as smart speakers •Recipe search indexing •Task setting •Distinguish between fake steps and the steps actually used for cooking Fake Steps 41% True Steps 59% Example of recipe steps Fake step Monthly users Recipes Japan 55.6 million 2.95 million Global 36.3 million 1.65 million This recipe became hot recipe on Oct. 9, 2010. Thank you for all who cookedˑ Put all seasonings and mix. Type of steps
2018, Brussels, Belgium (at EMNLP 2018) Method Accuracy Precision Recall Dict 51.2 52.2 53.3 XGB 84.8 87.4 79.5 RForest 89.3 86.1 92.3 LogReg 92.4 91.1 93.1 LSTM 95.1 94.6 95.0 AVG 93.3 93.1 92.9 ▪ LSTM with pre-trained word vectors ▪ Experiments • Data • Collected 20,000 distinct steps from cookpad • Manually annotated by human • Pos / Neg rate is balanced • Only using text information • Without position, timestamp, etc. • Methods • Dict: Manually selected clue words • Non-neural models (Input: TF-IDF vector) • LogReg: LogisticRegression • RForest: RandomForest • XGB: XGBoost • Neural models(Input: pre-trained skip-gram) • LSTM: LSTM • AVG: Average of word vectors •Distinguish between fake steps and the steps actually used for cooking Word Embedding LSTM # # # # # # # Dropout # # # # # # # LeakyReLu Sigmoid Model Architecture • Preprocessing • Split sentences into words • Input: “ௐຯྉΛશͯೖΕͯࠞͥ߹Θͤ·͢” • Ouput: “ௐຯྉ”, “Λ”, “શͯ”, “ೖΕ”, “ͯ”, “ࠞ ͥ߹Θ”, “ͤ”, “·͢” • Not apply normalization • e.g. “ೖΕ” → “ೖΕΔ” • Word Embedding • Skip-Gram Word2vec • Pre-trained with our whole recipe steps dataset (over 20 million steps) • (218408, 100) vector • Results and Discussion • Over 95% accuracy in neural models • Non-neural models are easy to deploy but insufficient for our service • LSTM-based model misclassifies • Steps that include special words such as “finished”, “completed”, etc. • Steps that are advice including cooking vocabularies • Future Work • Get higher score • Not only textual features • Deploy and apply in our services cookedˑ • False Positive • JP: “ίʔώʔʹೖΕͯɾɾɾ͓͍͍ͯ͘͠ ϛϧΫίʔώʔͷग़དྷ্Γ<unk>” • EN: “Put in Coffee. You will get sweet delicious milk coffee. That’s all.” • False Negative • JP: “'ˎΦʔϒϯͷ߹̎̏̌℃ʹ༧ͨ͠Φʔ ϒϯͰ̍̑͘Β͍ম͘ɻ'” • EN: “Bake for about 15 minutes in an oven preheated to 230℃.” Example of results