Upgrade to PRO for Only $50/Year—Limited-Time Offer! 🔥
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Using millions of emoji occurrences to learn an...
Search
Yuto Kamiwaki
December 16, 2018
Research
0
110
Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm
2018/12/17 文献紹介の発表内容
Yuto Kamiwaki
December 16, 2018
Tweet
Share
More Decks by Yuto Kamiwaki
See All by Yuto Kamiwaki
Emo2Vec: Learning Generalized Emotion Representation by Multi-task Training
yuto_kamiwaki
0
120
Modeling Naive Psychology of Characters in Simple Commonsense Stories
yuto_kamiwaki
1
220
Epita at SemEval-2018 Task 1: Sentiment Analysis Using Transfer Learning Approach
yuto_kamiwaki
0
140
Tensor Fusion Network for Multimodal Sentiment Analysis
yuto_kamiwaki
0
270
Sentiment Analysis: It’s Complicated!
yuto_kamiwaki
0
83
ADAPT at IJCNLP-2017 Task 4: A Multinomial Naive Bayes Classification Approach for Customer Feedback Analysis task
yuto_kamiwaki
0
170
EmoWordNet: Automatic Expansion of Emotion Lexicon Using English WordNet
yuto_kamiwaki
0
110
ATTENTION-BASED LSTM FOR PSYCHOLOGICAL STRESS DETECTION FROM SPOKEN LANGUAGE USING DISTANT SUPERVISION
yuto_kamiwaki
0
150
BB_twtr at SemEval-2017 Task 4: Twitter Sentiment Analysis with CNNs and LSTMs
yuto_kamiwaki
0
250
Other Decks in Research
See All in Research
CoRL2025速報
rpc
3
3.6k
Language Models Are Implicitly Continuous
eumesy
PRO
0
360
論文紹介: ReGenesis: LLMs can Grow into Reasoning Generalists via Self-Improvement
hisaokatsumi
0
140
Aurora Serverless からAurora Serverless v2への課題と知見を論文から読み解く/Understanding the challenges and insights of moving from Aurora Serverless to Aurora Serverless v2 from a paper
bootjp
2
180
機械学習と数理最適化の融合 (MOAI) による革新
mickey_kubo
1
440
VectorLLM: Human-like Extraction of Structured Building Contours via Multimodal LLMs
satai
4
510
湯村研究室の紹介2025 / yumulab2025
yumulab
0
250
"主観で終わらせない"定性データ活用 ― プロダクトディスカバリーを加速させるインサイトマネジメント / Utilizing qualitative data that "doesn't end with subjectivity" - Insight management that accelerates product discovery
kaminashi
15
16k
Unsupervised Domain Adaptation Architecture Search with Self-Training for Land Cover Mapping
satai
3
390
AlphaEarth Foundations: An embedding field model for accurate and efficient global mapping from sparse label data
satai
3
540
AI in Enterprises - Java and Open Source to the Rescue
ivargrimstad
0
1k
Multi-Agent Large Language Models for Code Intelligence: Opportunities, Challenges, and Research Directions
fatemeh_fard
0
110
Featured
See All Featured
The Web Performance Landscape in 2024 [PerfNow 2024]
tammyeverts
12
980
Avoiding the “Bad Training, Faster” Trap in the Age of AI
tmiket
0
32
Large-scale JavaScript Application Architecture
addyosmani
515
110k
Understanding Cognitive Biases in Performance Measurement
bluesmoon
32
2.8k
Easily Structure & Communicate Ideas using Wireframe
afnizarnur
194
17k
Color Theory Basics | Prateek | Gurzu
gurzu
0
140
The Illustrated Children's Guide to Kubernetes
chrisshort
51
51k
Deep Space Network (abreviated)
tonyrice
0
16
Neural Spatial Audio Processing for Sound Field Analysis and Control
skoyamalab
0
120
Documentation Writing (for coders)
carmenintech
77
5.2k
Fight the Zombie Pattern Library - RWD Summit 2016
marcelosomers
234
17k
Side Projects
sachag
455
43k
Transcript
Using millions of emoji occurrences to learn any-domain representations for
detecting sentiment, emotion and sarcasm Nagaoka University of Technology Yuto Kamiwaki Literature Review
Literature • Using millions of emoji occurrences to learn any-domain
representations for detecting sentiment, emotion and sarcasm • Bjarke Felbo, Alan Mislove, Anders Søgaard, Iyad Rahwan, Sune Lehmann • EMNLP 2017 2
Abstract • sentiment analysis, emotion analysis and sarcasm classificationにおける8つのbenchmarkでSoTA達成 •
感情ラベルの多様性が以前のdistant supervisonのアプ ローチよりもパフォーマンスの向上をもたらすことを確認 3
Introduction • NLPのタスクでは,アノテーション済み(感情が付与された)の データは少ない. • Distant supervisionを用いてSoTAを達成している研究があ る. Distant supervision
: (http://web.stanford.edu/~jurafsky/mintz.pdf) ラベル付きデータの情報を手がかりに全く別のラベルなしデータからラベル付きの学 習データを生成し、モデルを学習する手法 4
Related work • Ekman, Plutchikなどの感情の理論を用いて手作業によって 分類 ◦ 感情の理解が難しく,時間がかかる. • official
emoji tables (Eisner et al., 2016)からembeddingす る手法 ◦ emojiの使われ方を考慮しない. • マルチタスク学習 ◦ データストレージの観点から問題あり. 5
Pretraining • 2013年1月から2017年6月までのTweet data(emojiあり) • Only English tweets without URL’s
are used for the pretraining dataset. • All tweets are tokenized on a word-by-word basis. 6
Model 7
Transfer Learning(ChainThaw) 8
Emoji Prediction 9
Benchmarking 10 8 Benchmarks(3tasks,5domains)
Benchmarking 11
Importance of emoji diversity 12 Pos/Neg Emoji:8 types DeepMoji:64 types
感情ラベルの多様性が重要 64種類のemojiの細かい ニュアンスを学習できている. (次ページの図を参照)
Importance of emoji diversity 13
Model architecture 14 Pretraining時点では,差がない benchmark時点では,Attention ありの方が精度が高い 低層の特徴へのアクセスが簡単 勾配消失がなく,学習可能
Analyzing the effect of pretraining 15 Pretraining+chainthawで語彙が 増加 ->word coverageが改善
Comparing with human-level agreement 16 Human:76.1% Deepmoji:82.4% Deepmojiの方が,精度 が高い (実験内容については,論文
を参照)
Conclusion • sentiment analysis, emotion analysis and sarcasm classificationにおける8つのbenchmarkでSoTA達成 •
感情ラベルの多様性が以前のdistant supervisonのアプ ローチよりもパフォーマンスの向上をもたらすことを確認 • Pretraining済みモデルを公開 ◦ (Demo : https://deepmoji.mit.edu/) 17