Slide 1

Slide 1 text

ҩྍݴޠॲཧ΁ͷ#&35ͷԠ༻ ʔ #JP#&35 $MJOJDBM#&35 ͦͯ͠ ʔ YQBQFSDIBMMFOHF #&35Ԡ༻ษڧձ :VUB/BLBNVSB

Slide 2

Slide 2 text

ࣗݾ঺հ தଜ ༏ଠ ͳ͔ΉΒ Ώ͏ͨ ɾ์ࣹઢՊઐ߈ҩ ೥౓ઐ໳ҩࢼݧडݧ༧ఆ ɾ౦ژେֶେֶӃ ҩֶത࢜՝ఔ೥ ࣗવݴޠॲཧΛར༻ͨ͠ը૾਍அࢧԉ ɾ೥݄ʙ݄·Ͱ /"*45ߥ຀ݚʹࠃ಺ཹֶத @iBotamon

Slide 3

Slide 3 text

#&35ొ৔·ͰͷܦҢ ུ֓ 2014.9: Attention Mechanism - RNN/LSTM ༻ͷҰػߏ 2017.6: Transformer - Attention ͷΈ͔ΒͳΔϞσϧ 2018.10: BERT - લޙ૒ํ޲ͷจ຺ΛߟྀͰ͖ΔΑ͏ʹվྑ → BERTͷొ৔ʹΑΓNLPͷੈք͸Ұม ଏʹ After BERT ͱ͍͏දݱ΋ʁ https://github.com/thunlp/PLMpapers

Slide 4

Slide 4 text

2ҩྍݴޠॲཧ͸ l"GUFS#&35zʁ ˣ " ٕज़ࣗମ͸ ୯ͳΔυϝΠϯಛԽܕ #&35ʹ ͱͲ·Βͳ͍΋ͷ͕ग़͓ͯΓ ͦͷҙຯͰ͸͢Ͱʹ l"GUFS#&35z ͔΋͠Εͳ͍

Slide 5

Slide 5 text

#&35ͷཱͪҐஔ ɾ#JPNFEJDBM/-1ʹ΋ #&35͸ਁಁ ɾ͔͠͠ɼΑΓݱ৔ʹ͍ۙݚڀ΁ͷྲྀೖ͸࢝·ͬͨ͹͔Γ ࣮ੈք΁ͷߩݙ NLP task Ͱͷ੒Ռ ཁૉٕज़ͷൃల Biomedical NLP task Ͱͷ੒Ռ .-/-1 ྟচҩֶ ҩྍ৘ใ/ ࣾձҩֶ ౷ܭతػցֶश CNN, RNN, LSTM Transformer, BERT

Slide 6

Slide 6 text

#&35ͷཱͪҐஔ ɾ਍ྍه࿥ʹର͢Δ Deep Learning ख๏ ɾ2019 1~4 ݄ͷ 48࿦จ ɾRNN/LSTM 50%, CNN 35% ɾAttention Λ༻͍ͨ΋ͷ 18% Wu S et al. Deep learning in clinical natural language processing: a methodical review. J Am Med Inform Assoc 27(3): 457-470, 2020

Slide 7

Slide 7 text

#&35ͷཱͪҐஔ ɾ2020೥5݄ ࣌఺ͰͷBERT࿦จ਺ ɾPubMed 18݅, PMC 45݅

Slide 8

Slide 8 text

1. ҩྍυϝΠϯಛԽܕ BERT 2. ҩྍυϝΠϯಛԽܕ BERT ͷͦͷઌ 3. ࠷ޙʹ

Slide 9

Slide 9 text

υϝΠϯಛԽܕ #&35 #JP#&35 ɾࣄલֶशࡁΈ BERT Λ͞Βʹҩֶ࿦จͰֶशͤͨ͞΋ͷ ɾBERT: Book Corpus + English Wikipedia ɾBioBERT: ্ه + PubMed abstract ʶ PMC full text Lee J et al. BioBERT: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics 36(4): 1234- 1240, 2020

Slide 10

Slide 10 text

υϝΠϯಛԽܕ #&35 $MJOJDBM#&35 ɾBeth Israel Deaconess Medical Center ͷ ಗ໊Խ ICU਍ྍه࿥ (MIMIC-III) Ͱࣄલֶशͤͨ͞BERT Huang K et al. ClinicalBERT: Modeling Clinical Notes and Predicting Hospital Readmission. arXiv preprint arXiv:1904.05342, 2019

Slide 11

Slide 11 text

υϝΠϯಛԽܕ #&35 &IS#&35 ɾBioBERT͔Βελʔτ͠ɼUMass Memorial Medical Centerͷ਍ྍه࿥ 50ສʙ100ສ݅Ͱ͞Βʹֶशͤͨ͞BERT (※ެ։͞Ε͍ͯͳ͍) Li F et al. Fine-Tuning Bidirectional Encoder Representations From Transformers (BERT)-Based Models on Large-Scale Electronic Health Record Notes: An Empirical Study. JMIR Med Inform 7(3): e14830, 2019

Slide 12

Slide 12 text

#JP#&35 $MJOJDBM#&35ͰԿ͕มΘ͔ͬͨ ɾBioBERT ʷ ҩֶ࿦จ ɾݻ༗දݱநग़ ʴප໊ਖ਼نԽ ɾ࣭໰Ԡ౴ NCBI disease corpus (ප໊ͷநग़+ਖ਼نԽ) F1 score 0.572 (MetaMap) → 0.886 (SOTA) → 0.897 PubMedQA (title͕ٙ໰จͷ࿦จͷ abstractͷconclusionΛ, ͦΕҎ֎ͷ෦෼͔ Βਪఆ) Lee J et al. BioBERT: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics 36(4): 1234- 1240, 2020 Jin Q, Dhingra B, Liu Z, Cohen W, Lu X: PubMedQA: A Dataset for Biomedical Research Question Answering. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): 2567-2577, 2019

Slide 13

Slide 13 text

#JP#&35 $MJOJDBM#&35ͰԿ͕มΘ͔ͬͨ ɾBioBERT ʷ ҩֶ࿦จ ɾཁ໿ʴϚϧνϞʔμϧ FigSum (ਤͷઆ໌ʹͳΔهड़Λຊ จ͔ΒूΊͯཁ໿͢Δ) Saini N, Saha S, Bhattacharyya P, Tuteja H: Textual Entailment--Based Figure Summarization for Biomedical Articles. ACM Trans. Multimedia Comput. Commun. Appl. DOI: 10.1145/3357334, April 2020 April, 2020

Slide 14

Slide 14 text

#JP#&35 $MJOJDBM#&35ͰԿ͕มΘ͔ͬͨ ɾClinicalBERT ʷ ਍ྍه࿥ ɾจॻ෼ྨ ɾݻ༗දݱநग़ ʴؔ܎நग़ Alsentzer E, Murphy J, Boag W, Weng WH, Jindi D, Naumann T, McDermott M: Publicly Available Clinical BERT Embeddings. Proceedings of the 2nd Clinical Natural Language Processing Workshop: 72-78, 2019 MIMIC-III ICU࠶ೖӃ༧ଌ AUC 0.67 i2b2 2010 (঱ঢ়, ݕࠪ, ࣏ྍͷؔ܎) F1 score 0.737 (2010) → 0.878

Slide 15

Slide 15 text

#JP#&35 $MJOJDBM#&35ͰԿ͕มΘ͔ͬͨ ɾClinicalBERT ʷ ਍ྍه࿥ ɾؚҙؔ܎ೝࣝ Alsentzer E, Murphy J, Boag W, Weng WH, Jindi D, Naumann T, McDermott M: Publicly Available Clinical BERT Embeddings. Proceedings of the 2nd Clinical Natural Language Processing Workshop: 72-78, 2019 MedNLI (2ͭͷҩֶతهड़ͷؚҙؔ܎) Acc 0.735 (baseline) → 0.827 ࣦޠ঱ͱ͍͑Δ͔? ࿩ͤͳ͍ͳΒ͹ True ౶೘පέτΞγυʔγεͷطԟ͕ 6ճ͋Γ, HbA1c 13.3%ͳΒ͹ ݂౶ίϯτϩʔϧྑ޷ͱ͍͑Δ͔? False

Slide 16

Slide 16 text

1. ҩྍυϝΠϯಛԽܕ BERT 2. ҩྍυϝΠϯಛԽܕ BERT ͷͦͷઌ 3. ࠷ޙʹ

Slide 17

Slide 17 text

#F)3U#&35ͷ ඇݴޠҩྍσʔλ΁ͷԠ༻ ɾ#&35ΛඇݴޠҩྍσʔλʹԠ༻ͨ͠ݚڀ ɾӳࠃͷ਍ྍه࿥σʔλϕʔε $13% ͷ ສױऀͷσʔλͰֶश Li, Y., Rao, S., Solares, J.R.A. et al. BEHRT: Transformer for Electronic Health Records. Sci Rep 10, 7155 (2020). https://doi.org/10.1038/s41598-020-62922-y

Slide 18

Slide 18 text

#F)3U#&35ͷ ඇݴޠҩྍσʔλ΁ͷԠ༻ ɾBERT: ֤จͷτʔΫϯྻΛ [SEP] Ͱ۠੾ͬͯೖྗ ɾτʔΫϯ͸ޠኮ͔Β͑ΒͿ (ޠኮ਺ ≒ 3.2ສ (BERTBASE )) ɾBeHRT: ֤ड਍࣌ͷ਍அͱ೥ྸΛ [SEP] Ͱ۠੾ͬͯೖྗ ɾ਍அ͸ ICD-10ίʔυ ͔Β͑ΒͿ (“ޠኮ਺” 301) Li, Y., Rao, S., Solares, J.R.A. et al. BEHRT: Transformer for Electronic Health Records. Sci Rep 10, 7155 (2020). https://doi.org/10.1038/s41598-020-62922-y BERT BeHRt

Slide 19

Slide 19 text

#F)3U#&35ͷ ඇݴޠҩྍσʔλ΁ͷԠ༻ ɾࣄલֶश: Masked Language Model ɾBERT: τʔΫϯͷҰ෦Λ [MASK] ͯ͠ೖྗ → ෮ݩͤ͞Δ ɾBeHRT: ਍அͷҰ෦Λ [MASK] ͯ͠ೖྗ → ෮ݩͤ͞Δ Li, Y., Rao, S., Solares, J.R.A. et al. BEHRT: Transformer for Electronic Health Records. Sci Rep 10, 7155 (2020). https://doi.org/10.1038/s41598-020-62922-y

Slide 20

Slide 20 text

#F)3U#&35ͷ ඇݴޠҩྍσʔλ΁ͷԠ༻ ɾੑೳ ɾױऀ͕͜Ε·Ͱʹطԟྺͷͳ͍࣬ױΛ ৽نൃ঱͢Δ͔Ͳ͏͔Λ༧ଌ → AUC 0.904ʙ0.907Ͱୡ੒ Li, Y., Rao, S., Solares, J.R.A. et al. BEHRT: Transformer for Electronic Health Records. Sci Rep 10, 7155 (2020). https://doi.org/10.1038/s41598-020-62922-y

Slide 21

Slide 21 text

5"1&3ݴޠσʔλ ඇݴޠҩྍσʔλͷ౷߹ S. Darabi, M. Kachuee, S. Fazeli and M. Sarrafzadeh, "TAPER: Time-Aware Patient EHR Representation," in IEEE Journal of Biomedical and Health Informatics, doi: 10.1109/JBHI.2020.2984931. ɾMIMIC-III ͷ਍ྍه࿥σʔλΛར༻ ᶃݴޠσʔλ: BioBERT + BiGRUͰ਍ྍه࿥Λཁ໿ ᶄඇݴޠσʔλ: ICDίʔυ, ༀࡎίʔυ, ॲஔίʔυΛ word2vecͰϕΫτϧԽ → transformer encoder ᶅױऀͷੑผ, ೥ྸͳͲΛϕΫτϧԽ →ᶃ+ᶄ+ᶅ ʹΑΓ”ೖӃͦͷ΋ͷ”ΛϕΫτϧԽ

Slide 22

Slide 22 text

5"1&3ݴޠσʔλ ඇݴޠҩྍσʔλͷ౷߹ S. Darabi, M. Kachuee, S. Fazeli and M. Sarrafzadeh, "TAPER: Time-Aware Patient EHR Representation," in IEEE Journal of Biomedical and Health Informatics, doi: 10.1109/JBHI.2020.2984931. ɾMIMIC-III ͷ਍ྍه࿥σʔλΛར༻ ɾICUױऀͷ࠶ೖӃɼࢮ๢ɼ௕ظೖӃΛ retrospective ʹ༧ଌ͢ΔλεΫͰ ClinicalBERT Λ ্ճͬͨ

Slide 23

Slide 23 text

1. ҩྍυϝΠϯಛԽܕ BERT 2. ҩྍυϝΠϯಛԽܕ BERT ͷͦͷઌ 3. ࠷ޙʹ

Slide 24

Slide 24 text

೔ຊޠͰͷυϝΠϯಛԽܕ #&35 ɾ౦ژେֶ ҩྍAI։ൃֶߨ࠲ΑΓެ։ ɾ໿1ԯߦͷ೔ຊޠ਍ྍه࿥Ͱࣄલֶशͨ͠BERT ɾ෼͔ͪॻ͖: MeCab + Byte Pair Encoding ɾࣙॻ: mecab-ipadic-neologd + ສපࣙॻ ɾޠኮ਺: 25,000 https://ai-health.m.u-tokyo.ac.jp/uth-bert

Slide 25

Slide 25 text

·ͱΊ ɾҩྍจॻ༻ͷPre-trained BERT model ͸ෳ਺͋Γ ೔ຊޠ൛΋ެ։͞Εͨ ɾҩྍυϝΠϯͷ࠷ઌ୺ʹݱΕ͖͍ͯͯΔτϨϯυɿ ɾTransformer ʹΑΔݴޠσʔλ ඇݴޠσʔλͷ౷߹ ɾඇݴޠσʔλ΁ͷ BERT ͷར༻ ɾྟচݱ৔ͱͷڑ཭͸·ͩ͋Δ͕ɼ ͜Ε͔Βগͣͭ͠ॖ·͍ͬͯ͘ͷΛظ଴