Contrastive Learning 35 Self-Supervised Learning We believe that self-supervised learning is one of the most promising ways to build such background knowledge and approximate a form of common sense in AI systems. “ —Yan LeCun (楊⽴昆)/Ishan Misra, 2021 ” Self-supervised Learning: The Dark Matter of Intelligence
Contrastive Learning 56 Word2Vec T. Mikolov, K. Chen, G. Corrado, J. Dean. Toutanova. Efficient Estimation of Word Representations in Vector Space. Proceedings of Workshop at ICLR, 2013.. 訓練好了有很多炫炫的功能。 巴黎 法國 義⼤利 羅⾺ 國王 男⼈ 女⼈ 皇后
Contrastive Learning 65 ELMo 開創⾃然語⾔的「芝⿇街時代」! ELMo M.E. Peters, M. Neumann, M. Iyyer, M. Gardner, C. Clark, K. Lee, L. Zettlemoyer. Deep contextualized word representations. NAACL 2018. arXiv preprint arXiv:1802.05365v2. AI2
Contrastive Learning 69 引領⾃然語⾔新時代的 BERT BERT J. Devlin, M.W. Chang, K. Lee, K. Toutanova. BERT: Pre- training of Deep Bidirectional Transformers for Language Understanding. arXiv preprint arXiv:1810.04805v2. Google
Contrastive Learning 70 Transformer Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., ... & Polosukhin, I. (2017). Attention is all you need. In Advances in Neural Information Processing Systems (pp. 5998-6008). 運⽤ self-attention, 避開 RNN 的缺點!
Contrastive Learning 89 更好的是⽤ Triplet Loss F. Schroff, D. Kalenichenko, J. Philbin (Google). FaceNet: A Unified Embedding for Face Recognition and Clustering. arXiv preprint arXiv:1503.03832. CNN ̂ y1 ̂ y2 ̂ yn CNN ̂ y1 ̂ y2 ̂ yn Positive Sample Negative Sample
Contrastive Learning 95 Self-Supervised Learning We believe that self-supervised learning is one of the most promising ways to build such background knowledge and approximate a form of common sense in AI systems. “ —Yan LeCun (楊⽴昆)/Ishan Misra, 2021 ” Self-supervised Learning: The Dark Matter of Intelligence