構築”. 言語処理学会第26回年次大会(NLP2020)発表論文集 • [Devlin et al. 2019] Jacob Devlin, Ming-Wei Chang, Kenton Lee, and KristinaToutanova. BERT: Pre-training of Deep Bidirectional Trans-formers for Language Understanding. InNAACL, volume 1,pages 4171–4186, 2019. • [Sun et al. 2019] Chi Sun, Xipeng Qiu, Yige Xu, Xuanjing Huang. How to Fine-Tune BERT for Text Classification?. arXiv [cs.CL]. arXiv. http://arxiv.org/abs/1905.05583, 2019. • [Erickson et al. 2020] Nick Erickson, Jonas Mueller, Alexander Shirkov, Hang Zhang, Pedro Larroy, Mu Li, Alexander Smola. AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data. arXiv [cs.LG]. arXiv. https://arxiv.org/abs/2003.06505, 2020. 12