Toutanova, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, arXiv:1810.04805 [cs], May 2019. arXiv: 1810.04805. M. E. Peters, M. Neumann, M. Iyyer, M. Gardner, C. Clark, K. Lee, and L. Zettlemoyer, Deep contextualized word representations, CoRR, vol. abs/1802.05365, 2018. N. Reimers and I. Gurevych, Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks, arXiv:1908.10084 [cs], Aug. 2019. arXiv: 1908.10084. R. Keraron, G. Lancrenon, M. Bras, F. Allary, G. Moyse, T. Scialom, E.-P. Soriano-Morales, and J. Staiano, Project PIAF: Building a Native French Question-Answering Dataset, in Proceedings of The 12th Language Resources and Evaluation Conference, (Marseille, France), pp. 54815490, European Language Resources Association, May 2020. 13