Slide 17
Slide 17 text
参考文献
- [Sajjad+22] Analyzing Encoded Concepts in Transformer Language Models,
NAACL2022
- [Belinkov+20] On the Linguistic Representational Power of Neural Machine
Translation Models, CL2020
- [Lepori+20] Picking BERT’s Brain: Probing for Linguistic Dependencies in
Contextualized Embeddings Using Representational Similarity Analysis,
COLING2020
- [Michael+20] Asking without Telling: Exploring Latent Ontologies in Contextual
Representations, EMNLP2020
- [Marvin+18] Targeted Syntactic Evaluation of Language Models, EMNLP2018
- [Linzen+16] Assessing the Ability of LSTMs to Learn Syntax-Sensitive
Dependencies, TACL2016
- [Elazar+21] Amnesic Probing: Behavioral Explanation with Amnesic
Counterfactuals, TACL2021
- [Luu+22] Time Waits for No One! Analysis and Challenges of Temporal
Misalignment, NAACL2022
17