effect of word predictability on reading time is logarithmic.” Journal of Cognition 128 (3): 302–19. l Levy, Roger. 2008. “Expectation-based syntactic comprehension.” Journal of Cognition 106 (3): 1126–77. l Genzel, Dmitriy, and Eugene Charniak. 2002. “Entropy rate constancy in text.” In Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, 199–206. l Jaeger, T., and Roger Levy. 2007. “Speakers optimize information density through syntactic reduction.” In Advances in Neural Information Processing Systems, edited by B. Schölkopf, J. Platt, and T. Hoffman, 19:849–56. MIT Press. l Hale, John. 2016. “Information-theoretical Complexity Metrics.” Language and Linguistics Compass 10 (9): 397– 412. l Futrell, Richard, Peng Qian, Edward Gibson, Evelina Fedorenko, and Idan Blank. 2019. “Syntactic Dependencies Correspond to Word Pairs with High Mutual Information.” In Proceedings of the Fifth International Conference on Dependency Linguistics (Depling, SyntaxFest 2019), 3–13. Paris, France: Association for Computational Linguistics. l Mollica, Francis, Matthew Siegelman, Evgeniia Diachek, Steven T. Piantadosi, Zachary Mineroff, Richard Futrell, Hope Kean, Peng Qian, and Evelina Fedorenko. 2020. “Composition Is the Core Driver of the Language- Selective Network.” Neurobiology of Language 1 (1): 104–34. l Genzel, Dmitriy, and Eugene Charniak. 2002. “Entropy rate constancy in text.” In Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, 199–206. 2022/9/26 ࠷ઌNLP2022