Most Popular Papers *
Learning Argument Structures with Recurrent Neural Network Grammars
Ryo Yoshida and Yohei Oseki
How Well Do LSTM Language Models Learn Filler-gap Dependencies?
Satoru Ozaki, Dan Yurovsky, and Lori Levin
Learning Stress Patterns with a Sequence-to-Sequence Neural Network
Brandon Prickett and Joe Pater
Masked language models directly encode linguistic uncertainty
Cassandra Jacobs, Ryan J. Hubbard, and Kara D. Federmeier
Linguistic Complexity and Planning Effects on Word Duration in Hindi Read Aloud Speech
Sidharth Ranjan, Rajakrishnan Rajkumar, and Sumeet Agarwal
* Based on the average number of full-text downloads per day since the paper was posted.
» Updated as of 02/28/22.