Most Popular Papers *
Where New Words Are Born: Distributional Semantic Analysis of Neologisms and Their Semantic Neighborhoods
Maria Ryskina, Ella Rabinovich, Taylor Berg-Kirkpatrick, David R. Mortensen, and Yulia Tsvetkov
Do language models know how to be polite?
Soo-Hwan Lee and Shaonan Wang
Unbounded Recursion in Two Dimensions, Where Syntax and Prosody Meet
Edward P. Stabler and Kristine M. Yu
What do you mean, BERT? Assessing BERT as a Distributional Semantics Model
Timothee Mickus, Denis Paperno, Mathieu Constant, and Kees van Deemter
Subject-verb Agreement with Seq2Seq Transformers: Bigger Is Better, but Still Not Best
Michael A. Wilson, Zhenghao Zhou, and Robert Frank
What Code-Switching Strategies are Effective in Dialogue Systems?
Emily Ahn, Cecilia Jimenez, Yulia Tsvetkov, and Alan Black
Language Models Can Learn Exceptions to Syntactic Rules
Cara Su-Yi Leong and Tal Linzen
Look at that! BERT can be easily distracted from paying attention to morphosyntax
Rui P. Chaves and Stephanie N. Richter
CANDS: A Computational Implementation of Collins and Stabler (2016)
Satoru Ozaki and Yohei Oseki
Rethinking Representations: A Log-bilinear Model of Phonotactics
Huteng Dai, Connor Mayer, and Richard Futrell
* Based on the average number of full-text downloads per day since the paper was posted.
» Updated as of 01/18/24.