Off-campus UMass Amherst users: To download campus access dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.
Non-UMass Amherst users: Please talk to your librarian about requesting this dissertation through interlibrary loan.
Dissertations that have an embargo placed on them will not be available to anyone until the embargo expires.
Author ORCID Identifier
https://orcid.org/0000-0001-8496-4445
AccessType
Open Access Dissertation
Document Type
dissertation
Degree Name
Doctor of Philosophy (PhD)
Degree Program
Linguistics
Year Degree Awarded
2020
Month Degree Awarded
September
First Advisor
Gaja Jarosz
Second Advisor
Joe Pater
Third Advisor
John Kingston
Fourth Advisor
Brendan O'Connor
Subject Categories
Computational Linguistics | Phonetics and Phonology
Abstract
This dissertation shows how a theory of grammatical representations and a theory of learning can be combined to generate gradient typological predictions in phonology, predicting not only which patterns are expected to exist, but also their relative frequencies: patterns which are learned more easily are predicted to be more typologically frequent than those which are more difficult.
In Chapter 1 I motivate and describe the specific implementation of this methodology in this dissertation. Maximum Entropy grammar (Goldwater & Johnson 2003) is combined with two agent-based learning models, the iterated and the interactive learning model, each of which mimics a type of learning dynamic observed in natural language acquisition.
In Chapter 2 I illustrate how this system works using a simplified, abstract example typology, and show how the models generate a bias away from patterns which rely on cumulative constraint interaction ("gang effects"), and a bias away from variable patterns. Both of these biases match observed trends in natural language typology and psycholinguistic experiments.
Chapter 3 further explores the models' bias away from cumulative constraint interaction using an empirical test case: the typology of possible patterns of contrast between two fricatives. This typology yields five possible patterns, the rarest of which is the result of a gang effect. The results of simulations performed with both models produce a bias against the gang effect pattern.
Chapter 4 further explores the models' bias away from variation using evidence from artificial grammar learning experiments, in which human participants show a bias away from variable patterns (e.g. Smith & Wonnacott 2010). This test case was chosen additionally to disambiguate between variable behavior within a lexical item (variation), and variable behavior across lexical items (exceptionality). The results of simulations performed with both learning models are consistent with the observed bias away from variable patterns in humans.
The results of the iterated and interactive learning models presented in this dissertation provide support for the use of this methodology in investigating the typological predictions of linguistic theories of grammar and learning, as well as in addressing broader questions regarding the source of gradient typological trends, and whether certain properties of natural language must be innately specified, or might emerge through other means.
DOI
https://doi.org/10.7275/19049538
Recommended Citation
Hughto, Coral, "Emergent Typological Effects of Agent-Based Learning Models in Maximum Entropy Grammar" (2020). Doctoral Dissertations. 2028.
https://doi.org/10.7275/19049538
https://scholarworks.umass.edu/dissertations_2/2028
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.