Off-campus UMass Amherst users: To download campus access dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.
Non-UMass Amherst users: Please talk to your librarian about requesting this dissertation through interlibrary loan.
Dissertations that have an embargo placed on them will not be available to anyone until the embargo expires.
Author ORCID Identifier
Open Access Dissertation
Doctor of Philosophy (PhD)
Electrical and Computer Engineering
Year Degree Awarded
Month Degree Awarded
Machine learning is the study of computer algorithms that focuses on analyzing and interpreting patterns and structures in data. It has been successfully applied to many areas in computer science and achieved state-of-the-art results to enable learning, reasoning, and decision-making without human interactions. This research aims to develop innovated data parallel frameworks to accommodate the computing resources to parallelize different machine learning and deep learning algorithms and speed up the training. To achieve that, we explore three interesting frameworks in this dissertation: (1) Sync-on-the-fly framework for gradient descent algorithms on transient resources; (2) Asynchronous Proactive Data Parallel framework for both gradient descent and Expectation-Maximization algorithms; (3) Cohesive Mini-batches graph convolutional network framework for graph convolutional networks.
Zhao, Guoyi, "Data Parallel Frameworks for Training Machine Learning Models" (2022). Doctoral Dissertations. 2585.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.