Loading...
Thumbnail Image
Publication

Data Parallel Frameworks for Training Machine Learning Models

Abstract
Machine learning is the study of computer algorithms that focuses on analyzing and interpreting patterns and structures in data. It has been successfully applied to many areas in computer science and achieved state-of-the-art results to enable learning, reasoning, and decision-making without human interactions. This research aims to develop innovated data parallel frameworks to accommodate the computing resources to parallelize different machine learning and deep learning algorithms and speed up the training. To achieve that, we explore three interesting frameworks in this dissertation: (1) Sync-on-the-fly framework for gradient descent algorithms on transient resources; (2) Asynchronous Proactive Data Parallel framework for both gradient descent and Expectation-Maximization algorithms; (3) Cohesive Mini-batches graph convolutional network framework for graph convolutional networks.
Type
openaccess
dissertation
Date
Publisher
Advisors
Rights
License
http://creativecommons.org/licenses/by/4.0/
Research Projects
Organizational Units
Journal Issue
Embargo
Publisher Version
Embedded videos
Collections