Off-campus UMass Amherst users: To download campus access dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.

Non-UMass Amherst users: Please talk to your librarian about requesting this dissertation through interlibrary loan.

Dissertations that have an embargo placed on them will not be available to anyone until the embargo expires.

Date of Award

9-2012

Access Type

Campus Access

Document type

dissertation

Degree Name

Doctor of Philosophy (PhD)

Degree Program

Mathematics and Statistics

First Advisor

John Staudenmayer

Second Advisor

Michael Lavine

Third Advisor

Daeyoung Kim

Subject Categories

Statistics and Probability

Abstract

Trees have long been used as a flexible way to build regression and classification models for complex problems. They can accommodate nonlinear response-predictor relationships and even interactive intra-predictor relationships. Tree based models handle data sets with predictors of mixed types, both ordered and categorical, in a natural way. The tree based regression model can also be used as the base model to build additive models, among which the most prominent models are gradient boosting trees and random forests. Classical training algorithms for tree based models are deterministic greedy algorithms. These algorithms are fast to train, but they usually are not guaranteed to find an optimal tree.

In this paper, we discuss a Bayesian approach to building tree based models. In Bayesian tree models, each tree is assigned a prior probability based on its structure, and standard Monte Carlo Markov Chain (MCMC) algorithms can be used to search through the posterior distribution. This thesis is aimed at improving the computational efficiency and performance of Bayesian tree based models. We consider introducing new proposal or "moves" in the MCMC algorithm to improve the efficiency of the algorithm. We use temperature based algorithms to help the MCMC algorithm get out of local optima and move towards the global optimum in the posterior distribution. Moreover, we develop semi-parametric Bayesian additive trees models where some predictors enter the model parametrically. The technical details about using parallel computing to shorten the computing time are also discussed in this thesis.

DOI

https://doi.org/10.7275/0xy2-t025

Share

COinS