Off-campus UMass Amherst users: To download campus access dissertations, please use the following link to log into our proxy server with your UMass Amherst user name and password.

Non-UMass Amherst users: Please talk to your librarian about requesting this dissertation through interlibrary loan.

Dissertations that have an embargo placed on them will not be available to anyone until the embargo expires.

Document Type

Open Access Dissertation

Degree Name

Doctor of Philosophy (PhD)

Degree Program

Computer Science

Year Degree Awarded

2015

Month Degree Awarded

September

First Advisor

David Jensen

Subject Categories

Artificial Intelligence and Robotics

Abstract

Over the past twenty-five years, a large number of algorithms have been developed to learn the structure of causal graphical models. Many of these algorithms learn causal structures by analyzing the implications of observed conditional independence among variables that describe characteristics of the domain being analyzed. They do so by applying inference rules, data analysis operations such as the conditional independence tests, each of which can eliminate large parts of the space of possible causal structures. Results show that the sequence of inference rules used by PC, a widely applied algorithm for constraint-based learning of causal models, is effective but not optimal. This is because algorithms such as PC ignore the probability of the outcomes of these inference rules. We demonstrate how an alternative algorithm can reliably outperform PC by taking into account the probability of inference rule outcomes. Specifically we show that an informed search that bases the order of causal inference on a prior probability distribution over the space of causal constraints can generate a flexible sequence of analysis that efficiently identifies the same results as PC. This class of algorithms is able to outperform PC even under uniform or erroneous priors.

Share

COinS