Loading...
Thumbnail Image
Publication

Informed Search for Learning Causal Structure

Abstract
Over the past twenty-five years, a large number of algorithms have been developed to learn the structure of causal graphical models. Many of these algorithms learn causal structures by analyzing the implications of observed conditional independence among variables that describe characteristics of the domain being analyzed. They do so by applying inference rules, data analysis operations such as the conditional independence tests, each of which can eliminate large parts of the space of possible causal structures. Results show that the sequence of inference rules used by PC, a widely applied algorithm for constraint-based learning of causal models, is effective but not optimal. This is because algorithms such as PC ignore the probability of the outcomes of these inference rules. We demonstrate how an alternative algorithm can reliably outperform PC by taking into account the probability of inference rule outcomes. Specifically we show that an informed search that bases the order of causal inference on a prior probability distribution over the space of causal constraints can generate a flexible sequence of analysis that efficiently identifies the same results as PC. This class of algorithms is able to outperform PC even under uniform or erroneous priors.
Type
openaccess
article
dissertation
Date
Publisher
Rights
License