Gaussian Process Networks

Nir Friedman, Iftach Nachman

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper we address the problem of learning the structure of a Bayesian network in domains with continuous variables. This task requires a procedure for comparing different candidate structures. In the Bayesian framework, this is done by evaluating the marginal likelihood of the data given a candidate structure. This term can be computed in closed-form for standard parametric families (e.g., Gaussians), and can be approximated, at some computational cost, for some semi-parametric families (e.g., mixtures of Gaussians).We present a new family of continuous variable probabilistic networks that are based on Gaussian Process priors. These priors are semiparametric in nature and can learn almost arbitrary noisy functional relations. Using these priors, we can directly compute marginal likelihoods for structure learning. The resulting method can discover a wide range of functional dependencies in multivariate data. We develop the Bayesian score of Gaussian Process Networks and describe how to learn them from data. We present empirical results on artificial data as well as on real-life domains with non-linear dependencies.
Original languageEnglish
Title of host publicationProceedings of the Sixteenth Conference on Uncertainty in Artificial Intelligence
Place of PublicationSan Francisco, CA, USA
PublisherMorgan Kaufmann Publishers, Inc.
Pages211–219
ISBN (Print)1558607099
StatePublished - 2000
EventThe Sixteenth Conference on Uncertainty in Artificial Intelligence - Stanford University, Stanford, United States
Duration: 30 Jun 19993 Jul 1999
Conference number: 16

Publication series

NameUAI'00
PublisherMorgan Kaufmann Publishers Inc.

Conference

ConferenceThe Sixteenth Conference on Uncertainty in Artificial Intelligence
Country/TerritoryUnited States
CityStanford
Period30/06/993/07/99

Fingerprint

Dive into the research topics of 'Gaussian Process Networks'. Together they form a unique fingerprint.

Cite this