Select Lab Publications


Residual Splash for Optimally Parallelizing Belief Propagation (2009)

By: Joseph Gonzalez, Yucheng Low, and Carlos Guestrin

Abstract: As computer architectures move towards multicore we must build a theoretical understanding of parallelism in machine learning. In this paper we focus on parallel inference in graphical models. We demonstrate that the natural, fully synchronous parallelization of belief propagation is highly inefficient. By bounding the achievable parallel performance in chain graphical models we develop a theoretical understanding of the parallel limitations of belief propagation. We then provide a new parallel belief propagation algorithm which achieves optimal performance. Using two challenging real-world tasks, we empirically evaluate the performance of our algorithm on large cyclic graphical models where we achieve near linear parallel scaling and out perform alternative algorithms.



Download Information
Joseph Gonzalez, Yucheng Low, and Carlos Guestrin (2009). "Residual Splash for Optimally Parallelizing Belief Propagation." In Artificial Intelligence and Statistics (AISTATS). pdf   talk      
BibTeX citation

@inproceedings{Gonzalez+al:aistats09paraml,
title = {Residual Splash for Optimally Parallelizing Belief Propagation},
author = {Joseph Gonzalez and Yucheng Low and Carlos Guestrin},
booktitle = {In Artificial Intelligence and Statistics (AISTATS)},
year = 2009,
month = {April},
address = {Clearwater Beach, Florida},
wwwfilebase = {aistats2009-gonzalez-low-guestrin},
wwwtopic = {Parallel Learning},
}



full list