The code on this webpage was developed by researchers of the SELECT Lab. Permission is granted to use this code for research purposes, although we request that you acknowledge these researchers in any published work, or equivalent distribution, using the bibtex entries provided. We make no warranties as to the quality or accuracy of any of the code linked from this page.
This MATLAB/C++ library contains an efficient parallel solver for the Lasso and Logistic Regression, based on parallel coordinate descent described in paper:
- Joseph K. Bradley, Aapo Kyrola, Danny Bickson, and Carlos Guestrin. "Parallel Coordinate Descent for L1-Regularized Loss Minimization." International Conference on Machine Learning (ICML 2011).
Download source code: shotgun.zip
This MATLAB/C++ library contains multi-core implementations of the Chromatic and Splash Gibbs samplers described in the paper:
- Joseph Gonzalez, Yucheng Low, Arthur Gretton, Carlos Guestrin. "Parallel Gibbs Sampling: From Colored Fields to Thin Junction Trees." AIStats, 2011.
The library was implemented on top of our GraphLab framework and therefore requires Pthreads and LibBoost. Unfortunately, due to the complexities of supporting threading models on varying platforms we are only currently support Linux and Mac at this time.
- For recent version of Matlab on Mac and 64bit linux platforms we have pre-compiled binaries available here: pgibbs.tar.gz. The main interface to the parallel sampler is pgibbs/matlab/gibbs_sampler.m which calls the c++ api via the Mex interface.
- We also have released the parallel Gibbs sampling tools as part of the GraphLab API under the path demoapps/pgibbs.
This C++ library is an implementation of the GraphLab abstraction described in the paper:
- Yucheng Low, Joseph Gonzalez, Aapo Kyrola, Danny Bickson, Carlos Guestrin, and Joseph M. Hellerstein. "GraphLab: A New Parallel Framework for Machine Learning." UAI, 2010.
GraphLab is a parallel algorithm design framework optimized for Machine Learning algorithms. The GraphLab abstraction generalizes the popular MapReduce abstraction to support algorithms with overlapping, dependent, and iterative computation (Map Operations). In essence GraphLab fills the gap between the MapReduce abstraction and low level parallel abstractions like Message Passing and Threads.
This MATLAB toolbox provides implementations of algorithms for maximizing and minimizing submodular set functions. It includes Queyranne's algorithm, Fujishige's minimum norm algorithm, Zhao et al's recursive splitting procedure, Nemhauser et al's greedy algorithm, Krause et al's Saturate algorithm, Goldengorin et al's Data Correcting algorithm, Narasimhan and Bilmes' submodular-supermodular procedure etc. It also provides a detailed tutorial script explaining the application of the toolbox to several machine learning problems like image denoising, clustering and experimental design. See also our tutorial materials.
This C++ library provides an efficient parallel implementation of the ResidualSplash algorithm described in Residual Splash for Optimally Parallelizing Belief Propagation, on discrete Markov Random Fields. In addition the library provides basic implementations of Synchronous, Round Robin, and Residual Belief Propagation (BP) along with supporting objects for representing factors, Markov Random Fields, and message scheduling. An easy to use Matlab interface is also provided. Read the README file for compilation and usage instructions.
This C++ library provides a distributed MPI implementation of the Belief Residual Splash algorithm described in Distributed Parallel Inference on Large Factor Graphs. It provides an inference program which takes as input a factor graph described in a text file, and outputs estimated variable marginals. In addition, we provide a lifted version which operates on lifted factor graphs. A thin Matlab wrapper is also included. Read the README file for compilation and usage instructions.
This C++ library provides code for learning tree structures for Conditional Random Fields (CRFs), as well as code for parameter learning, inference, evaluation, and generative structure learning via Chow-Liu. It was used for the experiments in the paper Learning Tree Conditional Random Fields and is described in more detail on the project page.