Date of this Version
Advances in Neural Information Processing Systems
We develop a Bayesian “sum-of-trees” model, named BART, where each tree is constrained by a prior to be a weak learner. Fitting and inference are accomplished via an iterative backfitting MCMC algorithm. This model is motivated by ensemble methods in general, and boosting algorithms in particular. Like boosting, each weak learner (i.e., each weak tree) contributes a small amount to the overall model. However, our procedure is defined by a statistical model: a prior and a likelihood, while boosting is defined by an algorithm. This model-based approach enables a full and accurate assessment of uncertainty in model predictions, while remaining highly competitive in terms of predictive accuracy.
Chipman, H. A., George, E. I., & McCulloch, R. E. (2006). Bayesian Ensemble Learning. Advances in Neural Information Processing Systems, 19 Retrieved from https://repository.upenn.edu/statistics_papers/458
Date Posted: 27 November 2017