Abstract: Born in a BARN: Bayesian Additive Regression Networks

Abstract: Born in a BARN: Bayesian Additive Regression Networks

We apply Bayesian Additive Regression Tree (BART) principles to training an ensemble of small neural networks for regression tasks. Using Markov Chain Monte Carlo, we sample from the space of single hidden layer neural networks conditioned on how well they fit the data. To create an ensemble of these, we apply Gibbs’ sampling to update each network against the residual target value (i.e. subtracting the effect of the other networks). We demonstrate the effectiveness of this technique on several benchmark regression problems, comparing it to equivalent neuron count single neural networks as well as equivalent tree count BART. Our Bayesian Additive Regression Networks (BARN) provide more consistent and often more accurate results, at the cost of significantly greater computation time.