Student Brown Bag Seminar

BARNstorm and Transform: Theory and Modeling with Bayesian Additive Regression Networks

When

1 p.m., March 31, 2023

Where

Harnessing the demonstrated effectiveness of Bayesian Additive Regression Tree (BART) principles, we develop a fully Bayesian procedure to train an ensemble of small neural networks for regression tasks.  We describe how BART samples from a Bayesian posterior of an ensemble of decision trees, and then adapt this method to neural networks.  Using Markov Chain Monte Carlo, Bayesian Additive Regression Networks (BARN) samples from the space of single hidden layer neural networks that are conditioned on their fit to data.  To create an ensemble of networks, we apply Gibbs' sampling to update each network against the residual target value (i.e. subtracting the effect of the other networks).  We examine the test performance of BARN on several benchmark regression tasks, comparing it to equivalent neuron count single neural networks as well as equivalent tree count BART.  We also use BARN to model an applied problem and compare against the state of the art modeling methods for that specific domain.  BARN provides more consistent and often more accurate results, with a mean root mean square error just 5% higher than the best (or next best) method across 9 datasets with different ``best methods'' in each.  This comes at the cost of significantly greater computation time (minutes vs seconds).  But this may be surmountable with more clever programming, and errors may further shrink with a hyperparameter grid search.

Math, 402 and Zoom: Link https://arizona.zoom.us/j/83541348598  Password:  BB2022