PAC-Bayes theory is a framework for deriving some of the tightest generalization bounds available. Many well established learning algorithms can be justified in the PAC-Bayes framework and even improved. PAC-Bayes bounds were originally applicable to classification, but over the last few years the theory has been extended to regression, density estimation, and problems with non iid data. The theory is well established within a small group of the statistical learning community, and has now matured to a level where it is relevant to a wider audience. The workshop will include tutorials on the foundations of the theory as well as recent findings through peer reviewed presentations.
Workshop topics
PAC Bayes theory or applications. In particular: application to:
- regression
- density estimation
- hypothesis testing
- structured density estimation
- non-iid data
- sequential data
Workshop Organisers
- Jean-Yves Audibert, IMAGINE, Ecole des Ponts ParisTech / CSTB, Université Paris Est
- Matthew Higgs, University College London
- Steffen Grünewälder, University College London
- François Laviolette, Université Laval, Canada
- John Shawe-Taylor, University College London