The theoretical analysis of systems that learn from data has been an important topic of study in statistics, machine learning, and information theory. In all these paradigms, distinct methods have been developed to deal with inference when the models under consideration can be arbitrarily large. Recently, there has been a fruitful cross-fertilization of ideas and proof techniques. To give but one example, very recently, minimax optimal convergence rates of the information-theoretic MDL method were proved using ideas from the - computational - PAC-Bayesian paradigm and - statistical - empirical process techniques. The goal of this workshop is to bring together leading theoreticians to allow them to debate, compare and cross-fertilise ideas from these distinct inductive principles. At the workshop, we will establish a PASCAL special interest group for `merging computational and information-theoretic learning with statistics'.