Practical session - Introduction to Bayesian deep learning in Python

Charles Ollion et Sylvain Le Corff (CMAP) nous ont proposé une session pratique autour de l’apprentissage profond bayésien.

Deep learning is widely used for regression and classification problems, but the simplest architectures do not capture model uncertainty. Most widespread solutions provide single point estimates in regression problems instead of designing predictive or posterior distributions. Several approaches have tried to tie Deep Learning with Bayesian models, such as Bayes by Backprop and practical techniques such as Monte Carlo Dropout. During this session, we propose a hands-on tutorial with such approaches, using Python (Tensorflow / Pytorch). We propose a workshop that you may follow using Google Colab (no installation or requirements). After an in-depth description of the model and the datasets, we will take time to deeply understand the models and focus on the implementation details.

Level: basic / intermediate Prerequisite: small experience in Python

Lien vers le tutoriel et le dépôt github

Papers Weight uncertainty in neural networks aka Bayes by Backprop Dropout as Bayesian approximation aka Monte Carlo dropout Evaluating predictive distributions