机器学习的贝叶斯方法

Bayesian Methods for Machine Learning

1174 次查看
国立高等经济大学
Coursera
  • 完成时间大约为 38 个小时
  • 高级
  • 英语, 韩语
注:本课程由Coursera和Linkshare共同提供,因开课平台的各种因素变化,以上开课日期仅供参考

课程概况

People apply Bayesian methods in many areas: from game development to drug discovery. They give superpowers to many machine learning algorithms: handling missing data, extracting much more information from small datasets. Bayesian methods also allow us to estimate uncertainty in predictions, which is a desirable feature for fields like medicine.

When applied to deep learning, Bayesian methods allow you to compress your models a hundred folds, and automatically tune hyperparameters, saving your time and money.
In six weeks we will discuss the basics of Bayesian methods: from how to define a probabilistic model to how to make predictions from it. We will see how one can automate this workflow and how to speed it up using some advanced techniques.
We will also see applications of Bayesian methods to deep learning and how to generate new images with it. We will see how new drugs that cure severe diseases be found with Bayesian methods.

Do you have technical problems? Write to us: coursera@hse.ru

课程大纲

Introduction to Bayesian methods & Conjugate priors

Welcome to first week of our course! Today we will discuss what bayesian methods are and what are probabilistic models. We will see how they can be used to model real-life situations and how to make conclusions from them. We will also learn about conjugate priors — a class of models where all math becomes really simple.

Expectation-Maximization algorithm

This week we will about the central topic in probabilistic modeling: the Latent Variable Models and how to train them, namely the Expectation Maximization algorithm. We will see models for clustering and dimensionality reduction where Expectation Maximization algorithm can be applied as is. In the following weeks, we will spend weeks 3, 4, and 5 discussing numerous extensions to this algorithm to make it work for more complicated models and scale to large datasets.

Variational Inference & Latent Dirichlet Allocation

This week we will move on to approximate inference methods. We will see why we care about approximating distributions and see variational inference — one of the most powerful methods for this task. We will also see mean-field approximation in details. And apply it to text-mining algorithm called Latent Dirichlet Allocation

Markov chain Monte Carlo

This week we will learn how to approximate training and inference with sampling and how to sample from complicated distributions. This will allow us to build simple method to deal with LDA and with Bayesian Neural Networks — Neural Networks which weights are random variables themselves and instead of training (finding the best value for the weights) we will sample from the posterior distributions on weights.

Variational Autoencoder

Welcome to the fifth week of the course! This week we will combine many ideas from the previous weeks and add some new to build Variational Autoencoder -- a model that can learn a distribution over structured data (like photographs or molecules) and then sample new data points from the learned distribution, hallucinating new photographs of non-existing people. We will also the same techniques to Bayesian Neural Networks and will see how this can greatly compress the weights of the network without reducing the accuracy.

Gaussian processes & Bayesian optimization

Welcome to the final week of our course! This time we will see nonparametric Bayesian methods. Specifically, we will learn about Gaussian processes and their application to Bayesian optimization that allows one to perform optimization for scenarios in which each function evaluation is very expensive: oil probe, drug discovery and neural network architecture tuning.

Final project

In this module you will apply methods that you learned in this course to this final project

千万首歌曲。全无广告干扰。
此外,您还能在所有设备上欣赏您的整个音乐资料库。免费畅听 3 个月,之后每月只需 ¥10.00。
Apple 广告
声明:MOOC中国十分重视知识产权问题,我们发布之课程均源自下列机构,版权均归其所有,本站仅作报道收录并尊重其著作权益。感谢他们对MOOC事业做出的贡献!
  • Coursera
  • edX
  • OpenLearning
  • FutureLearn
  • iversity
  • Udacity
  • NovoEd
  • Canvas
  • Open2Study
  • Google
  • ewant
  • FUN
  • IOC-Athlete-MOOC
  • World-Science-U
  • Codecademy
  • CourseSites
  • opencourseworld
  • ShareCourse
  • gacco
  • MiriadaX
  • JANUX
  • openhpi
  • Stanford-Open-Edx
  • 网易云课堂
  • 中国大学MOOC
  • 学堂在线
  • 顶你学堂
  • 华文慕课
  • 好大学在线CnMooc
  • (部分课程由Coursera、Udemy、Linkshare共同提供)

© 2008-2022 CMOOC.COM 慕课改变你,你改变世界