Warning: WP Redis: Connection refused in /www/wwwroot/cmooc.com/wp-content/plugins/powered-cache/includes/dropins/redis-object-cache.php on line 1433
机器学习技术 | MOOC中国 - 慕课改变你,你改变世界

机器学习技术

1138 次查看
国立台湾大学
Coursera
  • 完成时间大约为 64 个小时
  • 中级
  • 中文
注:本课程由Coursera和Linkshare共同提供,因开课平台的各种因素变化,以上开课日期仅供参考

课程概况

The course extends the fundamental tools in “Machine Learning Foundations” to powerful and practical models by three directions, which includes embedding numerous features, combining predictive features, and distilling hidden features. [这门课将先前「机器学习基石」课程中所学的基础工具往三个方向延伸为强大而实用的工具。这三个方向包括嵌入大量的特徵、融合预测性的特徵、与萃取潜藏的特徵。]

课程大纲

第一讲:Linear Support Vector Machine

more robust linear classification solvable with quadratic programming

第二讲:Dual Support Vector Machine

another QP form of SVM with valuable geometric messages and almost no dependence on the dimension of transformation

第三讲:Kernel Support Vector Machine

kernel as a shortcut to (transform + inner product): allowing a spectrum of models ranging from simple linear ones to infinite dimensional ones with margin control

第四讲:Soft-Margin Support Vector Machine

a new primal formulation that allows some penalized margin violations, which is equivalent to a dual formulation with upper-bounded variables

第五讲:Kernel Logistic Regression

soft-classification by an SVM-like sparse model using two-level learning, or by a "kernelized" logistic regression model using representer theorem

第六讲:Support Vector Regression

kernel ridge regression via ridge regression + representer theorem, or support vector regression via regularized tube error + Lagrange dual

第七讲:Blending and Bagging

blending known diverse hypotheses uniformly, linearly, or even non-linearly; obtaining diverse hypotheses from bootstrapped data

第八讲:Adaptive Boosting

"optimal" re-weighting for diverse hypotheses and adaptive linear aggregation to boost weak algorithms

第九讲:Decision Tree

recursive branching (purification) for conditional aggregation of simple hypotheses

第十讲:Random Forest

bootstrap aggregation of randomized decision trees with automatic validation

第十一讲:Gradient Boosted Decision Tree

aggregating trees from functional + steepest gradient descent subject to any error measure

第十二讲:Neural Network

automatic feature extraction from layers of neurons with the back-propagation technique for stochastic gradient descent

第十三讲:Deep Learning

an early and simple deep learning model that pre-trains with denoising autoencoder and fine-tunes with back-propagation

第十四讲:Radial Basis Function Network

linear aggregation of distance-based similarities to prototypes found by clustering

第十五讲:Matrix Factorization

linear models of items on extracted user features (or vice versa) jointly optimized with stochastic gradient descent for recommender systems

第十六讲:Finale

summary from the angles of feature exploitation, error optimization, and overfitting elimination towards practical use cases of machine learning

千万首歌曲。全无广告干扰。
此外,您还能在所有设备上欣赏您的整个音乐资料库。免费畅听 3 个月,之后每月只需 ¥10.00。
Apple 广告
声明:MOOC中国十分重视知识产权问题,我们发布之课程均源自下列机构,版权均归其所有,本站仅作报道收录并尊重其著作权益。感谢他们对MOOC事业做出的贡献!
  • Coursera
  • edX
  • OpenLearning
  • FutureLearn
  • iversity
  • Udacity
  • NovoEd
  • Canvas
  • Open2Study
  • Google
  • ewant
  • FUN
  • IOC-Athlete-MOOC
  • World-Science-U
  • Codecademy
  • CourseSites
  • opencourseworld
  • ShareCourse
  • gacco
  • MiriadaX
  • JANUX
  • openhpi
  • Stanford-Open-Edx
  • 网易云课堂
  • 中国大学MOOC
  • 学堂在线
  • 顶你学堂
  • 华文慕课
  • 好大学在线CnMooc
  • (部分课程由Coursera、Udemy、Linkshare共同提供)

© 2008-2022 CMOOC.COM 慕课改变你,你改变世界