申请试用
HOT
登录
注册
 
Support Vector Machines

Support Vector Machines

陈傲天
/
发布于
/
2070
人观看
Turning nonlinear problems into linear ones by expanding into high-dimensional feature spaces. The dual representation of linear classifiers: weight training points, not features. Observation: in the dual representation, only inner products of vectors matter. The kernel trick: kernel functions let us compute inner products in feature spaces without computing the features. Some bounds on the generalization error of linear classifiers based on "margin" and the number of training points with non-zero weight ("support vectors"). Learning support vector machines by trading in-sample performance against bounds on over-fitting.
6点赞
2收藏
0下载
确认
3秒后跳转登录页面
去登陆