1 Logistic Regression
1.1 step1: Function Set
--
1.2 Step2: Goodness of a Function

cross entropy 衡量两个分布有多接近

1.3 Step3: Find the best function

1.4 Review

2 why not Logsitic Regression + Square Error?


3 Discriminative V.S. Generative

Logistics模型没有假设,但generative假设样本的probability distribution为高斯?朴素贝叶斯?等
3.1 which one is better

通常认为discriminative比generative要好

如上图所示,在Naive Bayes中并没有考虑不同dimension之间的correlation。所以generative模型中假定的probability distribution有可能会脑补出不应该有的条件
3.2 Benefit of generative model

4 Multi-class Classification
3 classes as example


5 Limitation of Logistic Regression

5.1 Feature Transformation

5.2 Cascading Logistic Regression Models
让机器自己学习找到好的feature transformation
这样机器自己学习后进行了feature transformation,从 x 1 , x 2 x_1, x_2 x1,x2转到 x 1 ′ , x 2 ′ x'_1, x'_2 x1′,x2′,再通过转化后的feature进行分类

Neural Network就来咯
