<监督和无监督学习>Introduction to Machine Learning

Definition

  • Machine learning is field of study thaht gives computers the ability to learn withuot being explicitly programmed.

Machine Learning Algorithms

  • Supervised learning
  • Unsupervised learning
  • Recommender system
  • Reinforcement learning

Supervised Learning

Basic Concept

  • Input and its corresponding right answer give labels then test the module with brand new input

  • Example:

  • Types

    • Regression: a particular type of supervise learning, is predict a number from infinitely many possible outputs

    • Classification : predict catagories, finited possible outputs (classes/catogories may be many, so do the inputs)

Linear Regression Model

  • Terminology
    • x = "input" variable = feature
    • y = "output" variable = "taget" variable
    • m = number of training examples
    • (x,y) = single training example
    • w,b = parameter = coefficients = weights
    • w is slope while b is y-intercept
  • The process of unsupervise learning

    • Univariable linear regression = one variable linear regression
  • Cost function ------ find w and b (额外除以2目的是方便后面梯度下降求导时把2约去使式子看起来更简洁)
    • Squared error cost function (To find different value when choosing w and b)

    • For linear regression with the squared error cost function, you always end up with a bow shape or a hammock shape.

      ==

    • The difference between fw(x) and J(w)

      • the previous one is related to x and we choose different w for J(w)

Gradient descent

  • The method of find the minimal J(w,b)
  • Every time ture 360 degree to have a little step and find the intermediate destination with the the largest difference with the last point, then do the same until you find you couldn't go down anymore
  • process (so called "Batch" gradient descent)
    • start with some w,b (set w=b=0)
    • keep chaging w,b to reduce J(w,b)
    • Until we settle at or near a minimum
  • If you find different minimal result by choosing different starting point, all these different results are calledlocal minima
  • Gradient descent algorithm
    • |--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
      | α = learning rate (usually a small positive number bwtween 0 to 1):decide how large the step I take when going down to the hill (dJ(w,b)/dw) destinate in which direction you want to take your step |

    • The end condition: w and b don't change much with each addition step that you take

    • Tip: b and w must be updated simultaneously

    • WHY THEY MAKE SENSE?

    • Learning rate α

      |--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
      | Problem1: When α is too small, the gradient makes sense but is too slow Problem2: When α is too big, it may overshoot, never reach the minimal value of J(w) Problem3: When the starting point is the local minima, the result will stop at the local minima (Can reach locak minimum with fixed learning rate) 所以!α是要根据坡度变化而变化的!! |

Learning Regression Algorithm

  • For square error cost function, there only one minima

Unsupervise Learning

  • Finding something interesting in unlabeled data:Data only comes with inputs x, but not outputs label y. Algrithm has to find structure in the data
  • Types
    • Clustering : Group similar data points together

    • Anomaly detection :Find unusual data points

    • Dimensionality redution: Compress data using fewer numbers

相关推荐
Blossom.11821 分钟前
量子网络:构建未来通信的超高速“高速公路”
网络·opencv·算法·安全·机器学习·密码学·量子计算
qsmyhsgcs24 分钟前
Java程序员转人工智能入门学习路线图(2025版)
java·人工智能·学习·机器学习·算法工程师·人工智能入门·ai算法工程师
A林玖25 分钟前
【机器学习】朴素贝叶斯
人工智能·算法·机器学习
六边形战士DONK28 分钟前
神经网络基础[损失函数,bp算法,梯度下降算法 ]
人工智能·神经网络·算法
IT从业者张某某34 分钟前
机器学习-08-时序数据分析预测
人工智能·机器学习·数据分析
归去_来兮35 分钟前
GBDT算法原理及Python实现
机器学习
袁煦丞36 分钟前
AI视频生成神器Wan 2.1:cpolar内网穿透实验室第596个成功挑战
人工智能·程序员·远程工作
Humbunklung41 分钟前
PySide6 GUI 学习笔记——常用类及控件使用方法(常用类矩阵QRectF)
笔记·python·学习·pyqt
xMathematics1 小时前
深度学习与SLAM特征提取融合:技术突破与应用前景
人工智能·深度学习
墨顿1 小时前
Transformer数学推导——Q29 推导语音识别中流式注意力(Streaming Attention)的延迟约束优化
人工智能·深度学习·transformer·注意力机制·跨模态与多模态