Machine Learning ---- Gradient Descent

目录

[一、The concept of gradient:](#一、The concept of gradient:)

[① In a univariate function:](#① In a univariate function:)

[②In multivariate functions:](#②In multivariate functions:)

[二、Introduction of gradient descent cases:](#二、Introduction of gradient descent cases:)

[三、Gradient descent formula and its simple understanding:](#三、Gradient descent formula and its simple understanding:)

[四、Formula operation precautions:](#四、Formula operation precautions:)


一、The concept of gradient:

In a univariate function

gradient is actually the differentiation of the function, representing the slope of the tangent of the function at a given point

In multivariate functions

a gradient is a vector with a direction, and the direction of the gradient indicates the direction in which the function rises the fastest at a given point

二、Introduction of gradient descent cases:

Do you remember the golf course inside the cat and mouse? It looks like this in the animation:

Let's take a look at these two pictures. You can easily see the distant hill, right? We can take it as the most typical example, and the golf course can also be abstracted into a coordinate map:

So in this coordinate, we will correspond the following (x, y) to (w, b) respectively. Then, when J (w, b) is at its maximum, which is the peak in the red area of the graph, we start the gradient descent process.

Firstly, we rotate one circle from the highest point to find the direction with the highest slope. At this point, we can take a small step down. The reason for choosing this direction is actually because it is the steepest direction. If we walk down the same step length, the height of descent will naturally be the highest, and we can also walk faster to the lowest point (local minimum point). At the same time, after each step, we look around and choose. Finally, we can determine this path:Finally reaching the local minimum point A, is this the only minimum point? Of course not:

It is also possible to reach point B, which is also a local minimum point. At this point, we have introduced the implementation process of gradient descent, and we will further understand its meaning through mathematical formulas.

三、Gradient descent formula and its simple understanding:

We first provide the formula for gradient descent:

In the formula, corresponds to what we call the learning rate, and the equal sign is the same as the assignment symbol in computer program code. J (w, b) can be found in the regression equation blog in the previous section. As for the determination of the learning rate, we will share it with you next time. Here, we will first understand the meaning of the formula:

Firstly, let's simplify the formula and takeb equal to 0as an example. This way, we can better understand its meaning through a two-dimensional Cartesian coordinate system:

In this J (w, b) coordinate graph, which is a quadratic function, since we consider b in the equation to be 0,So we can assume that = ,So, such a partial derivative can be seen as the derivative in the unary case. At this point, it can be seen that when >0 and the corresponding w value is in the right half, the derivative is positive, that is, its slope is positive. This is equivalent to subtracting a positive number from w, and its w point will move to the left, which is the closest to its minimum value, which is the optimal solution. Similarly, when in the left half of the function, its w will move to the right, which is close to the minimum value, So the step size for each movement is .

This is a simple understanding of the gradient descent formula.


四、Formula operation precautions:

This is a simple understanding of the gradient descent formula

just like this:

The following is an incorrect order of operations that shouldbe avoided:

This is the understanding of the formula and algorithm implementation for gradient descent. As for the code implementation, we will continue to explain it in future articles.

Machine Learning ---- Cost function-CSDN博客

相关推荐
二二孚日9 分钟前
自用华为ICT云赛道AI第三章知识点-昇腾芯片硬件架构,昇腾芯片软件架构
人工智能·华为
蹦蹦跳跳真可爱5891 小时前
Python----OpenCV(几何变换--图像平移、图像旋转、放射变换、图像缩放、透视变换)
开发语言·人工智能·python·opencv·计算机视觉
蹦蹦跳跳真可爱5891 小时前
Python----循环神经网络(Transformer ----Layer-Normalization(层归一化))
人工智能·python·rnn·transformer
夜阳朔1 小时前
Conda环境激活失效问题
人工智能·后端·python
小Lu的开源日常1 小时前
AI模型太多太乱?用 OpenRouter,一个接口全搞定!
人工智能·llm·api
mit6.8242 小时前
[Meetily后端框架] Whisper转录服务器 | 后端服务管理脚本
c++·人工智能·后端·python
Baihai IDP2 小时前
AI 系统架构的演进:LLM → RAG → AI Workflow → AI Agent
人工智能·ai·系统架构·llm·agent·rag·白海科技
沫儿笙2 小时前
弧焊机器人气体全方位节能指南
网络·人工智能·机器人
LONGZETECH2 小时前
【龙泽科技】新能源汽车维护与动力蓄电池检测仿真教学软件【吉利几何G6】
人工智能·科技·汽车·汽车仿真教学软件·汽车教学软件
jndingxin3 小时前
OpenCV 图像哈希类cv::img_hash::AverageHash
人工智能·opencv·哈希算法