【机器学习】Exam3

线性可分logistic逻辑回归

数据集点被分为了两边,根据课程学会归一化函数以及梯度下降即可。

使用线性模型。

python 复制代码
import copy

import numpy as np
import pandas as pd
from matplotlib import pyplot as plt

#归一化函数
def sigmoid(x):
    return 1 / (1 + np.exp(-x))

# 计算损失
def compute_cost_logistic(X, y, w, b):
    m = X.shape[0]
    cost = 0.0
    for i in range(m):
        z_i = np.dot(X[i], w) + b
        f_wb_i = sigmoid(z_i)
        cost += -y[i] * np.log(f_wb_i) - (1 - y[i]) * np.log(1 - f_wb_i)
    cost = cost / m
    return cost
    
# 计算梯度逻辑
def compute_gradient_logistic(X, y, w, b):
    m, n = X.shape
    db_w = np.zeros(n)
    db_b = 0.
    for i in range(m):
        z_i = sigmoid(np.dot(X[i], w) + b)
        err_i = z_i - y[i]
        for j in range(n):
            db_w[j] += err_i * X[i][j]
        db_b += err_i
    return db_w / m, db_b / m

# 梯度下降
def gradient_descent(X, y, w, b, eta, num_iter):
    m = X.shape[0]
    for i in range(num_iter):
        w_temp = copy.deepcopy(w)
        b_temp = b
        db_w, db_b = compute_gradient_logistic(X, y, w_temp, b_temp)
        w = w_temp - eta * db_w
        b = b_temp - eta * db_b
        """
        if i == 0:
            print(compute_cost_logistic(X, y, w, b))
        """

    return w, b

if __name__ == '__main__':
    data = pd.read_csv(r'D:\BaiduNetdiskDownload\data_sets\ex2data1.txt')
    xx = data.iloc[:, 0:-1].to_numpy()
    data = (data - data.min()) / (data.max() - data.min())
	# 获取X,y训练集
    X_train = data.iloc[:, 0:-1]
    y_train = data.iloc[:, -1]

    X_train = X_train.to_numpy()
    y_train = y_train.to_numpy()

    w_tmp = np.zeros_like(X_train[0])
    b_tmp = 0.
    alph = 0.1
    iters = 10000
    w_out, b_out = gradient_descent(X_train, y_train, w_tmp, b_tmp, alph, iters)
    print(w_out, b_out)
    # 根据 w,b画出关于x的图表
    x = np.linspace(0, 1, 100)
    k = (-b_out - w_out[0] * x ) / w_out[1]

    X_air = xx[:, 0]
    Y_air = np.zeros(X_air.shape[0])
    plt.plot(x, k, color='blue')
    plt.scatter(X_train[:, 0], X_train[:, 1], c=y_train)
    plt.show()

	# 计算准确率
    count = 0
    for i in range(X_train.shape[0]):
        ans = sigmoid(np.dot(X_train[i], w_out) + b_out)
        prediction = 1 if ans >= 0.5 else 0
        if(prediction == y_train[i]): count += 1
    print('Accuracy:{}'.format(count))
    print(f"\nupdated parameters: w:{w_out}, b:{b_out}")
一些图表

回归方程和数据集:

预期结果:

w: [9.24150506 8.78629869] b: -8.125896329768265

Accuracy:88%

相关推荐
大山同学1 分钟前
多机器人图优化:2024ICARA开源
人工智能·语言模型·机器人·去中心化·slam·感知定位
工业3D_大熊2 分钟前
【虚拟仿真】CEETRON SDK在船舶流体与结构仿真中的应用解读
java·python·科技·信息可视化·c#·制造·虚拟现实
Topstip8 分钟前
Gemini 对话机器人加入开源盲水印技术来检测 AI 生成的内容
人工智能·ai·机器人
SEEONTIME11 分钟前
python-24-一篇文章彻底掌握Python HTTP库Requests
开发语言·python·http·http库requests
Bearnaise11 分钟前
PointMamba: A Simple State Space Model for Point Cloud Analysis——点云论文阅读(10)
论文阅读·笔记·python·深度学习·机器学习·计算机视觉·3d
小嗷犬24 分钟前
【论文笔记】VCoder: Versatile Vision Encoders for Multimodal Large Language Models
论文阅读·人工智能·语言模型·大模型·多模态
Struart_R29 分钟前
LVSM: A LARGE VIEW SYNTHESIS MODEL WITH MINIMAL 3D INDUCTIVE BIAS 论文解读
人工智能·3d·transformer·三维重建
lucy1530275107930 分钟前
【青牛科技】GC5931:工业风扇驱动芯片的卓越替代者
人工智能·科技·单片机·嵌入式硬件·算法·机器学习
哇咔咔哇咔42 分钟前
【科普】conda、virtualenv, venv分别是什么?它们之间有什么区别?
python·conda·virtualenv
幻风_huanfeng1 小时前
线性代数中的核心数学知识
人工智能·机器学习