机器学习中回归训练的示例

使用perming完成示例

python 复制代码
pip install perming>=1.9.2
pip install polars[pandas]

下载数据集

数据清洗和预处理

python 复制代码
import numpy
import pandas
df = pandas.read_csv('../data/uci_gbm_data.txt', sep='   ', engine='python')
df.head()
python 复制代码
Lever position (lp) [ ]	Ship speed (v) [knots]	Gas Turbine shaft torque (GTT) [kN m]	Gas Turbine rate of revolutions (GTn) [rpm]	Gas Generator rate of revolutions (GGn) [rpm]	Starboard Propeller Torque (Ts) [kN]	Port Propeller Torque (Tp) [kN]	HP Turbine exit temperature (T48) [C]	GT Compressor inlet air temperature (T1) [C]	GT Compressor outlet air temperature (T2) [C]	HP Turbine exit pressure (P48) [bar]	GT Compressor inlet air pressure (P1) [bar]	GT Compressor outlet air pressure (P2) [bar]	Gas Turbine exhaust gas pressure (Pexh) [bar]	Turbine Injecton Control (TIC) [%]	Fuel flow (mf) [kg/s]	GT Compressor decay state coefficient.	GT Turbine decay state coefficient.
0	1.138	3.0	289.964	1349.489	6677.380	7.584	7.584	464.006	288.0	550.563	1.096	0.998	5.947	1.019	7.137	0.082	0.95	0.975
1	2.088	6.0	6960.180	1376.166	6828.469	28.204	28.204	635.401	288.0	581.658	1.331	0.998	7.282	1.019	10.655	0.287	0.95	0.975
2	3.144	9.0	8379.229	1386.757	7111.811	60.358	60.358	606.002	288.0	587.587	1.389	0.998	7.574	1.020	13.086	0.259	0.95	0.975
3	4.161	12.0	14724.395	1547.465	7792.630	113.774	113.774	661.471	288.0	613.851	1.658	0.998	9.007	1.022	18.109	0.358	0.95	0.975
4	5.140	15.0	21636.432	1924.313	8494.777	175.306	175.306	731.494	288.0	645.642	2.078	0.998	11.197	1.026	26.373	0.522	0.95	0.975

转换数据形式到Numpy

python 复制代码
df = df.to_numpy()
values = df[:,-1]
features = df[:,:-1]
features.shape, values.shape
python 复制代码
((11934, 17), (11934,))

加载机器学习过程

python 复制代码
import perming
main = perming.Box(17, 1, (30,), criterion='MSELoss', batch_size=4, activation='relu', inplace_on=True, solver='adam', learning_rate_init=0.01)
# main = perming.Regressier(17, (30,), batch_size=4, activation='relu', solver='adam', learning_rate_init=0.01)
# main = perming.COMMON_MODELS['Regression'](17, (30,), batch_size=4, activation='relu', solver='adam', learning_rate_init=0.01)
main.print_config()
python 复制代码
MLP(
  (mlp): Sequential(
    (Linear0): Linear(in_features=17, out_features=30, bias=True)
    (Activation0): ReLU(inplace=True)
    (Linear1): Linear(in_features=30, out_features=1, bias=True)
  )
)
OrderedDict([('torch -v', '1.7.1+cu101'),
             ('criterion', MSELoss()),
             ('batch_size', 4),
             ('solver',
              Adam (
              Parameter Group 0
                  amsgrad: False
                  betas: (0.9, 0.99)
                  eps: 1e-08
                  lr: 0.01
                  weight_decay: 0
              )),
             ('lr_scheduler', None),
             ('device', device(type='cuda'))])

加载数据集到DataLoader

python 复制代码
main.data_loader(features, values, random_seed=0)

训练阶段和加速验证

python 复制代码
main.train_val(num_epochs=2, interval=100, early_stop=True)
python 复制代码
Epoch [1/2], Step [100/2387], Training Loss: 23.0912, Validation Loss: 24.5740
Epoch [1/2], Step [200/2387], Training Loss: 291.9099, Validation Loss: 6.7348
Epoch [1/2], Step [300/2387], Training Loss: 5637.1328, Validation Loss: 1480.3076
Epoch [1/2], Step [400/2387], Training Loss: 1211.0406, Validation Loss: 210.9741
Epoch [1/2], Step [500/2387], Training Loss: 90.4388, Validation Loss: 23.6573
Epoch [1/2], Step [600/2387], Training Loss: 67.0454, Validation Loss: 24.6701
Epoch [1/2], Step [700/2387], Training Loss: 1253.5343, Validation Loss: 1144.0096
Epoch [1/2], Step [800/2387], Training Loss: 39.3887, Validation Loss: 257.6939
Epoch [1/2], Step [900/2387], Training Loss: 0.9986, Validation Loss: 1.1887
Epoch [1/2], Step [1000/2387], Training Loss: 30.2453, Validation Loss: 9.7175
Epoch [1/2], Step [1100/2387], Training Loss: 264.4302, Validation Loss: 19.0528
Epoch [1/2], Step [1200/2387], Training Loss: 5.2984, Validation Loss: 8.8709
Epoch [1/2], Step [1300/2387], Training Loss: 0.0152, Validation Loss: 0.3077
Epoch [1/2], Step [1400/2387], Training Loss: 0.0118, Validation Loss: 0.0014
Epoch [1/2], Step [1500/2387], Training Loss: 0.3608, Validation Loss: 0.3265
Epoch [1/2], Step [1600/2387], Training Loss: 5616.9810, Validation Loss: 54.1350
Epoch [1/2], Step [1700/2387], Training Loss: 1.0014, Validation Loss: 0.3494
Epoch [1/2], Step [1800/2387], Training Loss: 0.0025, Validation Loss: 0.0249
Epoch [1/2], Step [1900/2387], Training Loss: 0.0008, Validation Loss: 0.0195
Epoch [1/2], Step [2000/2387], Training Loss: 0.0041, Validation Loss: 0.0234
Epoch [1/2], Step [2100/2387], Training Loss: 0.2388, Validation Loss: 0.0222
Process stop at epoch [1/2] with patience 10 within tolerance 0.001

已训练的参数测试

python 复制代码
main.test()
python 复制代码
loss of Box on the 1196 test dataset: 0.14259785413742065.
OrderedDict([('problem', 'regression'),
             ('loss',
              {'train': 0.18060052394866943,
               'val': 0.025247152894735336,
               'test': 0.14259785413742065})])

保存模型和导入模型参数

python 复制代码
main.save(False, '../models/ucigbm.ckpt')
python 复制代码
main.load(False, '../models/ucigbm.ckpt')
相关推荐
勇气要爆发4 小时前
【第二阶段—机器学习入门】第十五章:机器学习核心概念
人工智能·机器学习
山东小木4 小时前
A2UI:智能问数的界面构建策略
大数据·人工智能·jboltai·javaai·springboot ai·a2ui
认真学GIS4 小时前
逐3小时降水量!全国2421个气象站点1951-2024年逐3小时尺度长时间序列降水量(EXCEL格式)数据
人工智能·算法·机器学习
龙山云仓4 小时前
No098:黄道婆&AI:智能的工艺革新与技术传承
大数据·开发语言·人工智能·python·机器学习
LaughingZhu4 小时前
Product Hunt 每日热榜 | 2025-12-20
人工智能·经验分享·深度学习·神经网络·产品运营
love530love4 小时前
Win11+RTX3090 亲测 · ComfyUI Hunyuan3D 全程实录 ②:nvdiffrast 源码编译实战(CUDA 13.1 零降级)
人工智能·windows·python·github·nvdiffrast
————A4 小时前
强化学习---->多臂老虎机问题
人工智能
pingao1413784 小时前
从数据到预警:自动雨量监测站如何用科技解码暴雨密码
人工智能·科技
undsky_4 小时前
【n8n教程】:执行工作流——从手动测试到生产自动化
人工智能·ai·aigc·ai编程