PyTorch实现糖尿病预测的CNN模型:从数据加载到模型部署全解析【N折交叉验证、文末免费下载】

本文将详细介绍如何使用PyTorch框架构建一个卷积神经网络(CNN)来预测糖尿病,包含完整的代码实现、技术细节和可视化分析。

1. 项目概述

本项目使用经典的Pima Indians Diabetes数据集,通过5折交叉验证训练一个1D CNN模型,最终实现糖尿病预测。

主要技术栈包括:

  • PyTorch 1.0+ (深度学习框架)

  • scikit-learn (数据分割)

  • Matplotlib (可视化)

  • Pandas (数据处理)

2. 环境配置与数据准备

2.1 硬件配置

复制代码
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
print(f"Using device: {device}")

这段代码自动检测可用的计算设备,优先使用GPU加速。在Colab或配备NVIDIA显卡的机器上会自动选择CUDA。

2.2 数据加载

复制代码
data = pd.read_csv('diabetes.csv')
X = data.iloc[:, :-1].values  # 特征(8个医学特征)
y = data.iloc[:, -1].values   # 标签(0/1)

X_tensor = torch.FloatTensor(X).unsqueeze(1)  # 添加通道维度
y_tensor = torch.LongTensor(y)

数据预处理关键点:

  1. 原始数据包含8个特征和1个二元标签

  2. unsqueeze(1)将形状从[N,8]变为[N,1,8],符合CNN输入要求

  3. 没有进行标准化处理,实际应用中建议添加MinMaxScaler

3. CNN模型架构设计

3.1 网络结构

复制代码
class DiabetesCNN(nn.Module):
    def __init__(self, input_features):
        super(DiabetesCNN, self).__init__()
        self.conv1 = nn.Conv1d(1, 16, kernel_size=3, stride=1, padding=1)
        self.conv2 = nn.Conv1d(16, 32, kernel_size=3, stride=1, padding=1)
        self.pool = nn.MaxPool1d(kernel_size=2, stride=2)
        self.fc1 = nn.Linear(32 * (input_features//4), 64)
        self.fc2 = nn.Linear(64, 2)
        self.dropout = nn.Dropout(0.2)
        self.relu = nn.ReLU()

架构详解:

  • ​Conv1d层​​:处理1D序列数据(8个特征作为序列)

    • 第一层:16个滤波器,kernel_size=3,保持维度(padding=1)

    • 第二层:32个滤波器,同上配置

  • ​MaxPool1d​​:下采样因子2,序列长度从8→4→2

  • ​全连接层​​:

    • 第一层:输入维度32*(8//4)=64 → 64

    • 输出层:64 → 2 (二分类)

3.2 前向传播

复制代码
def forward(self, x):
    x = self.relu(self.conv1(x))
    x = self.pool(x)
    x = self.relu(self.conv2(x))
    x = self.pool(x)
    x = x.view(x.size(0), -1)  # 展平
    x = self.dropout(x)
    x = self.relu(self.fc1(x))
    x = self.fc2(x)
    return x

数据流维度变化:

  1. 输入: [batch, 1, 8]

  2. Conv1 → [batch, 16, 8]

  3. Pool → [batch, 16, 4]

  4. Conv2 → [batch, 32, 4]

  5. Pool → [batch, 32, 2]

  6. Flatten → [batch, 64]

  7. FC → [batch, 2]

4. 训练流程实现

4.1 训练函数

复制代码
def train_model(model, train_loader, val_loader, criterion, optimizer, epochs=50):
    # 初始化最佳准确率和记录列表
    best_val_acc = 0.0
    train_losses, val_losses = [], []
    train_accs, val_accs = [], []
    
    for epoch in range(epochs):
        # 训练阶段
        model.train()
        running_loss = 0.0
        train_correct = 0
        train_total = 0
        
        for inputs, labels in train_loader:
            # 数据迁移到设备
            inputs, labels = inputs.to(device), labels.to(device)
            
            # 经典训练三步曲
            optimizer.zero_grad()
            outputs = model(inputs)
            loss = criterion(outputs, labels)
            loss.backward()
            optimizer.step()
            
            # 统计指标
            running_loss += loss.item()
            _, predicted = torch.max(outputs.data, 1)
            train_total += labels.size(0)
            train_correct += (predicted == labels).sum().item()

关键训练细节:

  • 使用Adam优化器(学习率0.001)

  • 交叉熵损失函数(内置softmax)

  • 记录每个epoch的loss和accuracy

  • 训练/验证分离

4.2 验证阶段

复制代码
        # 验证阶段
        model.eval()
        val_correct = 0
        val_total = 0
        val_loss = 0.0
        
        with torch.no_grad():  # 禁用梯度计算
            for inputs, labels in val_loader:
                inputs, labels = inputs.to(device), labels.to(device)
                outputs = model(inputs)
                loss = criterion(outputs, labels)
                val_loss += loss.item()
                _, predicted = torch.max(outputs.data, 1)
                val_total += labels.size(0)
                val_correct += (predicted == labels).sum().item()

验证阶段特点:

  • model.eval():关闭Dropout等训练专用层

  • torch.no_grad():节省内存,加速计算

  • 不执行反向传播

5. 交叉验证实现

5.1 KFold配置

复制代码
kfold = KFold(n_splits=5, shuffle=True, random_state=42)
batch_size = 32
input_features = X_tensor.shape[2]  # 8
epochs = 50

# 结果存储
fold_results = []
all_train_losses, all_val_losses = [], []
all_train_accs, all_val_accs = [], []

5折交叉验证优势:

  • 充分利用小数据集

  • 可靠评估模型性能

  • 检测过拟合

5.2 训练循环

复制代码
for fold, (train_ids, val_ids) in enumerate(kfold.split(X_tensor)):
    # 数据分割
    X_train, X_val = X_tensor[train_ids], X_tensor[val_ids]
    y_train, y_val = y_tensor[train_ids], y_tensor[val_ids]
    
    # 创建DataLoader
    train_data = TensorDataset(X_train, y_train)
    val_data = TensorDataset(X_val, y_val)
    train_loader = DataLoader(train_data, batch_size=batch_size, shuffle=True)
    val_loader = DataLoader(val_data, batch_size=batch_size, shuffle=False)
    
    # 初始化模型
    model = DiabetesCNN(input_features).to(device)
    criterion = nn.CrossEntropyLoss()
    optimizer = optim.Adam(model.parameters(), lr=0.001)
    
    # 训练并记录结果
    best_val_acc, train_losses, val_losses, train_accs, val_accs = train_model(
        model, train_loader, val_loader, criterion, optimizer, epochs)
    
    fold_results.append(best_val_acc)
    # 保存各fold的指标...

DataLoader关键参数:

  • shuffle=True:训练集打乱顺序

  • batch_size=32:小批量训练

  • num_workers:可添加多线程加载(未显示)

6. 可视化分析

6.1 训练曲线绘制

复制代码
plt.figure(figsize=(15, 10))

# 损失曲线
plt.subplot(2, 2, 1)
for i in range(len(all_train_losses)):
    plt.plot(all_train_losses[i], label=f'Fold {i+1} Train')
plt.title('Training Loss Across Folds')
plt.xlabel('Epoch')
plt.ylabel('Loss')

# 准确率曲线
plt.subplot(2, 2, 3)
for i in range(len(all_train_accs)):
    plt.plot(all_train_accs[i], label=f'Fold {i+1} Train')
plt.title('Training Accuracy Across Folds')
plt.xlabel('Epoch')
plt.ylabel('Accuracy (%)')

曲线分析要点:

  • 观察训练/验证曲线的收敛性

  • 检查过拟合迹象(训练持续下降但验证波动)

  • 比较不同fold的一致性

7. 模型保存与部署

7.1 全数据训练

复制代码
full_train_loader = DataLoader(TensorDataset(X_tensor, y_tensor), 
                             batch_size=batch_size, shuffle=True)
final_model = DiabetesCNN(input_features).to(device)
final_optimizer = optim.Adam(final_model.parameters(), lr=0.001)
train_model(final_model, full_train_loader, full_train_loader, 
           criterion, final_optimizer, epochs)

7.2 模型保存

复制代码
torch.save(final_model.state_dict(), 'diabetes_cnn_model_final.pth')

部署建议:

  1. 加载模型:

    复制代码
    model = DiabetesCNN(8)
    model.load_state_dict(torch.load('diabetes_cnn_model_final.pth'))
    model.eval()
  2. 对新数据预测:

    复制代码
    with torch.no_grad():
        output = model(new_data)
        prediction = torch.argmax(output, dim=1)

完整代码已提供所有关键实现,读者可根据实际需求进行调整和扩展。这个项目展示了如何在小规模医疗数据上应用深度学习技术,为类似的二分类问题提供了可复用的模板。

复制下载

数据集:

复制代码
-0.294118,0.487437,0.180328,-0.292929,0,0.00149028,-0.53117,-0.0333333,0
-0.882353,-0.145729,0.0819672,-0.414141,0,-0.207153,-0.766866,-0.666667,1
-0.0588235,0.839196,0.0491803,0,0,-0.305514,-0.492741,-0.633333,0
-0.882353,-0.105528,0.0819672,-0.535354,-0.777778,-0.162444,-0.923997,0,1
0,0.376884,-0.344262,-0.292929,-0.602837,0.28465,0.887276,-0.6,0
-0.411765,0.165829,0.213115,0,0,-0.23696,-0.894962,-0.7,1
-0.647059,-0.21608,-0.180328,-0.353535,-0.791962,-0.0760059,-0.854825,-0.833333,0
0.176471,0.155779,0,0,0,0.052161,-0.952178,-0.733333,1
-0.764706,0.979899,0.147541,-0.0909091,0.283688,-0.0909091,-0.931682,0.0666667,0
-0.0588235,0.256281,0.57377,0,0,0,-0.868488,0.1,0
-0.529412,0.105528,0.508197,0,0,0.120715,-0.903501,-0.7,1
0.176471,0.688442,0.213115,0,0,0.132638,-0.608027,-0.566667,0
0.176471,0.396985,0.311475,0,0,-0.19225,0.163962,0.2,1
-0.882353,0.899497,-0.0163934,-0.535354,1,-0.102832,-0.726729,0.266667,0
-0.176471,0.00502513,0,0,0,-0.105812,-0.653288,-0.633333,0
0,0.18593,0.377049,-0.0505051,-0.456265,0.365127,-0.596072,-0.666667,0
-0.176471,0.0753769,0.213115,0,0,-0.117735,-0.849701,-0.666667,0
-0.882353,0.0351759,-0.508197,-0.232323,-0.803783,0.290611,-0.910333,-0.6,1
-0.882353,0.155779,0.147541,-0.393939,-0.77305,0.0312965,-0.614859,-0.633333,0
-0.647059,0.266332,0.442623,-0.171717,-0.444444,0.171386,-0.465414,-0.8,1
-0.0588235,-0.00502513,0.377049,0,0,0.0551417,-0.735269,-0.0333333,1
-0.176471,0.969849,0.47541,0,0,0.186289,-0.681469,-0.333333,0
0.0588235,0.19598,0.311475,-0.292929,0,-0.135618,-0.842015,-0.733333,0
0.176471,0.256281,0.147541,-0.474747,-0.728132,-0.0730253,-0.891546,-0.333333,0
-0.176471,0.477387,0.245902,0,0,0.174367,-0.847139,-0.266667,0
-0.882353,-0.0251256,0.0819672,-0.69697,-0.669031,-0.308495,-0.650726,-0.966667,1
0.529412,0.457286,0.344262,-0.616162,-0.739953,-0.338301,-0.857387,0.2,1
-0.411765,0.175879,0.508197,0,0,0.0163934,-0.778822,-0.433333,1
-0.411765,0.0954774,0.229508,-0.474747,0,0.0730254,-0.600342,0.3,1
-0.647059,0.58794,0.245902,-0.272727,-0.420804,-0.0581222,-0.33988,-0.766667,0
-0.647059,-0.115578,-0.0491803,-0.777778,-0.87234,-0.260805,-0.838599,-0.966667,1
-0.294118,-0.0753769,0.508197,0,0,-0.406855,-0.906063,-0.766667,1
0.176471,0.226131,0.278689,-0.373737,0,-0.177347,-0.629377,-0.2,1
-0.529412,0.0351759,-0.0163934,-0.333333,-0.546099,-0.28465,-0.241674,-0.6,1
0.294118,0.386935,0.245902,0,0,-0.0104321,-0.707942,-0.533333,1
0.0588235,0.0251256,0.245902,-0.252525,0,-0.019374,-0.498719,-0.166667,0
-0.764706,-0.0954774,0.114754,-0.151515,0,0.138599,-0.637062,-0.8,0
-0.529412,0.115578,0.180328,-0.0505051,-0.510638,0.105812,0.12041,0.166667,0
-0.647059,0.809045,0.0491803,-0.494949,-0.834515,0.0134128,-0.835184,-0.833333,1
-0.176471,0.336683,0.377049,0,0,0.198212,-0.472246,-0.466667,1
-0.176471,0.0653266,0.508197,-0.636364,0,-0.323398,-0.865927,-0.1,1
0.0588235,0.718593,0.803279,-0.515152,-0.432624,0.353204,-0.450897,0.1,0
-0.176471,0.59799,0.0491803,0,0,-0.183308,-0.815542,-0.366667,1
0,0.809045,0.0819672,-0.212121,0,0.251863,0.549957,-0.866667,0
-0.882353,0.467337,-0.0819672,0,0,-0.114754,-0.58497,-0.733333,1
-0.764706,-0.286432,0.147541,-0.454545,0,-0.165425,-0.566183,-0.966667,1
-0.176471,0.0351759,0.0819672,-0.353535,0,0.165425,-0.772844,-0.666667,0
-0.176471,0.0552764,0,0,0,0,-0.806149,-0.9,1
-0.882353,0.0351759,0.311475,-0.777778,-0.806147,-0.421759,-0.64731,-0.966667,1
-0.882353,0.0150754,-0.180328,-0.69697,-0.914894,-0.278688,-0.617421,-0.833333,1
-0.411765,-0.115578,0.0819672,-0.575758,-0.945626,-0.272727,-0.774552,-0.7,1
-0.0588235,0.768844,0.47541,-0.313131,-0.29078,0.004471,-0.667805,0.233333,0
-0.176471,0.507538,0.0819672,-0.151515,-0.191489,0.0342773,-0.453459,-0.3,1
-0.882353,-0.266332,-0.180328,-0.79798,0,-0.314456,-0.854825,0,1
-0.176471,0.879397,0.114754,-0.212121,-0.281324,0.123696,-0.849701,-0.333333,0
0,0.00502513,0.442623,0.212121,-0.739953,0.394933,-0.24509,-0.666667,1
0,0.467337,0.344262,0,0,0.207154,0.454313,-0.233333,1
0,0.0552764,0.0491803,-0.171717,-0.664303,0.23696,-0.918873,-0.966667,1
-0.764706,-0.155779,0,0,0,0,-0.807003,0,1
-0.0588235,0.336683,0.180328,0,0,-0.019374,-0.836038,-0.4,0
-0.411765,-0.557789,0.0163934,0,0,-0.254843,-0.565329,-0.5,1
-0.764706,0.417085,-0.0491803,-0.313131,-0.6974,-0.242921,-0.469684,-0.9,1
-0.176471,0.145729,0.0819672,0,0,-0.0223547,-0.846285,-0.3,0
-0.411765,-0.00502513,0.213115,-0.454545,0,-0.135618,-0.893254,-0.633333,1
0,0.0954774,0.442623,-0.393939,0,-0.0312965,-0.336465,-0.433333,0
-0.764706,0.0954774,0.508197,0,0,0.272727,-0.345004,0.1,1
-0.882353,-0.0452261,0.0819672,-0.737374,-0.910165,-0.415797,-0.781383,-0.866667,1
-0.529412,0.467337,0.393443,-0.454545,-0.763593,-0.138599,-0.905209,-0.8,1
-0.764706,0.00502513,0.0819672,-0.59596,-0.787234,-0.019374,-0.326217,-0.766667,0
-0.411765,0.396985,0.0491803,-0.292929,-0.669031,-0.147541,-0.715628,-0.833333,1
0.529412,0.266332,0.47541,0,0,0.293592,-0.568745,-0.3,0
-0.529412,0.296482,0.409836,-0.59596,-0.361702,0.0461997,-0.869342,-0.933333,1
-0.882353,-0.20603,0.229508,-0.393939,0,-0.0461997,-0.728437,-0.966667,1
-0.882353,0,-0.213115,-0.59596,0,-0.263785,-0.947054,-0.966667,1
-0.176471,-0.376884,0.278689,0,0,-0.028316,-0.732707,-0.333333,1
-0.411765,-0.0452261,0.180328,-0.333333,0,0.123696,-0.75064,-0.8,1
0,0.316583,0,0,0,0.28763,-0.836038,-0.833333,0
-0.764706,0.125628,0.0819672,-0.555556,0,-0.254843,-0.804441,-0.9,1
-0.647059,0.135678,-0.278689,-0.737374,0,-0.33234,-0.947054,-0.966667,1
-0.764706,-0.256281,0,0,0,0,-0.979505,-0.966667,1
-0.176471,-0.165829,0.278689,-0.474747,-0.832151,-0.126677,-0.411614,-0.5,1
0,0.0150754,0.0655738,-0.434343,0,-0.266766,-0.864219,-0.966667,1
-0.411765,0.376884,0.770492,0,0,0.454545,-0.872758,-0.466667,0
-0.764706,0.105528,0.213115,-0.414141,-0.704492,-0.0342771,-0.470538,-0.8,1
0.529412,0.0653266,0.180328,0.0909091,0,0.0909091,-0.914603,-0.2,1
-0.764706,0.00502513,0.114754,-0.494949,-0.832151,0.147541,-0.789923,-0.833333,1
0.764706,0.366834,0.147541,-0.353535,-0.739953,0.105812,-0.935952,-0.266667,0
-0.882353,0.0753769,0.114754,-0.616162,0,-0.210134,-0.925705,-0.9,1
-0.882353,-0.19598,-0.0983607,0,0,-0.4307,-0.846285,0,1
-0.529412,0.236181,0.311475,-0.69697,-0.583924,-0.0461997,-0.688301,-0.566667,1
-0.176471,-0.18593,0.278689,-0.191919,-0.886525,0.391952,-0.843723,-0.3,1
-0.529412,0.346734,0.180328,0,0,-0.290611,-0.83006,0.3,0
-0.764706,0.427136,0.344262,-0.636364,-0.8487,-0.263785,-0.416738,0,1
-0.294118,0.447236,0.180328,-0.454545,-0.460993,0.0104323,-0.848847,-0.366667,1
-0.764706,-0.0753769,0.0163934,-0.434343,0,-0.0581222,-0.955594,-0.9,1
-0.882353,-0.286432,-0.213115,-0.636364,-0.820331,-0.391952,-0.790777,-0.966667,1
-0.294118,-0.0653266,-0.180328,-0.393939,-0.8487,-0.14456,-0.762596,-0.933333,1
-0.882353,0.226131,0.47541,0.030303,-0.479905,0.481371,-0.789069,-0.666667,0
-0.882353,0.638191,0.180328,0,0,0.162444,-0.0230572,-0.6,0
-0.882353,0.517588,-0.0163934,0,0,-0.222057,-0.913749,-0.966667,1
0,0.256281,0.57377,0,0,-0.329359,-0.842869,0,1
-0.882353,-0.18593,0.180328,-0.636364,-0.905437,-0.207153,-0.824936,-0.9,1
-0.764706,-0.145729,0.0655738,0,0,0.180328,-0.272417,-0.8,1
-0.882353,0.266332,-0.0819672,-0.414141,-0.640662,-0.14456,-0.382579,0,1
-0.882353,-0.0351759,1,0,0,-0.33234,-0.889838,-0.8,1
-0.529412,0.447236,-0.0491803,-0.434343,-0.669031,-0.120715,-0.82152,-0.466667,1
-0.647059,-0.165829,-0.0491803,-0.373737,-0.957447,0.0223547,-0.779675,-0.866667,1
0,-0.0452261,0.393443,-0.494949,-0.914894,0.114754,-0.855679,-0.9,0
-0.647059,0.718593,0.180328,-0.333333,-0.680851,-0.00745157,-0.89667,-0.9,0
-0.0588235,0.557789,0.0163934,-0.474747,0.170213,0.0134128,-0.602904,-0.166667,0
-0.882353,-0.105528,0.245902,-0.313131,-0.91253,-0.0700447,-0.902647,-0.933333,1
-0.529412,-0.236181,0.0163934,0,0,0.0134128,-0.732707,-0.866667,1
-0.176471,0.60804,-0.114754,-0.353535,-0.586288,-0.0909091,-0.564475,-0.4,0
-0.529412,0.467337,0.508197,0,0,-0.0700447,-0.606319,0.333333,0
-0.411765,0.246231,0.213115,0,0,0.0134128,-0.878736,-0.433333,0
-0.411765,-0.21608,-0.213115,0,0,0.004471,-0.508113,-0.866667,1
-0.529412,-0.0251256,-0.0163934,-0.535354,0,-0.159463,-0.688301,-0.966667,1
-0.529412,-0.00502513,0.245902,-0.69697,-0.879433,-0.308495,-0.876174,0,1
0,0.628141,0.245902,0.131313,-0.763593,0.585693,-0.418446,-0.866667,0
-0.294118,0.115578,0.0491803,-0.212121,0,0.0193741,-0.844577,-0.9,1
-0.764706,0.0753769,0.213115,-0.393939,-0.763593,0.00149028,-0.721605,-0.933333,1
-0.411765,0.326633,0.311475,0,0,-0.201192,-0.907771,0.6,1
0,0.135678,0.245902,0,0,-0.00745157,-0.829206,-0.933333,0
-0.882353,-0.115578,-0.508197,-0.151515,-0.765957,0.639344,-0.64304,-0.833333,0
-0.647059,0.20603,0.147541,-0.393939,-0.680851,0.278689,-0.680615,-0.7,1
-0.882353,0.18593,-0.0491803,-0.272727,-0.777778,-0.00745157,-0.843723,-0.933333,1
-0.882353,0.175879,0.442623,-0.515152,-0.65721,0.028316,-0.722459,-0.366667,0
0,0.0552764,0.377049,0,0,-0.168405,-0.433817,0.366667,0
-0.529412,0.738693,0.147541,-0.717172,-0.602837,-0.114754,-0.758326,-0.6,0
0.0588235,0.226131,-0.0819672,0,0,-0.00745157,-0.115286,-0.6,0
-0.647059,0.708543,0.0491803,-0.252525,-0.468085,0.028316,-0.762596,-0.7,0
-0.0588235,-0.155779,0.213115,-0.373737,0,0.14158,-0.676345,-0.4,1
-0.764706,-0.0351759,0.114754,-0.737374,-0.884161,-0.371088,-0.514091,-0.833333,1
-0.764706,0.256281,-0.0163934,-0.59596,-0.669031,0.00745157,-0.99146,-0.666667,1
0,0.00502513,0.147541,-0.474747,-0.881797,-0.0819672,-0.556789,0,1
0,-0.0653266,-0.0163934,-0.494949,-0.782506,-0.14456,-0.612297,-0.966667,1
0,0.296482,0.311475,0,0,-0.0700447,-0.466268,-0.733333,1
-0.411765,0.0552764,0.180328,-0.414141,-0.231678,0.0998511,-0.930828,-0.766667,1
-0.647059,0.286432,0.278689,0,0,-0.371088,-0.837746,0.133333,1
-0.411765,0.0653266,0.344262,-0.393939,0,0.177347,-0.822374,-0.433333,1
-0.764706,0.0854271,-0.147541,-0.474747,-0.851064,-0.0312965,-0.795047,-0.966667,1
0.176471,0.0854271,0.0819672,0,0,-0.0342771,-0.83433,-0.3,0
-0.529412,0.547739,0.0163934,-0.373737,-0.328605,-0.0223547,-0.864219,-0.933333,1
0,0.0251256,0.229508,-0.535354,0,0,-0.578138,0,1
0.0588235,-0.427136,0.311475,-0.252525,0,-0.0223547,-0.984629,-0.333333,1
-0.764706,0.0653266,0.0491803,-0.292929,-0.718676,-0.0909091,0.12895,-0.566667,1
-0.411765,0.477387,0.278689,0,0,0.004471,-0.880444,0.466667,1
-0.764706,-0.0954774,0.147541,-0.656566,0,-0.186289,-0.994022,-0.966667,1
-0.882353,0.366834,0.213115,0.010101,-0.51773,0.114754,-0.725875,-0.9,1
-0.529412,0.145729,0.0655738,0,0,-0.347243,-0.697694,-0.466667,1
0.0588235,0.567839,0.409836,-0.434343,-0.63357,0.0223547,-0.0512383,-0.3,0
-0.882353,0.537688,0.344262,-0.151515,0.146572,0.210134,-0.479932,-0.933333,1
-0.0588235,0.889447,0.278689,0,0,0.42772,-0.949616,-0.266667,0
-0.176471,0.527638,0.442623,-0.111111,0,0.490313,-0.778822,-0.5,0
-0.764706,-0.00502513,-0.147541,-0.69697,-0.777778,-0.266766,-0.52263,0,1
-0.882353,0.0954774,-0.0819672,-0.575758,-0.680851,-0.248882,-0.355252,-0.933333,1
-0.764706,-0.115578,0.213115,-0.616162,-0.874704,-0.135618,-0.87105,-0.966667,1
1,0.638191,0.180328,-0.171717,-0.730496,0.219076,-0.368915,-0.133333,0
-0.529412,0.517588,0.47541,-0.232323,0,-0.114754,-0.815542,-0.5,1
-0.176471,0.0251256,0.213115,-0.191919,-0.751773,0.108793,-0.8924,-0.2,1
0,0.145729,0.311475,-0.313131,-0.326241,0.317437,-0.923997,-0.8,1
-0.764706,0.00502513,0.0491803,-0.535354,0,-0.114754,-0.752348,0,1
0,0.316583,0.442623,0,0,-0.0581222,-0.432109,-0.633333,0
-0.294118,0.0452261,0.213115,-0.636364,-0.631206,-0.108793,-0.450043,-0.333333,0
-0.647059,0.487437,0.0819672,-0.494949,0,-0.0312965,-0.847993,-0.966667,1
-0.529412,0.20603,0.114754,0,0,-0.117735,-0.461144,-0.566667,1
-0.529412,0.105528,0.0819672,0,0,-0.0491803,-0.664389,-0.733333,1
-0.647059,0.115578,0.47541,-0.757576,-0.815603,-0.153502,-0.643894,-0.733333,1
-0.294118,0.0251256,0.344262,0,0,-0.0819672,-0.912895,-0.5,0
-0.294118,0.346734,0.147541,-0.535354,-0.692671,0.0551417,-0.603757,-0.733333,0
-0.764706,-0.125628,0,-0.535354,0,-0.138599,-0.40649,-0.866667,1
-0.882353,-0.20603,-0.0163934,-0.151515,-0.886525,0.296572,-0.487617,-0.933333,1
-0.764706,-0.246231,0.0491803,-0.515152,-0.869976,-0.114754,-0.75064,-0.6,1
-0.0588235,0.798995,0.180328,-0.151515,-0.692671,-0.0253353,-0.452605,-0.5,0
-0.294118,-0.145729,0.278689,0,0,-0.0700447,-0.740393,-0.3,1
0,0.296482,0.803279,-0.0707071,-0.692671,1,-0.794193,-0.833333,0
-0.411765,0.437186,0.278689,0,0,0.341282,-0.904355,-0.133333,1
-0.411765,0.306533,0.344262,0,0,0.165425,-0.250213,-0.466667,0
-0.294118,-0.125628,0.311475,0,0,-0.308495,-0.994876,-0.633333,1
0,0.19598,0.0491803,-0.636364,-0.782506,0.0402385,-0.447481,-0.933333,1
-0.882353,0,0.213115,-0.59596,-0.945626,-0.174367,-0.811272,0,1
-0.411765,-0.266332,-0.0163934,0,0,-0.201192,-0.837746,-0.8,1
-0.529412,0.417085,0.213115,0,0,-0.177347,-0.858241,-0.366667,1
-0.176471,0.949749,0.114754,-0.434343,0,0.0700448,-0.430401,-0.333333,0
-0.0588235,0.819095,0.114754,-0.272727,0.170213,-0.102832,-0.541418,0.3,0
-0.882353,0.286432,0.606557,-0.171717,-0.862884,-0.0461997,0.0614859,-0.6,0
-0.0588235,0.0954774,0.245902,-0.212121,-0.730496,-0.168405,-0.520068,-0.666667,0
-0.411765,0.396985,0.311475,-0.292929,-0.621749,-0.0581222,-0.758326,-0.866667,0
-0.647059,0.115578,0.0163934,0,0,-0.326379,-0.945346,0,1
0.0588235,0.236181,0.147541,-0.111111,-0.777778,-0.0134128,-0.747225,-0.366667,1
-0.176471,0.59799,0.0819672,0,0,-0.0938897,-0.739539,-0.5,0
0.294118,0.356784,0,0,0,0.558867,-0.573015,-0.366667,0
-0.0588235,-0.145729,-0.0983607,-0.59596,0,-0.272727,-0.95047,-0.3,1
-0.411765,0.58794,0.377049,-0.171717,-0.503546,0.174367,-0.729291,-0.733333,0
-0.882353,0.0552764,-0.0491803,0,0,-0.275708,-0.906917,0,1
-0.647059,0.0753769,0.0163934,-0.737374,-0.886525,-0.317437,-0.487617,-0.933333,0
-0.529412,0.0954774,0.0491803,-0.111111,-0.765957,0.0372578,-0.293766,-0.833333,0
-0.529412,0.487437,-0.0163934,-0.454545,-0.248227,-0.0789866,-0.938514,-0.733333,0
0,0.135678,0.311475,-0.676768,0,-0.0760059,-0.320239,0,1
-0.882353,0.386935,0.344262,0,0,0.195231,-0.865073,-0.766667,1
0,0.0854271,0.114754,-0.59596,0,-0.186289,-0.394535,-0.633333,1
-0.764706,-0.00502513,0.147541,-0.676768,-0.895981,-0.391952,-0.865927,-0.8,1
-0.294118,0.0351759,0.180328,-0.353535,-0.550827,0.123696,-0.789923,0.133333,1
-0.411765,0.115578,0.180328,-0.434343,0,-0.28763,-0.719044,-0.8,1
-0.0588235,0.969849,0.245902,-0.414141,-0.338061,0.117735,-0.549957,0.2,0
-0.411765,0.628141,0.704918,0,0,0.123696,-0.93766,0.0333333,0
-0.882353,-0.0351759,0.0491803,-0.454545,-0.794326,-0.0104321,-0.819812,0,1
-0.176471,0.849246,0.377049,-0.333333,0,0.0581222,-0.76345,-0.333333,0
-0.764706,-0.18593,-0.0163934,-0.555556,0,-0.174367,-0.818958,-0.866667,1
0,0.477387,0.393443,0.0909091,0,0.275708,-0.746371,-0.9,1
-0.176471,0.798995,0.557377,-0.373737,0,0.0193741,-0.926558,0.3,1
0,0.407035,0.0655738,-0.474747,-0.692671,0.269747,-0.698548,-0.9,0
0.0588235,0.125628,0.344262,-0.353535,-0.586288,0.0193741,-0.844577,-0.5,0
0.411765,0.517588,0.147541,-0.191919,-0.359338,0.245902,-0.432963,-0.433333,0
-0.411765,0.0954774,0.0163934,-0.171717,-0.695035,0.0670641,-0.627669,-0.866667,0
-0.294118,0.256281,0.114754,-0.393939,-0.716312,-0.105812,-0.670367,-0.633333,1
-0.411765,-0.145729,0.213115,-0.555556,0,-0.135618,-0.0213493,-0.633333,0
-0.411765,0.125628,0.0819672,0,0,0.126677,-0.843723,-0.333333,0
0,0.778894,-0.0163934,-0.414141,0.130024,0.0312965,-0.151153,0,0
-0.764706,0.58794,0.47541,0,0,-0.0581222,-0.379163,0.5,0
-0.176471,0.19598,0,0,0,-0.248882,-0.88813,-0.466667,1
-0.176471,0.427136,-0.0163934,-0.333333,-0.550827,-0.14158,-0.479932,0.333333,1
-0.882353,0.00502513,0.0819672,-0.69697,-0.867612,-0.296572,-0.497865,-0.833333,1
-0.882353,-0.125628,0.278689,-0.454545,-0.92435,0.0312965,-0.980359,-0.966667,1
0,0.0150754,0.245902,0,0,0.0640835,-0.897523,-0.833333,1
-0.647059,0.628141,-0.147541,-0.232323,0,0.108793,-0.509821,-0.9,0
-0.529412,0.979899,0.147541,-0.212121,0.758865,0.0938898,0.922289,-0.666667,1
0,0.175879,0.311475,-0.373737,-0.874704,0.347243,-0.990606,-0.9,1
-0.529412,0.427136,0.409836,0,0,0.311475,-0.515798,-0.966667,0
-0.294118,0.346734,0.311475,-0.252525,-0.125296,0.377049,-0.863365,-0.166667,0
-0.882353,-0.20603,0.311475,-0.494949,-0.91253,-0.242921,-0.568745,-0.966667,1
-0.529412,0.226131,0.114754,0,0,0.0432191,-0.730145,-0.733333,1
-0.647059,-0.256281,0.114754,-0.434343,-0.893617,-0.114754,-0.816396,-0.933333,1
-0.529412,0.718593,0.180328,0,0,0.299553,-0.657558,-0.833333,0
0,0.798995,0.47541,-0.454545,0,0.314456,-0.480786,-0.933333,0
0.0588235,0.648241,0.377049,-0.575758,0,-0.0819672,-0.35696,-0.633333,0
0,0.0452261,0.245902,0,0,-0.451565,-0.569599,-0.8,1
-0.882353,-0.0854271,0.0491803,-0.515152,0,-0.129657,-0.902647,0,1
-0.529412,-0.0854271,0.147541,-0.353535,-0.791962,-0.0134128,-0.685739,-0.966667,1
-0.647059,0.396985,-0.114754,0,0,-0.23696,-0.723313,-0.966667,0
-0.294118,0.19598,-0.180328,-0.555556,-0.583924,-0.19225,0.058924,-0.6,0
-0.764706,0.467337,0.245902,-0.292929,-0.541371,0.138599,-0.785653,-0.733333,1
0.0588235,0.849246,0.393443,-0.69697,0,-0.105812,-0.030743,-0.0666667,0
0.176471,0.226131,0.114754,0,0,-0.0700447,-0.846285,-0.333333,1
0,0.658291,0.47541,-0.333333,0.607565,0.558867,-0.701964,-0.933333,1
0.0588235,0.246231,0.147541,-0.333333,-0.0496454,0.0551417,-0.82579,-0.566667,1
-0.882353,0.115578,0.409836,-0.616162,0,-0.102832,-0.944492,-0.933333,1
0.0588235,0.0653266,-0.147541,0,0,-0.0700447,-0.742101,-0.3,1
-0.764706,0.296482,0.377049,0,0,-0.165425,-0.824082,-0.8,1
-0.764706,-0.0954774,0.311475,-0.717172,-0.869976,-0.272727,-0.853971,-0.9,1
0,-0.135678,0.114754,-0.353535,0,0.0670641,-0.863365,-0.866667,1
0.411765,-0.0753769,0.0163934,-0.858586,-0.390071,-0.177347,-0.275833,-0.233333,0
-0.882353,0.135678,0.0491803,-0.292929,0,0.00149028,-0.602904,0,0
-0.647059,0.115578,-0.0819672,-0.212121,0,-0.102832,-0.590948,-0.7,1
-0.764706,0.145729,0.114754,-0.555556,0,-0.14456,-0.988044,-0.866667,1
-0.882353,0.939698,-0.180328,-0.676768,-0.113475,-0.228018,-0.507259,-0.9,1
-0.647059,0.919598,0.114754,-0.69697,-0.692671,-0.0789866,-0.811272,-0.566667,1
-0.647059,0.417085,0,0,0,-0.105812,-0.416738,-0.8,0
-0.529412,-0.0452261,0.147541,-0.353535,0,-0.0432191,-0.54398,-0.9,1
-0.647059,0.427136,0.311475,-0.69697,0,-0.0342771,-0.895816,0.4,1
-0.529412,0.236181,0.0163934,0,0,-0.0461997,-0.873612,-0.533333,0
-0.411765,-0.0351759,0.213115,-0.636364,-0.841608,0.00149028,-0.215201,-0.266667,1
0,0.386935,0,0,0,0.0819672,-0.269855,-0.866667,0
-0.764706,0.286432,0.0491803,-0.151515,0,0.19225,-0.126388,-0.9,1
0,0.0251256,-0.147541,0,0,-0.251863,0,0,1
-0.764706,0.467337,0,0,0,-0.180328,-0.861657,-0.766667,0
0.176471,0.0150754,0.409836,-0.252525,0,0.359165,-0.0964987,-0.433333,0
-0.764706,0.0854271,0.0163934,-0.353535,-0.867612,-0.248882,-0.957301,0,1
-0.647059,0.226131,0.278689,0,0,-0.314456,-0.849701,-0.366667,1
-0.882353,-0.286432,0.278689,0.010101,-0.893617,-0.0104321,-0.706234,0,1
0.529412,0.0653266,0.147541,0,0,0.0193741,-0.852263,0.0333333,1
-0.764706,0.00502513,0.147541,0.0505051,-0.865248,0.207154,-0.488471,-0.866667,1
-0.176471,0.0653266,-0.0163934,-0.515152,0,-0.210134,-0.813834,-0.733333,0
0,0.0452261,0.0491803,-0.535354,-0.725768,-0.171386,-0.678907,-0.933333,1
-0.411765,0.145729,0.213115,0,0,-0.257824,-0.431255,0.2,1
-0.764706,0.0854271,0.0163934,-0.79798,-0.34279,-0.245902,-0.314261,-0.966667,1
0,0.467337,0.147541,0,0,0.129657,-0.781383,-0.766667,0
0.176471,0.296482,0.245902,-0.434343,-0.711584,0.0700448,-0.827498,-0.4,1
-0.176471,0.336683,0.442623,-0.69697,-0.63357,-0.0342771,-0.842869,-0.466667,1
-0.176471,0.61809,0.409836,0,0,-0.0938897,-0.925705,-0.133333,0
-0.764706,0.0854271,0.311475,0,0,-0.195231,-0.845431,0.0333333,0
-0.411765,0.557789,0.377049,-0.111111,0.288416,0.153502,-0.538002,-0.566667,1
-0.882353,0.19598,0.409836,-0.212121,-0.479905,0.359165,-0.376601,-0.733333,0
-0.529412,-0.0351759,-0.0819672,-0.656566,-0.884161,-0.38003,-0.77626,-0.833333,1
-0.411765,0.0854271,0.180328,-0.131313,-0.822695,0.0760059,-0.842015,-0.6,1
0,-0.21608,0.442623,-0.414141,-0.905437,0.0998511,-0.695986,0,1
0,0.0753769,0.0163934,-0.393939,-0.825059,0.0909091,-0.420154,-0.866667,0
-0.764706,0.286432,0.278689,-0.252525,-0.56974,0.290611,-0.0213493,-0.666667,0
-0.882353,0.286432,-0.213115,-0.0909091,-0.541371,0.207154,-0.543126,-0.9,0
0,0.61809,-0.180328,0,0,-0.347243,-0.849701,0.466667,1
-0.294118,0.517588,0.0163934,-0.373737,-0.716312,0.0581222,-0.475662,-0.766667,1
-0.764706,0.467337,0.147541,-0.232323,-0.148936,-0.165425,-0.778822,-0.733333,0
0,0.266332,0.377049,-0.414141,-0.491726,-0.0849478,-0.622545,-0.9,1
0.647059,0.00502513,0.278689,-0.494949,-0.565012,0.0909091,-0.714774,-0.166667,0
-0.0588235,0.125628,0.180328,0,0,-0.296572,-0.349274,0.233333,1
0,0.678392,0,0,0,-0.0372578,-0.350128,-0.7,0
-0.764706,0.447236,-0.0491803,-0.333333,-0.680851,-0.0581222,-0.706234,-0.866667,0
-0.411765,-0.226131,0.344262,-0.171717,-0.900709,0.0670641,-0.93339,-0.533333,1
-0.411765,0.155779,0.606557,0,0,0.576751,-0.88813,-0.766667,0
-0.647059,0.507538,0.245902,0,0,-0.374069,-0.889838,-0.466667,1
-0.764706,0.20603,0.245902,-0.252525,-0.751773,0.183309,-0.883006,-0.733333,1
0.176471,0.61809,0.114754,-0.535354,-0.687943,-0.23994,-0.788215,-0.133333,0
0,0.376884,0.114754,-0.717172,-0.650118,-0.260805,-0.944492,0,1
0,0.286432,0.114754,-0.616162,-0.574468,-0.0909091,0.121264,-0.866667,0
-0.764706,0.246231,0.114754,-0.434343,-0.515366,-0.019374,-0.319385,-0.7,0
-0.294118,-0.19598,0.0819672,-0.393939,0,-0.219076,-0.799317,-0.333333,1
0,0.0653266,0.147541,-0.252525,-0.650118,0.174367,-0.549957,-0.966667,1
-0.764706,0.557789,0.213115,-0.656566,-0.77305,-0.207153,-0.69684,-0.8,0
-0.647059,0.135678,-0.180328,-0.79798,-0.799054,-0.120715,-0.532024,-0.866667,1
-0.176471,0.0954774,0.311475,-0.373737,0,0.0700448,-0.104184,-0.266667,0
-0.764706,0.125628,0.114754,-0.555556,-0.777778,0.0163934,-0.797609,-0.833333,1
-0.647059,-0.00502513,0.311475,-0.777778,-0.8487,-0.424739,-0.824082,-0.7,1
-0.647059,0.829146,0.213115,0,0,-0.0909091,-0.77199,-0.733333,0
-0.647059,0.155779,0.0819672,-0.212121,-0.669031,0.135618,-0.938514,-0.766667,1
-0.294118,0.949749,0.278689,0,0,-0.299553,-0.956447,0.266667,0
-0.529412,0.296482,-0.0163934,-0.757576,-0.453901,-0.180328,-0.616567,-0.666667,1
-0.647059,0.125628,0.213115,-0.393939,0,-0.0581222,-0.898377,-0.866667,0
0,0.246231,0.147541,-0.59596,0,-0.183308,-0.849701,-0.5,0
0.529412,0.527638,0.47541,-0.333333,-0.931442,-0.201192,-0.442357,-0.266667,0
-0.764706,0.125628,0.229508,-0.353535,0,0.0640835,-0.940222,0,1
-0.882353,0.577889,0.180328,-0.575758,-0.602837,-0.23696,-0.961571,-0.9,1
-0.882353,0.226131,0.0491803,-0.353535,-0.631206,0.0461997,-0.475662,-0.7,0
0.176471,0.798995,0.147541,0,0,0.0461997,-0.895816,-0.466667,1
-0.764706,0.0251256,0.409836,-0.272727,-0.716312,0.356185,-0.958155,-0.933333,0
-0.294118,0.0552764,0.147541,-0.353535,-0.839243,-0.0819672,-0.962425,-0.466667,1
-0.0588235,0.18593,0.180328,-0.616162,0,-0.311475,0.193851,-0.166667,1
-0.764706,-0.125628,-0.0491803,-0.676768,-0.877069,-0.0253353,-0.924851,-0.866667,1
-0.882353,0.809045,0,0,0,0.290611,-0.82579,-0.333333,0
0.411765,0.0653266,0.311475,0,0,-0.296572,-0.949616,-0.233333,1
-0.882353,-0.0452261,-0.0163934,-0.636364,-0.862884,-0.28763,-0.844577,-0.966667,1
0,0.658291,0.245902,-0.131313,-0.397163,0.42772,-0.845431,-0.833333,1
0,0.175879,0,0,0,0.00745157,-0.270709,-0.233333,1
-0.411765,0.155779,0.245902,0,0,-0.0700447,-0.773698,-0.233333,0
0.0588235,0.527638,0.278689,-0.313131,-0.595745,0.0193741,-0.304014,-0.6,0
-0.176471,0.788945,0.377049,0,0,0.18927,-0.783945,-0.333333,0
-0.882353,0.306533,0.147541,-0.737374,-0.751773,-0.228018,-0.663535,-0.966667,1
-0.882353,-0.0452261,0.213115,-0.575758,-0.827423,-0.228018,-0.491887,-0.5,1
-0.882353,0,0.114754,-0.292929,0,-0.0461997,-0.734415,-0.966667,1
-0.411765,0.226131,0.409836,0,0,0.0342773,-0.818958,-0.6,1
-0.0588235,-0.0452261,0.180328,0,0,0.0968703,-0.652434,0.2,1
-0.0588235,0.266332,0.442623,-0.272727,-0.744681,0.147541,-0.768574,-0.0666667,1
-0.882353,0.396985,-0.245902,-0.616162,-0.803783,-0.14456,-0.508113,-0.966667,1
-0.647059,0.165829,0,0,0,-0.299553,-0.906917,-0.933333,1
-0.647059,-0.00502513,0.0163934,-0.616162,-0.825059,-0.350224,-0.828352,-0.833333,1
-0.411765,0,0.311475,-0.353535,0,0.222057,-0.771136,-0.466667,0
-0.529412,-0.0753769,0.311475,0,0,0.257824,-0.864219,-0.733333,1
-0.529412,0.376884,0.377049,0,0,-0.0700447,-0.851409,-0.7,1
-0.647059,-0.386935,0.344262,-0.434343,0,0.0253354,-0.859095,-0.166667,1
-0.882353,-0.0954774,0.0163934,-0.757576,-0.898345,-0.18927,-0.571307,-0.9,1
-0.647059,-0.0954774,0.278689,0,0,0.272727,-0.58924,0,1
0.0588235,0.658291,0.442623,0,0,-0.0938897,-0.808711,-0.0666667,0
-0.882353,0.256281,-0.180328,-0.191919,-0.605201,-0.00745157,-0.24509,-0.766667,0
0.529412,0.296482,0,-0.393939,0,0.18927,-0.5807,-0.233333,0
0.411765,-0.115578,0.213115,-0.191919,-0.87234,0.052161,-0.743809,-0.1,1
-0.882353,0.969849,0.245902,-0.272727,-0.411348,0.0879285,-0.319385,-0.733333,0
-0.411765,0.899497,0.0491803,-0.333333,-0.231678,-0.0700447,-0.568745,-0.733333,0
-0.411765,0.58794,0.147541,0,0,-0.111773,-0.889838,0.4,1
-0.411765,0.0351759,0.770492,-0.252525,0,0.168405,-0.806149,0.466667,1
-0.529412,0.467337,0.278689,0,0,0.147541,-0.622545,0.533333,0
-0.529412,0.477387,0.213115,-0.494949,-0.307329,0.0402385,-0.737831,-0.7,1
-0.411765,-0.00502513,-0.114754,-0.434343,-0.803783,0.0134128,-0.640478,-0.7,1
-0.294118,0.246231,0.180328,0,0,-0.177347,-0.752348,-0.733333,0
0,0.0150754,0.0491803,-0.656566,0,-0.374069,-0.851409,0,1
-0.647059,-0.18593,0.409836,-0.676768,-0.843972,-0.180328,-0.805295,-0.966667,1
-0.882353,0.336683,0.672131,-0.434343,-0.669031,-0.0223547,-0.866781,-0.2,0
-0.647059,0.738693,0.344262,-0.030303,0.0992908,0.14456,0.758326,-0.866667,0
0,0.18593,0.0491803,-0.535354,-0.789598,0,0.411614,0,1
0,-0.155779,0.0491803,-0.555556,-0.843972,0.0670641,-0.601196,0,1
-0.764706,0.0552764,-0.0491803,-0.191919,-0.777778,0.0402385,-0.874466,-0.866667,1
-0.764706,0.226131,-0.147541,-0.131313,-0.626478,0.0789866,-0.369769,-0.766667,1
0.411765,0.407035,0.344262,-0.131313,-0.231678,0.168405,-0.615713,0.233333,0
0,-0.0150754,0.344262,-0.69697,-0.801418,-0.248882,-0.811272,-0.966667,1
-0.882353,-0.125628,-0.0163934,-0.252525,-0.822695,0.108793,-0.631939,-0.966667,1
-0.529412,0.567839,0.229508,0,0,0.439642,-0.863365,-0.633333,0
0,-0.0653266,0.639344,-0.212121,-0.829787,0.293592,-0.194705,-0.533333,1
-0.882353,0.0753769,0.180328,-0.393939,-0.806147,-0.0819672,-0.3655,-0.9,1
0,0.0552764,0.114754,-0.555556,0,-0.403875,-0.865073,-0.966667,1
-0.882353,0.0954774,-0.0163934,-0.838384,-0.56974,-0.242921,-0.257899,0,1
-0.882353,-0.0954774,0.0163934,-0.636364,-0.86052,-0.251863,0.0162254,-0.866667,1
-0.882353,0.256281,0.147541,-0.515152,-0.739953,-0.275708,-0.877882,-0.866667,1
-0.882353,0.19598,-0.114754,-0.737374,-0.881797,-0.33532,-0.891546,-0.9,1
-0.411765,0.165829,0.213115,-0.414141,0,-0.0372578,-0.502989,-0.533333,0
-0.0588235,0.0552764,0.639344,-0.272727,0,0.290611,-0.862511,-0.2,0
-0.411765,0.447236,0.344262,-0.474747,-0.326241,-0.0461997,-0.680615,0.233333,0
-0.647059,0.00502513,0.114754,-0.535354,-0.808511,-0.0581222,-0.256191,-0.766667,1
-0.882353,0.00502513,0.0819672,-0.414141,-0.536643,-0.0461997,-0.687447,-0.3,1
-0.411765,0.668342,0.245902,0,0,0.362146,-0.77626,-0.8,0
-0.882353,0.316583,0.0491803,-0.717172,-0.0189125,-0.293592,-0.734415,0,1
-0.529412,0.165829,0.180328,-0.757576,-0.794326,-0.341282,-0.671221,-0.466667,1
-0.529412,0.58794,0.278689,0,0,-0.019374,-0.380871,-0.666667,0
-0.764706,0.276382,-0.0491803,-0.515152,-0.349882,-0.174367,0.299744,-0.866667,1
-0.647059,-0.0351759,-0.0819672,-0.313131,-0.728132,-0.263785,-0.260461,-0.4,1
0,0.316583,0.0819672,-0.191919,0,0.0223547,-0.899231,-0.966667,0
-0.647059,-0.175879,0.147541,0,0,-0.371088,-0.734415,-0.866667,1
-0.647059,0.939698,0.147541,-0.373737,0,0.0402385,-0.860803,-0.866667,0
-0.529412,-0.0452261,0.0491803,0,0,-0.0461997,-0.92912,-0.666667,0
-0.411765,0.366834,0.377049,-0.171717,-0.791962,0.0432191,-0.822374,-0.533333,0
0.0588235,-0.276382,0.278689,-0.494949,0,-0.0581222,-0.827498,-0.433333,1
-0.411765,0.688442,0.0491803,0,0,-0.019374,-0.951324,-0.333333,0
-0.764706,0.236181,-0.213115,-0.353535,-0.609929,0.254843,-0.622545,-0.833333,1
-0.529412,0.155779,0.180328,0,0,-0.138599,-0.745517,-0.166667,0
0,0.0150754,0.0163934,0,0,-0.347243,-0.779675,-0.866667,1
-0.0588235,0.979899,0.213115,0,0,-0.228018,-0.0495303,-0.4,0
-0.882353,0.728643,0.114754,-0.010101,0.368794,0.263785,-0.467122,-0.766667,0
-0.294118,0.0251256,0.47541,-0.212121,0,0.0640835,-0.491033,-0.766667,1
-0.882353,0.125628,0.180328,-0.393939,-0.583924,0.0253354,-0.615713,-0.866667,1
-0.882353,0.437186,0.377049,-0.535354,-0.267139,0.263785,-0.147737,-0.966667,1
-0.882353,0.437186,0.213115,-0.555556,-0.855792,-0.219076,-0.847993,0,1
0,0.386935,-0.0163934,-0.292929,-0.605201,0.0312965,-0.610589,0,0
-0.647059,0.738693,0.377049,-0.333333,0.120567,0.0640835,-0.846285,-0.966667,0
-0.882353,-0.0251256,0.114754,-0.575758,0,-0.18927,-0.131512,-0.966667,1
-0.529412,0.447236,0.344262,-0.353535,0,0.147541,-0.59351,-0.466667,0
-0.882353,-0.165829,0.114754,0,0,-0.457526,-0.533732,-0.8,1
-0.647059,0.296482,0.0491803,-0.414141,-0.728132,-0.213115,-0.87959,-0.766667,0
-0.882353,0.19598,0.442623,-0.171717,-0.598109,0.350224,-0.633646,-0.833333,1
-0.764706,-0.0552764,0.114754,-0.636364,-0.820331,-0.225037,-0.587532,0,1
0,0.0251256,0.0491803,-0.0707071,-0.815603,0.210134,-0.64304,0,1
-0.764706,0.155779,0.0491803,-0.555556,0,-0.0819672,-0.707088,0,1
-0.0588235,0.517588,0.278689,-0.353535,-0.503546,0.278689,-0.625961,-0.5,0
-0.529412,0.849246,0.278689,-0.212121,-0.345154,0.102832,-0.841161,-0.666667,0
0,-0.0552764,0,0,0,0,-0.847993,-0.866667,1
-0.882353,0.819095,0.0491803,-0.393939,-0.574468,0.0163934,-0.786507,-0.433333,0
0,0.356784,0.540984,-0.0707071,-0.65721,0.210134,-0.824082,-0.833333,1
-0.882353,-0.0452261,0.344262,-0.494949,-0.574468,0.0432191,-0.867635,-0.266667,0
-0.764706,-0.00502513,0,0,0,-0.338301,-0.974381,-0.933333,1
-0.647059,-0.105528,0.213115,-0.676768,-0.799054,-0.0938897,-0.596072,-0.433333,1
-0.882353,-0.19598,0.213115,-0.777778,-0.858156,-0.105812,-0.616567,-0.966667,1
-0.764706,0.396985,0.229508,0,0,-0.23696,-0.923997,-0.733333,1
-0.882353,-0.0954774,0.114754,-0.838384,0,-0.269747,-0.0947908,-0.5,1
0,0.417085,0,0,0,0.263785,-0.891546,-0.733333,0
0.411765,0.407035,0.393443,-0.333333,0,0.114754,-0.858241,-0.333333,1
-0.411765,0.477387,0.229508,0,0,-0.108793,-0.695986,-0.766667,1
-0.882353,-0.0251256,0.147541,-0.69697,0,-0.457526,-0.941076,0,1
-0.294118,0.0753769,0.442623,0,0,0.0968703,-0.445773,-0.666667,1
0,0.899497,0.704918,-0.494949,0,0.0223547,-0.695132,-0.333333,0
-0.764706,-0.165829,0.0819672,-0.535354,-0.881797,-0.0402384,-0.642186,-0.966667,1
-0.529412,0.175879,0.0491803,-0.454545,-0.716312,-0.0104321,-0.870196,-0.9,1
-0.0588235,0.0854271,0.147541,0,0,-0.0909091,-0.251067,-0.6,0
-0.529412,0.175879,0.0163934,-0.757576,0,-0.114754,-0.742101,-0.7,0
0,0.809045,0.278689,0.272727,-0.966903,0.770492,1,-0.866667,0
-0.882353,0.00502513,0.180328,-0.757576,-0.834515,-0.245902,-0.504697,-0.766667,1
0,-0.0452261,0.311475,-0.0909091,-0.782506,0.0879285,-0.784799,-0.833333,1
0,0.0452261,0.0491803,-0.252525,-0.8487,0.00149028,-0.631085,-0.966667,0
0,0.20603,0.213115,-0.636364,-0.851064,-0.0909091,-0.823228,-0.833333,1
-0.882353,-0.175879,0.0491803,-0.737374,-0.775414,-0.368107,-0.712212,-0.933333,1
-0.764706,0.346734,0.147541,0,0,-0.138599,-0.603757,-0.933333,0
0,-0.0854271,0.114754,-0.353535,-0.503546,0.18927,-0.741247,-0.866667,1
-0.764706,0.19598,0,0,0,-0.415797,-0.356106,0.7,1
-0.764706,0.00502513,-0.114754,-0.434343,-0.751773,0.126677,-0.641332,-0.9,1
0.647059,0.758794,0.0163934,-0.393939,0,0.00149028,-0.885568,-0.433333,0
-0.882353,0.356784,-0.114754,0,0,-0.204173,-0.479932,0.366667,1
-0.411765,-0.135678,0.114754,-0.434343,-0.832151,-0.0998509,-0.755764,-0.9,1
0.0588235,0.346734,0.213115,-0.333333,-0.858156,-0.228018,-0.673783,1,1
0.0588235,0.20603,0.180328,-0.555556,-0.867612,-0.38003,-0.440649,-0.1,1
-0.882353,-0.286432,0.0163934,0,0,-0.350224,-0.711358,-0.833333,1
-0.0588235,-0.256281,0.147541,-0.191919,-0.884161,0.052161,-0.46456,-0.4,1
-0.411765,-0.115578,0.278689,-0.393939,0,-0.177347,-0.846285,-0.466667,1
0.176471,0.155779,0.606557,0,0,-0.28465,-0.193851,-0.566667,1
0,0.246231,-0.0819672,-0.737374,-0.751773,-0.350224,-0.680615,0,1
0,-0.256281,-0.147541,-0.79798,-0.914894,-0.171386,-0.836892,-0.966667,1
0,-0.0251256,0.0491803,-0.272727,-0.763593,0.0968703,-0.554227,-0.866667,1
-0.0588235,0.20603,0,0,0,-0.105812,-0.910333,-0.433333,0
-0.294118,0.547739,0.278689,-0.171717,-0.669031,0.374069,-0.578992,-0.8,1
-0.882353,0.447236,0.344262,-0.191919,0,0.230999,-0.548249,-0.766667,1
0,0.376884,0.147541,-0.232323,0,-0.0104321,-0.921435,-0.966667,1
0,0.19598,0.0819672,-0.454545,0,0.156483,-0.845431,-0.966667,1
-0.176471,0.366834,0.47541,0,0,-0.108793,-0.887276,-0.0333333,1
-0.529412,0.145729,0.0491803,0,0,-0.138599,-0.959009,-0.9,1
0,0.376884,0.377049,-0.454545,0,-0.186289,-0.869342,0.266667,1
-0.764706,0.0552764,0.311475,-0.0909091,-0.548463,0.004471,-0.459436,-0.733333,0
-0.176471,0.145729,0.245902,-0.656566,-0.739953,-0.290611,-0.668659,-0.666667,1
-0.0588235,0.266332,0.213115,-0.232323,-0.822695,-0.228018,-0.928266,-0.4,1
-0.529412,0.326633,0.409836,-0.373737,0,-0.165425,-0.708796,0.4,1
-0.647059,0.58794,0.147541,-0.393939,-0.224586,0.0581222,-0.772844,-0.533333,0
0,0.236181,0.442623,-0.252525,0,0.0491804,-0.898377,-0.733333,1
-0.529412,-0.145729,-0.0491803,-0.555556,-0.884161,-0.171386,-0.805295,-0.766667,1
0,-0.155779,0.344262,-0.373737,-0.704492,0.138599,-0.867635,-0.933333,1
0,0.457286,0,0,0,0.317437,-0.528608,-0.666667,0
0,0.356784,0.114754,-0.151515,-0.408983,0.260805,-0.75491,-0.9,0
-0.882353,0.396985,0.0163934,-0.171717,0.134752,0.213115,-0.608881,0,1
0,0.738693,0.278689,-0.353535,-0.373522,0.385991,-0.0768574,0.233333,1
-0.529412,-0.00502513,0.180328,-0.656566,0,-0.23696,-0.815542,-0.766667,1
-0.0588235,0.949749,0.311475,0,0,-0.222057,-0.596072,0.533333,1
-0.764706,-0.165829,0.0655738,-0.434343,-0.843972,0.0968703,-0.529462,-0.9,1
-0.764706,-0.105528,0.47541,-0.393939,0,-0.00149028,-0.81725,-0.3,1
-0.529412,-0.00502513,0.114754,-0.232323,0,-0.0223547,-0.942784,-0.6,1
-0.529412,0.256281,0.147541,-0.636364,-0.711584,-0.138599,-0.089667,-0.2,0
-0.647059,-0.19598,0,0,0,0,-0.918019,-0.966667,1
-0.294118,0.668342,0.213115,0,0,-0.207153,-0.807003,0.5,1
-0.411765,0.105528,0.114754,0,0,-0.225037,-0.81725,-0.7,1
-0.764706,-0.18593,0.180328,-0.69697,-0.820331,-0.102832,-0.599488,-0.866667,1
-0.176471,0.959799,0.147541,-0.333333,-0.65721,-0.251863,-0.927412,0.133333,0
-0.294118,0.547739,0.213115,-0.353535,-0.543735,-0.126677,-0.350128,-0.4,1
-0.764706,0.175879,0.47541,-0.616162,-0.832151,-0.248882,-0.799317,0,1
-0.647059,-0.155779,0.180328,-0.353535,0,0.108793,-0.838599,-0.766667,1
-0.294118,0,0.114754,-0.171717,0,0.162444,-0.445773,-0.333333,0
-0.176471,-0.0552764,0.0491803,-0.494949,-0.813239,-0.00745157,-0.436379,-0.333333,1
-0.647059,-0.0351759,0.278689,-0.212121,0,0.111773,-0.863365,-0.366667,1
0.176471,-0.246231,0.344262,0,0,-0.00745157,-0.842015,-0.433333,1
0,0.809045,0.47541,-0.474747,-0.787234,0.0879285,-0.798463,-0.533333,0
-0.882353,0.306533,-0.0163934,-0.535354,-0.598109,-0.147541,-0.475662,0,1
-0.764706,-0.155779,-0.180328,-0.535354,-0.820331,-0.0938897,-0.239966,0,1
-0.0588235,0.20603,0.278689,0,0,-0.254843,-0.717336,0.433333,1
0.411765,-0.155779,0.180328,-0.373737,0,-0.114754,-0.81298,-0.166667,0
0,0.396985,0.0163934,-0.656566,-0.503546,-0.341282,-0.889838,0,1
0.0588235,-0.0854271,0.114754,0,0,-0.278688,-0.895816,0.233333,1
-0.764706,-0.0854271,0.0163934,0,0,-0.186289,-0.618275,-0.966667,1
-0.647059,-0.00502513,-0.114754,-0.616162,-0.79669,-0.23696,-0.935098,-0.9,1
-0.647059,0.638191,0.147541,-0.636364,-0.751773,-0.0581222,-0.837746,-0.766667,0
0.0588235,0.457286,0.442623,-0.313131,-0.609929,-0.0968703,-0.408198,0.0666667,0
0.529412,-0.236181,-0.0163934,0,0,-0.0223547,-0.912895,-0.333333,1
-0.294118,0.296482,0.47541,-0.858586,-0.229314,-0.415797,-0.569599,0.3,1
-0.764706,-0.316583,0.147541,-0.353535,-0.843972,-0.254843,-0.906917,-0.866667,1
-0.647059,0.246231,0.311475,-0.333333,-0.692671,-0.0104321,-0.806149,-0.833333,1
-0.294118,0.145729,0,0,0,0,-0.905209,-0.833333,1
0.0588235,0.306533,0.147541,0,0,0.0193741,-0.509821,-0.2,0
-0.647059,0.256281,-0.0491803,0,0,-0.0581222,-0.93766,-0.9,1
-0.647059,-0.125628,-0.0163934,-0.636364,0,-0.350224,-0.687447,0,1
-0.882353,-0.0251256,0.0491803,-0.616162,-0.806147,-0.457526,-0.811272,0,1
-0.647059,0.165829,0.213115,-0.69697,-0.751773,-0.216095,-0.975235,-0.9,1
0,0.175879,0.0819672,-0.373737,-0.555556,-0.0819672,-0.645602,-0.966667,1
0,0.115578,0.0655738,0,0,-0.266766,-0.502989,-0.666667,1
-0.764706,0.226131,-0.0163934,-0.636364,-0.749409,-0.111773,-0.454313,-0.966667,1
0,0.0753769,0.245902,0,0,0.350224,-0.480786,-0.9,1
-0.882353,-0.135678,0.0819672,0.0505051,-0.846336,0.230999,-0.283518,-0.733333,1
-0.294118,-0.0854271,0,0,0,-0.111773,-0.63877,-0.666667,1
-0.882353,-0.226131,-0.0819672,-0.393939,-0.867612,-0.00745157,0.00170794,-0.9,1
-0.529412,0.326633,0,0,0,-0.019374,-0.808711,-0.933333,0
0,0.0552764,0.47541,0,0,-0.117735,-0.898377,-0.166667,1
0,-0.427136,-0.0163934,0,0,-0.353204,-0.438941,0.533333,1
0,0.276382,0.311475,-0.252525,-0.503546,0.0819672,-0.380017,-0.933333,1
-0.647059,0.296482,0.508197,-0.010101,-0.63357,0.0849479,-0.239966,-0.633333,0
-0.0588235,0.00502513,0.213115,-0.191919,-0.491726,0.174367,-0.502135,-0.266667,0
-0.647059,0.286432,0.180328,-0.494949,-0.550827,-0.0342771,-0.59778,-0.8,0
0.176471,-0.0954774,0.393443,-0.353535,0,0.0402385,-0.362084,0.166667,0
-0.529412,-0.155779,0.47541,-0.535354,-0.867612,0.177347,-0.930828,-0.866667,1
-0.882353,-0.115578,0.278689,-0.414141,-0.820331,-0.0461997,-0.75491,-0.733333,1
-0.0588235,0.869347,0.47541,-0.292929,-0.468085,0.028316,-0.70538,-0.466667,0
-0.411765,0.879397,0.245902,-0.454545,-0.510638,0.299553,-0.183604,0.0666667,0
-0.529412,0.316583,0.114754,-0.575758,-0.607565,-0.0134128,-0.929974,-0.766667,1
-0.882353,0.648241,0.344262,-0.131313,-0.841608,-0.0223547,-0.775406,-0.0333333,1
-0.529412,0.899497,0.803279,-0.373737,0,-0.150522,-0.485909,-0.466667,1
-0.882353,0.165829,0.147541,-0.434343,0,-0.183308,-0.8924,0,1
-0.647059,-0.155779,0.114754,-0.393939,-0.749409,-0.0491803,-0.561913,-0.866667,1
-0.294118,0.145729,0.442623,0,0,-0.171386,-0.855679,0.5,1
-0.882353,-0.115578,0.0163934,-0.515152,-0.895981,-0.108793,-0.706234,-0.933333,1
-0.882353,-0.155779,0.0491803,-0.535354,-0.728132,0.0998511,-0.664389,-0.766667,1
-0.176471,0.246231,0.147541,-0.333333,-0.491726,-0.23994,-0.92912,-0.466667,1
-0.882353,-0.0251256,0.147541,-0.191919,0,0.135618,-0.880444,-0.7,1
-0.0588235,0.105528,0.245902,0,0,-0.171386,-0.864219,0.233333,1
0.294118,0.0351759,0.114754,-0.191919,0,0.377049,-0.959009,-0.3,1
0.294118,-0.145729,0.213115,0,0,-0.102832,-0.810418,-0.533333,1
-0.294118,0.256281,0.245902,0,0,0.00745157,-0.963279,0.1,0
0,0.98995,0.0819672,-0.353535,-0.352246,0.230999,-0.637916,-0.766667,0
-0.882353,-0.125628,0.114754,-0.313131,-0.817967,0.120715,-0.724167,-0.9,1
-0.294118,-0.00502513,-0.0163934,-0.616162,-0.87234,-0.198212,-0.642186,-0.633333,1
0,-0.0854271,0.311475,0,0,-0.0342771,-0.553373,-0.8,1
-0.764706,-0.0452261,-0.114754,-0.717172,-0.791962,-0.222057,-0.427839,-0.966667,1
-0.882353,-0.00502513,0.180328,-0.393939,-0.957447,0.150522,-0.714774,0,1
-0.294118,-0.0753769,0.0163934,-0.353535,-0.702128,-0.0461997,-0.994022,-0.166667,1
-0.529412,0.547739,0.180328,-0.414141,-0.702128,-0.0670641,-0.777968,-0.466667,1
0,0.21608,0.0819672,-0.393939,-0.609929,0.0223547,-0.893254,-0.6,0
-0.647059,-0.21608,0.147541,0,0,-0.0312965,-0.836038,-0.4,1
-0.764706,0.306533,0.57377,0,0,-0.326379,-0.837746,0,1
-0.647059,0.115578,-0.0491803,-0.373737,-0.895981,-0.120715,-0.699402,-0.966667,1
-0.764706,-0.0150754,-0.0163934,-0.656566,-0.716312,0.0342773,-0.897523,-0.966667,1
-0.882353,0.437186,0.409836,-0.393939,-0.219858,-0.102832,-0.304868,-0.933333,1
-0.882353,0.19598,-0.278689,-0.0505051,-0.851064,0.0581222,-0.827498,-0.866667,1
-0.294118,0.0854271,-0.278689,-0.59596,-0.692671,-0.28465,-0.372331,-0.533333,1
-0.764706,0.18593,0.311475,0,0,0.278689,-0.474808,0,0
0.176471,0.336683,0.114754,0,0,-0.195231,-0.857387,-0.5,1
-0.764706,0.979899,0.147541,1,0,0.0342773,-0.575576,0.366667,0
0,0.517588,0.47541,-0.0707071,0,0.254843,-0.749787,0,0
-0.294118,0.0954774,-0.0163934,-0.454545,0,-0.254843,-0.890692,-0.8,1
0.411765,0.21608,0.278689,-0.656566,0,-0.210134,-0.845431,0.366667,1
-0.0588235,0.00502513,0.245902,0,0,0.153502,-0.904355,-0.3,1
-0.0588235,0.246231,0.245902,-0.515152,0.41844,-0.14456,-0.479932,0.0333333,0
-0.882353,-0.0653266,-0.0819672,-0.777778,0,-0.329359,-0.710504,-0.966667,1
-0.0588235,0.437186,0.0819672,0,0,0.0402385,-0.956447,-0.333333,0
-0.294118,0.0351759,0.0819672,0,0,-0.275708,-0.853971,-0.733333,1
-0.647059,0.768844,0.409836,-0.454545,-0.631206,-0.00745157,-0.0811272,0.0333333,0
0,-0.266332,0,0,0,-0.371088,-0.774552,-0.866667,1
0.294118,0.115578,0.377049,-0.191919,0,0.394933,-0.276687,-0.2,0
-0.764706,0.125628,0.278689,0.010101,-0.669031,0.174367,-0.917165,-0.9,1
-0.647059,0.326633,0.311475,0,0,0.0253354,-0.723313,-0.233333,0
-0.764706,-0.175879,-0.147541,-0.555556,-0.728132,-0.150522,0.384287,-0.866667,1
-0.294118,0.236181,0.180328,-0.0909091,-0.456265,0.00149028,-0.440649,-0.566667,1
0,0.889447,0.344262,-0.717172,-0.562648,-0.0461997,-0.484202,-0.966667,0
0,-0.326633,0.245902,0,0,0.350224,-0.900939,-0.166667,1
-0.882353,-0.105528,-0.606557,-0.616162,-0.940898,-0.171386,-0.58924,0,1
-0.882353,0.738693,0.213115,0,0,0.0968703,-0.99146,-0.433333,0
-0.882353,0.0954774,-0.377049,-0.636364,-0.716312,-0.311475,-0.719044,-0.833333,1
-0.882353,0.0854271,0.442623,-0.616162,0,-0.19225,-0.725021,-0.9,1
-0.294118,-0.0351759,0,0,0,-0.293592,-0.904355,-0.766667,1
-0.882353,0.246231,0.213115,-0.272727,0,-0.171386,-0.981213,-0.7,1
-0.176471,0.507538,0.278689,-0.414141,-0.702128,0.0491804,-0.475662,0.1,0
-0.529412,0.839196,0,0,0,-0.153502,-0.885568,-0.5,0
-0.882353,0.246231,-0.0163934,-0.353535,0,0.0670641,-0.627669,0,1
-0.882353,0.819095,0.278689,-0.151515,-0.307329,0.19225,0.00768574,-0.966667,0
-0.882353,-0.0753769,0.0163934,-0.494949,-0.903073,-0.418778,-0.654996,-0.866667,1
0,0.527638,0.344262,-0.212121,-0.356974,0.23696,-0.836038,-0.8,1
-0.882353,0.115578,0.0163934,-0.737374,-0.56974,-0.28465,-0.948762,-0.933333,1
-0.647059,0.0653266,-0.114754,-0.575758,-0.626478,-0.0789866,-0.81725,-0.9,1
-0.647059,0.748744,-0.0491803,-0.555556,-0.541371,-0.019374,-0.560205,-0.5,0
-0.176471,0.688442,0.442623,-0.151515,-0.241135,0.138599,-0.394535,-0.366667,0
-0.294118,0.0552764,0.311475,-0.434343,0,-0.0312965,-0.316823,-0.833333,1
0.294118,0.386935,0.213115,-0.474747,-0.659574,0.0760059,-0.590948,-0.0333333,0
-0.647059,0.0653266,0.180328,0,0,-0.230999,-0.889838,-0.8,1
-0.294118,0.175879,0.57377,0,0,-0.14456,-0.932536,-0.7,1
-0.764706,-0.316583,0.0163934,-0.737374,-0.964539,-0.400894,-0.847139,-0.933333,1
0.0588235,0.125628,0.344262,-0.515152,0,-0.159463,0.028181,-0.0333333,0
0,0.19598,0,0,0,-0.0342771,-0.9462,-0.9,0
-0.764706,0.125628,0.409836,-0.151515,-0.621749,0.14456,-0.856533,-0.766667,1
-0.764706,-0.0753769,0.245902,-0.59596,0,-0.278688,0.383433,-0.766667,1
-0.294118,0.839196,0.540984,0,0,0.216095,0.181042,-0.2,1
0,-0.0552764,0.147541,-0.454545,-0.728132,0.296572,-0.770282,0,1
-0.764706,0.0854271,0.0491803,0,0,-0.0819672,-0.931682,0,1
-0.529412,-0.0954774,0.442623,-0.0505051,-0.87234,0.123696,-0.757472,-0.733333,1
0,0.256281,0.114754,0,0,-0.263785,-0.890692,0,1
0,0.326633,0.278689,0,0,-0.0342771,-0.730999,0,1
-0.411765,0.286432,0.311475,0,0,0.0312965,-0.943638,-0.2,1
-0.529412,-0.0552764,0.0655738,-0.555556,0,-0.263785,-0.940222,0,1
-0.176471,0.145729,0.0491803,0,0,-0.183308,-0.441503,-0.566667,0
0,0.0251256,0.278689,-0.191919,-0.787234,0.028316,-0.863365,-0.9,1
-0.764706,0.115578,-0.0163934,0,0,-0.219076,-0.773698,-0.933333,1
-0.882353,0.286432,0.344262,-0.656566,-0.567376,-0.180328,-0.968403,-0.966667,1
0.176471,-0.0753769,0.0163934,0,0,-0.228018,-0.923997,-0.666667,1
0.529412,0.0452261,0.180328,0,0,-0.0700447,-0.669513,-0.433333,0
-0.411765,0.0452261,0.213115,0,0,-0.14158,-0.935952,-0.1,1
-0.764706,-0.0552764,0.245902,-0.636364,-0.843972,-0.0581222,-0.512383,-0.933333,1
-0.176471,-0.0251256,0.245902,-0.353535,-0.78487,0.219076,-0.322801,-0.633333,0
-0.882353,0.00502513,0.213115,-0.757576,-0.891253,-0.418778,-0.939368,-0.766667,1
0,0.0251256,0.409836,-0.656566,-0.751773,-0.126677,-0.4731,-0.8,1
-0.529412,0.286432,0.147541,0,0,0.0223547,-0.807857,-0.9,1
-0.294118,0.477387,0.311475,0,0,-0.120715,-0.914603,-0.0333333,0
-0.529412,-0.0954774,0,0,0,-0.165425,-0.545687,-0.666667,1
-0.647059,0.0351759,0.180328,-0.393939,-0.640662,-0.177347,-0.443211,-0.8,1
-0.764706,0.577889,0.213115,-0.292929,0.0401891,0.174367,-0.952178,-0.7,1
-0.882353,0.678392,0.213115,-0.656566,-0.659574,-0.302534,-0.684885,-0.6,0
0,0.798995,-0.180328,-0.272727,-0.624113,0.126677,-0.678053,-0.966667,0
0.294118,0.366834,0.377049,-0.292929,-0.692671,-0.156483,-0.844577,-0.3,0
0,0.0753769,-0.0163934,-0.494949,0,-0.213115,-0.953032,-0.933333,1
-0.882353,-0.0854271,-0.114754,-0.494949,-0.763593,-0.248882,-0.866781,-0.933333,1
-0.882353,0.175879,-0.0163934,-0.535354,-0.749409,0.00745157,-0.668659,-0.8,1
-0.411765,0.236181,0.213115,-0.191919,-0.817967,0.0163934,-0.836892,-0.766667,1
-0.764706,0.20603,-0.114754,0,0,-0.201192,-0.678053,-0.8,1
-0.882353,0.0653266,0.147541,-0.434343,-0.680851,0.0193741,-0.945346,-0.966667,1
-0.764706,0.557789,-0.147541,-0.454545,0.276596,0.153502,-0.861657,-0.866667,0
-0.764706,0.0150754,-0.0491803,-0.292929,-0.787234,-0.350224,-0.934244,-0.966667,1
-0.882353,0.20603,0.311475,-0.030303,-0.527187,0.159464,-0.0742955,-0.333333,1
-0.647059,-0.19598,0.344262,-0.373737,-0.834515,0.0193741,0.0367208,-0.8,0
0.176471,0.628141,0.377049,0,0,-0.174367,-0.911187,0.1,1
-0.882353,1,0.245902,-0.131313,0,0.278689,0.123826,-0.966667,0
-0.0588235,0.678392,0.737705,-0.0707071,-0.453901,0.120715,-0.925705,-0.266667,0
0.0588235,0.457286,0.311475,-0.0707071,-0.692671,0.129657,-0.52263,-0.366667,0
-0.294118,0.155779,-0.0163934,-0.212121,0,0.004471,-0.857387,-0.366667,0
-0.882353,0.125628,0.311475,-0.0909091,-0.687943,0.0372578,-0.881298,-0.9,1
-0.529412,0.457286,0.344262,-0.636364,0,-0.0312965,-0.865927,0.633333,0
0.176471,0.115578,0.147541,-0.454545,0,-0.180328,-0.9462,-0.366667,0
-0.294118,-0.0150754,-0.0491803,-0.333333,-0.550827,0.0134128,-0.699402,-0.266667,1
0.0588235,0.547739,0.278689,-0.393939,-0.763593,-0.0789866,-0.926558,-0.2,1
-0.294118,0.658291,0.114754,-0.474747,-0.602837,0.00149028,-0.527754,-0.0666667,1
-0.882353,-0.00502513,-0.0491803,-0.79798,0,-0.242921,-0.596072,0,1
0.176471,-0.316583,0.737705,-0.535354,-0.884161,0.0581222,-0.823228,-0.133333,1
-0.647059,0.236181,0.639344,-0.292929,-0.432624,0.707899,-0.315115,-0.966667,1
-0.0588235,-0.0854271,0.344262,0,0,0.0611028,-0.565329,0.566667,1
-0.294118,0.959799,0.147541,0,0,-0.0789866,-0.786507,-0.666667,0
0.0588235,0.567839,0.409836,0,0,-0.260805,-0.870196,0.0666667,0
0,-0.0653266,-0.0163934,0,0,0.052161,-0.842015,-0.866667,1
-0.647059,0.21608,-0.147541,0,0,0.0730254,-0.958155,-0.866667,0
-0.764706,0.0150754,-0.0491803,-0.656566,-0.373522,-0.278688,-0.542272,-0.933333,1
-0.764706,-0.437186,-0.0819672,-0.434343,-0.893617,-0.278688,-0.783091,-0.966667,1
0,0.628141,0.245902,-0.272727,0,0.47839,-0.755764,-0.833333,0
0,-0.0452261,0.0491803,-0.212121,-0.751773,0.329359,-0.754056,-0.966667,1
-0.529412,0.256281,0.311475,0,0,-0.0372578,-0.608881,-0.8,0
-0.411765,0.366834,0.344262,0,0,0,-0.520068,0.6,1
-0.764706,0.296482,0.213115,-0.474747,-0.515366,-0.0104321,-0.561913,-0.866667,1
-0.647059,0.306533,0.0491803,0,0,-0.311475,-0.798463,-0.966667,1
-0.882353,0.0753769,-0.180328,-0.616162,0,-0.156483,-0.912041,-0.733333,1
-0.882353,0.407035,0.213115,-0.474747,-0.574468,-0.281669,-0.359522,-0.933333,1
-0.882353,0.447236,0.344262,-0.0707071,-0.574468,0.374069,-0.780529,-0.166667,0
-0.0588235,0.0753769,0.311475,0,0,-0.266766,-0.335611,-0.566667,1
0.529412,0.58794,0.868852,0,0,0.260805,-0.847139,-0.233333,0
-0.764706,0.21608,0.147541,-0.353535,-0.775414,0.165425,-0.309991,-0.933333,1
-0.176471,0.296482,0.114754,-0.010101,-0.704492,0.147541,-0.691716,-0.266667,0
-0.764706,-0.0954774,-0.0163934,0,0,-0.299553,-0.903501,-0.866667,1
-0.176471,0.427136,0.47541,-0.515152,0.134752,-0.0938897,-0.957301,-0.266667,0
-0.647059,0.698492,0.213115,-0.616162,-0.704492,-0.108793,-0.837746,-0.666667,0
0,-0.00502513,0,0,0,-0.254843,-0.850555,-0.966667,1
-0.529412,0.276382,0.442623,-0.777778,-0.63357,0.028316,-0.555935,-0.766667,1
-0.529412,0.18593,0.147541,0,0,0.326379,-0.29462,-0.833333,1
-0.764706,0.226131,0.245902,-0.454545,-0.527187,0.0700448,-0.654142,-0.833333,1
-0.294118,0.256281,0.278689,-0.373737,0,-0.177347,-0.584116,-0.0666667,0
-0.882353,0.688442,0.442623,-0.414141,0,0.0432191,-0.293766,0.0333333,0
-0.764706,0.296482,0,0,0,0.147541,-0.807003,-0.333333,1
-0.529412,0.105528,0.245902,-0.59596,-0.763593,-0.153502,-0.965841,-0.8,1
-0.294118,-0.19598,0.311475,-0.272727,0,0.186289,-0.915457,-0.766667,1
0.176471,0.155779,0,0,0,0,-0.843723,-0.7,0
-0.764706,0.276382,-0.245902,-0.575758,-0.208038,0.0253354,-0.916311,-0.966667,1
0.0588235,0.648241,0.278689,0,0,-0.0223547,-0.940222,-0.2,0
-0.764706,-0.0653266,0.0491803,-0.353535,-0.621749,0.132638,-0.491033,-0.933333,0
-0.647059,0.58794,0.0491803,-0.737374,-0.0851064,-0.0700447,-0.814688,-0.9,1
-0.411765,0.266332,0.278689,-0.454545,-0.947991,-0.117735,-0.691716,-0.366667,1
0.176471,0.296482,0.0163934,-0.272727,0,0.228018,-0.690009,-0.433333,0
0,0.346734,-0.0491803,-0.59596,-0.312057,-0.213115,-0.766012,0,1
-0.647059,0.0251256,0.213115,0,0,-0.120715,-0.963279,-0.633333,1
-0.176471,0.879397,-0.180328,-0.333333,-0.0732861,0.0104323,-0.36123,-0.566667,0
-0.647059,0.738693,0.278689,-0.212121,-0.562648,0.00745157,-0.238258,-0.666667,0
0.176471,-0.0552764,0.180328,-0.636364,0,-0.311475,-0.558497,0.166667,1
-0.882353,0.0854271,-0.0163934,-0.0707071,-0.579196,0.0581222,-0.712212,-0.9,1
-0.411765,-0.0251256,0.245902,-0.454545,0,0.0611028,-0.743809,0.0333333,0
-0.529412,-0.165829,0.409836,-0.616162,0,-0.126677,-0.795901,-0.566667,1
-0.882353,0.145729,0.0819672,-0.272727,-0.527187,0.135618,-0.819812,0,1
-0.882353,0.497487,0.114754,-0.414141,-0.699764,-0.126677,-0.768574,-0.3,0
-0.411765,0.175879,0.409836,-0.393939,-0.751773,0.165425,-0.852263,-0.3,1
-0.882353,0.115578,0.540984,0,0,-0.0223547,-0.840307,-0.2,1
-0.529412,0.125628,0.278689,-0.191919,0,0.174367,-0.865073,-0.433333,1
-0.882353,0.165829,0.278689,-0.414141,-0.574468,0.0760059,-0.64304,-0.866667,1
0,0.417085,0.377049,-0.474747,0,-0.0342771,-0.69684,-0.966667,1
-0.764706,0.758794,0.442623,0,0,-0.317437,-0.788215,-0.966667,1
-0.764706,-0.0753769,-0.147541,0,0,-0.102832,-0.9462,-0.966667,1
-0.647059,0.306533,0.278689,-0.535354,-0.813239,-0.153502,-0.790777,-0.566667,0
-0.0588235,0.20603,0.409836,0,0,-0.153502,-0.845431,-0.966667,0
-0.764706,0.748744,0.442623,-0.252525,-0.716312,0.326379,-0.514944,-0.9,0
-0.764706,0.0653266,-0.0819672,-0.454545,-0.609929,-0.135618,-0.702818,-0.966667,1
-0.764706,0.0552764,0.229508,0,0,-0.305514,-0.588386,0.0666667,1
-0.529412,-0.0452261,-0.0163934,-0.353535,0,0.0551417,-0.824082,-0.766667,1
0,0.266332,0.409836,-0.454545,-0.716312,-0.183308,-0.626815,0,1
-0.0588235,-0.346734,0.180328,-0.535354,0,-0.0461997,-0.554227,-0.3,1
-0.764706,-0.00502513,-0.0163934,-0.656566,-0.621749,0.0909091,-0.679761,0,1
-0.882353,0.0251256,0.213115,0,0,0.177347,-0.816396,-0.3,0
0.294118,0.20603,0.311475,-0.252525,-0.64539,0.260805,-0.396243,-0.1,0
-0.647059,0.0251256,-0.278689,-0.59596,-0.777778,-0.0819672,-0.725021,-0.833333,1
-0.882353,0.0954774,-0.0491803,-0.636364,-0.725768,-0.150522,-0.87959,-0.966667,1
0.0588235,0.407035,0.540984,0,0,-0.0253353,-0.439795,-0.2,0
0.529412,0.537688,0.442623,-0.252525,-0.669031,0.210134,-0.0640478,-0.4,1
0.411765,0.00502513,0.377049,-0.333333,-0.751773,-0.105812,-0.649872,-0.166667,1
-0.882353,0.477387,0.540984,-0.171717,0,0.469449,-0.760888,-0.8,0
-0.882353,-0.18593,0.213115,-0.171717,-0.865248,0.38003,-0.130658,-0.633333,1
-0.647059,0.879397,0.147541,-0.555556,-0.527187,0.0849479,-0.71819,-0.5,0
-0.294118,0.628141,0.0163934,0,0,-0.275708,-0.914603,-0.0333333,0
-0.529412,0.366834,0.147541,0,0,-0.0700447,-0.0572161,-0.966667,0
-0.882353,0.21608,0.278689,-0.212121,-0.825059,0.162444,-0.843723,-0.766667,1
-0.647059,0.0854271,0.0163934,-0.515152,0,-0.225037,-0.876174,-0.866667,1
0,0.819095,0.442623,-0.111111,0.205674,0.290611,-0.877028,-0.833333,0
-0.0588235,0.547739,0.278689,-0.353535,0,-0.0342771,-0.688301,-0.2,0
-0.882353,0.286432,0.442623,-0.212121,-0.739953,0.0879285,-0.163962,-0.466667,0
-0.176471,0.376884,0.47541,-0.171717,0,-0.0461997,-0.732707,-0.4,1
0,0.236181,0.180328,0,0,0.0819672,-0.846285,0.0333333,0
-0.882353,0.0653266,0.245902,0,0,0.117735,-0.898377,-0.833333,1
-0.294118,0.909548,0.508197,0,0,0.0581222,-0.829206,0.5,0
-0.764706,-0.115578,-0.0491803,-0.474747,-0.962175,-0.153502,-0.412468,-0.966667,1
0.0588235,0.708543,0.213115,-0.373737,0,0.311475,-0.722459,-0.266667,0
0.0588235,-0.105528,0.0163934,0,0,-0.329359,-0.945346,-0.6,1
0.176471,0.0150754,0.245902,-0.030303,-0.574468,-0.019374,-0.920581,0.4,1
-0.764706,0.226131,0.147541,-0.454545,0,0.0968703,-0.77626,-0.8,1
-0.411765,0.21608,0.180328,-0.535354,-0.735225,-0.219076,-0.857387,-0.7,1
-0.882353,0.266332,-0.0163934,0,0,-0.102832,-0.768574,-0.133333,0
-0.882353,-0.0653266,0.147541,-0.373737,0,-0.0938897,-0.797609,-0.933333,1

完整代码:

python 复制代码
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import torch
import torch.nn as nn
import torch.optim as optim
from torch.utils.data import DataLoader, TensorDataset
from sklearn.model_selection import KFold

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
print(f"Using device: {device}")

data = pd.read_csv('diabetes.csv')
X = data.iloc[:, :-1].values
y = data.iloc[:, -1].values

X_tensor = torch.FloatTensor(X).unsqueeze(1)
y_tensor = torch.LongTensor(y)


class DiabetesCNN(nn.Module):
    def __init__(self, input_features):
        super(DiabetesCNN, self).__init__()
        self.conv1 = nn.Conv1d(in_channels=1, out_channels=16, kernel_size=3, stride=1, padding=1)
        self.conv2 = nn.Conv1d(in_channels=16, out_channels=32, kernel_size=3, stride=1, padding=1)
        self.pool = nn.MaxPool1d(kernel_size=2, stride=2)
        self.fc1 = nn.Linear(32 * (input_features // 4), 64)
        self.fc2 = nn.Linear(64, 2)
        self.dropout = nn.Dropout(0.2)
        self.relu = nn.ReLU()

    def forward(self, x):
        x = self.relu(self.conv1(x))
        x = self.pool(x)
        x = self.relu(self.conv2(x))
        x = self.pool(x)
        x = x.view(x.size(0), -1)
        x = self.dropout(x)
        x = self.relu(self.fc1(x))
        x = self.fc2(x)
        return x


def train_model(model, train_loader, val_loader, criterion, optimizer, epochs=50):
    model.train()
    best_val_acc = 0.0

    # 用于记录训练和验证指标
    train_losses = []
    val_losses = []
    train_accs = []
    val_accs = []

    for epoch in range(epochs):
        model.train()
        running_loss = 0.0
        train_correct = 0
        train_total = 0

        for inputs, labels in train_loader:
            inputs, labels = inputs.to(device), labels.to(device)

            optimizer.zero_grad()
            outputs = model(inputs)
            loss = criterion(outputs, labels)
            loss.backward()
            optimizer.step()

            running_loss += loss.item()
            _, predicted = torch.max(outputs.data, 1)
            train_total += labels.size(0)
            train_correct += (predicted == labels).sum().item()

        train_loss = running_loss / len(train_loader)
        train_acc = 100 * train_correct / train_total
        train_losses.append(train_loss)
        train_accs.append(train_acc)

        model.eval()
        val_correct = 0
        val_total = 0
        val_loss = 0.0

        with torch.no_grad():
            for inputs, labels in val_loader:
                inputs, labels = inputs.to(device), labels.to(device)
                outputs = model(inputs)
                loss = criterion(outputs, labels)
                val_loss += loss.item()
                _, predicted = torch.max(outputs.data, 1)
                val_total += labels.size(0)
                val_correct += (predicted == labels).sum().item()

        val_loss = val_loss / len(val_loader)
        val_acc = 100 * val_correct / val_total
        val_losses.append(val_loss)
        val_accs.append(val_acc)

        if val_acc > best_val_acc:
            best_val_acc = val_acc

        print(
            f'Epoch {epoch + 1}/{epochs} - Train Loss: {train_loss:.4f}, Train Acc: {train_acc:.2f}%, Val Loss: {val_loss:.4f}, Val Acc: {val_acc:.2f}%')

    return best_val_acc, train_losses, val_losses, train_accs, val_accs


# 设置交叉验证
kfold = KFold(n_splits=5, shuffle=True, random_state=42)
batch_size = 32
input_features = X_tensor.shape[2]
epochs = 50

# 用于存储每个fold的结果
fold_results = []
all_train_losses = []
all_val_losses = []
all_train_accs = []
all_val_accs = []

for fold, (train_ids, val_ids) in enumerate(kfold.split(X_tensor)):
    print(f'\nFold {fold + 1}')
    print('-' * 20)

    X_train, X_val = X_tensor[train_ids], X_tensor[val_ids]
    y_train, y_val = y_tensor[train_ids], y_tensor[val_ids]

    train_data = TensorDataset(X_train, y_train)
    val_data = TensorDataset(X_val, y_val)
    train_loader = DataLoader(train_data, batch_size=batch_size, shuffle=True)
    val_loader = DataLoader(val_data, batch_size=batch_size, shuffle=False)

    model = DiabetesCNN(input_features).to(device)
    criterion = nn.CrossEntropyLoss()
    optimizer = optim.Adam(model.parameters(), lr=0.001)

    best_val_acc, train_losses, val_losses, train_accs, val_accs = train_model(
        model, train_loader, val_loader, criterion, optimizer, epochs)

    fold_results.append(best_val_acc)
    all_train_losses.append(train_losses)
    all_val_losses.append(val_losses)
    all_train_accs.append(train_accs)
    all_val_accs.append(val_accs)

    print(f'Fold {fold + 1} Best Validation Accuracy: {best_val_acc:.2f}%')

# 绘制训练和验证曲线
plt.figure(figsize=(15, 10))

# 绘制训练和验证损失曲线
plt.subplot(2, 2, 1)
for i in range(len(all_train_losses)):
    plt.plot(all_train_losses[i], label=f'Fold {i + 1} Train')
plt.title('Training Loss Across Folds')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.legend()

plt.subplot(2, 2, 2)
for i in range(len(all_val_losses)):
    plt.plot(all_val_losses[i], label=f'Fold {i + 1} Val')
plt.title('Validation Loss Across Folds')
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.legend()

# 绘制训练和验证准确率曲线
plt.subplot(2, 2, 3)
for i in range(len(all_train_accs)):
    plt.plot(all_train_accs[i], label=f'Fold {i + 1} Train')
plt.title('Training Accuracy Across Folds')
plt.xlabel('Epoch')
plt.ylabel('Accuracy (%)')
plt.legend()

plt.subplot(2, 2, 4)
for i in range(len(all_val_accs)):
    plt.plot(all_val_accs[i], label=f'Fold {i + 1} Val')
plt.title('Validation Accuracy Across Folds')
plt.xlabel('Epoch')
plt.ylabel('Accuracy (%)')
plt.legend()

plt.tight_layout()
plt.savefig('cross_validation_curves.png')
plt.show()

# 打印交叉验证结果
print('\nCross-Validation Results:')
for i, acc in enumerate(fold_results):
    print(f'Fold {i + 1}: {acc:.2f}%')
print(f'Mean Accuracy: {np.mean(fold_results):.2f}% ± {np.std(fold_results):.2f}%')

# 训练最终模型
print('\nTraining final model on all data...')
full_train_loader = DataLoader(TensorDataset(X_tensor, y_tensor), batch_size=batch_size, shuffle=True)
final_model = DiabetesCNN(input_features).to(device)
final_optimizer = optim.Adam(final_model.parameters(), lr=0.001)
train_model(final_model, full_train_loader, full_train_loader, criterion, final_optimizer, epochs)

torch.save(final_model.state_dict(), 'diabetes_cnn_model_final.pth')

训练过程

python 复制代码
Using device: cuda

Fold 1
--------------------
Epoch 1/50 - Train Loss: 0.6533, Train Acc: 66.01%, Val Loss: 0.6610, Val Acc: 63.16%
Epoch 2/50 - Train Loss: 0.6341, Train Acc: 66.01%, Val Loss: 0.6441, Val Acc: 63.16%
Epoch 3/50 - Train Loss: 0.6177, Train Acc: 66.01%, Val Loss: 0.6188, Val Acc: 63.16%
Epoch 4/50 - Train Loss: 0.5901, Train Acc: 66.17%, Val Loss: 0.5904, Val Acc: 63.16%
Epoch 5/50 - Train Loss: 0.5680, Train Acc: 69.97%, Val Loss: 0.5498, Val Acc: 74.34%
Epoch 6/50 - Train Loss: 0.5460, Train Acc: 71.95%, Val Loss: 0.5323, Val Acc: 75.00%
Epoch 7/50 - Train Loss: 0.5334, Train Acc: 72.77%, Val Loss: 0.5177, Val Acc: 75.00%
Epoch 8/50 - Train Loss: 0.5261, Train Acc: 74.09%, Val Loss: 0.5050, Val Acc: 76.32%
Epoch 9/50 - Train Loss: 0.5198, Train Acc: 73.43%, Val Loss: 0.5086, Val Acc: 76.32%
Epoch 10/50 - Train Loss: 0.5082, Train Acc: 75.25%, Val Loss: 0.4906, Val Acc: 76.97%
Epoch 11/50 - Train Loss: 0.5016, Train Acc: 74.59%, Val Loss: 0.4775, Val Acc: 78.95%
Epoch 12/50 - Train Loss: 0.4964, Train Acc: 75.41%, Val Loss: 0.4701, Val Acc: 78.95%
Epoch 13/50 - Train Loss: 0.4874, Train Acc: 75.25%, Val Loss: 0.4933, Val Acc: 75.00%
Epoch 14/50 - Train Loss: 0.4999, Train Acc: 74.92%, Val Loss: 0.4645, Val Acc: 78.95%
Epoch 15/50 - Train Loss: 0.4832, Train Acc: 77.39%, Val Loss: 0.4611, Val Acc: 78.95%
Epoch 16/50 - Train Loss: 0.4851, Train Acc: 77.56%, Val Loss: 0.4607, Val Acc: 78.29%
Epoch 17/50 - Train Loss: 0.4845, Train Acc: 75.58%, Val Loss: 0.4560, Val Acc: 79.61%
Epoch 18/50 - Train Loss: 0.4760, Train Acc: 76.40%, Val Loss: 0.4783, Val Acc: 78.29%
Epoch 19/50 - Train Loss: 0.4852, Train Acc: 75.41%, Val Loss: 0.4561, Val Acc: 78.95%
Epoch 20/50 - Train Loss: 0.4848, Train Acc: 76.90%, Val Loss: 0.4688, Val Acc: 78.95%
Epoch 21/50 - Train Loss: 0.4812, Train Acc: 75.58%, Val Loss: 0.4629, Val Acc: 78.29%
Epoch 22/50 - Train Loss: 0.4777, Train Acc: 75.58%, Val Loss: 0.4560, Val Acc: 78.95%
Epoch 23/50 - Train Loss: 0.4705, Train Acc: 76.07%, Val Loss: 0.4537, Val Acc: 78.95%
Epoch 24/50 - Train Loss: 0.4657, Train Acc: 76.73%, Val Loss: 0.4534, Val Acc: 78.29%
Epoch 25/50 - Train Loss: 0.4727, Train Acc: 76.73%, Val Loss: 0.4526, Val Acc: 77.63%
Epoch 26/50 - Train Loss: 0.4620, Train Acc: 78.55%, Val Loss: 0.4530, Val Acc: 78.29%
Epoch 27/50 - Train Loss: 0.4614, Train Acc: 77.89%, Val Loss: 0.4533, Val Acc: 77.63%
Epoch 28/50 - Train Loss: 0.4719, Train Acc: 75.58%, Val Loss: 0.4545, Val Acc: 79.61%
Epoch 29/50 - Train Loss: 0.4619, Train Acc: 78.38%, Val Loss: 0.4526, Val Acc: 76.97%
Epoch 30/50 - Train Loss: 0.4621, Train Acc: 76.90%, Val Loss: 0.4580, Val Acc: 78.95%
Epoch 31/50 - Train Loss: 0.4729, Train Acc: 78.05%, Val Loss: 0.4515, Val Acc: 76.32%
Epoch 32/50 - Train Loss: 0.4667, Train Acc: 76.07%, Val Loss: 0.4650, Val Acc: 76.97%
Epoch 33/50 - Train Loss: 0.4734, Train Acc: 77.39%, Val Loss: 0.4517, Val Acc: 75.66%
Epoch 34/50 - Train Loss: 0.4675, Train Acc: 76.90%, Val Loss: 0.4502, Val Acc: 76.97%
Epoch 35/50 - Train Loss: 0.4666, Train Acc: 77.23%, Val Loss: 0.4518, Val Acc: 76.97%
Epoch 36/50 - Train Loss: 0.4593, Train Acc: 77.39%, Val Loss: 0.4494, Val Acc: 76.32%
Epoch 37/50 - Train Loss: 0.4668, Train Acc: 77.89%, Val Loss: 0.4643, Val Acc: 76.97%
Epoch 38/50 - Train Loss: 0.4535, Train Acc: 78.55%, Val Loss: 0.4543, Val Acc: 78.29%
Epoch 39/50 - Train Loss: 0.4668, Train Acc: 75.91%, Val Loss: 0.4497, Val Acc: 75.66%
Epoch 40/50 - Train Loss: 0.4543, Train Acc: 79.04%, Val Loss: 0.4661, Val Acc: 76.97%
Epoch 41/50 - Train Loss: 0.4861, Train Acc: 77.56%, Val Loss: 0.4613, Val Acc: 77.63%
Epoch 42/50 - Train Loss: 0.4618, Train Acc: 78.05%, Val Loss: 0.4803, Val Acc: 73.68%
Epoch 43/50 - Train Loss: 0.4620, Train Acc: 78.55%, Val Loss: 0.4524, Val Acc: 77.63%
Epoch 44/50 - Train Loss: 0.4521, Train Acc: 79.37%, Val Loss: 0.4543, Val Acc: 76.32%
Epoch 45/50 - Train Loss: 0.4604, Train Acc: 76.57%, Val Loss: 0.4503, Val Acc: 76.32%
Epoch 46/50 - Train Loss: 0.4644, Train Acc: 76.57%, Val Loss: 0.4499, Val Acc: 76.32%
Epoch 47/50 - Train Loss: 0.4656, Train Acc: 77.06%, Val Loss: 0.4528, Val Acc: 76.32%
Epoch 48/50 - Train Loss: 0.4666, Train Acc: 76.57%, Val Loss: 0.4548, Val Acc: 76.32%
Epoch 49/50 - Train Loss: 0.4543, Train Acc: 77.89%, Val Loss: 0.4594, Val Acc: 76.97%
Epoch 50/50 - Train Loss: 0.4581, Train Acc: 77.06%, Val Loss: 0.4497, Val Acc: 76.97%
Fold 1 Best Validation Accuracy: 79.61%

Fold 2
--------------------
Epoch 1/50 - Train Loss: 0.6640, Train Acc: 65.35%, Val Loss: 0.6394, Val Acc: 65.79%
Epoch 2/50 - Train Loss: 0.6409, Train Acc: 65.35%, Val Loss: 0.6258, Val Acc: 65.79%
Epoch 3/50 - Train Loss: 0.6235, Train Acc: 65.35%, Val Loss: 0.6071, Val Acc: 65.79%
Epoch 4/50 - Train Loss: 0.5903, Train Acc: 65.51%, Val Loss: 0.5726, Val Acc: 65.79%
Epoch 5/50 - Train Loss: 0.5389, Train Acc: 73.43%, Val Loss: 0.5337, Val Acc: 71.71%
Epoch 6/50 - Train Loss: 0.5072, Train Acc: 75.91%, Val Loss: 0.5123, Val Acc: 73.03%
Epoch 7/50 - Train Loss: 0.4944, Train Acc: 76.73%, Val Loss: 0.5127, Val Acc: 72.37%
Epoch 8/50 - Train Loss: 0.4844, Train Acc: 77.06%, Val Loss: 0.5140, Val Acc: 72.37%
Epoch 9/50 - Train Loss: 0.4836, Train Acc: 77.23%, Val Loss: 0.5103, Val Acc: 73.68%
Epoch 10/50 - Train Loss: 0.4751, Train Acc: 78.88%, Val Loss: 0.5065, Val Acc: 74.34%
Epoch 11/50 - Train Loss: 0.4788, Train Acc: 77.72%, Val Loss: 0.5186, Val Acc: 73.68%
Epoch 12/50 - Train Loss: 0.4762, Train Acc: 77.39%, Val Loss: 0.5045, Val Acc: 74.34%
Epoch 13/50 - Train Loss: 0.4756, Train Acc: 77.39%, Val Loss: 0.5042, Val Acc: 74.34%
Epoch 14/50 - Train Loss: 0.4722, Train Acc: 78.22%, Val Loss: 0.5114, Val Acc: 74.34%
Epoch 15/50 - Train Loss: 0.4661, Train Acc: 77.56%, Val Loss: 0.4998, Val Acc: 72.37%
Epoch 16/50 - Train Loss: 0.4609, Train Acc: 78.55%, Val Loss: 0.5048, Val Acc: 75.00%
Epoch 17/50 - Train Loss: 0.4646, Train Acc: 77.72%, Val Loss: 0.5072, Val Acc: 75.00%
Epoch 18/50 - Train Loss: 0.4552, Train Acc: 78.88%, Val Loss: 0.5006, Val Acc: 74.34%
Epoch 19/50 - Train Loss: 0.4650, Train Acc: 77.23%, Val Loss: 0.5076, Val Acc: 73.68%
Epoch 20/50 - Train Loss: 0.4602, Train Acc: 78.55%, Val Loss: 0.4982, Val Acc: 73.68%
Epoch 21/50 - Train Loss: 0.4453, Train Acc: 78.88%, Val Loss: 0.4963, Val Acc: 72.37%
Epoch 22/50 - Train Loss: 0.4563, Train Acc: 78.22%, Val Loss: 0.4963, Val Acc: 72.37%
Epoch 23/50 - Train Loss: 0.4509, Train Acc: 77.56%, Val Loss: 0.4944, Val Acc: 72.37%
Epoch 24/50 - Train Loss: 0.4514, Train Acc: 78.05%, Val Loss: 0.4942, Val Acc: 73.68%
Epoch 25/50 - Train Loss: 0.4573, Train Acc: 77.89%, Val Loss: 0.4957, Val Acc: 72.37%
Epoch 26/50 - Train Loss: 0.4485, Train Acc: 79.37%, Val Loss: 0.4927, Val Acc: 72.37%
Epoch 27/50 - Train Loss: 0.4527, Train Acc: 78.71%, Val Loss: 0.5123, Val Acc: 75.00%
Epoch 28/50 - Train Loss: 0.4556, Train Acc: 77.72%, Val Loss: 0.4931, Val Acc: 73.68%
Epoch 29/50 - Train Loss: 0.4553, Train Acc: 78.55%, Val Loss: 0.4974, Val Acc: 73.03%
Epoch 30/50 - Train Loss: 0.4515, Train Acc: 76.90%, Val Loss: 0.4936, Val Acc: 72.37%
Epoch 31/50 - Train Loss: 0.4510, Train Acc: 77.89%, Val Loss: 0.4923, Val Acc: 73.03%
Epoch 32/50 - Train Loss: 0.4494, Train Acc: 78.88%, Val Loss: 0.4925, Val Acc: 73.03%
Epoch 33/50 - Train Loss: 0.4491, Train Acc: 77.72%, Val Loss: 0.4975, Val Acc: 72.37%
Epoch 34/50 - Train Loss: 0.4485, Train Acc: 77.72%, Val Loss: 0.4951, Val Acc: 73.03%
Epoch 35/50 - Train Loss: 0.4502, Train Acc: 78.38%, Val Loss: 0.4985, Val Acc: 73.68%
Epoch 36/50 - Train Loss: 0.4482, Train Acc: 77.23%, Val Loss: 0.4959, Val Acc: 73.03%
Epoch 37/50 - Train Loss: 0.4556, Train Acc: 77.06%, Val Loss: 0.4963, Val Acc: 73.03%
Epoch 38/50 - Train Loss: 0.4464, Train Acc: 78.55%, Val Loss: 0.4935, Val Acc: 72.37%
Epoch 39/50 - Train Loss: 0.4383, Train Acc: 78.88%, Val Loss: 0.5050, Val Acc: 73.68%
Epoch 40/50 - Train Loss: 0.4405, Train Acc: 79.21%, Val Loss: 0.4977, Val Acc: 72.37%
Epoch 41/50 - Train Loss: 0.4412, Train Acc: 78.38%, Val Loss: 0.4950, Val Acc: 73.03%
Epoch 42/50 - Train Loss: 0.4395, Train Acc: 78.05%, Val Loss: 0.4951, Val Acc: 73.03%
Epoch 43/50 - Train Loss: 0.4321, Train Acc: 78.55%, Val Loss: 0.4933, Val Acc: 72.37%
Epoch 44/50 - Train Loss: 0.4350, Train Acc: 78.71%, Val Loss: 0.4980, Val Acc: 72.37%
Epoch 45/50 - Train Loss: 0.4406, Train Acc: 78.88%, Val Loss: 0.5048, Val Acc: 73.68%
Epoch 46/50 - Train Loss: 0.4460, Train Acc: 78.05%, Val Loss: 0.5150, Val Acc: 73.03%
Epoch 47/50 - Train Loss: 0.4367, Train Acc: 78.38%, Val Loss: 0.4979, Val Acc: 71.71%
Epoch 48/50 - Train Loss: 0.4368, Train Acc: 78.71%, Val Loss: 0.5003, Val Acc: 73.03%
Epoch 49/50 - Train Loss: 0.4306, Train Acc: 79.87%, Val Loss: 0.5001, Val Acc: 72.37%
Epoch 50/50 - Train Loss: 0.4299, Train Acc: 78.55%, Val Loss: 0.5014, Val Acc: 73.03%
Fold 2 Best Validation Accuracy: 75.00%

Fold 3
--------------------
Epoch 1/50 - Train Loss: 0.6494, Train Acc: 65.51%, Val Loss: 0.6422, Val Acc: 65.13%
Epoch 2/50 - Train Loss: 0.6302, Train Acc: 65.51%, Val Loss: 0.6254, Val Acc: 65.13%
Epoch 3/50 - Train Loss: 0.6113, Train Acc: 65.51%, Val Loss: 0.5903, Val Acc: 65.13%
Epoch 4/50 - Train Loss: 0.5717, Train Acc: 65.84%, Val Loss: 0.5344, Val Acc: 72.37%
Epoch 5/50 - Train Loss: 0.5309, Train Acc: 73.10%, Val Loss: 0.5028, Val Acc: 79.61%
Epoch 6/50 - Train Loss: 0.5104, Train Acc: 74.42%, Val Loss: 0.4957, Val Acc: 75.00%
Epoch 7/50 - Train Loss: 0.4976, Train Acc: 75.08%, Val Loss: 0.4904, Val Acc: 76.32%
Epoch 8/50 - Train Loss: 0.4892, Train Acc: 75.91%, Val Loss: 0.4909, Val Acc: 77.63%
Epoch 9/50 - Train Loss: 0.4842, Train Acc: 77.89%, Val Loss: 0.4917, Val Acc: 76.97%
Epoch 10/50 - Train Loss: 0.4772, Train Acc: 77.06%, Val Loss: 0.4918, Val Acc: 75.00%
Epoch 11/50 - Train Loss: 0.4692, Train Acc: 77.23%, Val Loss: 0.4856, Val Acc: 75.66%
Epoch 12/50 - Train Loss: 0.4725, Train Acc: 77.39%, Val Loss: 0.4877, Val Acc: 75.66%
Epoch 13/50 - Train Loss: 0.4696, Train Acc: 78.22%, Val Loss: 0.4906, Val Acc: 75.66%
Epoch 14/50 - Train Loss: 0.4563, Train Acc: 77.39%, Val Loss: 0.4978, Val Acc: 76.32%
Epoch 15/50 - Train Loss: 0.4770, Train Acc: 78.71%, Val Loss: 0.4913, Val Acc: 74.34%
Epoch 16/50 - Train Loss: 0.4558, Train Acc: 77.06%, Val Loss: 0.4873, Val Acc: 75.00%
Epoch 17/50 - Train Loss: 0.4651, Train Acc: 76.07%, Val Loss: 0.4836, Val Acc: 75.66%
Epoch 18/50 - Train Loss: 0.4612, Train Acc: 76.73%, Val Loss: 0.4977, Val Acc: 76.97%
Epoch 19/50 - Train Loss: 0.4715, Train Acc: 76.90%, Val Loss: 0.4852, Val Acc: 75.66%
Epoch 20/50 - Train Loss: 0.4554, Train Acc: 78.22%, Val Loss: 0.4901, Val Acc: 75.00%
Epoch 21/50 - Train Loss: 0.4498, Train Acc: 78.55%, Val Loss: 0.4899, Val Acc: 75.66%
Epoch 22/50 - Train Loss: 0.4596, Train Acc: 77.23%, Val Loss: 0.4937, Val Acc: 74.34%
Epoch 23/50 - Train Loss: 0.4470, Train Acc: 77.39%, Val Loss: 0.4918, Val Acc: 75.66%
Epoch 24/50 - Train Loss: 0.4565, Train Acc: 77.06%, Val Loss: 0.4913, Val Acc: 75.66%
Epoch 25/50 - Train Loss: 0.4567, Train Acc: 77.89%, Val Loss: 0.4889, Val Acc: 75.66%
Epoch 26/50 - Train Loss: 0.4606, Train Acc: 76.40%, Val Loss: 0.5028, Val Acc: 76.32%
Epoch 27/50 - Train Loss: 0.4487, Train Acc: 77.89%, Val Loss: 0.4901, Val Acc: 74.34%
Epoch 28/50 - Train Loss: 0.4459, Train Acc: 78.22%, Val Loss: 0.4908, Val Acc: 75.00%
Epoch 29/50 - Train Loss: 0.4413, Train Acc: 78.38%, Val Loss: 0.4937, Val Acc: 76.32%
Epoch 30/50 - Train Loss: 0.4524, Train Acc: 77.06%, Val Loss: 0.4896, Val Acc: 74.34%
Epoch 31/50 - Train Loss: 0.4493, Train Acc: 78.55%, Val Loss: 0.4888, Val Acc: 76.32%
Epoch 32/50 - Train Loss: 0.4405, Train Acc: 77.56%, Val Loss: 0.4917, Val Acc: 75.00%
Epoch 33/50 - Train Loss: 0.4388, Train Acc: 78.71%, Val Loss: 0.4969, Val Acc: 75.00%
Epoch 34/50 - Train Loss: 0.4374, Train Acc: 78.55%, Val Loss: 0.4886, Val Acc: 75.66%
Epoch 35/50 - Train Loss: 0.4364, Train Acc: 78.22%, Val Loss: 0.4927, Val Acc: 76.32%
Epoch 36/50 - Train Loss: 0.4422, Train Acc: 78.88%, Val Loss: 0.4885, Val Acc: 74.34%
Epoch 37/50 - Train Loss: 0.4411, Train Acc: 77.23%, Val Loss: 0.4929, Val Acc: 74.34%
Epoch 38/50 - Train Loss: 0.4419, Train Acc: 78.38%, Val Loss: 0.5028, Val Acc: 74.34%
Epoch 39/50 - Train Loss: 0.4322, Train Acc: 78.71%, Val Loss: 0.4894, Val Acc: 76.97%
Epoch 40/50 - Train Loss: 0.4336, Train Acc: 78.55%, Val Loss: 0.4869, Val Acc: 73.68%
Epoch 41/50 - Train Loss: 0.4388, Train Acc: 78.05%, Val Loss: 0.4958, Val Acc: 74.34%
Epoch 42/50 - Train Loss: 0.4402, Train Acc: 79.21%, Val Loss: 0.4871, Val Acc: 73.68%
Epoch 43/50 - Train Loss: 0.4409, Train Acc: 79.21%, Val Loss: 0.4876, Val Acc: 74.34%
Epoch 44/50 - Train Loss: 0.4325, Train Acc: 79.87%, Val Loss: 0.5009, Val Acc: 74.34%
Epoch 45/50 - Train Loss: 0.4363, Train Acc: 77.89%, Val Loss: 0.4843, Val Acc: 73.03%
Epoch 46/50 - Train Loss: 0.4277, Train Acc: 80.69%, Val Loss: 0.5052, Val Acc: 74.34%
Epoch 47/50 - Train Loss: 0.4287, Train Acc: 78.22%, Val Loss: 0.4915, Val Acc: 75.00%
Epoch 48/50 - Train Loss: 0.4254, Train Acc: 79.04%, Val Loss: 0.4881, Val Acc: 75.00%
Epoch 49/50 - Train Loss: 0.4176, Train Acc: 80.86%, Val Loss: 0.4984, Val Acc: 73.68%
Epoch 50/50 - Train Loss: 0.4282, Train Acc: 79.04%, Val Loss: 0.4924, Val Acc: 75.00%
Fold 3 Best Validation Accuracy: 79.61%

Fold 4
--------------------
Epoch 1/50 - Train Loss: 0.6630, Train Acc: 65.24%, Val Loss: 0.6393, Val Acc: 65.56%
Epoch 2/50 - Train Loss: 0.6328, Train Acc: 65.40%, Val Loss: 0.6170, Val Acc: 65.56%
Epoch 3/50 - Train Loss: 0.6088, Train Acc: 65.40%, Val Loss: 0.5834, Val Acc: 65.56%
Epoch 4/50 - Train Loss: 0.5712, Train Acc: 65.90%, Val Loss: 0.5335, Val Acc: 72.85%
Epoch 5/50 - Train Loss: 0.5322, Train Acc: 73.97%, Val Loss: 0.5047, Val Acc: 77.48%
Epoch 6/50 - Train Loss: 0.5108, Train Acc: 75.78%, Val Loss: 0.4954, Val Acc: 78.15%
Epoch 7/50 - Train Loss: 0.4963, Train Acc: 75.45%, Val Loss: 0.4927, Val Acc: 76.16%
Epoch 8/50 - Train Loss: 0.4993, Train Acc: 74.79%, Val Loss: 0.5034, Val Acc: 75.50%
Epoch 9/50 - Train Loss: 0.4875, Train Acc: 77.10%, Val Loss: 0.4910, Val Acc: 76.16%
Epoch 10/50 - Train Loss: 0.4803, Train Acc: 76.11%, Val Loss: 0.4888, Val Acc: 75.50%
Epoch 11/50 - Train Loss: 0.4776, Train Acc: 77.27%, Val Loss: 0.4890, Val Acc: 75.50%
Epoch 12/50 - Train Loss: 0.4692, Train Acc: 76.77%, Val Loss: 0.4994, Val Acc: 75.50%
Epoch 13/50 - Train Loss: 0.4684, Train Acc: 77.59%, Val Loss: 0.4964, Val Acc: 74.83%
Epoch 14/50 - Train Loss: 0.4727, Train Acc: 76.77%, Val Loss: 0.4945, Val Acc: 76.16%
Epoch 15/50 - Train Loss: 0.4748, Train Acc: 76.94%, Val Loss: 0.4969, Val Acc: 74.83%
Epoch 16/50 - Train Loss: 0.4744, Train Acc: 75.62%, Val Loss: 0.4988, Val Acc: 74.17%
Epoch 17/50 - Train Loss: 0.4664, Train Acc: 77.43%, Val Loss: 0.4955, Val Acc: 73.51%
Epoch 18/50 - Train Loss: 0.4537, Train Acc: 76.94%, Val Loss: 0.5028, Val Acc: 74.83%
Epoch 19/50 - Train Loss: 0.4553, Train Acc: 78.42%, Val Loss: 0.4979, Val Acc: 73.51%
Epoch 20/50 - Train Loss: 0.4581, Train Acc: 77.59%, Val Loss: 0.5114, Val Acc: 75.50%
Epoch 21/50 - Train Loss: 0.4518, Train Acc: 77.59%, Val Loss: 0.5026, Val Acc: 74.83%
Epoch 22/50 - Train Loss: 0.4467, Train Acc: 77.43%, Val Loss: 0.5110, Val Acc: 76.16%
Epoch 23/50 - Train Loss: 0.4523, Train Acc: 76.94%, Val Loss: 0.5065, Val Acc: 73.51%
Epoch 24/50 - Train Loss: 0.4566, Train Acc: 76.61%, Val Loss: 0.5082, Val Acc: 74.83%
Epoch 25/50 - Train Loss: 0.4521, Train Acc: 77.76%, Val Loss: 0.5120, Val Acc: 74.83%
Epoch 26/50 - Train Loss: 0.4452, Train Acc: 76.94%, Val Loss: 0.5121, Val Acc: 76.82%
Epoch 27/50 - Train Loss: 0.4458, Train Acc: 77.27%, Val Loss: 0.5089, Val Acc: 74.17%
Epoch 28/50 - Train Loss: 0.4402, Train Acc: 78.25%, Val Loss: 0.5059, Val Acc: 74.83%
Epoch 29/50 - Train Loss: 0.4457, Train Acc: 77.59%, Val Loss: 0.5144, Val Acc: 76.16%
Epoch 30/50 - Train Loss: 0.4384, Train Acc: 78.75%, Val Loss: 0.5100, Val Acc: 74.17%
Epoch 31/50 - Train Loss: 0.4437, Train Acc: 78.58%, Val Loss: 0.5081, Val Acc: 74.17%
Epoch 32/50 - Train Loss: 0.4391, Train Acc: 78.09%, Val Loss: 0.5129, Val Acc: 74.83%
Epoch 33/50 - Train Loss: 0.4287, Train Acc: 78.75%, Val Loss: 0.5157, Val Acc: 75.50%
Epoch 34/50 - Train Loss: 0.4325, Train Acc: 79.41%, Val Loss: 0.5123, Val Acc: 75.50%
Epoch 35/50 - Train Loss: 0.4422, Train Acc: 77.92%, Val Loss: 0.5269, Val Acc: 76.82%
Epoch 36/50 - Train Loss: 0.4359, Train Acc: 77.92%, Val Loss: 0.5165, Val Acc: 76.16%
Epoch 37/50 - Train Loss: 0.4300, Train Acc: 78.25%, Val Loss: 0.5255, Val Acc: 75.50%
Epoch 38/50 - Train Loss: 0.4376, Train Acc: 76.77%, Val Loss: 0.5212, Val Acc: 74.83%
Epoch 39/50 - Train Loss: 0.4353, Train Acc: 78.91%, Val Loss: 0.5305, Val Acc: 76.82%
Epoch 40/50 - Train Loss: 0.4313, Train Acc: 80.07%, Val Loss: 0.5250, Val Acc: 75.50%
Epoch 41/50 - Train Loss: 0.4280, Train Acc: 79.74%, Val Loss: 0.5210, Val Acc: 74.83%
Epoch 42/50 - Train Loss: 0.4172, Train Acc: 79.90%, Val Loss: 0.5238, Val Acc: 76.16%
Epoch 43/50 - Train Loss: 0.4194, Train Acc: 80.40%, Val Loss: 0.5227, Val Acc: 76.16%
Epoch 44/50 - Train Loss: 0.4211, Train Acc: 78.91%, Val Loss: 0.5325, Val Acc: 74.83%
Epoch 45/50 - Train Loss: 0.4264, Train Acc: 80.07%, Val Loss: 0.5316, Val Acc: 75.50%
Epoch 46/50 - Train Loss: 0.4271, Train Acc: 80.07%, Val Loss: 0.5367, Val Acc: 74.17%
Epoch 47/50 - Train Loss: 0.4203, Train Acc: 78.42%, Val Loss: 0.5378, Val Acc: 74.83%
Epoch 48/50 - Train Loss: 0.4254, Train Acc: 78.58%, Val Loss: 0.5564, Val Acc: 75.50%
Epoch 49/50 - Train Loss: 0.4312, Train Acc: 79.74%, Val Loss: 0.5338, Val Acc: 74.17%
Epoch 50/50 - Train Loss: 0.4215, Train Acc: 77.76%, Val Loss: 0.5374, Val Acc: 75.50%
Fold 4 Best Validation Accuracy: 78.15%

Fold 5
--------------------
Epoch 1/50 - Train Loss: 0.6627, Train Acc: 65.24%, Val Loss: 0.6267, Val Acc: 67.55%
Epoch 2/50 - Train Loss: 0.6166, Train Acc: 64.91%, Val Loss: 0.5772, Val Acc: 67.55%
Epoch 3/50 - Train Loss: 0.5680, Train Acc: 68.53%, Val Loss: 0.5351, Val Acc: 74.83%
Epoch 4/50 - Train Loss: 0.5262, Train Acc: 74.30%, Val Loss: 0.5070, Val Acc: 73.51%
Epoch 5/50 - Train Loss: 0.5130, Train Acc: 74.96%, Val Loss: 0.4970, Val Acc: 72.85%
Epoch 6/50 - Train Loss: 0.5019, Train Acc: 76.44%, Val Loss: 0.4944, Val Acc: 75.50%
Epoch 7/50 - Train Loss: 0.4911, Train Acc: 76.11%, Val Loss: 0.4868, Val Acc: 76.82%
Epoch 8/50 - Train Loss: 0.4987, Train Acc: 76.44%, Val Loss: 0.4904, Val Acc: 77.48%
Epoch 9/50 - Train Loss: 0.4776, Train Acc: 76.28%, Val Loss: 0.4836, Val Acc: 77.48%
Epoch 10/50 - Train Loss: 0.4686, Train Acc: 78.75%, Val Loss: 0.4831, Val Acc: 77.48%
Epoch 11/50 - Train Loss: 0.4710, Train Acc: 76.61%, Val Loss: 0.4853, Val Acc: 77.48%
Epoch 12/50 - Train Loss: 0.4687, Train Acc: 76.61%, Val Loss: 0.4788, Val Acc: 78.81%
Epoch 13/50 - Train Loss: 0.4642, Train Acc: 77.76%, Val Loss: 0.4840, Val Acc: 77.48%
Epoch 14/50 - Train Loss: 0.4745, Train Acc: 77.43%, Val Loss: 0.4764, Val Acc: 80.13%
Epoch 15/50 - Train Loss: 0.4625, Train Acc: 77.59%, Val Loss: 0.4807, Val Acc: 78.15%
Epoch 16/50 - Train Loss: 0.4713, Train Acc: 77.10%, Val Loss: 0.4799, Val Acc: 77.48%
Epoch 17/50 - Train Loss: 0.4714, Train Acc: 77.92%, Val Loss: 0.4779, Val Acc: 77.48%
Epoch 18/50 - Train Loss: 0.4769, Train Acc: 76.94%, Val Loss: 0.4726, Val Acc: 79.47%
Epoch 19/50 - Train Loss: 0.4655, Train Acc: 76.94%, Val Loss: 0.4719, Val Acc: 80.79%
Epoch 20/50 - Train Loss: 0.4779, Train Acc: 77.59%, Val Loss: 0.5113, Val Acc: 74.17%
Epoch 21/50 - Train Loss: 0.4785, Train Acc: 76.61%, Val Loss: 0.4762, Val Acc: 80.13%
Epoch 22/50 - Train Loss: 0.4626, Train Acc: 76.94%, Val Loss: 0.4739, Val Acc: 79.47%
Epoch 23/50 - Train Loss: 0.4605, Train Acc: 76.61%, Val Loss: 0.4751, Val Acc: 78.81%
Epoch 24/50 - Train Loss: 0.4572, Train Acc: 77.10%, Val Loss: 0.4788, Val Acc: 79.47%
Epoch 25/50 - Train Loss: 0.4560, Train Acc: 76.94%, Val Loss: 0.4760, Val Acc: 80.13%
Epoch 26/50 - Train Loss: 0.4508, Train Acc: 78.58%, Val Loss: 0.4761, Val Acc: 78.81%
Epoch 27/50 - Train Loss: 0.4465, Train Acc: 79.24%, Val Loss: 0.4785, Val Acc: 78.81%
Epoch 28/50 - Train Loss: 0.4634, Train Acc: 76.94%, Val Loss: 0.4888, Val Acc: 76.16%
Epoch 29/50 - Train Loss: 0.4708, Train Acc: 77.27%, Val Loss: 0.4783, Val Acc: 79.47%
Epoch 30/50 - Train Loss: 0.4641, Train Acc: 77.92%, Val Loss: 0.4778, Val Acc: 80.79%
Epoch 31/50 - Train Loss: 0.4507, Train Acc: 78.25%, Val Loss: 0.4802, Val Acc: 78.15%
Epoch 32/50 - Train Loss: 0.4472, Train Acc: 77.92%, Val Loss: 0.4800, Val Acc: 79.47%
Epoch 33/50 - Train Loss: 0.4566, Train Acc: 76.94%, Val Loss: 0.4782, Val Acc: 78.15%
Epoch 34/50 - Train Loss: 0.4451, Train Acc: 80.07%, Val Loss: 0.4798, Val Acc: 80.79%
Epoch 35/50 - Train Loss: 0.4543, Train Acc: 77.27%, Val Loss: 0.4804, Val Acc: 78.15%
Epoch 36/50 - Train Loss: 0.4409, Train Acc: 79.41%, Val Loss: 0.4821, Val Acc: 80.79%
Epoch 37/50 - Train Loss: 0.4566, Train Acc: 77.27%, Val Loss: 0.4916, Val Acc: 77.48%
Epoch 38/50 - Train Loss: 0.4543, Train Acc: 77.76%, Val Loss: 0.4823, Val Acc: 80.13%
Epoch 39/50 - Train Loss: 0.4439, Train Acc: 78.58%, Val Loss: 0.4920, Val Acc: 76.82%
Epoch 40/50 - Train Loss: 0.4469, Train Acc: 76.77%, Val Loss: 0.4888, Val Acc: 78.15%
Epoch 41/50 - Train Loss: 0.4495, Train Acc: 77.92%, Val Loss: 0.4895, Val Acc: 77.48%
Epoch 42/50 - Train Loss: 0.4534, Train Acc: 77.59%, Val Loss: 0.4834, Val Acc: 80.13%
Epoch 43/50 - Train Loss: 0.4397, Train Acc: 78.09%, Val Loss: 0.4973, Val Acc: 78.15%
Epoch 44/50 - Train Loss: 0.4358, Train Acc: 77.92%, Val Loss: 0.4930, Val Acc: 79.47%
Epoch 45/50 - Train Loss: 0.4422, Train Acc: 78.42%, Val Loss: 0.4865, Val Acc: 79.47%
Epoch 46/50 - Train Loss: 0.4337, Train Acc: 78.09%, Val Loss: 0.4867, Val Acc: 78.81%
Epoch 47/50 - Train Loss: 0.4344, Train Acc: 79.41%, Val Loss: 0.5084, Val Acc: 78.81%
Epoch 48/50 - Train Loss: 0.4415, Train Acc: 78.25%, Val Loss: 0.4896, Val Acc: 80.13%
Epoch 49/50 - Train Loss: 0.4452, Train Acc: 78.75%, Val Loss: 0.5051, Val Acc: 76.82%
Epoch 50/50 - Train Loss: 0.4461, Train Acc: 77.92%, Val Loss: 0.4913, Val Acc: 80.13%
Fold 5 Best Validation Accuracy: 80.79%

Cross-Validation Results:
Fold 1: 79.61%
Fold 2: 75.00%
Fold 3: 79.61%
Fold 4: 78.15%
Fold 5: 80.79%
Mean Accuracy: 78.63% ± 2.00%

Training final model on all data...
Epoch 1/50 - Train Loss: 0.6579, Train Acc: 65.44%, Val Loss: 0.6412, Val Acc: 65.44%
Epoch 2/50 - Train Loss: 0.6326, Train Acc: 65.44%, Val Loss: 0.6137, Val Acc: 65.30%
Epoch 3/50 - Train Loss: 0.5963, Train Acc: 67.68%, Val Loss: 0.5641, Val Acc: 68.21%
Epoch 4/50 - Train Loss: 0.5485, Train Acc: 72.82%, Val Loss: 0.5203, Val Acc: 74.67%
Epoch 5/50 - Train Loss: 0.5089, Train Acc: 75.07%, Val Loss: 0.4883, Val Acc: 76.52%
Epoch 6/50 - Train Loss: 0.5042, Train Acc: 75.20%, Val Loss: 0.4894, Val Acc: 74.67%
Epoch 7/50 - Train Loss: 0.4966, Train Acc: 75.20%, Val Loss: 0.4889, Val Acc: 75.99%
Epoch 8/50 - Train Loss: 0.4998, Train Acc: 75.59%, Val Loss: 0.4727, Val Acc: 77.44%
Epoch 9/50 - Train Loss: 0.4800, Train Acc: 77.18%, Val Loss: 0.4772, Val Acc: 76.91%
Epoch 10/50 - Train Loss: 0.4759, Train Acc: 76.91%, Val Loss: 0.4655, Val Acc: 77.57%
Epoch 11/50 - Train Loss: 0.4859, Train Acc: 75.59%, Val Loss: 0.4699, Val Acc: 76.65%
Epoch 12/50 - Train Loss: 0.4742, Train Acc: 77.04%, Val Loss: 0.4592, Val Acc: 77.44%
Epoch 13/50 - Train Loss: 0.4719, Train Acc: 77.70%, Val Loss: 0.4567, Val Acc: 77.44%
Epoch 14/50 - Train Loss: 0.4647, Train Acc: 76.91%, Val Loss: 0.4554, Val Acc: 76.78%
Epoch 15/50 - Train Loss: 0.4702, Train Acc: 77.57%, Val Loss: 0.4592, Val Acc: 77.04%
Epoch 16/50 - Train Loss: 0.4590, Train Acc: 77.57%, Val Loss: 0.4624, Val Acc: 77.97%
Epoch 17/50 - Train Loss: 0.4595, Train Acc: 76.91%, Val Loss: 0.4510, Val Acc: 77.04%
Epoch 18/50 - Train Loss: 0.4603, Train Acc: 77.97%, Val Loss: 0.4477, Val Acc: 77.57%
Epoch 19/50 - Train Loss: 0.4571, Train Acc: 77.44%, Val Loss: 0.4526, Val Acc: 77.44%
Epoch 20/50 - Train Loss: 0.4683, Train Acc: 76.78%, Val Loss: 0.4433, Val Acc: 78.50%
Epoch 21/50 - Train Loss: 0.4516, Train Acc: 78.23%, Val Loss: 0.4423, Val Acc: 77.31%
Epoch 22/50 - Train Loss: 0.4614, Train Acc: 77.31%, Val Loss: 0.4445, Val Acc: 77.84%
Epoch 23/50 - Train Loss: 0.4556, Train Acc: 77.84%, Val Loss: 0.4442, Val Acc: 77.84%
Epoch 24/50 - Train Loss: 0.4539, Train Acc: 78.50%, Val Loss: 0.4390, Val Acc: 78.63%
Epoch 25/50 - Train Loss: 0.4505, Train Acc: 77.44%, Val Loss: 0.4401, Val Acc: 79.02%
Epoch 26/50 - Train Loss: 0.4488, Train Acc: 77.57%, Val Loss: 0.4440, Val Acc: 78.63%
Epoch 27/50 - Train Loss: 0.4586, Train Acc: 77.31%, Val Loss: 0.4377, Val Acc: 78.76%
Epoch 28/50 - Train Loss: 0.4456, Train Acc: 78.63%, Val Loss: 0.4363, Val Acc: 77.97%
Epoch 29/50 - Train Loss: 0.4520, Train Acc: 77.97%, Val Loss: 0.4350, Val Acc: 78.63%
Epoch 30/50 - Train Loss: 0.4484, Train Acc: 77.57%, Val Loss: 0.4327, Val Acc: 78.76%
Epoch 31/50 - Train Loss: 0.4411, Train Acc: 79.16%, Val Loss: 0.4286, Val Acc: 78.89%
Epoch 32/50 - Train Loss: 0.4417, Train Acc: 79.68%, Val Loss: 0.4295, Val Acc: 78.89%
Epoch 33/50 - Train Loss: 0.4361, Train Acc: 78.50%, Val Loss: 0.4324, Val Acc: 78.89%
Epoch 34/50 - Train Loss: 0.4443, Train Acc: 77.44%, Val Loss: 0.4270, Val Acc: 79.68%
Epoch 35/50 - Train Loss: 0.4349, Train Acc: 77.97%, Val Loss: 0.4343, Val Acc: 78.76%
Epoch 36/50 - Train Loss: 0.4441, Train Acc: 77.84%, Val Loss: 0.4323, Val Acc: 78.76%
Epoch 37/50 - Train Loss: 0.4460, Train Acc: 78.76%, Val Loss: 0.4289, Val Acc: 79.55%
Epoch 38/50 - Train Loss: 0.4376, Train Acc: 78.50%, Val Loss: 0.4222, Val Acc: 79.68%
Epoch 39/50 - Train Loss: 0.4421, Train Acc: 78.36%, Val Loss: 0.4200, Val Acc: 79.29%
Epoch 40/50 - Train Loss: 0.4316, Train Acc: 79.16%, Val Loss: 0.4294, Val Acc: 79.82%
Epoch 41/50 - Train Loss: 0.4409, Train Acc: 78.76%, Val Loss: 0.4213, Val Acc: 79.95%
Epoch 42/50 - Train Loss: 0.4304, Train Acc: 78.89%, Val Loss: 0.4472, Val Acc: 77.57%
Epoch 43/50 - Train Loss: 0.4352, Train Acc: 80.21%, Val Loss: 0.4219, Val Acc: 79.82%
Epoch 44/50 - Train Loss: 0.4413, Train Acc: 79.29%, Val Loss: 0.4146, Val Acc: 79.29%
Epoch 45/50 - Train Loss: 0.4246, Train Acc: 79.16%, Val Loss: 0.4167, Val Acc: 80.21%
Epoch 46/50 - Train Loss: 0.4297, Train Acc: 78.23%, Val Loss: 0.4129, Val Acc: 79.55%
Epoch 47/50 - Train Loss: 0.4254, Train Acc: 79.16%, Val Loss: 0.4069, Val Acc: 80.08%
Epoch 48/50 - Train Loss: 0.4260, Train Acc: 79.55%, Val Loss: 0.4094, Val Acc: 80.47%
Epoch 49/50 - Train Loss: 0.4189, Train Acc: 80.21%, Val Loss: 0.4158, Val Acc: 79.82%
Epoch 50/50 - Train Loss: 0.4270, Train Acc: 79.82%, Val Loss: 0.4058, Val Acc: 80.08%

曲线:

相关推荐
Honmaple17 小时前
OpenClaw 迁移指南:如何把 AI 助手搬到新电脑
人工智能
wenzhangli717 小时前
Ooder A2UI 第一性原理出发 深度解析核心逻辑
人工智能·开源
网络安全研究所17 小时前
AI安全提示词注入攻击如何操控你的智能助手?
人工智能·安全
数据猿17 小时前
硬盘价格涨疯了,AI存储何去何从?
人工智能
zhangfeng113317 小时前
氨基酸序列表示法,蛋白质序列表达 计算机中机器学习 大语言模型中的表达,为什么没有糖蛋白或者其他基团磷酸化甲基化乙酰化泛素化
人工智能·机器学习·语言模型
陈天伟教授17 小时前
人工智能应用- 语言理解:06.大语言模型
人工智能·语言模型·自然语言处理
海心焱17 小时前
安全之盾:深度解析 MCP 如何缝合企业级 SSO 身份验证体系,构建可信 AI 数据通道
人工智能·安全
2501_9453184917 小时前
AI证书能否作为招聘/培训标准?2026最新
人工智能
2601_9491465317 小时前
Python语音通知接口接入教程:开发者快速集成AI语音API的脚本实现
人工智能·python·语音识别
韦东东17 小时前
RAGFlow v0.20的Agent重大更新:text2sql的Agent案例测试
人工智能·大模型·agent·text2sql·ragflow