基于深度学习的电路板缺陷检测-
文章目录
- 研究背景
-
-
- [1. 电路板制造的重要性](#1. 电路板制造的重要性)
- [2. 传统检测方法的局限性](#2. 传统检测方法的局限性)
- [3. 深度学习技术的兴起](#3. 深度学习技术的兴起)
- [4. 深度学习在缺陷检测中的应用](#4. 深度学习在缺陷检测中的应用)
- [5. 研究进展](#5. 研究进展)
- [6. 面临的挑战](#6. 面临的挑战)
- [7. 结论](#7. 结论)
-
- 代码下载链接
- 一、效果演示
-
- [1.1 图像演示](#1.1 图像演示)
- [1.2 视频演示](#1.2 视频演示)
- [1.3 摄像头演示](#1.3 摄像头演示)
- 二、技术原理
-
- [2.1 整体流程](#2.1 整体流程)
- [2.2 电路板缺陷数据集介绍](#2.2 电路板缺陷数据集介绍)
- [2.3 YOLOV5电路板缺陷检测原理](#2.3 YOLOV5电路板缺陷检测原理)
-
- [2.3.1 概述](#2.3.1 概述)
- [2.3.2 输入层](#2.3.2 输入层)
- [2.3.3 Backbone层](#2.3.3 Backbone层)
- [2.3.4 Backbone层](#2.3.4 Backbone层)
- [2.3.5 Head层](#2.3.5 Head层)
- [2.4 模型训练](#2.4 模型训练)
-
- [2.4.1 Conda环境构建](#2.4.1 Conda环境构建)
- [2.4.2 基础环境构建](#2.4.2 基础环境构建)
- [2.4.3 安装YOLOv5环境](#2.4.3 安装YOLOv5环境)
- [2.4.4 构建电路板缺陷检测模型](#2.4.4 构建电路板缺陷检测模型)
- [2.4.5 电路板缺陷数据集标记与校验](#2.4.5 电路板缺陷数据集标记与校验)
- [2.4.6 电路板缺陷检测模型训练](#2.4.6 电路板缺陷检测模型训练)
- [2.4.7 电路板缺陷验证测试](#2.4.7 电路板缺陷验证测试)
- 代码下载链接
- 参考文献
研究背景
深度学习在电路板缺陷检测领域的应用是工业自动化和质量控制的重要方向。随着电子制造业的快速发展,电路板的复杂性和精细度不断提高,传统的人工检测方法已经难以满足高效率和高精度的要求。因此,利用深度学习技术进行自动化的电路板缺陷检测成为了研究的热点。以下是关于深度学习电路板缺陷检测研究背景的详细阐述:
1. 电路板制造的重要性
电路板(Printed Circuit Board, PCB)是现代电子设备中不可或缺的组成部分,它们承载着电子元件并确保这些元件之间的电气连接。电路板的质量直接影响到电子产品的性能和可靠性。因此,电路板的制造过程需要严格的质量控制。
2. 传统检测方法的局限性
传统的电路板缺陷检测主要依赖于人工视觉检查或简单的自动化设备。这些方法存在以下局限性:
- 效率低下:人工检测速度慢,难以应对大规模生产的需求。
- 主观性强:人工检测易受操作者疲劳、情绪等因素的影响,导致检测结果不稳定。
- 精度有限:对于微小或隐蔽的缺陷,人工检测难以发现。
3. 深度学习技术的兴起
深度学习是机器学习的一个分支,它通过构建多层次的神经网络来学习数据的高层特征。近年来,深度学习在图像识别、语音处理等领域取得了显著的成果。特别是在图像识别领域,深度学习模型如卷积神经网络(CNN)已经证明了其在图像特征提取方面的强大能力。
4. 深度学习在缺陷检测中的应用
将深度学习应用于电路板缺陷检测,可以有效地解决传统方法的局限性:
- 自动化程度高:深度学习模型可以自动从大量图像数据中学习缺陷特征,实现自动化检测。
- 检测精度高:深度学习模型能够识别微小或复杂的缺陷,提高检测的准确性。
- 适应性强:通过持续学习新的数据,深度学习模型可以适应不同的电路板类型和缺陷模式。
5. 研究进展
目前,深度学习在电路板缺陷检测领域的研究主要集中在以下几个方面:
- 数据集的构建:收集和标注大量的电路板图像数据,为模型训练提供基础。
- 模型的优化:研究和设计适合电路板缺陷检测的深度学习模型,如改进的卷积神经网络。
- 检测算法的改进:开发高效的算法,提高缺陷检测的速度和准确性。
- 系统集成:将深度学习模型集成到电路板生产线中,实现实时在线检测。
6. 面临的挑战
尽管深度学习在电路板缺陷检测中展现出巨大的潜力,但仍面临一些挑战:
- 数据获取难度:高质量的标注数据获取成本高,且需要专业知识。
- 模型泛化能力:模型在特定数据集上表现良好,但在不同环境或条件下的泛化能力有待提高。
- 计算资源需求:深度学习模型训练和推理需要大量的计算资源,对硬件设备有较高要求。
7. 结论
深度学习技术为电路板缺陷检测提供了新的解决方案,通过自动化和高精度的检测,有助于提高电子制造业的生产效率和产品质量。随着技术的不断发展和优化,深度学习在电路板缺陷检测领域的应用前景广阔。
觉得不错的小伙伴,感谢点赞、关注加收藏哦!更多干货内容持续更新...
代码下载链接
关注博主的G Z H【小蜜蜂视觉】,回复【电路板缺陷检测】即可获取下载方式
若您想获得博文中涉及的实现完整全部程序文件(包括系统UI设计文件,电路板缺陷测试数据集、py文件,模型权重文件,调试说明等),代码获取与技术指导,具体见可参考博客与视频,已将所有涉及的文件同时打包到里面,软件安装调试有具体说明,我们有专业的调试技术人员,将远程协助客户调试,具体请看安装调试说明.txt
,完整文件截图如下:
一、效果演示
本文构建的电路板缺陷检测系统基于PyQt5构建,支持图像、视频、摄像头以及RTSP等数据源输入。
1.1 图像演示
1.2 视频演示
1.3 摄像头演示
二、技术原理
2.1 整体流程
深度学习电路板缺陷检测技术的目标是从输入图像中准确地定位手写的位置,通常是通过目标检测技术来实现。
-
数据准备: 首先,需要准备电路板缺陷数据集。
-
网络架构: 选择一个适合电路板缺陷定位任务的深度学习网络架构。一种常见的选择是基于卷积神经网络(CNN)的架构,例如Faster R-CNN、YOLO(You Only Look Once)或SSD(Single Shot MultiBox Detector)。这些网络可以同时预测边界框的位置和类别,适用于目标检测任务。
-
训练: 使用准备好的训练数据集对所选网络架构进行训练。训练过程涉及将输入图像传递给网络,然后通过反向传播优化网络的权重,使其能够准确地预测电路缺陷位置。训练数据中的每个样本都包括输入图像和相应的缺陷位置标注。
-
预测: 在训练完成后,将训练得到的网络应用于新的图像。通过将图像输入网络,网络将输出电路缺陷位置的预测结果,这通常是一个边界框或四个关键点的坐标。
-
后处理: 根据网络输出的预测结果,可以使用一些后处理技术来提高定位的准确性。例如,可以使用非极大值抑制(NMS)来抑制重叠的边界框,只保留最有可能的车牌位置。
-
评估和调优: 对预测结果进行评估,可以使用评价指标如IoU(Intersection over Union)来衡量预测框与真实标注框的重叠程度。根据评估结果,可以对网络架构、训练参数等进行调优,以提高定位的准确性和稳定性。
2.2 电路板缺陷数据集介绍
电路板缺陷数据集一共690张图象左右,包含六种缺陷,missing_hole【缺失孔】, mouse_bite【鼠咬】, open_circuit【开路】, short【短路】, spur【杂散】, spurious_copper【杂铜】,如下图所示
2.3 YOLOV5电路板缺陷检测原理
2.3.1 概述
YOLOv5算法是一种单阶段目标检测算法,其网络结构主要由输入端(Input)、主干网络Backbone)、特征融合模块(Neck )和预测层(Head)4个部分组成。如下图所示。
对不同尺寸的目标进行检测时,输入图片经过处理后变成大小为640×640的图片,再输入骨干网络处理得到20×20 、40×40、80×80 三种特征图,再将三种不同尺度的特征图进行融合,使得网络学习同时兼顾目标的顶层和底层特征。
2.3.2 输入层
为了提升模型的泛化能力,在YOLOv5 中增加了Mosaic数据增强方式,即从一个 batch 中随机选取 4 张图片,并将图片进行随机缩放、裁剪,再拼接成一个设定边长的训练样本,作为训练集图片送入神经网络。这样做可以在不改变原来的数据集数量的基础上获得更多数据特征进行训练,既能有效提高系统的鲁棒性,也能在一定程度上减少GPU 的损耗,也可以加快网络训练速度。马赛克数据增强原理如下图所示。
2.3.3 Backbone层
YOLOv5 中的主干网络 Backbone 主要作用是提取输入图像的目标特征,使用了Focus结构作为Backbone中的基准网络,网络结构模型为CSPDarknet53 ,并通过切片操作来获得得到二倍下采样图,可以有效增强主干网络特征提取能力。
1)Focus 结构
输入的图像先经过 Focus 模块,进行切片操作,即在图片中每隔像素值进行取值,得到四张互补的输入图像,再输入骨干网络进行处理,从而达到对系统提速的效果。Focus结构如下图所示。
2)CSP 结构
YOLOv5 中的 CSP 结构主要用于增强主干网络提取深层图的信息,常用的CSP 结构主要有两种,被用于 Backbone 主干网络的是CSP1 模块,被用于特征融合Neck结构的是CSP2 模块。CSP1 模块能有效减少网络计算量和保证网络模型整体的准确性,其结构共有两个分支,一个分支连接残差组件,另一分支在卷积后通过 Concat 方式和上一分支相连接。结构如下图所示。
CBL 模块主要由图像的卷积、批量标准化操作和 Leaky_Relu 激活函数组成,如下图所示。
残差结构 Resunit 主要用于防止当网络深度加深时网络性能退化,如下图所示。
SPP 模块主要用于把输入图像送入池化层中,获得不同的池化特征值,再将这些池化特征值和原图的特征值用Concat进行连接,使得在不影响网络的训练速率的前提下,显著分离图像特征值,如下图所示。
2.3.4 Backbone层
YOLOv5中的Neck 层主要用于将 Backbone 结构中提取到的目标特征进行融合,再输入 Head 层。在YOLOv5的Neck模块中采用FPN+PAN网络结构和CSP2 模块来增加特征融合能力。其中, 特征金字塔网络(FPN),主要用于采集图像中的高层信息,并将其传递给低层,路径聚合网络(PAN),则相反,将目标位置信息由低层传递给高层,从而有效提高目标识别的准确性,如下图所示。
2.3.5 Head层
YOLOv5 的 Head 层主要功能是对经过 Neck 结构特征融合后的目标进行类别的判断和预测。Head 层主要包含损失函数和非极大值抑制两部分,损失函数用于评价训练时预测值与真实值之间的误差程度。其中,YOLOv5 以 GIOU_Loss 做为损失函数,其数值越小,说明模型的预测效果越好。非极大值抑制处理主要用于对最后的目标检测框进行非极大值抑制处理,保留最优目标框,提高了目标识别的准确性。
2.4 模型训练
模型训练主要分为如下几步:
2.4.1 Conda环境构建
新人安装Anaconda环境可以参考博主写的文章Anaconda3与PyCharm安装配置保姆教程
2.4.2 基础环境构建
新人安装PyTorch GPU版本可以参考博主写的文章基于conda的PyTorch深度学习框架GPU安装教程
2.4.3 安装YOLOv5环境
python
conda create -n yolov5 python=3.8
conda activate yolov5
git clone https://github.com/ultralytics/yolov5.git
cd yolov5
pip install -r requirement.txt
2.4.4 构建电路板缺陷检测模型
python
# Parameters
nc: 6 # number of classes
depth_multiple: 0.33 # model depth multiple
width_multiple: 0.50 # layer channel multiple
anchors:
- [10,13, 16,30, 33,23] # P3/8
- [30,61, 62,45, 59,119] # P4/16
- [116,90, 156,198, 373,326] # P5/32
# YOLOv5 v6.0 backbone
backbone:
# [from, number, module, args]
[[-1, 1, Conv, [64, 6, 2, 2]], # 0-P1/2
[-1, 1, Conv, [128, 3, 2]], # 1-P2/4
[-1, 3, C3, [128]],
[-1, 1, Conv, [256, 3, 2]], # 3-P3/8
[-1, 6, C3, [256]],
[-1, 1, Conv, [512, 3, 2]], # 5-P4/16
[-1, 9, C3, [512]],
[-1, 1, Conv, [1024, 3, 2]], # 7-P5/32
[-1, 3, C3, [1024]],
[-1, 1, SPPF, [1024, 5]], # 9
]
# YOLOv5 v6.0 head
head:
[[-1, 1, Conv, [512, 1, 1]],
[-1, 1, nn.Upsample, [None, 2, 'nearest']],
[[-1, 6], 1, Concat, [1]], # cat backbone P4
[-1, 3, C3, [512, False]], # 13
[-1, 1, Conv, [256, 1, 1]],
[-1, 1, nn.Upsample, [None, 2, 'nearest']],
[[-1, 4], 1, Concat, [1]], # cat backbone P3
[-1, 3, C3, [256, False]], # 17 (P3/8-small)
[-1, 1, Conv, [256, 3, 2]],
[[-1, 14], 1, Concat, [1]], # cat head P4
[-1, 3, C3, [512, False]], # 20 (P4/16-medium)
[-1, 1, Conv, [512, 3, 2]],
[[-1, 10], 1, Concat, [1]], # cat head P5
[-1, 3, C3, [1024, False]], # 23 (P5/32-large)
[[17, 20, 23], 1, Detect, [nc, anchors]], # Detect(P3, P4, P5)
]
2.4.5 电路板缺陷数据集标记与校验
数据集按照7:2:1的比例划分,即训练集70%,验证集20%,测试集10%,标注格式采用yolo格式组织
python
pcbDefect
images
train
image1.jpg
image2.jpg
...
val
image11.jpg
image22.jpg
...
test
image111.jpg
image222.jpg
...
labels
train
image1.txt
image2.txt
...
val
image11.txt
image22.txt
...
test
image111.txt
image222.txt
...
2.4.6 电路板缺陷检测模型训练
python
python train.py --data data/pcbDefect.yaml --weights weights/yolo5s.pt --epochs 200 --img 640 --batch 32
python
epoch, train/box_loss, train/obj_loss, train/cls_loss, metrics/precision, metrics/recall, metrics/mAP_0.5,metrics/mAP_0.5:0.95, val/box_loss, val/obj_loss, val/cls_loss, x/lr0, x/lr1, x/lr2
0, 0.12484, 0.02737, 0.05086, 0, 0, 0, 0, 0.099175, 0.022928, 0.041956, 0.0003, 0.0003, 0.0973
1, 0.1142, 0.023805, 0.047829, 1.8484e-05, 0.0019608, 9.3664e-06, 4.6832e-06, 0.094973, 0.022314, 0.040621, 0.00060698, 0.00060698, 0.094507
2, 0.12133, 0.023373, 0.048619, 2.3914e-05, 0.0039216, 1.2268e-05, 3.0787e-06, 0.09351, 0.02185, 0.039008, 0.00091089, 0.00091089, 0.091711
3, 0.10872, 0.022898, 0.045056, 0, 0, 0, 0, 0.090055, 0.023763, 0.03879, 0.0012117, 0.0012117, 0.088912
4, 0.10918, 0.0239, 0.043116, 8.742e-06, 0.0019608, 4.7785e-06, 9.5569e-07, 0.088986, 0.023305, 0.03792, 0.0015095, 0.0015095, 0.08611
5, 0.10875, 0.024434, 0.043995, 0.00019947, 0.010214, 0.00010997, 2.6007e-05, 0.088992, 0.022971, 0.037073, 0.0018042, 0.0018042, 0.083304
6, 0.11414, 0.025017, 0.043744, 0, 0, 0, 0, 0.085712, 0.025487, 0.036888, 0.0020958, 0.0020958, 0.080496
7, 0.10827, 0.026964, 0.042441, 0.16709, 0.0024155, 2.6152e-05, 2.6152e-06, 0.084743, 0.025547, 0.035558, 0.0023844, 0.0023844, 0.077684
8, 0.10007, 0.027432, 0.040806, 0.00018402, 0.024385, 0.0001051, 4.1652e-05, 0.083923, 0.025623, 0.034752, 0.0026699, 0.0026699, 0.07487
9, 0.1022, 0.027681, 0.038991, 0.83637, 0.0048309, 0.000838, 0.00016014, 0.08344, 0.025185, 0.033924, 0.0029523, 0.0029523, 0.072052
10, 0.10668, 0.028629, 0.041762, 0.0043043, 0.11947, 0.0028779, 0.00070002, 0.081718, 0.026008, 0.03363, 0.0032317, 0.0032317, 0.069232
11, 0.094817, 0.028187, 0.037223, 0.010421, 0.13669, 0.0064485, 0.0023622, 0.079839, 0.026461, 0.03251, 0.003508, 0.003508, 0.066408
12, 0.10223, 0.029149, 0.03979, 0.08995, 0.11447, 0.026334, 0.0071619, 0.077197, 0.026976, 0.031045, 0.0037812, 0.0037812, 0.063581
13, 0.096238, 0.030066, 0.033945, 0.29717, 0.12115, 0.046143, 0.011257, 0.073834, 0.02661, 0.03009, 0.0040514, 0.0040514, 0.060751
14, 0.093807, 0.029488, 0.033628, 0.29149, 0.20257, 0.09556, 0.02552, 0.064574, 0.026527, 0.028707, 0.0043184, 0.0043184, 0.057918
15, 0.086757, 0.029328, 0.036134, 0.34997, 0.20591, 0.12756, 0.032759, 0.058909, 0.025801, 0.02892, 0.0045825, 0.0045825, 0.055082
16, 0.076235, 0.028122, 0.033492, 0.26873, 0.16685, 0.082917, 0.023012, 0.06048, 0.023564, 0.028187, 0.0048434, 0.0048434, 0.052243
17, 0.081788, 0.027419, 0.036132, 0.1771, 0.27427, 0.13825, 0.038245, 0.054586, 0.023455, 0.026885, 0.0051013, 0.0051013, 0.049401
18, 0.072503, 0.02621, 0.035861, 0.19031, 0.26175, 0.13786, 0.039231, 0.055165, 0.02206, 0.026718, 0.0053561, 0.0053561, 0.046556
19, 0.075154, 0.025401, 0.035481, 0.13098, 0.29517, 0.15981, 0.051945, 0.053173, 0.02305, 0.026441, 0.0056078, 0.0056078, 0.043708
20, 0.07247, 0.027697, 0.034355, 0.16031, 0.387, 0.19065, 0.063287, 0.050783, 0.023427, 0.0259, 0.0058565, 0.0058565, 0.040856
21, 0.070079, 0.024609, 0.03242, 0.2172, 0.40022, 0.22486, 0.070513, 0.051542, 0.020548, 0.02562, 0.0061021, 0.0061021, 0.038002
22, 0.072638, 0.023778, 0.033797, 0.23871, 0.40299, 0.23364, 0.079628, 0.048709, 0.019411, 0.025403, 0.0063446, 0.0063446, 0.035145
23, 0.067763, 0.023429, 0.031075, 0.30583, 0.42377, 0.31444, 0.1101, 0.045775, 0.018809, 0.024514, 0.0065841, 0.0065841, 0.032284
24, 0.068432, 0.02275, 0.031787, 0.23464, 0.47114, 0.25414, 0.076288, 0.045868, 0.017216, 0.023834, 0.0068205, 0.0068205, 0.02942
25, 0.06421, 0.021766, 0.029952, 0.36794, 0.46555, 0.42251, 0.16531, 0.044276, 0.016465, 0.022828, 0.0070538, 0.0070538, 0.026554
26, 0.060759, 0.02159, 0.028128, 0.29827, 0.53119, 0.39159, 0.13732, 0.04389, 0.016635, 0.021993, 0.0072841, 0.0072841, 0.023684
27, 0.06263, 0.021824, 0.029637, 0.37995, 0.49254, 0.40589, 0.13539, 0.044346, 0.017079, 0.020667, 0.0075113, 0.0075113, 0.020811
28, 0.061617, 0.021408, 0.024676, 0.38632, 0.54125, 0.46086, 0.16602, 0.044787, 0.016174, 0.019635, 0.0077354, 0.0077354, 0.017935
29, 0.065169, 0.020749, 0.026807, 0.38223, 0.60098, 0.45897, 0.15665, 0.045328, 0.016849, 0.02, 0.0079564, 0.0079564, 0.015056
30, 0.063139, 0.02049, 0.024576, 0.4257, 0.46196, 0.40128, 0.14993, 0.044961, 0.019111, 0.018987, 0.0081744, 0.0081744, 0.012174
31, 0.067716, 0.020788, 0.02591, 0.39936, 0.57545, 0.44998, 0.16, 0.044199, 0.016384, 0.017622, 0.0083893, 0.0083893, 0.0092893
32, 0.056865, 0.019899, 0.023785, 0.4629, 0.68263, 0.54684, 0.17261, 0.044873, 0.015662, 0.015883, 0.008416, 0.008416, 0.008416
33, 0.058964, 0.01995, 0.021784, 0.55403, 0.58913, 0.5685, 0.22085, 0.040284, 0.015942, 0.01443, 0.008416, 0.008416, 0.008416
34, 0.058454, 0.019378, 0.020485, 0.42254, 0.70449, 0.54222, 0.21409, 0.046571, 0.014397, 0.013292, 0.0083665, 0.0083665, 0.0083665
35, 0.058928, 0.02042, 0.019211, 0.60793, 0.70669, 0.64054, 0.25293, 0.041433, 0.014017, 0.012946, 0.008317, 0.008317, 0.008317
36, 0.054344, 0.019364, 0.018559, 0.63038, 0.63844, 0.60525, 0.214, 0.040027, 0.014703, 0.01273, 0.0082675, 0.0082675, 0.0082675
37, 0.056027, 0.018725, 0.019346, 0.58863, 0.65128, 0.61601, 0.2186, 0.04146, 0.014097, 0.012551, 0.008218, 0.008218, 0.008218
38, 0.051582, 0.017833, 0.017648, 0.65324, 0.68514, 0.66436, 0.26664, 0.037995, 0.014601, 0.011902, 0.0081685, 0.0081685, 0.0081685
39, 0.056378, 0.019822, 0.019619, 0.64028, 0.69426, 0.64908, 0.26194, 0.038427, 0.015223, 0.012832, 0.008119, 0.008119, 0.008119
40, 0.055912, 0.019035, 0.019743, 0.56056, 0.66146, 0.61191, 0.24558, 0.0392, 0.01611, 0.012616, 0.0080695, 0.0080695, 0.0080695
41, 0.057096, 0.018863, 0.018639, 0.60676, 0.65111, 0.64875, 0.24476, 0.037914, 0.015474, 0.012516, 0.00802, 0.00802, 0.00802
42, 0.052602, 0.018758, 0.016409, 0.73267, 0.65367, 0.66801, 0.24529, 0.039337, 0.014857, 0.010857, 0.0079705, 0.0079705, 0.0079705
43, 0.052403, 0.01904, 0.016651, 0.66987, 0.65322, 0.65329, 0.23442, 0.038708, 0.014813, 0.010211, 0.007921, 0.007921, 0.007921
44, 0.054819, 0.018756, 0.016562, 0.726, 0.71208, 0.72272, 0.28136, 0.038059, 0.014965, 0.011355, 0.0078715, 0.0078715, 0.0078715
45, 0.051945, 0.018247, 0.01558, 0.65716, 0.74192, 0.71502, 0.29463, 0.036035, 0.014914, 0.010335, 0.007822, 0.007822, 0.007822
46, 0.048662, 0.018387, 0.01261, 0.75183, 0.74856, 0.78253, 0.32716, 0.035286, 0.013997, 0.00908, 0.0077725, 0.0077725, 0.0077725
47, 0.051778, 0.018055, 0.013956, 0.75034, 0.74415, 0.78651, 0.32863, 0.037475, 0.013481, 0.0087572, 0.007723, 0.007723, 0.007723
48, 0.04879, 0.017522, 0.014655, 0.84672, 0.77846, 0.83463, 0.36633, 0.03312, 0.013482, 0.008113, 0.0076735, 0.0076735, 0.0076735
49, 0.045367, 0.016369, 0.011432, 0.88172, 0.78395, 0.8345, 0.36982, 0.034699, 0.0136, 0.0077343, 0.007624, 0.007624, 0.007624
50, 0.049162, 0.017486, 0.013406, 0.87596, 0.78937, 0.85484, 0.3911, 0.033765, 0.013094, 0.0071882, 0.0075745, 0.0075745, 0.0075745
51, 0.048895, 0.01649, 0.010995, 0.818, 0.76106, 0.79677, 0.31772, 0.035831, 0.013744, 0.0072649, 0.007525, 0.007525, 0.007525
52, 0.048196, 0.01597, 0.012448, 0.91648, 0.79214, 0.87183, 0.39436, 0.033049, 0.012967, 0.0064913, 0.0074755, 0.0074755, 0.0074755
53, 0.049006, 0.016843, 0.012422, 0.89503, 0.82083, 0.85232, 0.35166, 0.034703, 0.013242, 0.0060425, 0.007426, 0.007426, 0.007426
54, 0.045039, 0.016665, 0.010964, 0.88004, 0.76269, 0.82095, 0.34482, 0.035443, 0.01379, 0.0058685, 0.0073765, 0.0073765, 0.0073765
55, 0.049921, 0.017242, 0.01021, 0.8134, 0.69093, 0.73544, 0.27637, 0.037711, 0.01636, 0.0078486, 0.007327, 0.007327, 0.007327
56, 0.045715, 0.016572, 0.010712, 0.89924, 0.7686, 0.8494, 0.34522, 0.033694, 0.014355, 0.0064229, 0.0072775, 0.0072775, 0.0072775
57, 0.045321, 0.01685, 0.0097301, 0.93359, 0.78541, 0.86524, 0.35157, 0.03404, 0.013498, 0.0055048, 0.007228, 0.007228, 0.007228
58, 0.045265, 0.01607, 0.0089117, 0.92405, 0.80675, 0.86445, 0.36384, 0.032952, 0.012936, 0.0052725, 0.0071785, 0.0071785, 0.0071785
59, 0.046811, 0.017249, 0.0079751, 0.88507, 0.77521, 0.85653, 0.35136, 0.034862, 0.013452, 0.005372, 0.007129, 0.007129, 0.007129
60, 0.044543, 0.016109, 0.010059, 0.92684, 0.82112, 0.89208, 0.39844, 0.031855, 0.013012, 0.0046403, 0.0070795, 0.0070795, 0.0070795
61, 0.04393, 0.016937, 0.010026, 0.96179, 0.79142, 0.88832, 0.3887, 0.032853, 0.013447, 0.004644, 0.00703, 0.00703, 0.00703
62, 0.04447, 0.017083, 0.00857, 0.91542, 0.81076, 0.85554, 0.37937, 0.033633, 0.012946, 0.0044348, 0.0069805, 0.0069805, 0.0069805
63, 0.048069, 0.015914, 0.010106, 0.88183, 0.83371, 0.88729, 0.38924, 0.035211, 0.012345, 0.0043419, 0.006931, 0.006931, 0.006931
64, 0.047465, 0.015861, 0.010033, 0.94491, 0.81236, 0.87716, 0.3876, 0.033621, 0.012739, 0.0042835, 0.0068815, 0.0068815, 0.0068815
65, 0.046503, 0.016447, 0.0096603, 0.93186, 0.82607, 0.86643, 0.39011, 0.031858, 0.013193, 0.005143, 0.006832, 0.006832, 0.006832
66, 0.043081, 0.016306, 0.0098997, 0.91899, 0.82768, 0.88598, 0.38536, 0.03219, 0.012865, 0.0050801, 0.0067825, 0.0067825, 0.0067825
67, 0.046527, 0.015942, 0.010459, 0.94605, 0.81018, 0.89336, 0.39331, 0.032565, 0.013048, 0.0044286, 0.006733, 0.006733, 0.006733
68, 0.044154, 0.015473, 0.008497, 0.92187, 0.8141, 0.87062, 0.39324, 0.032281, 0.013037, 0.0039939, 0.0066835, 0.0066835, 0.0066835
69, 0.044052, 0.016617, 0.010523, 0.92972, 0.82751, 0.86467, 0.3902, 0.033314, 0.013084, 0.0051996, 0.006634, 0.006634, 0.006634
70, 0.045459, 0.016656, 0.0086925, 0.88645, 0.80142, 0.84436, 0.33819, 0.036311, 0.013742, 0.0047912, 0.0065845, 0.0065845, 0.0065845
71, 0.044562, 0.016596, 0.010416, 0.90629, 0.83939, 0.88171, 0.3867, 0.033018, 0.013289, 0.0042312, 0.006535, 0.006535, 0.006535
72, 0.041348, 0.017017, 0.0083758, 0.95946, 0.81294, 0.90342, 0.40224, 0.031912, 0.012792, 0.0042021, 0.0064855, 0.0064855, 0.0064855
73, 0.044251, 0.016574, 0.0080459, 0.88369, 0.7731, 0.87218, 0.38941, 0.032398, 0.016928, 0.0039136, 0.006436, 0.006436, 0.006436
74, 0.041181, 0.016271, 0.0060709, 0.79849, 0.79458, 0.765, 0.31907, 0.03449, 0.036394, 0.0044259, 0.0063865, 0.0063865, 0.0063865
75, 0.042834, 0.015841, 0.0086537, 0.92066, 0.79164, 0.88469, 0.3953, 0.031921, 0.014896, 0.0036765, 0.006337, 0.006337, 0.006337
76, 0.04294, 0.015388, 0.0073545, 0.91545, 0.81423, 0.89903, 0.41682, 0.030503, 0.013578, 0.0033275, 0.0062875, 0.0062875, 0.0062875
77, 0.040866, 0.014745, 0.0058436, 0.93302, 0.86188, 0.90637, 0.40798, 0.031485, 0.012598, 0.0029572, 0.006238, 0.006238, 0.006238
78, 0.041912, 0.015394, 0.0052194, 0.93608, 0.85307, 0.90307, 0.42248, 0.031025, 0.012454, 0.0031475, 0.0061885, 0.0061885, 0.0061885
79, 0.039007, 0.015139, 0.0061208, 0.91964, 0.8273, 0.87532, 0.36615, 0.032998, 0.012248, 0.0031014, 0.006139, 0.006139, 0.006139
80, 0.043204, 0.01559, 0.0075981, 0.93912, 0.87994, 0.91136, 0.42586, 0.03035, 0.012175, 0.0032027, 0.0060895, 0.0060895, 0.0060895
81, 0.040371, 0.014788, 0.0064149, 0.97283, 0.85112, 0.92561, 0.43932, 0.029957, 0.011927, 0.0028763, 0.00604, 0.00604, 0.00604
82, 0.041994, 0.01537, 0.0066436, 0.95699, 0.86119, 0.92346, 0.41763, 0.030161, 0.01205, 0.0027768, 0.0059905, 0.0059905, 0.0059905
83, 0.042882, 0.015644, 0.0062851, 0.96661, 0.87436, 0.93015, 0.45028, 0.030057, 0.011764, 0.0027965, 0.005941, 0.005941, 0.005941
84, 0.03866, 0.014544, 0.0057479, 0.95962, 0.87919, 0.93729, 0.44303, 0.030041, 0.011658, 0.0026109, 0.0058915, 0.0058915, 0.0058915
85, 0.041031, 0.01497, 0.0064724, 0.95067, 0.85396, 0.91883, 0.43661, 0.031194, 0.011717, 0.0024433, 0.005842, 0.005842, 0.005842
86, 0.043856, 0.015274, 0.0071839, 0.94593, 0.88202, 0.92081, 0.4285, 0.030159, 0.011857, 0.0027411, 0.0057925, 0.0057925, 0.0057925
87, 0.039042, 0.014907, 0.0056313, 0.94987, 0.87799, 0.92806, 0.41992, 0.029423, 0.01182, 0.0029971, 0.005743, 0.005743, 0.005743
88, 0.039156, 0.014449, 0.0058202, 0.93032, 0.92113, 0.936, 0.41752, 0.030294, 0.011555, 0.002698, 0.0056935, 0.0056935, 0.0056935
89, 0.038369, 0.015305, 0.0057764, 0.94996, 0.88343, 0.9351, 0.4419, 0.029601, 0.011693, 0.0027395, 0.005644, 0.005644, 0.005644
90, 0.039153, 0.014749, 0.0062159, 0.94639, 0.89046, 0.92992, 0.43134, 0.029415, 0.011824, 0.0024304, 0.0055945, 0.0055945, 0.0055945
91, 0.039486, 0.014833, 0.0059336, 0.933, 0.90507, 0.93022, 0.43061, 0.030484, 0.011829, 0.0023976, 0.005545, 0.005545, 0.005545
92, 0.037397, 0.014199, 0.0060579, 0.94522, 0.90726, 0.93538, 0.4515, 0.030685, 0.01187, 0.0022765, 0.0054955, 0.0054955, 0.0054955
93, 0.037392, 0.014398, 0.005526, 0.9546, 0.87499, 0.9291, 0.44996, 0.028949, 0.01183, 0.0023412, 0.005446, 0.005446, 0.005446
94, 0.038571, 0.014143, 0.0051352, 0.94653, 0.86266, 0.91688, 0.44484, 0.029331, 0.011822, 0.0024087, 0.0053965, 0.0053965, 0.0053965
95, 0.039126, 0.014401, 0.004951, 0.95383, 0.89699, 0.9396, 0.44699, 0.029785, 0.01173, 0.0024243, 0.005347, 0.005347, 0.005347
96, 0.035678, 0.0143, 0.0047188, 0.94611, 0.89373, 0.93404, 0.4459, 0.029089, 0.011582, 0.0022794, 0.0052975, 0.0052975, 0.0052975
97, 0.035773, 0.014327, 0.0042397, 0.94307, 0.89542, 0.93623, 0.45345, 0.028663, 0.011711, 0.0023837, 0.005248, 0.005248, 0.005248
98, 0.035447, 0.014431, 0.0059631, 0.94095, 0.9063, 0.93334, 0.45018, 0.02879, 0.011681, 0.0022759, 0.0051985, 0.0051985, 0.0051985
99, 0.036644, 0.014601, 0.0055177, 0.96148, 0.90983, 0.95032, 0.45671, 0.028263, 0.011551, 0.0021521, 0.005149, 0.005149, 0.005149
100, 0.037563, 0.014183, 0.0079009, 0.96421, 0.88252, 0.93722, 0.44607, 0.029796, 0.011798, 0.0022199, 0.0050995, 0.0050995, 0.0050995
101, 0.038423, 0.014047, 0.0052177, 0.96269, 0.88753, 0.93611, 0.44788, 0.030238, 0.011613, 0.002229, 0.00505, 0.00505, 0.00505
102, 0.041102, 0.014574, 0.0062728, 0.95869, 0.88132, 0.93141, 0.45731, 0.0297, 0.011374, 0.0024294, 0.0050005, 0.0050005, 0.0050005
103, 0.040239, 0.015299, 0.0045802, 0.93908, 0.87286, 0.91821, 0.44729, 0.029021, 0.011544, 0.0026109, 0.004951, 0.004951, 0.004951
104, 0.037945, 0.015084, 0.0047527, 0.92849, 0.8798, 0.92594, 0.45464, 0.029488, 0.011624, 0.0025235, 0.0049015, 0.0049015, 0.0049015
105, 0.035728, 0.014625, 0.0041137, 0.93104, 0.89017, 0.93473, 0.45106, 0.028464, 0.011536, 0.0024739, 0.004852, 0.004852, 0.004852
106, 0.036095, 0.014377, 0.0035695, 0.95104, 0.89067, 0.93275, 0.45525, 0.028681, 0.011324, 0.0023388, 0.0048025, 0.0048025, 0.0048025
107, 0.033129, 0.013745, 0.0045044, 0.96466, 0.87034, 0.92253, 0.43602, 0.029847, 0.011526, 0.0026187, 0.004753, 0.004753, 0.004753
108, 0.03581, 0.014129, 0.004466, 0.94246, 0.87897, 0.93256, 0.44235, 0.029161, 0.011498, 0.002383, 0.0047035, 0.0047035, 0.0047035
109, 0.036119, 0.014118, 0.00385, 0.94807, 0.88811, 0.93238, 0.4495, 0.028547, 0.011253, 0.0021958, 0.004654, 0.004654, 0.004654
110, 0.036586, 0.013732, 0.0057441, 0.95179, 0.89555, 0.93243, 0.45773, 0.028935, 0.011324, 0.0022486, 0.0046045, 0.0046045, 0.0046045
111, 0.03723, 0.014682, 0.0049151, 0.90559, 0.87265, 0.90411, 0.43171, 0.030688, 0.011908, 0.003186, 0.004555, 0.004555, 0.004555
112, 0.03769, 0.014841, 0.0041948, 0.91731, 0.89011, 0.93053, 0.44163, 0.029824, 0.011698, 0.0021341, 0.0045055, 0.0045055, 0.0045055
113, 0.038761, 0.014188, 0.0038471, 0.95437, 0.86884, 0.92727, 0.45618, 0.028281, 0.011817, 0.0022187, 0.004456, 0.004456, 0.004456
114, 0.036452, 0.014331, 0.0057453, 0.96522, 0.87871, 0.93824, 0.46064, 0.028799, 0.011588, 0.0021392, 0.0044065, 0.0044065, 0.0044065
115, 0.039601, 0.014884, 0.0051718, 0.96994, 0.88182, 0.94156, 0.46756, 0.028833, 0.011381, 0.00205, 0.004357, 0.004357, 0.004357
116, 0.03565, 0.013554, 0.0051132, 0.96426, 0.87265, 0.9426, 0.48181, 0.028461, 0.011433, 0.0023655, 0.0043075, 0.0043075, 0.0043075
117, 0.034671, 0.01408, 0.0047611, 0.95432, 0.8779, 0.93406, 0.46503, 0.028968, 0.011435, 0.0022321, 0.004258, 0.004258, 0.004258
118, 0.035034, 0.013762, 0.005086, 0.96119, 0.89065, 0.94045, 0.47809, 0.02748, 0.011298, 0.002148, 0.0042085, 0.0042085, 0.0042085
119, 0.036641, 0.013784, 0.0046315, 0.96592, 0.88876, 0.93777, 0.46226, 0.02947, 0.011419, 0.0021617, 0.004159, 0.004159, 0.004159
120, 0.035707, 0.014564, 0.0046529, 0.96137, 0.87907, 0.93435, 0.45536, 0.028311, 0.011509, 0.0020214, 0.0041095, 0.0041095, 0.0041095
121, 0.036372, 0.013897, 0.0039082, 0.97488, 0.86659, 0.93632, 0.46514, 0.028935, 0.011269, 0.0020603, 0.00406, 0.00406, 0.00406
122, 0.035962, 0.01397, 0.004511, 0.95351, 0.88176, 0.94079, 0.47849, 0.028425, 0.011238, 0.0020694, 0.0040105, 0.0040105, 0.0040105
123, 0.03481, 0.013681, 0.0047074, 0.95818, 0.88796, 0.94004, 0.47831, 0.027879, 0.01152, 0.0019698, 0.003961, 0.003961, 0.003961
124, 0.032671, 0.013762, 0.0040278, 0.96445, 0.89987, 0.93647, 0.46653, 0.027991, 0.01143, 0.0016975, 0.0039115, 0.0039115, 0.0039115
125, 0.036008, 0.013361, 0.0047041, 0.96146, 0.9013, 0.93424, 0.45903, 0.028519, 0.011311, 0.0017047, 0.003862, 0.003862, 0.003862
126, 0.03401, 0.013447, 0.0040689, 0.95604, 0.90185, 0.93919, 0.47158, 0.027806, 0.011117, 0.0018529, 0.0038125, 0.0038125, 0.0038125
127, 0.031878, 0.013099, 0.0027967, 0.96041, 0.90457, 0.94384, 0.47963, 0.028383, 0.011013, 0.0018119, 0.003763, 0.003763, 0.003763
128, 0.034706, 0.01369, 0.0027384, 0.96853, 0.88868, 0.94017, 0.48097, 0.02741, 0.011032, 0.0018237, 0.0037135, 0.0037135, 0.0037135
129, 0.03394, 0.01386, 0.0032774, 0.95868, 0.89865, 0.93642, 0.46987, 0.028186, 0.011018, 0.0018775, 0.003664, 0.003664, 0.003664
130, 0.034989, 0.013471, 0.0039545, 0.93797, 0.91745, 0.94262, 0.46758, 0.027316, 0.011095, 0.0019048, 0.0036145, 0.0036145, 0.0036145
131, 0.033065, 0.01357, 0.0039853, 0.9269, 0.93499, 0.94237, 0.46206, 0.027146, 0.01123, 0.0019533, 0.003565, 0.003565, 0.003565
132, 0.032226, 0.013642, 0.0029681, 0.94436, 0.9174, 0.94702, 0.47563, 0.027191, 0.011339, 0.0017685, 0.0035155, 0.0035155, 0.0035155
133, 0.035862, 0.013621, 0.0048942, 0.94663, 0.92488, 0.94931, 0.47343, 0.027177, 0.010983, 0.0016468, 0.003466, 0.003466, 0.003466
134, 0.032855, 0.01324, 0.0045393, 0.94466, 0.91886, 0.94822, 0.48145, 0.026784, 0.0109, 0.0016719, 0.0034165, 0.0034165, 0.0034165
135, 0.03274, 0.013255, 0.0030268, 0.9484, 0.91461, 0.95161, 0.47642, 0.027377, 0.011027, 0.0015977, 0.003367, 0.003367, 0.003367
136, 0.033074, 0.013192, 0.003337, 0.95303, 0.9182, 0.94721, 0.48645, 0.026837, 0.011032, 0.0016092, 0.0033175, 0.0033175, 0.0033175
137, 0.033013, 0.013323, 0.003707, 0.93758, 0.91747, 0.94717, 0.47612, 0.027161, 0.010983, 0.0016134, 0.003268, 0.003268, 0.003268
138, 0.030322, 0.013092, 0.0025357, 0.94052, 0.91465, 0.94666, 0.48368, 0.026713, 0.010899, 0.0016353, 0.0032185, 0.0032185, 0.0032185
139, 0.033769, 0.013375, 0.0031571, 0.95437, 0.91911, 0.95307, 0.48591, 0.027083, 0.01088, 0.0016468, 0.003169, 0.003169, 0.003169
140, 0.034674, 0.013167, 0.0034438, 0.95476, 0.90341, 0.94133, 0.48328, 0.02734, 0.010869, 0.0016518, 0.0031195, 0.0031195, 0.0031195
141, 0.034613, 0.01286, 0.0034004, 0.95353, 0.91442, 0.94604, 0.48783, 0.027129, 0.010867, 0.0018675, 0.00307, 0.00307, 0.00307
142, 0.032892, 0.013876, 0.0035535, 0.9322, 0.91933, 0.9457, 0.49279, 0.026627, 0.010771, 0.0019343, 0.0030205, 0.0030205, 0.0030205
143, 0.031011, 0.013132, 0.003447, 0.92268, 0.92772, 0.94732, 0.49448, 0.026822, 0.010732, 0.0017231, 0.002971, 0.002971, 0.002971
144, 0.032919, 0.013046, 0.004605, 0.94068, 0.92479, 0.94934, 0.49588, 0.026757, 0.010778, 0.0015787, 0.0029215, 0.0029215, 0.0029215
145, 0.031458, 0.013003, 0.0034416, 0.93822, 0.9246, 0.94658, 0.49657, 0.027682, 0.010705, 0.0015234, 0.002872, 0.002872, 0.002872
146, 0.031623, 0.013087, 0.0035433, 0.93304, 0.93561, 0.94772, 0.49624, 0.028391, 0.010767, 0.0013857, 0.0028225, 0.0028225, 0.0028225
147, 0.032807, 0.01306, 0.0029549, 0.93738, 0.90933, 0.94098, 0.49238, 0.02858, 0.010758, 0.0013865, 0.002773, 0.002773, 0.002773
148, 0.036935, 0.013306, 0.0032721, 0.92351, 0.91808, 0.9423, 0.47735, 0.028816, 0.010793, 0.0015612, 0.0027235, 0.0027235, 0.0027235
149, 0.030831, 0.012657, 0.0039643, 0.91736, 0.91401, 0.9414, 0.49229, 0.027486, 0.010801, 0.0020861, 0.002674, 0.002674, 0.002674
150, 0.032044, 0.013002, 0.0041448, 0.95351, 0.90876, 0.94856, 0.4788, 0.027577, 0.010924, 0.0018295, 0.0026245, 0.0026245, 0.0026245
151, 0.031503, 0.013092, 0.0028095, 0.94881, 0.90539, 0.95116, 0.49358, 0.026849, 0.011107, 0.0017219, 0.002575, 0.002575, 0.002575
152, 0.031583, 0.013609, 0.0039922, 0.92632, 0.93211, 0.94618, 0.49088, 0.026933, 0.011233, 0.001522, 0.0025255, 0.0025255, 0.0025255
153, 0.031909, 0.013541, 0.003112, 0.9427, 0.92273, 0.94507, 0.4937, 0.026627, 0.011193, 0.0015387, 0.002476, 0.002476, 0.002476
154, 0.030239, 0.012999, 0.00289, 0.97151, 0.90291, 0.94399, 0.4949, 0.027178, 0.01108, 0.0014948, 0.0024265, 0.0024265, 0.0024265
155, 0.030373, 0.012644, 0.0023986, 0.97148, 0.89704, 0.94592, 0.49002, 0.027153, 0.011089, 0.0014694, 0.002377, 0.002377, 0.002377
156, 0.031883, 0.01289, 0.002711, 0.93966, 0.92571, 0.94411, 0.49418, 0.026661, 0.011046, 0.0016071, 0.0023275, 0.0023275, 0.0023275
157, 0.029923, 0.012475, 0.0029759, 0.94115, 0.92157, 0.94618, 0.49813, 0.026859, 0.011083, 0.0016252, 0.002278, 0.002278, 0.002278
158, 0.030265, 0.012474, 0.0024685, 0.94215, 0.91655, 0.94856, 0.48548, 0.02631, 0.01111, 0.0016479, 0.0022285, 0.0022285, 0.0022285
159, 0.031297, 0.013007, 0.0040787, 0.94493, 0.90204, 0.94156, 0.46938, 0.026793, 0.011069, 0.0017465, 0.002179, 0.002179, 0.002179
160, 0.030058, 0.012764, 0.0037174, 0.95447, 0.89408, 0.94238, 0.48629, 0.026789, 0.010908, 0.0016649, 0.0021295, 0.0021295, 0.0021295
161, 0.031778, 0.012688, 0.0029219, 0.93461, 0.92777, 0.9492, 0.49049, 0.026714, 0.010916, 0.0015429, 0.00208, 0.00208, 0.00208
162, 0.031261, 0.012908, 0.0029539, 0.94651, 0.91275, 0.94502, 0.49397, 0.026811, 0.010991, 0.0014428, 0.0020305, 0.0020305, 0.0020305
163, 0.030035, 0.012285, 0.0026411, 0.9596, 0.91855, 0.95084, 0.48836, 0.026452, 0.011055, 0.0013155, 0.001981, 0.001981, 0.001981
164, 0.02853, 0.01247, 0.0030658, 0.94318, 0.93036, 0.94797, 0.49946, 0.026565, 0.011087, 0.0013172, 0.0019315, 0.0019315, 0.0019315
165, 0.03144, 0.013278, 0.0022033, 0.93844, 0.92516, 0.94625, 0.48882, 0.026606, 0.011076, 0.0014597, 0.001882, 0.001882, 0.001882
166, 0.028238, 0.012273, 0.0022988, 0.93881, 0.91819, 0.94724, 0.49165, 0.027022, 0.011048, 0.0014657, 0.0018325, 0.0018325, 0.0018325
167, 0.029581, 0.012764, 0.0018928, 0.94996, 0.91838, 0.95192, 0.49397, 0.027021, 0.010946, 0.0014319, 0.001783, 0.001783, 0.001783
168, 0.030364, 0.012646, 0.002652, 0.95142, 0.92398, 0.94893, 0.49144, 0.026209, 0.010894, 0.0013961, 0.0017335, 0.0017335, 0.0017335
169, 0.029047, 0.013155, 0.0022754, 0.94613, 0.92001, 0.94914, 0.50106, 0.026235, 0.010813, 0.0013629, 0.001684, 0.001684, 0.001684
170, 0.030348, 0.012868, 0.0025903, 0.94442, 0.92638, 0.94826, 0.49845, 0.026182, 0.01084, 0.0013827, 0.0016345, 0.0016345, 0.0016345
171, 0.028844, 0.013418, 0.00221, 0.94845, 0.91907, 0.94593, 0.49442, 0.025945, 0.010825, 0.0013878, 0.001585, 0.001585, 0.001585
172, 0.028044, 0.012475, 0.0020475, 0.94977, 0.91562, 0.94693, 0.49909, 0.025607, 0.010884, 0.0013793, 0.0015355, 0.0015355, 0.0015355
173, 0.03037, 0.012416, 0.0020514, 0.95493, 0.91276, 0.94515, 0.49652, 0.025832, 0.010954, 0.0014017, 0.001486, 0.001486, 0.001486
174, 0.02901, 0.012575, 0.0022523, 0.93349, 0.93395, 0.94592, 0.49279, 0.025973, 0.010781, 0.00136, 0.0014365, 0.0014365, 0.0014365
175, 0.0299, 0.013539, 0.0023896, 0.9492, 0.92686, 0.9518, 0.50485, 0.026004, 0.010724, 0.0013465, 0.001387, 0.001387, 0.001387
176, 0.028433, 0.012411, 0.0016765, 0.95407, 0.92405, 0.9525, 0.50396, 0.025779, 0.010821, 0.0013625, 0.0013375, 0.0013375, 0.0013375
177, 0.0271, 0.012667, 0.0019585, 0.95087, 0.92476, 0.95126, 0.50414, 0.026096, 0.010865, 0.0014006, 0.001288, 0.001288, 0.001288
178, 0.026388, 0.012452, 0.0016576, 0.94651, 0.923, 0.94803, 0.50449, 0.025548, 0.010875, 0.0013988, 0.0012385, 0.0012385, 0.0012385
179, 0.028348, 0.012673, 0.0030774, 0.94689, 0.92148, 0.94484, 0.49438, 0.025692, 0.010761, 0.0013638, 0.001189, 0.001189, 0.001189
180, 0.029453, 0.012794, 0.0026043, 0.95834, 0.91733, 0.94887, 0.49784, 0.025882, 0.010789, 0.0013153, 0.0011395, 0.0011395, 0.0011395
181, 0.028473, 0.012477, 0.0022936, 0.95265, 0.92939, 0.95142, 0.49798, 0.025795, 0.010809, 0.0013028, 0.00109, 0.00109, 0.00109
182, 0.028324, 0.012459, 0.0019193, 0.9466, 0.91467, 0.94467, 0.49455, 0.026117, 0.010848, 0.0013132, 0.0010405, 0.0010405, 0.0010405
183, 0.029356, 0.012703, 0.003578, 0.95437, 0.91491, 0.95042, 0.50744, 0.025541, 0.010811, 0.0013132, 0.000991, 0.000991, 0.000991
184, 0.027983, 0.012892, 0.0025631, 0.95246, 0.91897, 0.94813, 0.50401, 0.025867, 0.010776, 0.0013071, 0.0009415, 0.0009415, 0.0009415
185, 0.027208, 0.012446, 0.0024887, 0.95587, 0.91898, 0.94605, 0.50406, 0.025502, 0.0108, 0.0013051, 0.000892, 0.000892, 0.000892
186, 0.028043, 0.012252, 0.003118, 0.95491, 0.91842, 0.94713, 0.50056, 0.025583, 0.010805, 0.0012869, 0.0008425, 0.0008425, 0.0008425
187, 0.026881, 0.012273, 0.0021586, 0.95228, 0.9148, 0.94728, 0.49569, 0.025704, 0.010731, 0.0012944, 0.000793, 0.000793, 0.000793
188, 0.028154, 0.012503, 0.0026297, 0.95994, 0.92402, 0.9504, 0.49985, 0.025299, 0.010703, 0.0012784, 0.0007435, 0.0007435, 0.0007435
189, 0.025187, 0.01273, 0.0016661, 0.95789, 0.92415, 0.95036, 0.50167, 0.025299, 0.010691, 0.0012481, 0.000694, 0.000694, 0.000694
190, 0.026276, 0.012488, 0.0024498, 0.96238, 0.92072, 0.95109, 0.50031, 0.025391, 0.010723, 0.0012344, 0.0006445, 0.0006445, 0.0006445
191, 0.026011, 0.012287, 0.0018118, 0.9519, 0.91835, 0.94936, 0.50506, 0.025053, 0.010712, 0.00121, 0.000595, 0.000595, 0.000595
192, 0.026917, 0.012097, 0.0022713, 0.95405, 0.9194, 0.95635, 0.50289, 0.025229, 0.010745, 0.0012023, 0.0005455, 0.0005455, 0.0005455
193, 0.027752, 0.012482, 0.0019695, 0.96902, 0.90241, 0.95131, 0.50304, 0.025084, 0.010724, 0.001186, 0.000496, 0.000496, 0.000496
194, 0.027573, 0.012552, 0.0020059, 0.96912, 0.91049, 0.95099, 0.50273, 0.025304, 0.010815, 0.001182, 0.0004465, 0.0004465, 0.0004465
195, 0.026493, 0.012007, 0.0028567, 0.96807, 0.9147, 0.95072, 0.5051, 0.025475, 0.010831, 0.0011794, 0.000397, 0.000397, 0.000397
196, 0.025988, 0.012443, 0.0015634, 0.95381, 0.91784, 0.9473, 0.50111, 0.025466, 0.010827, 0.0012101, 0.0003475, 0.0003475, 0.0003475
197, 0.027279, 0.012284, 0.0022347, 0.96237, 0.90546, 0.94477, 0.49965, 0.025437, 0.010829, 0.0012444, 0.000298, 0.000298, 0.000298
198, 0.027767, 0.012665, 0.0024972, 0.95433, 0.91548, 0.94497, 0.50231, 0.025252, 0.010814, 0.0012563, 0.0002485, 0.0002485, 0.0002485
199, 0.027364, 0.012457, 0.0030708, 0.9553, 0.91458, 0.94536, 0.50262, 0.025289, 0.010801, 0.001247, 0.000199, 0.000199, 0.000199
训练曲线如下
训练混淆矩阵如下
2.4.7 电路板缺陷验证测试
python
class CPcbDefectCnnModel(object):
def __init__(self, model_path):
self.weights= model_path
self.data='data/pcbDefect.yaml'
self.imgsz=(640, 640)
self.conf_thres=0.5
self.iou_thres=0.45
# Load model
self.device = select_device()
print(self.device)
self.model = DetectMultiBackend(self.weights, device=self.device, dnn=self.dnn, data=self.data, fp16=self.half)
stride, self.names, pt = self.model.stride, self.model.names, self.model.pt
imgsz = check_img_size(self.imgsz, s=stride) # check image size
def predict(self, image_numpy_data):
# Padded resize
img = letterbox(image_numpy_data, 640, 32, True)[0]
print(img.shape)
# Convert
img = img.transpose((2, 0, 1))[::-1] # HWC to CHW, BGR to RGB
img = np.ascontiguousarray(img)
pred = self.model(im, augment=self.augment, visualize=self.visualize)
pred = non_max_suppression(pred, self.conf_thres, self.iou_thres, self.classes)
detect_results = []
# Process predictions
for i, det in enumerate(pred): # per image
if len(det):
# Rescale boxes from img_size to im0 size
det[:, :4] = scale_coords(im.shape[2:], det[:, :4], image_numpy_data.shape).round()
detections = det.cpu().numpy()
for v in detections:
detect_results.append(v)
bboxes = []
scores = []
classIds = []
# [x,y,w,h,p,class]
for detection in detect_results:
print(detection)
score = detection[4]
classId = detection[5]
(x1, y1, x2, y2) = detection[:4]
bboxes.append([int(x1), int(y1), \
int(x2 - x1), int(y2 - y1)])
scores.append(float(score))
classIds.append(classId)
print(detect_results)
for *xyxy, conf, cls in reversed(det):
c = int(cls) # integer class
label = None if self.hide_labels else (self.names[c] if self.hide_conf else f'{self.names[c]} {conf:.2f}')
annotator.box_label(xyxy, label, color=colors(c, True))
return im0
模型验证结果如下:
觉得不错的小伙伴,感谢点赞、关注加收藏哦!更多干货内容持续更新...
代码下载链接
关注博主的G Z H【小蜜蜂视觉】,回复【电路板缺陷检测】即可获取下载方式
参考文献
[1] 袁氢.基于特征融合与神经网络的深度学习图像识别技术研究[D]:(硕士学位论文).武汉:武汉科技大学,2020.
[2] 白天毅.基于神经网络的深度学习图像识别关键技术研究[D]:(硕士学位论文).西安:西安工业大学,2019.
[3] Hubel D H,Wiesel T N.Receptive fields,binocular interaction and functional architecture in the cat's visual cortex[J].Journal of Physiology,1962,160(1):106--154.
[4] Fukushima K.A neural network model for selective attention in visual pattern DEEP[J].Biological Cybernetics,1986,55(1):5-15.
[5] 徐珊珊. 卷积神经网络的研究与应用[D]:(硕士学位论文). 南京:南京林业大学,2013.
[6] VapnikV著,张学工译. 统计学习理论的本质[M]. 北京:清华大学出版社,2000.
[7] 郑君里,应启珩等.信号与系统[M]. 北京:高等教育出版社,2000.
[8] Haykin S著,申富饶译. 神经网络与机器学习[M]. 北京:机器工业出版社,2011.
[9] Lecun Y,Bottou L,et al.Gradient-based learning applied to document DEEP[J].Proceedings of the IEEE,1998,86(11):2278-2324.
[10] 田一然.深度学习图像识别技术的研究与实现[D]:(硕士学位论文).长春:吉林大学,2015.
[11] Naigong Y,Panna J,et al.Handwritten digits DEEP base on improved LeNet5[C].Chinese Control and Decision Conference (CCDC),IEEE,2015:4871-4875.
[12] 尹宝才,王文通等.深度学习研究综述[J]. 北京大学学报,2015,41(1):49-51.
[13] Dalal N, Triggs B.Histograms of oriented gradients for human detection[A].In Proc.IEEE CVPR[C], 2005,886-893.
[14] Zhu Q, Yeh C, Cheng T.Fasthuman detection using a cascade of histograms of oriented gradients[A]. In Proc.IEEE CVPR [C], 2006, 2:1491-1498.
[15] Ojala T, Pietikainen M, Harwood D.A comparative study of texture measures with classification based on feature distributions[J]. Pattern DEEP, 1996, 19(3):51-59.
[16] Mu Y, Yan S, Liu Y, et al.Discriminative local binary patterns for pedestrian detection in personal album[A]. In Proc.IEEE CVPR[C], 2008.
[17] Wang X, Han X, Yan S.A HOG-LBP human detector with partial occlusion handling[A]. In Proc.IEEE ICCV[C], 2009.
[18] Walk S, Majer N, Schindler K, et al.New features and insights for pedestrian detection[A]. In Proc.IEEE CVPR[C], 2010.
[19] Wu J X, Rehg J M. CENTRIST: A visual descriptor for scene categorization[J]. IEEE Trans.on Pattern Analysis and Machine Intelligent, 2011, 33(8):1489-1501.
[20] Wu J, Geyer C, Rehg J M. Real-time human detection using contour cues[C]. IEEE International Conference on Robotics and Automation. IEEE, 2011:860-867.
[21] Viola P, Jones M. Robust real-time face detection[J]. International Journal of Computer Vision, 2004, 57(2):137-154.
[22] Papageorgiou C P, Oren M, Poggio T. A general framework for object detection[C]. International Conference on Computer Vision. IEEE, 1998:555-562.
[23] Viola P, Jones M, Snow D. Detecting pedestrians using patterns of motion and appearance[J], Inter-national Journal of Computer Vision, 2005, 63(2):153-161.
[24] Guo L, Ge P S, Zhang M H, et al. Pedestrian detection for intelligent transportation systems combining AdaBoost algorithm and support vector machine[J]. Expert Systems with Applications, 2012, 39(4):4274-4286.
[25] D.M. Gavrila, V. Philomin. Real-Time Object Detection for "Smart" Vehicles[C]. The Proceedings of the Seventh IEEE International Conference on Computer Vision. IEEE Xplore, 1999:87-93 vol.1.