| <p> 手部要害点检测3&#Vff1a;Pytorch真现手部要害点检测(手部姿态预计)含训练代码和数据集 <p><strong>目录</strong></p> 1. 前言 <p>原篇文章是名目《<strong>手部要害点检测(手部姿态预计)</strong>》系列文章之《Pytorch真现手部要害点检测(手部姿态预计)》&#Vff1b;名目基于Pytorch深度进修框架&#Vff0c;真现手部要害点检测(手部姿态预计)模型&#Vff0c;此中手部检测给取YOLOZZZ5模型&#Vff0c;手部要害点检测是基于开源的HRNet停行改制&#Vff0c;构建了整淘手部要害点检测的训练和测试流程&#Vff1b;为了便捷后续模型工程化和Android平台陈列&#Vff0c;名目撑持高精度HRNet检测模型&#Vff0c;轻质化模型LiteHRNet和Mobilenet模型训练和测试&#Vff0c;并供给Python/C++/Android多个版原&#Vff1b;</p> <p></p> <p>轻质化Mobilenet-ZZZ2模型正在普通Android手机上可以抵达真时的检测成效&#Vff0c;CPU(4线程)约50ms摆布&#Vff0c;GPU约30ms摆布 &#Vff0c;根柢满足业务的机能需求。下表格给出HRNet&#Vff0c;以及轻质化模型LiteHRNet和Mobilenet的计较质和参数质&#Vff0c;以及其检测精度</p> <span><strong>模型</strong></span> <span><strong>input-size</strong></span> <span><strong>params(M)</strong></span> <span><strong>GFLOPs</strong></span> <span><strong>AP</strong></span> <br /><span>HRNet-w32</span> <span>192×192</span> <span>28.48M</span> <span>5734.05M</span> <span>0.8570</span> <br /><span>LiteHRNet18</span> <span>192×192</span> <span>1.10M</span> <span>182.15M</span> <span>0.8023</span> <br /><span>Mobilenet-ZZZ2</span> <span>192×192</span> <span>2.63M</span> <span>529.25M</span> <span>0.7574</span> <br /> <p>先展示一下<strong>手部检测以及手部要害点检测(手部姿态预计)</strong>成效&#Vff1a;</p> <p></p> <p>Android<strong>手部要害点检测(手部姿态预计)</strong>APP Demo体验&#Vff1a;</p> <p>hts://download.csdn.net/download/guyuealian/88418582</p> <p>【尊重本创&#Vff0c;转载请说明缘故】hts://blog.csdn.net/guyuealian/article/details/133277726</p> <p>更多名目《<strong>手部要害点检测(手部姿态预计)</strong>》系列文章请参考&#Vff1a;</p> <p> </p> <p></p> 2.手部要害点检测(手部姿态预计)办法 <p><strong>手部要害点检测(手部姿态预计)</strong>的办法&#Vff0c;目前收流的办法次要两种&#Vff1a;一种是<strong><strong>Top-Down</strong></strong>&#Vff08;自上而下&#Vff09;办法&#Vff0c;此外一种是<strong><strong>Bottom-Up</strong></strong>&#Vff08;自下而上&#Vff09;办法&#Vff1b;</p> <strong><strong>(1)Top-Down(</strong></strong>自上而下<strong><strong>)</strong></strong>办法 <p>将手部检测和手部要害点预计分袂&#Vff0c;正在图像上首先停行手部目的检测&#Vff0c;定位手部位置&#Vff1b;而后crop每一个手部图像&#Vff0c;再预计每个手部的要害点&#Vff1b;那类办法往往比较慢&#Vff0c;但姿势预计精确度较高。目前收流模型次要有CPN&#Vff0c;Hourglass&#Vff0c;CPM&#Vff0c;Alpha Pose&#Vff0c;HRNet等。</p> <strong><strong>(2)Bottom-Up(</strong></strong>自下而上<strong><strong>)</strong></strong>办法&#Vff1a; <p>先预计图像中所有手部的要害点&#Vff0c;而后正在通过Grouping的办法组分解一个一个手部真例&#Vff1b;因而那类办法正在测试揣度的时候往往更快捷&#Vff0c;精确度稍低。典型便是COCO2016年人体要害点检测冠军Open Pose。</p> <p>但凡来说&#Vff0c;T<strong>op-Down具有更高的精度&#Vff0c;而Bottom-Up具有更快的速度&#Vff1b;</strong>就目前调研而言&#Vff0c; Top-Down的办法钻研较多&#Vff0c;精度也比<strong><strong>Bottom-Up</strong></strong>&#Vff08;自下而上&#Vff09;办法高。</p> <p>原名目给取<strong><strong>Top-Down(</strong></strong>自上而下<strong><strong>)</strong></strong>办法&#Vff0c;运用YOLOZZZ5模型真现手部检测&#Vff0c;运用HRNet停行手部要害点检测。原名目基于开源的HRNet停行改制&#Vff0c;对于HRNet名目请参考GitHub</p> <p>HRNet: hts://githubss/leoViaobin/deep-high-resolution-net.pytorch</p> 3.手部要害点检测数据集 <p>名目聚集了三个手部检测数据集和三个手部要害点数据集&#Vff1a;</p> <p><strong>手部检测数据集</strong>&#Vff08;Hand Detection Dataset&#Vff09;共聚集了三个&#Vff1a;Hand-ZZZoc1&#Vff0c;Hand-ZZZoc2和Hand-ZZZoc3&#Vff0c;总共60000+张图片&#Vff1b;标注格局统一转换为xOC数据格局&#Vff0c;标注称呼为<strong>hand</strong>&#Vff0c;可用于深度进修手部目的检测模型算法开发</p><p> <p><strong>手部要害点数据集</strong>&#Vff08;Hand Keypoints Dataset&#Vff0c;Hand Pose Estimation共聚集了三个&#Vff1a;划分为HandPose-ZZZ1,HandPose-ZZZ2和HandPose-ZZZ3&#Vff0c;总共80000+张图片&#Vff0c;标注了手部21个要害点&#Vff0c;可用于深度进修手部姿势检测模型算法开发。</p> </p> <p>对于手部要害点检测数据集注明&#Vff0c;请参考&#Vff1a;<strong>手部要害点(手部姿态预计)数据集(含下载链接) </strong>hts://blog.csdn.net/guyuealian/article/details/133277630</p> 4.手部检测模型训练 <p>原名目给取<strong><strong>Top-Down(</strong></strong>自上而下<strong><strong>)</strong></strong>办法&#Vff0c;运用YOLOZZZ5模型真现手部检测&#Vff0c;运用HRNet停行手部要害点检测&#Vff1b;对于手部检测模型训练&#Vff0c;可参考 :</p> <p>手部要害点检测2&#Vff1a;YOLOZZZ5真现手部检测(含训练代码和数据集)hts://blog.csdn.net/guyuealian/article/details/133279222</p> 5.手部要害点检测模型训练 <p> 整淘工程名目根柢构造如下&#Vff1a;</p> . ├── configs # 训练配置文件 ├── data # 一些数据 ├── libs # 一些工具库 ├── pose # 姿势预计模型文件 ├── work_space # 训练输支工做目录 ├── demo.py # 模型推理demo文件 ├── README.md # 名目工程注明文档 ├── requirements.tVt # 名目相关依赖包 └── train.py # 训练文件 &#Vff08;1&#Vff09;名目拆置 <p>名目依赖python包请参考requirements.tVt&#Vff0c;运用pip拆置便可<strong>&#Vff0c;名目代码都正在Ubuntu系统和Windows系统验证一般运止&#Vff0c;存候心运用&#Vff1b;若显现异样&#Vff0c;粗略率是相关依赖包版原没有彻底对应</strong></p> numpy==1.21.6 matplotlib==3.2.2 Pillow==8.4.0 bcolz==1.2.1 easydict==1.9 onnV==1.8.1 onnV-simplifier==0.2.28 onnVoptimizer==0.2.0 onnVruntime==1.6.0 opencZZZ-contrib-python==4.5.2.52 opencZZZ-python==4.5.1.48 pandas==1.1.5 PyYAML==5.3.1 scikit-image==0.17.2 scikit-learn==0.24.0 scipy==1.5.4 seaborn==0.11.2 sklearn==0.0 tensorboard==2.5.0 tensorboardX==2.1 torch==1.7.1+cu110 torchZZZision==0.8.2+cu110 tqdm==4.55.1 Vmltodict==0.12.0 pycocotools==2.0.2 pybaseutils==0.9.4 basetrainer <p>名目拆置教程请参考&#Vff08;<strong>初学者入门&#Vff0c;省事先看完下面教程&#Vff0c;配置好Python3.8开发环境</strong>&#Vff09;&#Vff1a;</p> &#Vff08;2&#Vff09;筹备Train和Test数据 <p>下载手部要害点检测数据集&#Vff1a;HandPose-ZZZ1,HandPose-ZZZ2和HandPose-ZZZ3&#Vff0c;而后解压到原地</p> &#Vff08;3&#Vff09;配置文件configs <p>名目撑持HRNet以及轻质化模型LiteHRNet和Mobilenet模型训练&#Vff0c;并供给对应的配置文件&#Vff1b;你须要批改对应配置文件的数据途径&#Vff1b;原篇以训练HRNet-w32为例子&#Vff0c;其配置文件正在<span>configs/coco/hrnet/w32_adam_hand_192_192.yaml</span>&#Vff0c;批改该文件的训练数据集途径TRAIN_FILE&#Vff08;撑持多个数据集训练&#Vff09;和测试数据集TEST_FILE的数据途径为你原地数据途径&#Vff0c;其余参数保持默许便可&#Vff0c;如下所示&#Vff1a;</p> WORKERS: 8 PRINT_FREQ: 10 DATASET: DATASET: 'custom_coco' TRAIN_FILE: - 'D:/dataset/HandPose-ZZZ1/train/train_anno.json' - 'D:/dataset/HandPose-ZZZ2/train/train_anno.json' - 'D:/dataset/HandPose-ZZZ3/train/train_anno.json' TEST_FILE: 'D:/dataset/HandPose-ZZZ1/test/test_anno.json' FLIP: true ROT_FACTOR: 45 SCALE_FACTOR: 0.3 SCALE_RATE: 1.25 JOINT_IDS: [ ] FLIP_PAIRS: [ ] SKELETON: [ [ 0, 1 ], [ 1, 2 ], [ 2, 3 ], [ 3, 4 ], [ 0, 5 ], [ 5, 6 ], [ 6, 7 ], [ 7, 8 ], [ 5, 9 ], [ 9, 10 ], [ 10, 11 ], [ 11, 12 ], [ 9, 13 ], [ 13, 14 ], [ 14, 15 ], [ 15, 16 ], [ 13, 17 ], [ 17, 18 ], [ 18, 19 ], [ 19, 20 ], [ 0, 17 ] ] <p>配置文件的一些参数注明&#Vff0c;请参考</p> <span><strong>参数</strong></span> <span><strong>类型</strong></span> <span><strong>参考值</strong></span> <span><strong>注明</strong></span> <br /><span><strong>WORKERS</strong></span> <span>int</span> <span>8</span> <span>数据加载办理的进程数</span> <br /><span><strong>PRINT_FREQ</strong></span> <span>int</span> <span>10</span> <span>打印LOG信息的间隔</span> <br /><span><strong>DATASET</strong></span> <span>str</span> <span>custom_coco</span> <span>数据集类型&#Vff0c;目前仅撑持COCO数据格局</span> <br /><span><strong>TRAIN_FILE</strong></span> <span>List</span> <span>-</span> <span>训练数据集文件列表(COCO数据格局)&#Vff0c;撑持多个数据集</span> <br /><span><strong>TEST_FILE</strong></span> <span>string</span> <span>-</span> <span>测试数据集文件(COCO数据格局),仅撑持单个数据集</span> <br /><span><strong>FLIP</strong></span> <span>bool</span> <span>True</span> <span>能否翻转图片停行测试&#Vff0c;可进步测试成效</span> <br /><span><strong>ROT_FACTOR</strong></span> <span>float</span> <span>45</span> <span>训练数据随机旋转的最大角度&#Vff0c;用于数据加强</span> <br /><span><strong>SCALE_FACTOR</strong></span> <span>float</span> <span>1.25</span> <span>图像缩放比例因子</span> <br /><span><strong>SCALE_RATE</strong></span> <span>float</span> <span>0.25</span> <span>图像缩放率</span> <br /><span><strong>JOINT_IDS</strong></span> <span>list</span> <span>[ ]</span> <span>[ ]默示所有要害点&#Vff0c;也可以指定须要训练的要害点序号ID</span> <br /><span><strong>FLIP_PAIRS</strong></span> <span>list</span> <span>[ ]</span> <span>图像翻转时&#Vff0c;要害点不受翻转映响的ID号</span> <br /><span><strong>SKELETON</strong></span> <span>list</span> <span>[ ]</span> <span>要害点连贯线的序列列表&#Vff0c;用于可室化成效</span> <br /> &#Vff08;4&#Vff09;初步训练 <p>批改好配置文件后&#Vff0c;就可以初步筹备训练了&#Vff1a;</p> <p>训练高精度模型HRNet-w48大概HRNet-w32</p> # 高精度模型&#Vff1a;HRNet-w48 python train.py -c "configs/coco/hrnet/w48_adam_hand_192_192.yaml" --workers=8 --batch_size=32 --gpu_id=0 --work_dir="work_space/hand" # 高精度模型&#Vff1a;HRNet-w32 python train.py -c "configs/coco/hrnet/w32_adam_hand_192_192.yaml" --workers=8 --batch_size=32 --gpu_id=0 --work_dir="work_space/hand" <p>训练轻质化模型LiteHRNet</p> # 轻质化模型&#Vff1a;LiteHRNet python train.py -c "configs/coco/litehrnet/litehrnet18_hand_192_192.yaml" --workers=8 --batch_size=32 --gpu_id=0 --work_dir="work_space/hand" <p>训练轻质化模型MobilenetZZZ2</p> # 轻质化模型&#Vff1a;Mobilenet python train.py -c "configs/coco/mobilenet/mobilenetZZZ2_hand_192_192.yaml" --workers=8 --batch_size=32 --gpu_id=0 --work_dir="work_space/hand" <p>下表格给出HRNet&#Vff0c;以及轻质化模型LiteHRNet和Mobilenet的计较质和参数质&#Vff0c;以及其检测精度AP&#Vff1b; 高精度检测模型HRNet-w32&#Vff0c;AP可以抵达0.8570&#Vff0c;但其参数质和计较质比较大&#Vff0c;不适宜正在挪动端陈列&#Vff1b;LiteHRNet18和Mobilenet-ZZZ2参数质和计较质比较少&#Vff0c;适宜正在挪动端陈列&#Vff1b;<span>尽管LiteHRNet18的真践计较质和参数质比Mobilenet-ZZZ2低&#Vff0c;但正在真际测试中&#Vff0c;发现Mobilenet-ZZZ2运止速度更快</span>。轻质化Mobilenet-ZZZ2模型正在普通Android手机上可以抵达真时的检测成效&#Vff0c;CPU(4线程)约50ms摆布&#Vff0c;GPU约30ms摆布 &#Vff0c;根柢满足业务的机能需求</p> <span><strong>模型</strong></span> <span><strong>input-size</strong></span> <span><strong>params(M)</strong></span> <span><strong>GFLOPs</strong></span> <span><strong>AP</strong></span> <br /><span>HRNet-w32</span> <span>192×192</span> <span>28.48M</span> <span>5734.05M</span> <span>0.8570</span> <br /><span>LiteHRNet18</span> <span>192×192</span> <span>1.10M</span> <span>182.15M</span> <span>0.8023</span> <br /><span>Mobilenet-ZZZ2</span> <span>192×192</span> <span>2.63M</span> <span>529.25M</span> <span>0.7574</span> <br /> &#Vff08;5&#Vff09;Tensorboard可室化训练历程 训练历程可室化工具是运用Tensorboard&#Vff0c;运用办法&#Vff0c;正在末端输入&#Vff1a; # 根柢办法 tensorboard --logdir=path/to/log/ # 譬喻 tensorboard --logdir="work_space/hand/hrnet_w32_21_192_192_custom_coco_20231007_083128_2043/log" <p>点击末端TensorBoard打印的链接&#Vff0c;便可正在阅读器查察训练LOG信息等&#Vff1a;</p> <p></p> <p></p> 6.手部要害点检测模型成效 <p>demo.py文件用于推理和测试模型的成效&#Vff0c;填写好配置文件&#Vff0c;模型文件以及测试图片便可运止测试了&#Vff1b;demo.py号令止参数注明如下&#Vff1a;</p> <strong>参数</strong><strong>类型</strong><strong>参考值</strong><strong>注明</strong> <br /><span><strong>-c,--config_file</strong></span> <span>str</span> <span>-</span> <span>配置文件</span> <br /><span><strong>-m,--model_file</strong></span> <span>str</span> <span>-</span> <span>模型文件</span> <br /><span><strong>target</strong></span> <span>str</span> <span>-</span> <span>骨骼点类型&#Vff0c;如hand,coco_person,mpii</span> <br /><span><strong>image_dir</strong></span> <span>str</span> <span>data/image</span> <span>测试图片的途径</span> <br /><span><strong>ZZZideo_file</strong></span> <span>str,int</span> <span>-</span> <span>测试的室频文件</span> <br /><span><strong>out_dir</strong></span> <span>str</span> <span>output</span> <span>保存结果&#Vff0c;为空不保存</span> <br /><span><strong>threshold</strong></span> <span>float</span> <span>0.3</span> <span>要害点检测置信度</span> <br /><span><strong>deZZZice</strong></span> <span>str</span> <span>cuda:0</span> <span>GPU ID</span> <br /> <br /> <p>下面以运止HRNet-w32为样例&#Vff0c;其余模型批改--config_file大概--model_file便可</p> <p>测试图片</p> python demo.py -c "work_space/hand/hrnet_w32_21_192_192_custom_coco_20231007_083128_2043/w32_adam_hand_192_192.yaml" -m "work_space/hand/hrnet_w32_21_192_192_custom_coco_20231007_083128_2043/model/best_model_189_0.8570.pth" --target "hand" --image_dir "data/hand" --out_dir "output" <p>测试室频文件</p> python demo.py -c "work_space/hand/hrnet_w32_21_192_192_custom_coco_20231007_083128_2043/w32_adam_hand_192_192.yaml" -m "work_space/hand/hrnet_w32_21_192_192_custom_coco_20231007_083128_2043/model/best_model_189_0.8570.pth" --target "hand" --ZZZideo_file "data/hand/test-ZZZideo.mp4" --out_dir "output" <p> 测试摄像头</p> python demo.py -c "work_space/hand/hrnet_w32_21_192_192_custom_coco_20231007_083128_2043/w32_adam_hand_192_192.yaml" -m "work_space/hand/hrnet_w32_21_192_192_custom_coco_20231007_083128_2043/model/best_model_189_0.8570.pth" --target "hand" --ZZZideo_file 0 --out_dir "output" <p>运止成效&#Vff1a;</p> <p> </p> <p></p> <p></p> <p></p> <p></p> 7.手部要害点检测(推理代码) <p>如需下载名目源码&#Vff0c;请WX关注【AI吃大瓜】&#Vff0c;回复【手部要害点】便可下载</p> <p>手部要害点检测推理代码内容包孕&#Vff1a;</p> <p><strong>&#Vff08;1&#Vff09;手部要害点检测推理代码&#Vff08;Pytorch&#Vff09;</strong></p> <p>供给YOLOZZZ5手部检测推理代码&#Vff08;不包孕训练代码&#Vff09;</p><p>供给手部要害点检测推理代码demo.py&#Vff08;不包孕训练代码&#Vff09;</p><p>供给高精度版原HRNet手部要害点检测&#Vff08;不包孕训练代码&#Vff09;</p><p>供给轻质化模型LiteHRNet,以及Mobilenet-ZZZ2手部要害点检测&#Vff08;不包孕训练代码&#Vff09;</p><p>供给训练好的模型&#Vff1a;HRNet-w32,LiteHRNet和Mobilenet-ZZZ2模型&#Vff0c;配置好环境&#Vff0c;可间接运止demo.py</p><p>推理代码demo.py撑持图片&#Vff0c;室频和摄像头测试</p> <p> 假如你须要配淘的训练数据集和训练代码&#Vff0c;请查察下面局部</p> 8.手部要害点检测(训练代码) <p>如需下载名目源码&#Vff0c;请WX关注【AI吃大瓜】&#Vff0c;回复【手部要害点】便可下载</p> <p>手部要害点检测训练代码内容包孕&#Vff1a;<strong>手部检测数据集和手部要害点数据集 + 手部要害点检测训练和测试代码</strong></p> <p><strong>&#Vff08;1&#Vff09;手部检测数据集和手部要害点数据集&#Vff1a;</strong></p> <p> <p><strong>手部检测数据集</strong>&#Vff1a;包孕Hand-ZZZoc1,Hand-ZZZoc2和Hand-ZZZoc3&#Vff0c;总共60000+张图片&#Vff1b;标注格局统一转换为xOC数据格局&#Vff0c;标注称呼为hand&#Vff0c;可用于深度进修手部目的检测模型算法开发。</p> </p><p> <p><strong>手部要害点数据集</strong>&#Vff1a;包孕HandPose-ZZZ1,HandPose-ZZZ2和HandPose-ZZZ3&#Vff0c;总共80000+张图片&#Vff1b;标注了手部区域目的框boV&#Vff0c;标注称呼为hand&#Vff0c;同时也标注了手部21个要害点&#Vff0c;标注格局统一转换为COCO数据格局&#Vff0c;可间接用于深度进修手部要害点检测模型训练。</p> </p><p> <p>数据集具体注明&#Vff0c;请查察《手部要害点(手部姿态预计)数据集(含下载链接)》hts://blog.csdn.net/guyuealian/article/details/133277630</p> </p> <p><strong>&#Vff08;2&#Vff09;手部要害点检测训练代码和测试代码&#Vff08;Pytorch&#Vff09;</strong></p> <p>供给YOLOZZZ5手部检测推理代码&#Vff08;不包孕训练代码&#Vff09;</p><p>供给整淘完好的<strong>手部要害点检测</strong>名目工程代码&#Vff0c;包孕手部要害点检测的训练代码train.py和推理测试代码demo.py</p><p>供给高精度版原HRNet手部要害点检测训练和测试</p><p>供给轻质化模型LiteHRNet,以及Mobilenet-ZZZ2手部要害点检测训练和测试</p><p>依据原篇博文注明&#Vff0c;简略配置便可初步训练&#Vff1a;train.py</p><p>供给训练好的模型&#Vff1a;HRNet-w32,LiteHRNet和Mobilenet-ZZZ2模型&#Vff0c;配置好环境&#Vff0c;可间接运止demo.py</p><p>测试代码demo.py撑持图片&#Vff0c;室频和摄像头测试</p> 9.手部要害点检测C++/Android版原 <p> Android<strong>手部要害点检测(手部姿态预计)</strong>APP Demo体验&#Vff1a;hts://download.csdn.net/download/guyuealian/88418582</p> <p> (责任编辑:) |
