README.md 3.36 KB
Newer Older
Your Name's avatar
Your Name committed
1
# YoloV7
shizhm's avatar
shizhm committed
2

Your Name's avatar
Your Name committed
3
4
5
6
7
8
## 模型介绍

YOLOV7是2022年最新出现的一种YOLO系列目标检测模型,在论文 [YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors](https://arxiv.org/abs/2207.02696)中提出。

## 模型结构

Your Name's avatar
Your Name committed
9
YoloV7模型的网络结构包括三个部分:input、backbone和head。与yolov5不同的是,将neck层与head层合称为head层,实际上的功能是一样的。各个部分的功能和yolov5相同,如backbone用于提取特征,head用于预测。yolov7依旧基于anchor based的方法,同时在网络架构上增加E-ELAN层,并将REP层也加入进来,方便后续部署,同时在训练时,在head时,新增Aux_detect用于辅助检测。
Your Name's avatar
Your Name committed
10

shizhm's avatar
shizhm committed
11
## Python版本推理
liucong's avatar
liucong committed
12

shizhm's avatar
shizhm committed
13
下面介绍如何运行Python代码示例,Python示例的详细说明见Doc目录下的Tutorial_Python.md。
liucong's avatar
liucong committed
14

shizhm's avatar
shizhm committed
15
### 下载镜像
Your Name's avatar
Your Name committed
16
17

在光源可拉取推理的docker镜像,YoloV7工程推荐的镜像如下:
Your Name's avatar
Your Name committed
18

Your Name's avatar
Your Name committed
19
20
21
```python
docker pull image.sourcefind.cn:5000/dcu/admin/base/custom:ort1.14.0_migraphx3.0.0-dtk22.10.1
```
Your Name's avatar
Your Name committed
22

shizhm's avatar
shizhm committed
23
### 设置Python环境变量
liucong's avatar
liucong committed
24

shizhm's avatar
shizhm committed
25
26
27
28
29
```
export PYTHONPATH=/opt/dtk/lib:$PYTHONPATH
```

### 安装依赖
liucong's avatar
liucong committed
30
31
32

```
# 进入python示例目录
shizhm's avatar
shizhm committed
33
cd <path_to_yolov7_migraphx>/Python
liucong's avatar
liucong committed
34
35
36

# 安装依赖
pip install -r requirements.txt
shizhm's avatar
shizhm committed
37
38
39
40
41
```

### 运行示例

YoloV7模型的推理示例程序是YoloV7_infer_migraphx.py,在Python目录下使用如下命令运行该推理示例:
liucong's avatar
liucong committed
42

shizhm's avatar
shizhm committed
43
```
liucong's avatar
liucong committed
44
45
46
47
48
49
50
51
52
53
54
55
56
57
python YoloV7_infer_migraphx.py \
	--imgpath 测试图像路径 \ 
	--modelpath onnx模型路径 \
	--objectThreshold 判断是否有物体阈值,默认0.5 \
	--confThreshold 置信度阈值,默认0.25 \
	--nmsThreshold nms阈值,默认0.5 \
```

程序运行结束会在当前目录生成YoloV7检测结果图像。

<img src="./Resource/Images/Result.jpg" alt="Result_2" style="zoom: 50%;" />

## C++版本推理

shizhm's avatar
shizhm committed
58
59
60
61
62
下面介绍如何运行C++代码示例,C++示例的详细说明见Doc目录下的Tutorial_Cpp.md。

### 下载镜像

在光源中下载MIGraphX镜像:
liucong's avatar
liucong committed
63

shizhm's avatar
shizhm committed
64
65
66
```
docker pull image.sourcefind.cn:5000/dcu/admin/base/custom:ort1.14.0_migraphx3.0.0-dtk22.10.1
```
liucong's avatar
liucong committed
67

Your Name's avatar
Your Name committed
68
### 安装Opencv依赖
Your Name's avatar
Your Name committed
69

Your Name's avatar
Your Name committed
70
71
72
```python
cd <path_to_migraphx_samples>
sh ./3rdParty/InstallOpenCVDependences.sh
Your Name's avatar
Your Name committed
73
```
Your Name's avatar
Your Name committed
74
75
76
77
78
79
80
81
82
83
84

### 修改CMakeLists.txt

- 如果使用ubuntu系统,需要修改CMakeLists.txt中依赖库路径:
  将"${CMAKE_CURRENT_SOURCE_DIR}/depend/lib64/"修改为"${CMAKE_CURRENT_SOURCE_DIR}/depend/lib/"

- **MIGraphX2.3.0及以上版本需要c++17**


### 安装OpenCV并构建工程

Your Name's avatar
Your Name committed
85
```
Your Name's avatar
Your Name committed
86
87
88
89
rbuild build -d depend
```

### 设置环境变量
Your Name's avatar
Your Name committed
90

Your Name's avatar
Your Name committed
91
92
93
将依赖库依赖加入环境变量LD_LIBRARY_PATH,在~/.bashrc中添加如下语句:

**Centos**:
Your Name's avatar
Your Name committed
94
95

```
shizhm's avatar
shizhm committed
96
export LD_LIBRARY_PATH=<path_to_yolov7_migraphx>/depend/lib64/:$LD_LIBRARY_PATH
Your Name's avatar
Your Name committed
97
98
99
100
101
```

**Ubuntu**:

```
shizhm's avatar
shizhm committed
102
export LD_LIBRARY_PATH=<path_to_yolov7_migraphx>/depend/lib/:$LD_LIBRARY_PATH
Your Name's avatar
Your Name committed
103
104
```

Your Name's avatar
Your Name committed
105
106
107
108
109
110
然后执行:

```
source ~/.bashrc
```

shizhm's avatar
shizhm committed
111
### 运行示例
Your Name's avatar
Your Name committed
112

liucong's avatar
liucong committed
113
成功编译YoloV7工程后,执行如下命令运行该示例:
Your Name's avatar
Your Name committed
114
115

```
shizhm's avatar
shizhm committed
116
# 进入yolov7 migraphx工程根目录
shizhm's avatar
shizhm committed
117
cd <path_to_yolov7_migraphx> 
Your Name's avatar
Your Name committed
118

liucong's avatar
liucong committed
119
120
# 进入build目录
cd ./build/
Your Name's avatar
Your Name committed
121

liucong's avatar
liucong committed
122
123
# 执行示例程序
./YOLOV7
Your Name's avatar
Your Name committed
124
```
Your Name's avatar
Your Name committed
125

liucong's avatar
liucong committed
126
程序运行结束会在build目录生成YoloV7检测结果图像。
Your Name's avatar
Your Name committed
127

liucong's avatar
liucong committed
128
<img src="./Resource/Images/Result.jpg" alt="Result" style="zoom:50%;" />
Your Name's avatar
Your Name committed
129
130
131



shizhm's avatar
shizhm committed
132
## 源码仓库及问题反馈
Your Name's avatar
Your Name committed
133

Your Name's avatar
Your Name committed
134
​		https://developer.hpccube.com/codes/modelzoo/yolov7_migraphx
Your Name's avatar
Your Name committed
135
136
137

## 参考

Your Name's avatar
Your Name committed
138
​		https://github.com/WongKinYiu/yolov7