Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
wangsen
paddle_dbnet
Commits
9874b6c0
Commit
9874b6c0
authored
Nov 10, 2021
by
tink2123
Browse files
rename params txt
parent
08dcbba4
Changes
12
Show whitespace changes
Inline
Side-by-side
Showing
12 changed files
with
126 additions
and
6 deletions
+126
-6
test_tipc/configs/ppocr_det_mobile/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+12
-0
test_tipc/configs/ppocr_det_mobile/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
..._linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
+18
-0
test_tipc/configs/ppocr_det_server/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+12
-0
test_tipc/configs/ppocr_det_server/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
..._linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
+18
-0
test_tipc/configs/ppocr_rec_mobile/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+12
-0
test_tipc/configs/ppocr_rec_mobile/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
..._linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
+18
-0
test_tipc/configs/ppocr_rec_server/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
..._linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
+12
-0
test_tipc/configs/ppocr_rec_server/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
..._linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
+18
-0
test_tipc/docs/test_paddle2onnx.md
test_tipc/docs/test_paddle2onnx.md
+2
-2
test_tipc/docs/test_serving.md
test_tipc/docs/test_serving.md
+2
-2
test_tipc/test_paddle2onnx.sh
test_tipc/test_paddle2onnx.sh
+1
-1
test_tipc/test_serving.sh
test_tipc/test_serving.sh
+1
-1
No files found.
test_tipc/configs/ppocr_det_mobile/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
View file @
9874b6c0
===========================paddle2onnx_params===========================
2onnx: paddle2onnx
--model_dir:./inference/ch_ppocr_mobile_v2.0_det_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./inference/det_mobile_onnx/model.onnx
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_det.py
--use_gpu:True|False
--det_model_dir:
--image_dir:./inference/ch_det_data_50/all-sum-510/
\ No newline at end of file
test_tipc/configs/ppocr_det_mobile/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
View file @
9874b6c0
===========================serving_params===========================
model_name:ocr_det_mobile
python:python3.7|cpp
trans_model:-m paddle_serving_client.convert
--dirname:./inference/ch_ppocr_mobile_v2.0_det_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--serving_server:./deploy/pdserving/ppocr_det_mobile_2.0_serving/
--serving_client:./deploy/pdserving/ppocr_det_mobile_2.0_client/
serving_dir:./deploy/pdserving
web_service:web_service_det.py --config=config.yml --opt op.det.concurrency=1
op.det.local_service_conf.devices:null|0
op.det.local_service_conf.use_mkldnn:True|False
op.det.local_service_conf.thread_num:1|6
op.det.local_service_conf.use_trt:False|True
op.det.local_service_conf.precision:fp32|fp16|int8
pipline:pipeline_rpc_client.py|pipeline_http_client.py
--image_dir:../../doc/imgs
\ No newline at end of file
test_tipc/configs/ppocr_det_server/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
View file @
9874b6c0
===========================paddle2onnx_params===========================
2onnx: paddle2onnx
--model_dir:./inference/ch_ppocr_server_v2.0_det_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./inference/det_server_onnx/model.onnx
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_det.py
--use_gpu:True|False
--det_model_dir:
--image_dir:./inference/det_inference
\ No newline at end of file
test_tipc/configs/ppocr_det_server/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
0 → 100644
View file @
9874b6c0
===========================serving_params===========================
model_name:ocr_det_server
python:python3.7|cpp
trans_model:-m paddle_serving_client.convert
--dirname:./inference/ch_ppocr_server_v2.0_det_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--serving_server:./deploy/pdserving/ppocr_det_server_2.0_serving/
--serving_client:./deploy/pdserving/ppocr_det_server_2.0_client/
serving_dir:./deploy/pdserving
web_service:web_service_det.py --config=config.yml --opt op.det.concurrency=1
op.det.local_service_conf.devices:null|0
op.det.local_service_conf.use_mkldnn:True|False
op.det.local_service_conf.thread_num:1|6
op.det.local_service_conf.use_trt:False|True
op.det.local_service_conf.precision:fp32|fp16|int8
pipline:pipeline_rpc_client.py|pipeline_http_client.py
--image_dir:../../doc/imgs_words_en
\ No newline at end of file
test_tipc/configs/ppocr_rec_mobile/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
View file @
9874b6c0
===========================paddle2onnx_params===========================
2onnx: paddle2onnx
--model_dir:./inference/ch_ppocr_mobile_v2.0_rec_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./inference/rec_mobile_onnx/model.onnx
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_rec.py
--use_gpu:True|False
--rec_model_dir:
--image_dir:./inference/rec_inference
\ No newline at end of file
test_tipc/configs/ppocr_rec_mobile/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
0 → 100644
View file @
9874b6c0
===========================serving_params===========================
model_name:ocr_rec_mobile
python:python3.7|cpp
trans_model:-m paddle_serving_client.convert
--dirname:./inference/ch_ppocr_mobile_v2.0_rec_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--serving_server:./deploy/pdserving/ppocr_rec_mobile_2.0_serving/
--serving_client:./deploy/pdserving/ppocr_rec_mobile_2.0_client/
serving_dir:./deploy/pdserving
web_service:web_service_rec.py --config=config.yml --opt op.rec.concurrency=1
op.rec.local_service_conf.devices:null|0
op.rec.local_service_conf.use_mkldnn:True|False
op.rec.local_service_conf.thread_num:1|6
op.rec.local_service_conf.use_trt:False|True
op.rec.local_service_conf.precision:fp32|fp16|int8
pipline:pipeline_rpc_client.py|pipeline_http_client.py
--image_dir:../../doc/imgs_words_en
\ No newline at end of file
test_tipc/configs/ppocr_rec_server/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu.txt
0 → 100644
View file @
9874b6c0
===========================paddle2onnx_params===========================
2onnx: paddle2onnx
--model_dir:./inference/ch_ppocr_server_v2.0_rec_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--save_file:./inference/rec_server_onnx/model.onnx
--opset_version:10
--enable_onnx_checker:True
inference:tools/infer/predict_rec.py
--use_gpu:True|False
--rec_model_dir:
--image_dir:./inference/rec_inference
\ No newline at end of file
test_tipc/configs/ppocr_rec_server/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu.txt
0 → 100644
View file @
9874b6c0
===========================serving_params===========================
model_name:ocr_rec_server
python:python3.7
trans_model:-m paddle_serving_client.convert
--dirname:./inference/ch_ppocr_server_v2.0_rec_infer/
--model_filename:inference.pdmodel
--params_filename:inference.pdiparams
--serving_server:./deploy/pdserving/ppocr_rec_server_2.0_serving/
--serving_client:./deploy/pdserving/ppocr_rec_server_2.0_client/
serving_dir:./deploy/pdserving
web_service:web_service_rec.py --config=config.yml --opt op.rec.concurrency=1
op.rec.local_service_conf.devices:null|0
op.rec.local_service_conf.use_mkldnn:True|False
op.rec.local_service_conf.thread_num:1|6
op.rec.local_service_conf.use_trt:False|True
op.rec.local_service_conf.precision:fp32|fp16|int8
pipline:pipeline_rpc_client.py|pipeline_http_client.py
--image_dir:../../doc/imgs_words_en
\ No newline at end of file
test_tipc/docs/test_paddle2onnx.md
View file @
9874b6c0
...
@@ -18,10 +18,10 @@ PaddleServing预测功能测试的主程序为`test_paddle2onnx.sh`,可以测
...
@@ -18,10 +18,10 @@ PaddleServing预测功能测试的主程序为`test_paddle2onnx.sh`,可以测
先运行
`prepare.sh`
准备数据和模型,然后运行
`test_paddle2onnx.sh`
进行测试,最终在
```test_tipc/output```
目录下生成
`paddle2onnx_infer_*.log`
后缀的日志文件。
先运行
`prepare.sh`
准备数据和模型,然后运行
`test_paddle2onnx.sh`
进行测试,最终在
```test_tipc/output```
目录下生成
`paddle2onnx_infer_*.log`
后缀的日志文件。
```
shell
```
shell
bash test_tipc/prepare.sh ./test_tipc/configs/ppocr_det_mobile
_params
.txt
"paddle2onnx_infer"
bash test_tipc/prepare.sh ./test_tipc/configs/ppocr_det_mobile
/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
"paddle2onnx_infer"
# 用法:
# 用法:
bash test_tipc/test_paddle2onnx.sh ./test_tipc/configs/ppocr_det_mobile
_params
.txt
bash test_tipc/test_paddle2onnx.sh ./test_tipc/configs/ppocr_det_mobile
/model_linux_gpu_normal_normal_paddle2onnx_python_linux_cpu
.txt
```
```
#### 运行结果
#### 运行结果
...
...
test_tipc/docs/test_serving.md
View file @
9874b6c0
...
@@ -20,10 +20,10 @@ PaddleServing预测功能测试的主程序为`test_serving.sh`,可以测试
...
@@ -20,10 +20,10 @@ PaddleServing预测功能测试的主程序为`test_serving.sh`,可以测试
先运行
`prepare.sh`
准备数据和模型,然后运行
`test_serving.sh`
进行测试,最终在
```test_tipc/output```
目录下生成
`serving_infer_*.log`
后缀的日志文件。
先运行
`prepare.sh`
准备数据和模型,然后运行
`test_serving.sh`
进行测试,最终在
```test_tipc/output```
目录下生成
`serving_infer_*.log`
后缀的日志文件。
```
shell
```
shell
bash test_tipc/prepare.sh ./test_tipc/configs/ppocr_det_mobile
_params
.txt
"serving_infer"
bash test_tipc/prepare.sh ./test_tipc/configs/ppocr_det_mobile
/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu
.txt
"serving_infer"
# 用法:
# 用法:
bash test_tipc/test_serving.sh ./test_tipc/configs/ppocr_det_mobile
_params
.txt
bash test_tipc/test_serving.sh ./test_tipc/configs/ppocr_det_mobile
/model_linux_gpu_normal_normal_serving_python_linux_gpu_cpu
.txt
```
```
#### 运行结果
#### 运行结果
...
...
test_tipc/test_paddle2onnx.sh
View file @
9874b6c0
...
@@ -11,7 +11,7 @@ python=$(func_parser_value "${lines[2]}")
...
@@ -11,7 +11,7 @@ python=$(func_parser_value "${lines[2]}")
# parser params
# parser params
dataline
=
$(
awk
'NR==1
11
, NR==12
3
{print}'
$FILENAME
)
dataline
=
$(
awk
'NR==1, NR==12{print}'
$FILENAME
)
IFS
=
$'
\n
'
IFS
=
$'
\n
'
lines
=(
${
dataline
}
)
lines
=(
${
dataline
}
)
...
...
test_tipc/test_serving.sh
View file @
9874b6c0
...
@@ -2,7 +2,7 @@
...
@@ -2,7 +2,7 @@
source
test_tipc/common_func.sh
source
test_tipc/common_func.sh
FILENAME
=
$1
FILENAME
=
$1
dataline
=
$(
awk
'NR==
67
, NR==8
4
{print}'
$FILENAME
)
dataline
=
$(
awk
'NR==
1
, NR==
1
8{print}'
$FILENAME
)
# parser params
# parser params
IFS
=
$'
\n
'
IFS
=
$'
\n
'
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment