Commit 0d97cc8c authored by Sugon_ldc's avatar Sugon_ldc
Browse files

add new model

parents
Pipeline #316 failed with stages
in 0 seconds
*.iml
.gradle
/local.properties
/.idea/caches
/.idea/libraries
/.idea/modules.xml
/.idea/workspace.xml
/.idea/navEditor.xml
/.idea/assetWizardSettings.xml
.DS_Store
/build
/captures
.externalNativeBuild
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
English | [简体中文](README_CN.md)
# Human Matting Android Demo
Based on [PaddleSeg](https://github.com/paddlepaddle/paddleseg/tree/develop) [MODNet](https://github.com/PaddlePaddle/PaddleSeg/tree/develop/contrib/Matting) algorithm to realise human matting(Android demo).
You can directly download and install the example project [apk](https://paddleseg.bj.bcebos.com/matting/models/deploy/app-debug.apk) to experience。
## 1. Results Exhibition
<div align="center">
<img src=https://user-images.githubusercontent.com/14087480/141890516-6aad4691-9ab3-4baf-99e5-f1afa1b21281.png width="50%">
</div>
## 2. Android Demo Instructions
### 2.1 Reruirements
* Android Studio 3.4;
* Android mobile phone;
### 2.2 Installation
* open Android Studio and on "Welcome to Android Studio" window, click "Open an existing Android Studio project". In the path selection window that is displayed, select the folder corresponding to the Android Demo. Then click the "Open" button in the lower right corner to import the project. The Lite prediction library required by demo will be automatically downloaded during the construction process.
* Connect Android phone via USB;
* After loading the project, click the Run->Run 'App' button on the menu bar, Select the connected Android device in the "Select Deployment Target" window, and then click the "OK" button;
*Note:this Android demo is based on [Paddle-Lite](https://paddlelite.paddlepaddle.org.cn/),PaddleLite version is 2.8.0*
### 2.3 Prediction
* In human matting demo, a human image will be loaded by default, and the CPU prediction result and prediction delay will be given below the image;
* In the human matting demo, you can also load test images from the album or camera by clicking the "Open local album" and "Open camera to take photos" buttons in the upper right corner, and then perform prediction.
*Note:When taking a photo in demo, the photo will be compressed automatically. If you want to test the effect of the original photo, you can use the mobile phone camera to take a photo and open it from the album for prediction.*
## 3. Secondary Development
The inference library or model can be updated according to the need for secondary development. The updated model can be divided into two steps: model export and model transformation.
### 3.1 Update Inference Library
[Paddle-Lite website](https://paddlelite.paddlepaddle.org.cn/) probides a pre-compiled version of Android inference library, which can also be compiled by referring to the official website.
The Paddle-Lite inference library on Android mainly contains three files:
* PaddlePredictor.jar;
* arm64-v8a/libpaddle_lite_jni.so;
* armeabi-v7a/libpaddle_lite_jni.so;
Two methods will be introduced in the following:
* Use a precompiled version of the inference library. Latest precompiled file reference:[release](https://github.com/PaddlePaddle/Paddle-Lite/releases/). This demo uses the [version](https://paddlelite-demo.bj.bcebos.com/libs/android/paddle_lite_libs_v2_8_0.tar.gz)
Uncompress the above files and the PaddlePredictor.jar is in java/PaddlePredictor.jar;
The so file about arm64-v8a is in java/libs/arm64-v8a;
The so file about armeabi-v7a is in java/libs/armeabi-v7a;
* Manually compile the paddle-Lite inference library
Development environment preparation and compilation methods refer to [paddle-Lite source code compilation](https://paddle-lite.readthedocs.io/zh/release-v2.8/source_compile/compile_env.html).
Prepare the above documents, then refer [java_api](https://paddle-lite.readthedocs.io/zh/release-v2.8/api_reference/java_api_doc.html)to have a inference on Android. Refer to the documentation in the Update Inference Library section of [Paddle-Lite-Demo](https://github.com/PaddlePaddle/Paddle-Lite-Demo) for details on how to use the inference library.
### 3.2 Model Export
This demo uses the MODNet with HRNet_W18 backbone to perform human matting. Please refer to official websit to get model [training tutorial](https://github.com/PaddlePaddle/PaddleSeg/tree/develop/contrib/Matting). There are 3 models provided with different backbones: MobileNetV2、ResNet50_vd and HRNet_W18. This Android demo considers the accuracy and speed, using HRNet_W18 as the Backone. The trained dynamic graph model can be downloaded directly from the official website for algorithm verification.
In order to be able to infer on Android phones, the dynamic graph model needs to be exported as a static graph model, and the input size of the image should be fixed when exporting.
First, update the [PaddleSeg](https://github.com/paddlepaddle/paddleseg/tree/develop) repository. Then `cd` to the `PaddleSeg/contrib/Matting` directory. Then put the downloaded modnet-hrnet_w18.pdparams (traing by youself is ok) on current directory(`PaddleSeg/contrib/Matting`). After that, fix the config file `configs/modnet_mobilenetv2.yml`(note: hrnet18 is used, but the config file `modnet_hrnet_w18.yml` is based on `modnet_mobilenetv2.yml`), where,modify the val_dataset field as follows:
``` yml
val_dataset:
type: MattingDataset
dataset_root: data/PPM-100
val_file: val.txt
transforms:
- type: LoadImages
- type: ResizeByShort
short_size: 256
- type: ResizeToIntMult
mult_int: 32
- type: Normalize
mode: val
get_trimap: False
```
In the above modification, pay special attention to the short_size: 256 field which directly determines the size of our final inferential image. If the value of this field is set too small, the prediction accuracy will be affected; if the value is set too high, the inference speed of the phone will be affected (or even the phone will crash due to performance problems). In practical testing, this field is set to 256 for HRnet18.
After modifying the configuration file, run the following command to export the static graph:
``` shell
python export.py \
--config configs/modnet/modnet_hrnet_w18.yml \
--model_path modnet-hrnet_w18.pdparams \
--save_dir output
```
After the conversion, the `output` folder will be generated in the current directory, and the files in the folder are the static graph files.
### 3.3 Model Conversion
#### 3.3.1 Model Conversion Tool
Once you have the static diagram model and parameter files exported from PaddleSeg ready, you need to optimize the model using the opt provided with Paddle-Lite and convert to the file format supported by Paddle-Lite.
Firstly, install PaddleLite:
``` shell
pip install paddlelite==2.8.0
```
Then use the following Python script to convert:
``` python
# Reference the Paddlelite inference library
from paddlelite.lite import *
# 1. Create opt instance
opt=Opt()
# 2. Specify the static model path
opt.set_model_file('./output/model.pdmodel')
opt.set_param_file('./output/model.pdiparams')
# 3. Specify conversion type: arm, x86, opencl or npu
opt.set_valid_places("arm")
# 4. Specifies the model transformation type: naive_buffer or protobuf
opt.set_model_type("naive_buffer")
# 5. Address of output model
opt.set_optimize_out("./output/hrnet_w18")
# 6. Perform model optimization
opt.run()
```
After conversion, the `hrnet_w18.nb` file will be generated in the `output` directory.
#### 3.3.2 Update Model
Using the optimized `. Nb ` file to replace the file in `app/SRC/main/assets/image_matting/models/modnet` in android applications.
Then change the image input size in the project: open the string.xml file and follow the under example:
``` xml
<string name="INPUT_SHAPE_DEFAULT">1,3,256,256</string>
```
1,3,256,256 represent the corresponding batchsize, channel, height and width respectively. Generally, height and width are modified to the size which set during model export.
The entire android demo is implemented in Java, without embedded C++ code, which is relatively easy to build and execute. In the future, you can also move this demo to Java Web projects for human matting in the Web.
简体中文 | [English](README.md)
# Human Matting Android Demo
基于[PaddleSeg](https://github.com/paddlepaddle/paddleseg/tree/develop)[MODNet](https://github.com/PaddlePaddle/PaddleSeg/tree/develop/contrib/Matting)算法实现人像抠图(安卓版demo)。
可以直接下载安装本示例工程中的[apk](https://paddleseg.bj.bcebos.com/matting/models/deploy/app-debug.apk)进行体验。
## 1. 效果展示
<div align="center">
<img src=https://user-images.githubusercontent.com/14087480/141890516-6aad4691-9ab3-4baf-99e5-f1afa1b21281.png width="50%">
</div>
## 2. 安卓Demo使用说明
### 2.1 要求
* Android Studio 3.4;
* Android手机;
### 2.2 一键安装
* 打开Android Studio,在"Welcome to Android Studio"窗口点击"Open an existing Android Studio project",在弹出的路径选择窗口中选择本安卓demo对应的文件夹,然后点击右下角的"Open"按钮即可导入工程,构建工程的过程中会自动下载demo需要的Lite预测库;
* 通过USB连接Android手机;
* 载入工程后,点击菜单栏的Run->Run 'App'按钮,在弹出的"Select Deployment Target"窗口选择已经连接的Android设备,然后点击"OK"按钮;
*注:此安卓demo基于[Paddle-Lite](https://paddlelite.paddlepaddle.org.cn/)开发,PaddleLite版本为2.8.0。*
### 2.3 预测
* 在人像抠图Demo中,默认会载入一张人像图像,并会在图像下方给出CPU的预测结果和预测延时;
* 在人像抠图Demo中,你还可以通过右上角的"打开本地相册"和"打开摄像头拍照"按钮分别从相册或相机中加载测试图像然后进行预测推理;
*注意:demo中拍照时照片会自动压缩,想测试拍照原图效果,可使用手机相机拍照后从相册中打开进行预测。*
## 3. 二次开发
可按需要更新预测库或模型进行二次开发,其中更新模型分为模型导出和模型转换两个步骤。
### 3.1 更新预测库
[Paddle-Lite官网](https://paddlelite.paddlepaddle.org.cn/)提供了预编译版本的安卓预测库,也可以参考官网自行编译。
Paddle-Lite在安卓端的预测库主要包括三个文件:
* PaddlePredictor.jar;
* arm64-v8a/libpaddle_lite_jni.so;
* armeabi-v7a/libpaddle_lite_jni.so;
下面分别介绍两种方法:
* 使用预编译版本的预测库,最新的预编译文件参考:[release](https://github.com/PaddlePaddle/Paddle-Lite/releases/),此demo使用的[版本](https://paddlelite-demo.bj.bcebos.com/libs/android/paddle_lite_libs_v2_8_0.tar.gz)
解压上面文件,PaddlePredictor.jar位于:java/PaddlePredictor.jar;
arm64-v8a相关so位于:java/libs/arm64-v8a;
armeabi-v7a相关so位于:java/libs/armeabi-v7a;
* 手动编译Paddle-Lite预测库
开发环境的准备和编译方法参考:[Paddle-Lite源码编译](https://paddle-lite.readthedocs.io/zh/release-v2.8/source_compile/compile_env.html)
准备好上述文件,即可参考[java_api](https://paddle-lite.readthedocs.io/zh/release-v2.8/api_reference/java_api_doc.html)在安卓端进行推理。具体使用预测库的方法可参考[Paddle-Lite-Demo](https://github.com/PaddlePaddle/Paddle-Lite-Demo)中更新预测库部分的文档。
### 3.2 模型导出
此demo的人像抠图采用Backbone为HRNet_W18的MODNet模型,模型[训练教程](https://github.com/PaddlePaddle/PaddleSeg/tree/develop/contrib/Matting)请参考官网,官网提供了3种不同性能的Backone:MobileNetV2、ResNet50_vd和HRNet_W18。本安卓demo综合考虑精度和速度要求,采用了HRNet_W18作为Backone。可以直接从官网下载训练好的动态图模型进行算法验证。
为了能够在安卓手机上进行推理,需要将动态图模型导出为静态图模型,导出时固定图像输入尺寸即可。
首先git最新的[PaddleSeg](https://github.com/paddlepaddle/paddleseg/tree/develop)项目,然后cd进入到PaddleSeg/contrib/Matting目录。将下载下来的modnet-hrnet_w18.pdparams动态图模型文件(也可以自行训练得到)放置在当前文件夹(PaddleSeg/contrib/Matting)下面。然后修改配置文件 configs/modnet_mobilenetv2.yml(注意:虽然采用hrnet18模型,但是该模型依赖的配置文件modnet_hrnet_w18.yml本身依赖modnet_mobilenetv2.yml),修改其中的val_dataset字段如下:
``` yml
val_dataset:
type: MattingDataset
dataset_root: data/PPM-100
val_file: val.txt
transforms:
- type: LoadImages
- type: ResizeByShort
short_size: 256
- type: ResizeToIntMult
mult_int: 32
- type: Normalize
mode: val
get_trimap: False
```
上述修改中尤其注意short_size: 256这个字段,这个值直接决定我们最终的推理图像采用的尺寸大小。这个字段值设置太小会影响预测精度,设置太大会影响手机推理速度(甚至造成手机因性能问题无法完成推理而崩溃)。经过实际测试,对于hrnet18,该字段设置为256较好。
完成配置文件修改后,采用下面的命令进行静态图导出:
``` shell
python export.py \
--config configs/modnet/modnet_hrnet_w18.yml \
--model_path modnet-hrnet_w18.pdparams \
--save_dir output
```
转换完成后在当前目录下会生成output文件夹,该文件夹中的文件即为转出来的静态图文件。
### 3.3 模型转换
#### 3.3.1 模型转换工具
准备好PaddleSeg导出来的静态图模型和参数文件后,需要使用Paddle-Lite提供的opt对模型进行优化,并转换成Paddle-Lite支持的文件格式。
首先安装PaddleLite:
``` shell
pip install paddlelite==2.8.0
```
然后使用下面的python脚本进行转换:
``` python
# 引用Paddlelite预测库
from paddlelite.lite import *
# 1. 创建opt实例
opt=Opt()
# 2. 指定静态模型路径
opt.set_model_file('./output/model.pdmodel')
opt.set_param_file('./output/model.pdiparams')
# 3. 指定转化类型: arm、x86、opencl、npu
opt.set_valid_places("arm")
# 4. 指定模型转化类型: naive_buffer、protobuf
opt.set_model_type("naive_buffer")
# 5. 输出模型地址
opt.set_optimize_out("./output/hrnet_w18")
# 6. 执行模型优化
opt.run()
```
转换完成后在output目录下会生成对应的hrnet_w18.nb文件。
#### 3.3.2 更新模型
将优化好的`.nb`文件,替换安卓程序中的 app/src/main/assets/image_matting/
models/modnet下面的文件即可。
然后在工程中修改图像输入尺寸:打开string.xml文件,修改示例如下:
``` xml
<string name="INPUT_SHAPE_DEFAULT">1,3,256,256</string>
```
1,3,256,256分别表示图像对应的batchsize、channel、height、width,我们一般修改height和width即可,这里的height和width需要和静态图导出时设置的尺寸一致。
整个安卓demo采用java实现,没有内嵌C++代码,构建和执行比较简单。未来也可以将本demo移植到java web项目中实现web版人像抠图。
import java.security.MessageDigest
apply plugin: 'com.android.application'
android {
compileSdkVersion 28
defaultConfig {
applicationId "com.baidu.paddleseg.lite.demo.human_matting"
minSdkVersion 15
versionCode 1
versionName "1.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(include: ['*.jar'], dir: 'libs')
implementation 'com.android.support:appcompat-v7:28.0.0'
implementation 'com.android.support.constraint:constraint-layout:1.1.3'
implementation 'com.android.support:design:28.0.0'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'com.android.support.test:runner:1.0.2'
androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'
implementation files('libs/PaddlePredictor.jar') //添加PaddleLite包
}
def paddleLiteLibs = 'https://paddlelite-demo.bj.bcebos.com/libs/android/paddle_lite_libs_v2_8_0.tar.gz'
task downloadAndExtractPaddleLiteLibs(type: DefaultTask) {
doFirst {
println "Downloading and extracting Paddle Lite libs"
}
doLast {
// Prepare cache folder for libs
if (!file("cache").exists()) {
mkdir "cache"
}
// Generate cache name for libs
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(paddleLiteLibs.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download libs
if (!file("cache/${cacheName}.tar.gz").exists()) {
ant.get(src: paddleLiteLibs, dest: file("cache/${cacheName}.tar.gz"))
}
// Unpack libs
copy {
from tarTree("cache/${cacheName}.tar.gz")
into "cache/${cacheName}"
}
// Copy PaddlePredictor.jar
if (!file("libs/PaddlePredictor.jar").exists()) {
copy {
from "cache/${cacheName}/java/PaddlePredictor.jar"
into "libs"
}
}
// Copy libpaddle_lite_jni.so for armeabi-v7a and arm64-v8a
if (!file("src/main/jniLibs/armeabi-v7a/libpaddle_lite_jni.so").exists()) {
copy {
from "cache/${cacheName}/java/libs/armeabi-v7a/"
into "src/main/jniLibs/armeabi-v7a"
}
}
if (!file("src/main/jniLibs/arm64-v8a/libpaddle_lite_jni.so").exists()) {
copy {
from "cache/${cacheName}/java/libs/arm64-v8a/"
into "src/main/jniLibs/arm64-v8a"
}
}
}
}
preBuild.dependsOn downloadAndExtractPaddleLiteLibs
def paddleLiteModels = [
[
'src' : 'https://paddleseg.bj.bcebos.com/matting/models/deploy/hrnet_w18.tar.gz',
'dest' : 'src/main/assets/image_matting/models/modnet'
],
]
task downloadAndExtractPaddleLiteModels(type: DefaultTask) {
doFirst {
println "Downloading and extracting Paddle Lite models"
}
doLast {
// Prepare cache folder for models
if (!file("cache").exists()) {
mkdir "cache"
}
paddleLiteModels.eachWithIndex { model, index ->
MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(model.src.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32)
// Download model file
if (!file("cache/${cacheName}.tar.gz").exists()) {
ant.get(src: model.src, dest: file("cache/${cacheName}.tar.gz"))
}
// Unpack model file
copy {
from tarTree("cache/${cacheName}.tar.gz")
into "cache/${cacheName}"
}
// Copy model file
if (!file("${model.dest}/hrnet_w18.nb").exists()) {
copy {
from "cache/${cacheName}"
into "${model.dest}"
}
}
}
}
}
preBuild.dependsOn downloadAndExtractPaddleLiteModels
#!/usr/bin/env sh
##############################################################################
##
## Gradle start up script for UN*X
##
##############################################################################
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
PRG="$0"
# Need this for relative symlinks.
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG=`dirname "$PRG"`"/$link"
fi
done
SAVED="`pwd`"
cd "`dirname \"$PRG\"`/" >/dev/null
APP_HOME="`pwd -P`"
cd "$SAVED" >/dev/null
APP_NAME="Gradle"
APP_BASE_NAME=`basename "$0"`
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS=""
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD="maximum"
warn () {
echo "$*"
}
die () {
echo
echo "$*"
echo
exit 1
}
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "`uname`" in
CYGWIN* )
cygwin=true
;;
Darwin* )
darwin=true
;;
MINGW* )
msys=true
;;
NONSTOP* )
nonstop=true
;;
esac
CLASSPATH=$APP_HOME/gradle/wrapper/gradle-wrapper.jar
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD="java"
which java >/dev/null 2>&1 || die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
# Increase the maximum file descriptors if we can.
if [ "$cygwin" = "false" -a "$darwin" = "false" -a "$nonstop" = "false" ] ; then
MAX_FD_LIMIT=`ulimit -H -n`
if [ $? -eq 0 ] ; then
if [ "$MAX_FD" = "maximum" -o "$MAX_FD" = "max" ] ; then
MAX_FD="$MAX_FD_LIMIT"
fi
ulimit -n $MAX_FD
if [ $? -ne 0 ] ; then
warn "Could not set maximum file descriptor limit: $MAX_FD"
fi
else
warn "Could not query maximum file descriptor limit: $MAX_FD_LIMIT"
fi
fi
# For Darwin, add options to specify how the application appears in the dock
if $darwin; then
GRADLE_OPTS="$GRADLE_OPTS \"-Xdock:name=$APP_NAME\" \"-Xdock:icon=$APP_HOME/media/gradle.icns\""
fi
# For Cygwin, switch paths to Windows format before running java
if $cygwin ; then
APP_HOME=`cygpath --path --mixed "$APP_HOME"`
CLASSPATH=`cygpath --path --mixed "$CLASSPATH"`
JAVACMD=`cygpath --unix "$JAVACMD"`
# We build the pattern for arguments to be converted via cygpath
ROOTDIRSRAW=`find -L / -maxdepth 1 -mindepth 1 -type d 2>/dev/null`
SEP=""
for dir in $ROOTDIRSRAW ; do
ROOTDIRS="$ROOTDIRS$SEP$dir"
SEP="|"
done
OURCYGPATTERN="(^($ROOTDIRS))"
# Add a user-defined pattern to the cygpath arguments
if [ "$GRADLE_CYGPATTERN" != "" ] ; then
OURCYGPATTERN="$OURCYGPATTERN|($GRADLE_CYGPATTERN)"
fi
# Now convert the arguments - kludge to limit ourselves to /bin/sh
i=0
for arg in "$@" ; do
CHECK=`echo "$arg"|egrep -c "$OURCYGPATTERN" -`
CHECK2=`echo "$arg"|egrep -c "^-"` ### Determine if an option
if [ $CHECK -ne 0 ] && [ $CHECK2 -eq 0 ] ; then ### Added a condition
eval `echo args$i`=`cygpath --path --ignore --mixed "$arg"`
else
eval `echo args$i`="\"$arg\""
fi
i=$((i+1))
done
case $i in
(0) set -- ;;
(1) set -- "$args0" ;;
(2) set -- "$args0" "$args1" ;;
(3) set -- "$args0" "$args1" "$args2" ;;
(4) set -- "$args0" "$args1" "$args2" "$args3" ;;
(5) set -- "$args0" "$args1" "$args2" "$args3" "$args4" ;;
(6) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" ;;
(7) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" ;;
(8) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" ;;
(9) set -- "$args0" "$args1" "$args2" "$args3" "$args4" "$args5" "$args6" "$args7" "$args8" ;;
esac
fi
# Escape application args
save () {
for i do printf %s\\n "$i" | sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/' \\\\/" ; done
echo " "
}
APP_ARGS=$(save "$@")
# Collect all arguments for the java command, following the shell quoting and substitution rules
eval set -- $DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS "\"-Dorg.gradle.appname=$APP_BASE_NAME\"" -classpath "\"$CLASSPATH\"" org.gradle.wrapper.GradleWrapperMain "$APP_ARGS"
# by default we should be in the correct project dir, but when run from Finder on Mac, the cwd is wrong
if [ "$(uname)" = "Darwin" ] && [ "$HOME" = "$PWD" ]; then
cd "$(dirname "$0")"
fi
exec "$JAVACMD" "$@"
@if "%DEBUG%" == "" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%" == "" set DIRNAME=.
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS=
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if "%ERRORLEVEL%" == "0" goto init
echo.
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto init
echo.
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME%
echo.
echo Please set the JAVA_HOME variable in your environment to match the
echo location of your Java installation.
goto fail
:init
@rem Get command-line arguments, handling Windows variants
if not "%OS%" == "Windows_NT" goto win9xME_args
:win9xME_args
@rem Slurp the command line arguments.
set CMD_LINE_ARGS=
set _SKIP=2
:win9xME_args_slurp
if "x%~1" == "x" goto execute
set CMD_LINE_ARGS=%*
:execute
@rem Setup the command line
set CLASSPATH=%APP_HOME%\gradle\wrapper\gradle-wrapper.jar
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" org.gradle.wrapper.GradleWrapperMain %CMD_LINE_ARGS%
:end
@rem End local scope for the variables with windows NT shell
if "%ERRORLEVEL%"=="0" goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
if not "" == "%GRADLE_EXIT_CONSOLE%" exit 1
exit /b 1
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega
## This file must *NOT* be checked into Version Control Systems,
# as it contains information specific to your local configuration.
#
# Location of the SDK. This is only used by Gradle.
# For customization when using a Version Control System, please read the
# header note.
#Mon Nov 25 17:01:52 CST 2019
sdk.dir=/Users/chenlingchi/Library/Android/sdk
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile
package com.baidu.paddle.lite.demo;
import android.content.Context;
import android.support.test.InstrumentationRegistry;
import android.support.test.runner.AndroidJUnit4;
import org.junit.Test;
import org.junit.runner.RunWith;
import static org.junit.Assert.*;
/**
* Instrumented test, which will execute on an Android device.
*
* @see <a href="http://d.android.com/tools/testing">Testing documentation</a>
*/
@RunWith(AndroidJUnit4.class)
public class ExampleInstrumentedTest {
@Test
public void useAppContext() {
// Context of the app under test.
Context appContext = InstrumentationRegistry.getTargetContext();
assertEquals("com.baidu.paddle.lite.demo", appContext.getPackageName());
}
}
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.paddle.demo.matting">
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.CAMERA"/>
<application
android:allowBackup="true"
android:largeHeap="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity android:name=".MainActivity">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
</activity>
<activity
android:name=".SettingsActivity"
android:label="Settings">
</activity>
</application>
</manifest>
\ No newline at end of file
/*
* Copyright (C) 2014 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.paddle.demo.matting;
import android.content.res.Configuration;
import android.os.Bundle;
import android.preference.PreferenceActivity;
import android.support.annotation.LayoutRes;
import android.support.annotation.Nullable;
import android.support.v7.app.ActionBar;
import android.support.v7.app.AppCompatDelegate;
import android.support.v7.widget.Toolbar;
import android.view.MenuInflater;
import android.view.View;
import android.view.ViewGroup;
/**
* A {@link android.preference.PreferenceActivity} which implements and proxies the necessary calls
* to be used with AppCompat.
* <p>
* This technique can be used with an {@link android.app.Activity} class, not just
* {@link android.preference.PreferenceActivity}.
*/
public abstract class AppCompatPreferenceActivity extends PreferenceActivity {
private AppCompatDelegate mDelegate;
@Override
protected void onCreate(Bundle savedInstanceState) {
getDelegate().installViewFactory();
getDelegate().onCreate(savedInstanceState);
super.onCreate(savedInstanceState);
}
@Override
protected void onPostCreate(Bundle savedInstanceState) {
super.onPostCreate(savedInstanceState);
getDelegate().onPostCreate(savedInstanceState);
}
public ActionBar getSupportActionBar() {
return getDelegate().getSupportActionBar();
}
public void setSupportActionBar(@Nullable Toolbar toolbar) {
getDelegate().setSupportActionBar(toolbar);
}
@Override
public MenuInflater getMenuInflater() {
return getDelegate().getMenuInflater();
}
@Override
public void setContentView(@LayoutRes int layoutResID) {
getDelegate().setContentView(layoutResID);
}
@Override
public void setContentView(View view) {
getDelegate().setContentView(view);
}
@Override
public void setContentView(View view, ViewGroup.LayoutParams params) {
getDelegate().setContentView(view, params);
}
@Override
public void addContentView(View view, ViewGroup.LayoutParams params) {
getDelegate().addContentView(view, params);
}
@Override
protected void onPostResume() {
super.onPostResume();
getDelegate().onPostResume();
}
@Override
protected void onTitleChanged(CharSequence title, int color) {
super.onTitleChanged(title, color);
getDelegate().setTitle(title);
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
getDelegate().onConfigurationChanged(newConfig);
}
@Override
protected void onStop() {
super.onStop();
getDelegate().onStop();
}
@Override
protected void onDestroy() {
super.onDestroy();
getDelegate().onDestroy();
}
public void invalidateOptionsMenu() {
getDelegate().invalidateOptionsMenu();
}
private AppCompatDelegate getDelegate() {
if (mDelegate == null) {
mDelegate = AppCompatDelegate.create(this, null);
}
return mDelegate;
}
}
package com.paddle.demo.matting;
import android.Manifest;
import android.app.ProgressDialog;
import android.content.ContentResolver;
import android.content.ContentValues;
import android.content.Intent;
import android.content.SharedPreferences;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Bundle;
import android.os.Environment;
import android.os.Handler;
import android.os.HandlerThread;
import android.os.Message;
import android.preference.PreferenceManager;
import android.provider.MediaStore;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.support.v7.app.AppCompatActivity;
import android.text.method.ScrollingMovementMethod;
import android.util.Log;
import android.view.Display;
import android.view.Menu;
import android.view.MenuInflater;
import android.view.MenuItem;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;
import com.paddle.demo.matting.config.Config;
import com.paddle.demo.matting.preprocess.Preprocess;
import com.paddle.demo.matting.visual.Visualize;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
public class MainActivity extends AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName();
//定义图像来源
public static final int OPEN_GALLERY_REQUEST_CODE = 0;//本地相册
public static final int TAKE_PHOTO_REQUEST_CODE = 1;//摄像头拍摄
//定义模型推理相关变量
public static final int REQUEST_LOAD_MODEL = 0;
public static final int REQUEST_RUN_MODEL = 1;
public static final int RESPONSE_LOAD_MODEL_SUCCESSED = 0;
public static final int RESPONSE_LOAD_MODEL_FAILED = 1;
public static final int RESPONSE_RUN_MODEL_SUCCESSED = 2;
public static final int RESPONSE_RUN_MODEL_FAILED = 3;
protected ProgressDialog pbLoadModel = null;
protected ProgressDialog pbRunModel = null;
//定义操作流程线程句柄
protected HandlerThread worker = null; // 工作线程(加载和运行模型)
protected Handler receiver = null; // 接收来自工作线程的消息
protected Handler sender = null; // 发送消息给工作线程
protected TextView tvInputSetting;//输入信息面板
protected ImageView ivInputImage;//输入图像面板
protected TextView tvOutputResult;//输出结果面板
protected TextView tvInferenceTime;//推理时间面板
// 模型配置
Config config = new Config();
protected Predictor predictor = new Predictor();
Preprocess preprocess = new Preprocess();
Visualize visualize = new Visualize();
//定义图像存储路径
Uri photoUri;
private Button btnSaveImg;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//绑定保存图像按钮
btnSaveImg = (Button) findViewById(R.id.save_img);
//定义消息接收线程
receiver = new Handler() {
@Override
public void handleMessage(Message msg) {
switch (msg.what) {
case RESPONSE_LOAD_MODEL_SUCCESSED:
pbLoadModel.dismiss();
onLoadModelSuccessed();
break;
case RESPONSE_LOAD_MODEL_FAILED:
pbLoadModel.dismiss();
Toast.makeText(MainActivity.this, "Load model failed!", Toast.LENGTH_SHORT).show();
onLoadModelFailed();
break;
case RESPONSE_RUN_MODEL_SUCCESSED:
pbRunModel.dismiss();
onRunModelSuccessed();
break;
case RESPONSE_RUN_MODEL_FAILED:
pbRunModel.dismiss();
Toast.makeText(MainActivity.this, "Run model failed!", Toast.LENGTH_SHORT).show();
onRunModelFailed();
break;
default:
break;
}
}
};
//定义工作线程
worker = new HandlerThread("Predictor Worker");
worker.start();
//定义发送消息线程
sender = new Handler(worker.getLooper()) {
public void handleMessage(Message msg) {
switch (msg.what) {
case REQUEST_LOAD_MODEL:
// load model and reload test image
if (onLoadModel()) {
receiver.sendEmptyMessage(RESPONSE_LOAD_MODEL_SUCCESSED);
} else {
receiver.sendEmptyMessage(RESPONSE_LOAD_MODEL_FAILED);
}
break;
case REQUEST_RUN_MODEL:
// run model if model is loaded
if (onRunModel()) {
receiver.sendEmptyMessage(RESPONSE_RUN_MODEL_SUCCESSED);
} else {
receiver.sendEmptyMessage(RESPONSE_RUN_MODEL_FAILED);
}
break;
default:
break;
}
}
};
tvInputSetting = findViewById(R.id.tv_input_setting);
ivInputImage = findViewById(R.id.iv_input_image);
tvInferenceTime = findViewById(R.id.tv_inference_time);
tvOutputResult = findViewById(R.id.tv_output_result);
tvInputSetting.setMovementMethod(ScrollingMovementMethod.getInstance());
tvOutputResult.setMovementMethod(ScrollingMovementMethod.getInstance());
}
public boolean onLoadModel() {
return predictor.init(MainActivity.this, config);
}
public boolean onRunModel() {
return predictor.isLoaded() && predictor.runModel(preprocess,visualize);
}
public void onLoadModelFailed() {
}
public void onRunModelFailed() {
}
public void loadModel() {
pbLoadModel = ProgressDialog.show(this, "", "加载模型中...", false, false);
sender.sendEmptyMessage(REQUEST_LOAD_MODEL);
}
public void runModel() {
pbRunModel = ProgressDialog.show(this, "", "推理中...", false, false);
sender.sendEmptyMessage(REQUEST_RUN_MODEL);
}
public void onLoadModelSuccessed() {
// load test image from file_paths and run model
try {
if (config.imagePath.isEmpty()||config.bgPath.isEmpty()) {
return;
}
Bitmap image = null;
Bitmap bg = null;
//加载待抠图像(如果是拍照或者本地相册读取,则第一个字符为“/”。否则就是从默认路径下读取图片)
if (!config.imagePath.substring(0, 1).equals("/")) {
InputStream imageStream = getAssets().open(config.imagePath);
image = BitmapFactory.decodeStream(imageStream);
} else {
if (!new File(config.imagePath).exists()) {
return;
}
image = BitmapFactory.decodeFile(config.imagePath);
}
//加载背景图像
if (!config.bgPath.substring(0, 1).equals("/")) {
InputStream imageStream = getAssets().open(config.bgPath);
bg = BitmapFactory.decodeStream(imageStream);
} else {
if (!new File(config.bgPath).exists()) {
return;
}
bg = BitmapFactory.decodeFile(config.bgPath);
}
if (image != null && bg != null && predictor.isLoaded()) {
predictor.setInputImage(image,bg);
runModel();
}
} catch (IOException e) {
Toast.makeText(MainActivity.this, "Load image failed!", Toast.LENGTH_SHORT).show();
e.printStackTrace();
}
}
public void onRunModelSuccessed() {
// 获取抠图结果并更新UI
tvInferenceTime.setText("推理耗时: " + predictor.inferenceTime() + " ms");
Bitmap outputImage = predictor.outputImage();
if (outputImage != null) {
ivInputImage.setImageBitmap(outputImage);
}
tvOutputResult.setText(predictor.outputResult());
tvOutputResult.scrollTo(0, 0);
}
public void onImageChanged(Bitmap image) {
Bitmap bg = null;
try {
//加载背景图像
if (!config.bgPath.substring(0, 1).equals("/")) {
InputStream imageStream = getAssets().open(config.bgPath);
bg = BitmapFactory.decodeStream(imageStream);
} else {
if (!new File(config.bgPath).exists()) {
return;
}
bg = BitmapFactory.decodeFile(config.bgPath);
}
} catch (IOException e) {
Toast.makeText(MainActivity.this, "加载背景图失败!", Toast.LENGTH_SHORT).show();
e.printStackTrace();
}
// rerun model if users pick test image from gallery or camera
//设置预测器图像
if (image != null && predictor.isLoaded()) {
predictor.setInputImage(image,bg);
runModel();
}
}
public void onImageChanged(String path) {
Bitmap bg = null;
try {
//加载背景图像
if (!config.bgPath.substring(0, 1).equals("/")) {
InputStream imageStream = getAssets().open(config.bgPath);
bg = BitmapFactory.decodeStream(imageStream);
} else {
if (!new File(config.bgPath).exists()) {
return;
}
bg = BitmapFactory.decodeFile(config.bgPath);
}
} catch (IOException e) {
Toast.makeText(MainActivity.this, "加载背景图失败!", Toast.LENGTH_SHORT).show();
e.printStackTrace();
}
//设置预测器图像
Bitmap image = BitmapFactory.decodeFile(path);
predictor.setInputImage(image,bg);
runModel();
}
//打开设置页面
public void onSettingsClicked() {
startActivity(new Intent(MainActivity.this, SettingsActivity.class));
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
MenuInflater inflater = getMenuInflater();
inflater.inflate(R.menu.menu_action_options, menu);
return true;
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()) {
case android.R.id.home:
finish();
break;
case R.id.open_gallery:
if (requestAllPermissions()) {
openGallery();
}
break;
case R.id.take_photo:
if (requestAllPermissions()) {
takePhoto();
}
break;
case R.id.settings:
if (requestAllPermissions()) {
// make sure we have SDCard r&w permissions to load model from SDCard
onSettingsClicked();
}
break;
}
return super.onOptionsItemSelected(item);
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,
@NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults[0] != PackageManager.PERMISSION_GRANTED || grantResults[1] != PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "Permission Denied", Toast.LENGTH_SHORT).show();
}
}
private Bitmap getBitMapFromPath(String imageFilePath) {
Display currentDisplay = getWindowManager().getDefaultDisplay();
int dw = currentDisplay.getWidth();
int dh = currentDisplay.getHeight();
// Load up the image's dimensions not the image itself
BitmapFactory.Options bmpFactoryOptions = new BitmapFactory.Options();
bmpFactoryOptions.inJustDecodeBounds = true;
Bitmap bmp = BitmapFactory.decodeFile(imageFilePath, bmpFactoryOptions);
int heightRatio = (int) Math.ceil(bmpFactoryOptions.outHeight
/ (float) dh);
int widthRatio = (int) Math.ceil(bmpFactoryOptions.outWidth
/ (float) dw);
// If both of the ratios are greater than 1,
// one of the sides of the image is greater than the screen
if (heightRatio > 1 && widthRatio > 1) {
if (heightRatio > widthRatio) {
// Height ratio is larger, scale according to it
bmpFactoryOptions.inSampleSize = heightRatio;
} else {
// Width ratio is larger, scale according to it
bmpFactoryOptions.inSampleSize = widthRatio;
}
}
// Decode it for real
bmpFactoryOptions.inJustDecodeBounds = false;
bmp = BitmapFactory.decodeFile(imageFilePath, bmpFactoryOptions);
return bmp;
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
//if (resultCode == RESULT_OK && data != null) {
if (resultCode == RESULT_OK) {
switch (requestCode) {
case OPEN_GALLERY_REQUEST_CODE:
try {
ContentResolver resolver = getContentResolver();
Uri uri = data.getData();
Bitmap image = MediaStore.Images.Media.getBitmap(resolver, uri);
//判断图像尺寸(如果图像过大推理会奔溃)
int width = image.getWidth();
int height = image.getHeight();
Bitmap scaleImage;
if(width > 800) {
int new_width = 800;
int new_height = (int)(height*1.0/width*new_width);
scaleImage = Bitmap.createScaledBitmap(image, new_width, new_height,true);
}
else{
scaleImage = image.copy(Bitmap.Config.ARGB_8888, true);
}
String[] proj = {MediaStore.Images.Media.DATA};
Cursor cursor = managedQuery(uri, proj, null, null, null);
cursor.moveToFirst();
onImageChanged(scaleImage);
} catch (IOException e) {
Log.e(TAG, e.toString());
}
break;
case TAKE_PHOTO_REQUEST_CODE:
//Bitmap image = (Bitmap) data.getParcelableExtra("data");//获取缩略图
Bitmap image = null;
if (photoUri == null)
return;
//通过Uri和selection来获取真实的图片路径
Cursor cursor = getContentResolver().query(photoUri, null, null, null, null);
if (cursor != null) {
if (cursor.moveToFirst()){
String path = cursor.getString(cursor.getColumnIndex(MediaStore.Images.Media.DATA));
//获得图片
image = getBitMapFromPath(path);
}
cursor.close();
}
photoUri = null;
onImageChanged(image);
break;
default:
break;
}
}
}
private boolean requestAllPermissions() {
if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE)
!= PackageManager.PERMISSION_GRANTED || ContextCompat.checkSelfPermission(this,
Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.CAMERA},
0);
return false;
}
return true;
}
private void openGallery() {
Intent intent = new Intent(Intent.ACTION_PICK, null);
intent.setDataAndType(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, "image/*");
startActivityForResult(intent, OPEN_GALLERY_REQUEST_CODE);
}
private void takePhoto() {
Intent takePhotoIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
ContentValues values = new ContentValues();
photoUri = getContentResolver().insert(
MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
takePhotoIntent.putExtra(android.provider.MediaStore.EXTRA_OUTPUT, photoUri);
if (takePhotoIntent.resolveActivity(getPackageManager()) != null) {
startActivityForResult(takePhotoIntent, TAKE_PHOTO_REQUEST_CODE);
}
}
@Override
public boolean onPrepareOptionsMenu(Menu menu) {
boolean isLoaded = predictor.isLoaded();
menu.findItem(R.id.open_gallery).setEnabled(isLoaded);
menu.findItem(R.id.take_photo).setEnabled(isLoaded);
return super.onPrepareOptionsMenu(menu);
}
@Override
protected void onResume() {
Log.i(TAG,"begin onResume");
super.onResume();
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
boolean settingsChanged = false;
String model_path = sharedPreferences.getString(getString(R.string.MODEL_PATH_KEY),
getString(R.string.MODEL_PATH_DEFAULT));
String label_path = sharedPreferences.getString(getString(R.string.LABEL_PATH_KEY),
getString(R.string.LABEL_PATH_DEFAULT));
String image_path = sharedPreferences.getString(getString(R.string.IMAGE_PATH_KEY),
getString(R.string.IMAGE_PATH_DEFAULT));
String bg_path = sharedPreferences.getString(getString(R.string.BG_PATH_KEY),
getString(R.string.BG_PATH_DEFAULT));
settingsChanged |= !model_path.equalsIgnoreCase(config.modelPath);
settingsChanged |= !label_path.equalsIgnoreCase(config.labelPath);
settingsChanged |= !image_path.equalsIgnoreCase(config.imagePath);
settingsChanged |= !bg_path.equalsIgnoreCase(config.bgPath);
int cpu_thread_num = Integer.parseInt(sharedPreferences.getString(getString(R.string.CPU_THREAD_NUM_KEY),
getString(R.string.CPU_THREAD_NUM_DEFAULT)));
settingsChanged |= cpu_thread_num != config.cpuThreadNum;
String cpu_power_mode =
sharedPreferences.getString(getString(R.string.CPU_POWER_MODE_KEY),
getString(R.string.CPU_POWER_MODE_DEFAULT));
settingsChanged |= !cpu_power_mode.equalsIgnoreCase(config.cpuPowerMode);
String input_color_format =
sharedPreferences.getString(getString(R.string.INPUT_COLOR_FORMAT_KEY),
getString(R.string.INPUT_COLOR_FORMAT_DEFAULT));
settingsChanged |= !input_color_format.equalsIgnoreCase(config.inputColorFormat);
long[] input_shape =
Utils.parseLongsFromString(sharedPreferences.getString(getString(R.string.INPUT_SHAPE_KEY),
getString(R.string.INPUT_SHAPE_DEFAULT)), ",");
settingsChanged |= input_shape.length != config.inputShape.length;
if (!settingsChanged) {
for (int i = 0; i < input_shape.length; i++) {
settingsChanged |= input_shape[i] != config.inputShape[i];
}
}
if (settingsChanged) {
config.init(model_path,label_path,image_path,bg_path,cpu_thread_num,cpu_power_mode,
input_color_format,input_shape);
preprocess.init(config);
// 更新UI
tvInputSetting.setText("算法模型: " + config.modelPath.substring(config.modelPath.lastIndexOf("/") + 1));
tvInputSetting.scrollTo(0, 0);
// 如果配置发生改变则重新加载模型并预测
loadModel();
}
}
@Override
protected void onDestroy() {
if (predictor != null) {
predictor.releaseModel();
}
worker.quit();
super.onDestroy();
}
//保存图像
public void clickSaveImg(View view){
//取出图像
Bitmap outputImage = predictor.outputImage();
if(outputImage==null) {
Toast.makeText(getApplicationContext(),"当前没有图像",Toast.LENGTH_SHORT).show();
return;
}
//保存图像
File f = new File(Environment.getExternalStorageDirectory() + "/newphoto.jpg");
if (f.exists()) {
f.delete();
}
try {
FileOutputStream out = new FileOutputStream(f);
outputImage.compress(Bitmap.CompressFormat.JPEG, 80, out);
out.flush();
out.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
// 把文件插入到系统图库
try {
MediaStore.Images.Media.insertImage(this.getContentResolver(),
f.getAbsolutePath(), Environment.getExternalStorageDirectory() + "/newphoto.jpg", null);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
// 通知图库更新
sendBroadcast(new Intent(Intent.ACTION_MEDIA_SCANNER_SCAN_FILE,
Uri.parse("file://" + "/sdcard/matting/")));
Toast.makeText(getApplicationContext(),"保存成功",Toast.LENGTH_SHORT).show();
}
}
package com.paddle.demo.matting;
import android.content.Context;
import android.graphics.Bitmap;
import android.util.Log;
import com.baidu.paddle.lite.MobileConfig;
import com.baidu.paddle.lite.PaddlePredictor;
import com.baidu.paddle.lite.PowerMode;
import com.baidu.paddle.lite.Tensor;
import com.paddle.demo.matting.config.Config;
import com.paddle.demo.matting.preprocess.Preprocess;
import com.paddle.demo.matting.visual.Visualize;
import java.io.File;
import java.io.InputStream;
import java.util.Date;
import java.util.Vector;
public class Predictor {
private static final String TAG = Predictor.class.getSimpleName();
protected Vector<String> wordLabels = new Vector<String>();
Config config = new Config();
protected Bitmap inputImage = null;//输入图像
protected Bitmap scaledImage = null;//尺度归一化图像
protected Bitmap bgImage = null;//背景图像
protected Bitmap outputImage = null;//输出图像
protected String outputResult = "";//输出结果
protected float preprocessTime = 0;//预处理时间
protected float postprocessTime = 0;//后处理时间
public boolean isLoaded = false;
public int warmupIterNum = 0;
public int inferIterNum = 1;
protected Context appCtx = null;
public int cpuThreadNum = 1;
public String cpuPowerMode = "LITE_POWER_HIGH";
public String modelPath = "";
public String modelName = "";
protected PaddlePredictor paddlePredictor = null;
protected float inferenceTime = 0;
public Predictor() {
super();
}
public boolean init(Context appCtx, String modelPath, int cpuThreadNum, String cpuPowerMode) {
this.appCtx = appCtx;
isLoaded = loadModel(modelPath, cpuThreadNum, cpuPowerMode);
return isLoaded;
}
public boolean init(Context appCtx, Config config) {
if (config.inputShape.length != 4) {
Log.i(TAG, "size of input shape should be: 4");
return false;
}
if (config.inputShape[0] != 1) {
Log.i(TAG, "only one batch is supported in the matting demo, you can use any batch size in " +
"your Apps!");
return false;
}
if (config.inputShape[1] != 1 && config.inputShape[1] != 3) {
Log.i(TAG, "only one/three channels are supported in the image matting demo, you can use any " +
"channel size in your Apps!");
return false;
}
if (!config.inputColorFormat.equalsIgnoreCase("RGB") && !config.inputColorFormat.equalsIgnoreCase("BGR")) {
Log.i(TAG, "only RGB and BGR color format is supported.");
return false;
}
init(appCtx, config.modelPath, config.cpuThreadNum, config.cpuPowerMode);
if (!isLoaded()) {
return false;
}
this.config = config;
return isLoaded;
}
public boolean isLoaded() {
return paddlePredictor != null && isLoaded;
}
//加载真值标签label,在本matting实例中用不到(在图像分类和图像分割任务中如果是多类目标那么一般需要提供label)
protected boolean loadLabel(String labelPath) {
wordLabels.clear();
// load word labels from file
try {
InputStream assetsInputStream = appCtx.getAssets().open(labelPath);
int available = assetsInputStream.available();
byte[] lines = new byte[available];
assetsInputStream.read(lines);
assetsInputStream.close();
String words = new String(lines);
String[] contents = words.split("\n");
for (String content : contents) {
wordLabels.add(content);
}
Log.i(TAG, "word label size: " + wordLabels.size());
} catch (Exception e) {
Log.e(TAG, e.getMessage());
return false;
}
return true;
}
public Tensor getInput(int idx) {
if (!isLoaded()) {
return null;
}
return paddlePredictor.getInput(idx);
}
public Tensor getOutput(int idx) {
if (!isLoaded()) {
return null;
}
return paddlePredictor.getOutput(idx);
}
protected boolean loadModel(String modelPath, int cpuThreadNum, String cpuPowerMode) {
// 如果已经加载过模型则先释放掉
releaseModel();
// 加载模型
if (modelPath.isEmpty()) {
return false;
}
String realPath = modelPath;
if (!modelPath.substring(0, 1).equals("/")) {
// read model files from custom file_paths if the first character of mode file_paths is '/'
// otherwise copy model to cache from assets
realPath = appCtx.getCacheDir() + "/" + modelPath;
Utils.copyDirectoryFromAssets(appCtx, modelPath, realPath);
}
if (realPath.isEmpty()) {
return false;
}
MobileConfig config = new MobileConfig();
//修改模型
config.setModelFromFile(realPath + File.separator + "hrnet_w18.nb");
config.setThreads(cpuThreadNum);
if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_HIGH")) {
config.setPowerMode(PowerMode.LITE_POWER_HIGH);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_LOW")) {
config.setPowerMode(PowerMode.LITE_POWER_LOW);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_FULL")) {
config.setPowerMode(PowerMode.LITE_POWER_FULL);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_NO_BIND")) {
config.setPowerMode(PowerMode.LITE_POWER_NO_BIND);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_RAND_HIGH")) {
config.setPowerMode(PowerMode.LITE_POWER_RAND_HIGH);
} else if (cpuPowerMode.equalsIgnoreCase("LITE_POWER_RAND_LOW")) {
config.setPowerMode(PowerMode.LITE_POWER_RAND_LOW);
} else {
Log.e(TAG, "unknown cpu power mode!");
return false;
}
paddlePredictor = PaddlePredictor.createPaddlePredictor(config);
this.cpuThreadNum = cpuThreadNum;
this.cpuPowerMode = cpuPowerMode;
this.modelPath = realPath;
this.modelName = realPath.substring(realPath.lastIndexOf("/") + 1);
return true;
}
public boolean runModel() {
if (!isLoaded()) {
return false;
}
// warm up
for (int i = 0; i < warmupIterNum; i++){
paddlePredictor.run();
}
// inference
Date start = new Date();
for (int i = 0; i < inferIterNum; i++) {
paddlePredictor.run();
}
Date end = new Date();
inferenceTime = (end.getTime() - start.getTime()) / (float) inferIterNum;
return true;
}
public boolean runModel(Bitmap image) {
setInputImage(image,bgImage);
return runModel();
}
//正式的推理主函数
public boolean runModel(Preprocess preprocess, Visualize visualize) {
if (inputImage == null || bgImage == null) {
return false;
}
// set input shape
Tensor inputTensor = getInput(0);
inputTensor.resize(config.inputShape);
// pre-process image
Date start = new Date();
// setInputImage(scaledImage);
preprocess.init(config);
preprocess.to_array(scaledImage);
preprocess.normalize(preprocess.inputData);
// feed input tensor with pre-processed data
inputTensor.setData(preprocess.inputData);
Date end = new Date();
preprocessTime = (float) (end.getTime() - start.getTime());
// inference
runModel();
start = new Date();
Tensor outputTensor = getOutput(0);
// post-process
this.outputImage = visualize.draw(inputImage, outputTensor,bgImage);
postprocessTime = (float) (end.getTime() - start.getTime());
outputResult = new String();
return true;
}
public void releaseModel() {
paddlePredictor = null;
isLoaded = false;
cpuThreadNum = 1;
cpuPowerMode = "LITE_POWER_HIGH";
modelPath = "";
modelName = "";
}
public void setConfig(Config config){
this.config = config;
}
public Bitmap inputImage() {
return inputImage;
}
public Bitmap outputImage() {
return outputImage;
}
public String outputResult() {
return outputResult;
}
public float preprocessTime() {
return preprocessTime;
}
public float postprocessTime() {
return postprocessTime;
}
public String modelPath() {
return modelPath;
}
public String modelName() {
return modelName;
}
public int cpuThreadNum() {
return cpuThreadNum;
}
public String cpuPowerMode() {
return cpuPowerMode;
}
public float inferenceTime() {
return inferenceTime;
}
public void setInputImage(Bitmap image,Bitmap bg) {
if (image == null || bg == null) {
return;
}
// 预处理输入图像
Bitmap rgbaImage = image.copy(Bitmap.Config.ARGB_8888, true);
Bitmap scaleImage = Bitmap.createScaledBitmap(rgbaImage, (int) this.config.inputShape[3], (int) this.config.inputShape[2], true);
this.inputImage = rgbaImage;
this.scaledImage = scaleImage;
//预处理背景图像
this.bgImage = bg.copy(Bitmap.Config.ARGB_8888, true);
}
}
package com.paddle.demo.matting;
import android.content.SharedPreferences;
import android.os.Bundle;
import android.preference.CheckBoxPreference;
import android.preference.EditTextPreference;
import android.preference.ListPreference;
import android.support.v7.app.ActionBar;
import java.util.ArrayList;
import java.util.List;
public class SettingsActivity extends AppCompatPreferenceActivity implements SharedPreferences.OnSharedPreferenceChangeListener {
ListPreference lpChoosePreInstalledModel = null;
CheckBoxPreference cbEnableCustomSettings = null;
EditTextPreference etModelPath = null;
EditTextPreference etLabelPath = null;
EditTextPreference etImagePath = null;
ListPreference lpCPUThreadNum = null;
ListPreference lpCPUPowerMode = null;
ListPreference lpInputColorFormat = null;
List<String> preInstalledModelPaths = null;
List<String> preInstalledLabelPaths = null;
List<String> preInstalledImagePaths = null;
List<String> preInstalledCPUThreadNums = null;
List<String> preInstalledCPUPowerModes = null;
List<String> preInstalledInputColorFormats = null;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
addPreferencesFromResource(R.xml.settings);
ActionBar supportActionBar = getSupportActionBar();
if (supportActionBar != null) {
supportActionBar.setDisplayHomeAsUpEnabled(true);
}
// initialized pre-installed models
preInstalledModelPaths = new ArrayList<String>();
preInstalledLabelPaths = new ArrayList<String>();
preInstalledImagePaths = new ArrayList<String>();
preInstalledCPUThreadNums = new ArrayList<String>();
preInstalledCPUPowerModes = new ArrayList<String>();
preInstalledInputColorFormats = new ArrayList<String>();
// add deeplab_mobilenet_for_cpu
preInstalledModelPaths.add(getString(R.string.MODEL_PATH_DEFAULT));
preInstalledLabelPaths.add(getString(R.string.LABEL_PATH_DEFAULT));
preInstalledImagePaths.add(getString(R.string.IMAGE_PATH_DEFAULT));
preInstalledCPUThreadNums.add(getString(R.string.CPU_THREAD_NUM_DEFAULT));
preInstalledCPUPowerModes.add(getString(R.string.CPU_POWER_MODE_DEFAULT));
preInstalledInputColorFormats.add(getString(R.string.INPUT_COLOR_FORMAT_DEFAULT));
// initialize UI components
lpChoosePreInstalledModel =
(ListPreference) findPreference(getString(R.string.CHOOSE_PRE_INSTALLED_MODEL_KEY));
String[] preInstalledModelNames = new String[preInstalledModelPaths.size()];
for (int i = 0; i < preInstalledModelPaths.size(); i++) {
preInstalledModelNames[i] =
preInstalledModelPaths.get(i).substring(preInstalledModelPaths.get(i).lastIndexOf("/") + 1);
}
lpChoosePreInstalledModel.setEntries(preInstalledModelNames);
lpChoosePreInstalledModel.setEntryValues(preInstalledModelPaths.toArray(new String[preInstalledModelPaths.size()]));
cbEnableCustomSettings =
(CheckBoxPreference) findPreference(getString(R.string.ENABLE_CUSTOM_SETTINGS_KEY));
etModelPath = (EditTextPreference) findPreference(getString(R.string.MODEL_PATH_KEY));
etModelPath.setTitle("Model Path (SDCard: " + Utils.getSDCardDirectory() + ")");
etLabelPath = (EditTextPreference) findPreference(getString(R.string.LABEL_PATH_KEY));
etImagePath = (EditTextPreference) findPreference(getString(R.string.IMAGE_PATH_KEY));
lpCPUThreadNum =
(ListPreference) findPreference(getString(R.string.CPU_THREAD_NUM_KEY));
lpCPUPowerMode =
(ListPreference) findPreference(getString(R.string.CPU_POWER_MODE_KEY));
lpInputColorFormat =
(ListPreference) findPreference(getString(R.string.INPUT_COLOR_FORMAT_KEY));
}
private void reloadPreferenceAndUpdateUI() {
SharedPreferences sharedPreferences = getPreferenceScreen().getSharedPreferences();
boolean enableCustomSettings =
sharedPreferences.getBoolean(getString(R.string.ENABLE_CUSTOM_SETTINGS_KEY), false);
String modelPath = sharedPreferences.getString(getString(R.string.CHOOSE_PRE_INSTALLED_MODEL_KEY),
getString(R.string.MODEL_PATH_DEFAULT));
int modelIdx = lpChoosePreInstalledModel.findIndexOfValue(modelPath);
if (modelIdx >= 0 && modelIdx < preInstalledModelPaths.size()) {
if (!enableCustomSettings) {
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.putString(getString(R.string.MODEL_PATH_KEY), preInstalledModelPaths.get(modelIdx));
editor.putString(getString(R.string.LABEL_PATH_KEY), preInstalledLabelPaths.get(modelIdx));
editor.putString(getString(R.string.IMAGE_PATH_KEY), preInstalledImagePaths.get(modelIdx));
editor.putString(getString(R.string.CPU_THREAD_NUM_KEY), preInstalledCPUThreadNums.get(modelIdx));
editor.putString(getString(R.string.CPU_POWER_MODE_KEY), preInstalledCPUPowerModes.get(modelIdx));
editor.putString(getString(R.string.INPUT_COLOR_FORMAT_KEY),
preInstalledInputColorFormats.get(modelIdx));
editor.commit();
}
lpChoosePreInstalledModel.setSummary(modelPath);
}
cbEnableCustomSettings.setChecked(enableCustomSettings);
etModelPath.setEnabled(enableCustomSettings);
etLabelPath.setEnabled(enableCustomSettings);
etImagePath.setEnabled(enableCustomSettings);
lpCPUThreadNum.setEnabled(enableCustomSettings);
lpCPUPowerMode.setEnabled(enableCustomSettings);
lpInputColorFormat.setEnabled(enableCustomSettings);
modelPath = sharedPreferences.getString(getString(R.string.MODEL_PATH_KEY),
getString(R.string.MODEL_PATH_DEFAULT));
String labelPath = sharedPreferences.getString(getString(R.string.LABEL_PATH_KEY),
getString(R.string.LABEL_PATH_DEFAULT));
String imagePath = sharedPreferences.getString(getString(R.string.IMAGE_PATH_KEY),
getString(R.string.IMAGE_PATH_DEFAULT));
String cpuThreadNum = sharedPreferences.getString(getString(R.string.CPU_THREAD_NUM_KEY),
getString(R.string.CPU_THREAD_NUM_DEFAULT));
String cpuPowerMode = sharedPreferences.getString(getString(R.string.CPU_POWER_MODE_KEY),
getString(R.string.CPU_POWER_MODE_DEFAULT));
String inputColorFormat = sharedPreferences.getString(getString(R.string.INPUT_COLOR_FORMAT_KEY),
getString(R.string.INPUT_COLOR_FORMAT_DEFAULT));
etModelPath.setSummary(modelPath);
etModelPath.setText(modelPath);
etLabelPath.setSummary(labelPath);
etLabelPath.setText(labelPath);
etImagePath.setSummary(imagePath);
etImagePath.setText(imagePath);
lpCPUThreadNum.setValue(cpuThreadNum);
lpCPUThreadNum.setSummary(cpuThreadNum);
lpCPUPowerMode.setValue(cpuPowerMode);
lpCPUPowerMode.setSummary(cpuPowerMode);
lpInputColorFormat.setValue(inputColorFormat);
lpInputColorFormat.setSummary(inputColorFormat);
}
@Override
protected void onResume() {
super.onResume();
getPreferenceScreen().getSharedPreferences().registerOnSharedPreferenceChangeListener(this);
reloadPreferenceAndUpdateUI();
}
@Override
protected void onPause() {
super.onPause();
getPreferenceScreen().getSharedPreferences().unregisterOnSharedPreferenceChangeListener(this);
}
@Override
public void onSharedPreferenceChanged(SharedPreferences sharedPreferences, String key) {
if (key.equals(getString(R.string.CHOOSE_PRE_INSTALLED_MODEL_KEY))) {
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.putBoolean(getString(R.string.ENABLE_CUSTOM_SETTINGS_KEY), false);
editor.commit();
}
reloadPreferenceAndUpdateUI();
}
}
package com.paddle.demo.matting;
import android.content.Context;
import android.os.Environment;
import java.io.*;
public class Utils {
private static final String TAG = Utils.class.getSimpleName();
public static void copyFileFromAssets(Context appCtx, String srcPath, String dstPath) {
if (srcPath.isEmpty() || dstPath.isEmpty()) {
return;
}
InputStream is = null;
OutputStream os = null;
try {
is = new BufferedInputStream(appCtx.getAssets().open(srcPath));
os = new BufferedOutputStream(new FileOutputStream(new File(dstPath)));
byte[] buffer = new byte[1024];
int length = 0;
while ((length = is.read(buffer)) != -1) {
os.write(buffer, 0, length);
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
os.close();
is.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
public static void copyDirectoryFromAssets(Context appCtx, String srcDir, String dstDir) {
if (srcDir.isEmpty() || dstDir.isEmpty()) {
return;
}
try {
if (!new File(dstDir).exists()) {
new File(dstDir).mkdirs();
}
for (String fileName : appCtx.getAssets().list(srcDir)) {
String srcSubPath = srcDir + File.separator + fileName;
String dstSubPath = dstDir + File.separator + fileName;
if (new File(srcSubPath).isDirectory()) {
copyDirectoryFromAssets(appCtx, srcSubPath, dstSubPath);
} else {
copyFileFromAssets(appCtx, srcSubPath, dstSubPath);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
public static float[] parseFloatsFromString(String string, String delimiter) {
String[] pieces = string.trim().toLowerCase().split(delimiter);
float[] floats = new float[pieces.length];
for (int i = 0; i < pieces.length; i++) {
floats[i] = Float.parseFloat(pieces[i].trim());
}
return floats;
}
public static long[] parseLongsFromString(String string, String delimiter) {
String[] pieces = string.trim().toLowerCase().split(delimiter);
long[] longs = new long[pieces.length];
for (int i = 0; i < pieces.length; i++) {
longs[i] = Long.parseLong(pieces[i].trim());
}
return longs;
}
public static String getSDCardDirectory() {
return Environment.getExternalStorageDirectory().getAbsolutePath();
}
public static boolean isSupportedNPU() {
String hardware = android.os.Build.HARDWARE;
return hardware.equalsIgnoreCase("kirin810") || hardware.equalsIgnoreCase("kirin990");
}
}
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment