[Doc] Update paddledetection/android app example (#663)

* [Bug Fix] fix android app detail page errors

* [Android] fix realtime camera mode and shutter

* [Bug Fix] fix AllocateSegmentationResultFromJava error

* [Bug Fix] fix camera preview size setting problem

* [Model] use uint8 buffer instead of fp32 in ppseg postprocess

* [Model] revert changes in ppseg

* [Model] revert postprocess changes in ppseg

* [Android] add fastdeploy android sdk download task

* [Bug Fix] fix yolov5face scale error in small image

* [Doc] Update Android SDK docs

* [Doc] Update Android SDK usage docs

* [Android] Update paddledetection/android app example
This commit is contained in:
DefTruth
2022-11-22 19:41:03 +08:00
committed by GitHub
parent 08a2a289d0
commit cd5cc98ba3
49 changed files with 8840 additions and 2200 deletions

View File

@@ -11,3 +11,10 @@ app/build
app/src/main/assets/models/* app/src/main/assets/models/*
app/.gradle app/.gradle
app/.idea app/.idea
fastdeploy/cache
fastdeploy/libs/fastdeploy*
fastdeploy/.cxx
fastdeploy/build
fastdeploy/src/main/assets/models/*
fastdeploy/.gradle
fastdeploy/.idea

View File

@@ -7,8 +7,6 @@
1. 在本地环境安装好 Android Studio 工具,详细安装方法请见[Android Stuido 官网](https://developer.android.com/studio)。 1. 在本地环境安装好 Android Studio 工具,详细安装方法请见[Android Stuido 官网](https://developer.android.com/studio)。
2. 准备一部 Android 手机,并开启 USB 调试模式。开启方法: `手机设置 -> 查找开发者选项 -> 打开开发者选项和 USB 调试模式` 2. 准备一部 Android 手机,并开启 USB 调试模式。开启方法: `手机设置 -> 查找开发者选项 -> 打开开发者选项和 USB 调试模式`
**注意**:如果您的 Android Studio 尚未配置 NDK ,请根据 Android Studio 用户指南中的[安装及配置 NDK 和 CMake ](https://developer.android.com/studio/projects/install-ndk)内容,预先配置好 NDK 。您可以选择最新的 NDK 版本,或者使用 FastDeploy Android 预测库版本一样的 NDK
## 部署步骤 ## 部署步骤
1. 目标检测 PicoDet Demo 位于 `fastdeploy/examples/vision/detection/paddledetection/android` 目录 1. 目标检测 PicoDet Demo 位于 `fastdeploy/examples/vision/detection/paddledetection/android` 目录
@@ -16,23 +14,20 @@
3. 手机连接电脑,打开 USB 调试和文件传输模式,并在 Android Studio 上连接自己的手机设备(手机需要开启允许从 USB 安装软件权限) 3. 手机连接电脑,打开 USB 调试和文件传输模式,并在 Android Studio 上连接自己的手机设备(手机需要开启允许从 USB 安装软件权限)
<p align="center"> <p align="center">
<img width="1280" alt="image" src="https://user-images.githubusercontent.com/31974251/197168120-7f77fbb7-3850-44f0-b6fa-865a98951226.png"> <img width="1440" alt="image" src="https://user-images.githubusercontent.com/31974251/203257262-71b908ab-bb2b-47d3-9efb-67631687b774.png">
</p> </p>
> **注意:** > **注意:**
>> 如果您在导入项目、编译或者运行过程中遇到 NDK 配置错误的提示,请打开 ` File > Project Structure > SDK Location`,修改 `Andriod NDK location` 为您本机配置的 NDK 所在路径。本工程默认使用的NDK版本为20. >> 如果您在导入项目、编译或者运行过程中遇到 NDK 配置错误的提示,请打开 ` File > Project Structure > SDK Location`,修改 `Andriod SDK location` 为您本机配置的 SDK 所在路径。
>> 如果您是通过 Andriod Studio 的 SDK Tools 下载的 NDK (见本章节"环境准备"),可以直接点击下拉框选择默认路径。
>> 还有一种 NDK 配置方法,你可以在 `paddledetection/android/local.properties` 文件中手动完成 NDK 路径配置,如下图所示
>> 如果以上步骤仍旧无法解决 NDK 配置错误,请尝试根据 Andriod Studio 官方文档中的[更新 Android Gradle 插件](https://developer.android.com/studio/releases/gradle-plugin?hl=zh-cn#updating-plugin)章节尝试更新Android Gradle plugin版本。
4. 点击 Run 按钮,自动编译 APP 并安装到手机。(该过程会自动下载预编译的 FastDeploy Android 库,需要联网) 4. 点击 Run 按钮,自动编译 APP 并安装到手机。(该过程会自动下载预编译的 FastDeploy Android 库 以及 模型文件,需要联网)
成功后效果如下图一APP 安装到手机;图二: APP 打开后的效果会自动识别图片中的物体并标记图三APP设置选项点击右上角的设置图片可以设置不同选项进行体验。 成功后效果如下图一APP 安装到手机;图二: APP 打开后的效果会自动识别图片中的物体并标记图三APP设置选项点击右上角的设置图片可以设置不同选项进行体验。
| APP 图标 | APP 效果 | APP设置项 | APP 图标 | APP 效果 | APP设置项
| --- | --- | --- | | --- | --- | --- |
| ![app_pic ](https://user-images.githubusercontent.com/31974251/197170082-a2bdd49d-60ea-4df0-af63-18ed898a746e.jpg) | ![app_res](https://user-images.githubusercontent.com/31974251/197169609-bb214af3-d6e7-4433-bb96-1225cddd441c.jpg) | ![app_setup](https://user-images.githubusercontent.com/31974251/197332983-afbfa6d5-4a3b-4c54-a528-4a3e58441be1.jpg) | | ![app_pic](https://user-images.githubusercontent.com/31974251/203268599-c94018d8-3683-490a-a5c7-a8136a4fa284.jpg) | ![app_res](https://user-images.githubusercontent.com/31974251/197169609-bb214af3-d6e7-4433-bb96-1225cddd441c.jpg) | ![app_setup](https://user-images.githubusercontent.com/31974251/197332983-afbfa6d5-4a3b-4c54-a528-4a3e58441be1.jpg) |
## PicoDet Java API 说明 ### PicoDet Java API 说明
- 模型初始化 API: 模型初始化API包含两种方式方式一是通过构造函数直接初始化方式二是通过调用init函数在合适的程序节点进行初始化。PicoDet初始化参数说明如下 - 模型初始化 API: 模型初始化API包含两种方式方式一是通过构造函数直接初始化方式二是通过调用init函数在合适的程序节点进行初始化。PicoDet初始化参数说明如下
- modelFile: String, paddle格式的模型文件路径如 model.pdmodel - modelFile: String, paddle格式的模型文件路径如 model.pdmodel
- paramFile: String, paddle格式的参数文件路径如 model.pdiparams - paramFile: String, paddle格式的参数文件路径如 model.pdiparams
@@ -56,13 +51,15 @@ public boolean init(String modelFile, String paramsFile, String configFile, Stri
// 直接预测不保存图片以及不渲染结果到Bitmap上 // 直接预测不保存图片以及不渲染结果到Bitmap上
public DetectionResult predict(Bitmap ARGB8888Bitmap) public DetectionResult predict(Bitmap ARGB8888Bitmap)
// 预测并且可视化预测结果以及可视化并将可视化后的图片保存到指定的途径以及将可视化结果渲染在Bitmap上 // 预测并且可视化预测结果以及可视化并将可视化后的图片保存到指定的途径以及将可视化结果渲染在Bitmap上
public DetectionResult predict(Bitmap ARGB8888Bitmap, String savedImagePath, float scoreThreshold) public DetectionResult predict(Bitmap ARGB8888Bitmap, String savedImagePath, float scoreThreshold);
public DetectionResult predict(Bitmap ARGB8888Bitmap, boolean rendering, float scoreThreshold); // 只渲染 不保存图片
``` ```
- 模型资源释放 API调用 release() API 可以释放模型资源返回true表示释放成功false表示失败调用 initialized() 可以判断模型是否初始化成功true表示初始化成功false表示失败。 - 模型资源释放 API调用 release() API 可以释放模型资源返回true表示释放成功false表示失败调用 initialized() 可以判断模型是否初始化成功true表示初始化成功false表示失败。
```java ```java
public boolean release(); // 释放native资源 public boolean release(); // 释放native资源
public boolean initialized(); // 检查是否初始化成功 public boolean initialized(); // 检查是否初始化成功
``` ```
- RuntimeOption设置说明 - RuntimeOption设置说明
```java ```java
public void enableLiteFp16(); // 开启fp16精度推理 public void enableLiteFp16(); // 开启fp16精度推理
@@ -70,16 +67,18 @@ public void disableLiteFP16(); // 关闭fp16精度推理
public void setCpuThreadNum(int threadNum); // 设置线程数 public void setCpuThreadNum(int threadNum); // 设置线程数
public void setLitePowerMode(LitePowerMode mode); // 设置能耗模式 public void setLitePowerMode(LitePowerMode mode); // 设置能耗模式
public void setLitePowerMode(String modeStr); // 通过字符串形式设置能耗模式 public void setLitePowerMode(String modeStr); // 通过字符串形式设置能耗模式
public void enableRecordTimeOfRuntime(); // 是否打印模型运行耗时
``` ```
- 模型结果DetectionResult说明 - 模型结果DetectionResult说明
```java ```java
public class DetectionResult {
public float[][] mBoxes; // [n,4] 检测框 (x1,y1,x2,y2) public float[][] mBoxes; // [n,4] 检测框 (x1,y1,x2,y2)
public float[] mScores; // [n] 得分 public float[] mScores; // [n] 每个检测框得分(置信度,概率值)
public int[] mLabelIds; // [n] 分类ID public int[] mLabelIds; // [n] 分类ID
public boolean initialized(); // 检测结果是否有效 public boolean initialized(); // 检测结果是否有效
}
``` ```
其他参考C++/Python对应的DetectionResult说明: [api/vision_results/detection_result.md](https://github.com/PaddlePaddle/FastDeploy/blob/develop/docs/api/vision_results/detection_result.md)
- 模型调用示例1使用构造函数以及默认的RuntimeOption - 模型调用示例1使用构造函数以及默认的RuntimeOption
```java ```java
@@ -125,32 +124,28 @@ String configFile = "picodet_s_320_coco_lcnet/infer_cfg.yml";
RuntimeOption option = new RuntimeOption(); RuntimeOption option = new RuntimeOption();
option.setCpuThreadNum(2); option.setCpuThreadNum(2);
option.setLitePowerMode(LitePowerMode.LITE_POWER_HIGH); option.setLitePowerMode(LitePowerMode.LITE_POWER_HIGH);
option.enableRecordTimeOfRuntime();
option.enableLiteFp16(); option.enableLiteFp16();
// 使用init函数初始化 // 使用init函数初始化
model.init(modelFile, paramFile, configFile, option); model.init(modelFile, paramFile, configFile, option);
// Bitmap读取、模型预测、资源释放 同上 ... // Bitmap读取、模型预测、资源释放 同上 ...
``` ```
更详细的用法请参考 [MainActivity](./app/src/main/java/com/baidu/paddle/fastdeploy/examples/MainActivity.java#L207) 中的用法 更详细的用法请参考 [DetectionMainActivity](./app/src/main/java/com/baidu/paddle/fastdeploy/app/examples/detection/DetectionMainActivity.java) 中的用法
## 替换 FastDeploy SDK和模型
替换FastDeploy预测库和模型的步骤非常简单。预测库所在的位置为 `app/libs/fastdeploy-android-sdk-xxx.aar`,其中 `xxx` 表示当前您使用的预测库版本号。模型所在的位置为,`app/src/main/assets/models/picodet_s_320_coco_lcnet`
- 替换FastDeploy Android SDK: 下载或编译最新的FastDeploy Android SDK解压缩后放在 `app/libs` 目录下;详细配置文档可参考:
- [在 Android 中使用 FastDeploy Java SDK](../../../../../java/android/)
## 替换 FastDeploy 预测库和模型
替换FastDeploy预测库和模型的步骤非常简单。预测库所在的位置为 `app/libs/fastdeploy-android-xxx-shared`,其中 `xxx` 表示当前您使用的预测库版本号。模型所在的位置为,`app/src/main/assets/models/picodet_s_320_coco_lcnet`。
- 替换FastDeploy预测库的步骤:
- 下载或编译最新的FastDeploy Android预测库解压缩后放在 `app/libs` 目录下;
- 修改 `app/src/main/cpp/CMakeLists.txt` 中的预测库路径,指向您下载或编译的预测库路径。如:
```cmake
set(FastDeploy_DIR "${CMAKE_CURRENT_SOURCE_DIR}/../../../libs/fastdeploy-android-xxx-shared")
```
- 替换PicoDet模型的步骤 - 替换PicoDet模型的步骤
- 将您的PicoDet模型放在 `app/src/main/assets/models` 目录下; - 将您的PicoDet模型放在 `app/src/main/assets/models` 目录下;
- 修改 `app/src/main/res/values/strings.xml` 中模型路径的默认值,如: - 修改 `app/src/main/res/values/strings.xml` 中模型路径的默认值,如:
```xml ```xml
<!-- 将这个路径指修改成您的模型,如 models/picodet_l_320_coco_lcnet --> <!-- 将这个路径指修改成您的模型,如 models/picodet_l_320_coco_lcnet -->
<string name="MODEL_DIR_DEFAULT">models/picodet_s_320_coco_lcnet</string> <string name="DETECTION_MODEL_DIR_DEFAULT">models/picodet_s_320_coco_lcnet</string>
<string name="LABEL_PATH_DEFAULT">labels/coco_label_list.txt</string> <string name="DETECTION_LABEL_PATH_DEFAULT">labels/coco_label_list.txt</string>
``` ```
## 如何通过 JNI 在 Native 层接入 FastDeploy C++ API ? ## 更多参考文档
如果您如何通过JNI来接入FastDeploy C++ API感兴趣可以参考以下内容: 如果您想知道更多的FastDeploy Java API文档以及如何通过JNI来接入FastDeploy C++ API感兴趣可以参考以下内容:
- [app/src/main/cpp 代码实现](./app/src/main/cpp/) - [在 Android 中使用 FastDeploy Java SDK](../../../../../java/android/)
- [在 Android 中使用 FastDeploy C++ SDK](../../../../../docs/cn/faq/use_cpp_sdk_on_android.md) - [在 Android 中使用 FastDeploy C++ SDK](../../../../../docs/cn/faq/use_cpp_sdk_on_android.md)

View File

@@ -6,21 +6,13 @@ android {
compileSdk 28 compileSdk 28
defaultConfig { defaultConfig {
applicationId "com.baidu.paddle.fastdeploy" applicationId 'com.baidu.paddle.fastdeploy.app.examples'
minSdkVersion 15 minSdkVersion 15
//noinspection ExpiredTargetSdkVersion //noinspection ExpiredTargetSdkVersion
targetSdkVersion 28 targetSdkVersion 28
versionCode 1 versionCode 1
versionName "1.0" versionName "1.0"
testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner" testInstrumentationRunner "android.support.test.runner.AndroidJUnitRunner"
externalNativeBuild {
cmake {
arguments '-DANDROID_PLATFORM=android-21', '-DANDROID_STL=c++_shared', "-DANDROID_TOOLCHAIN=clang"
abiFilters 'armeabi-v7a', 'arm64-v8a'
cppFlags "-std=c++11"
}
}
} }
buildTypes { buildTypes {
@@ -30,17 +22,10 @@ android {
} }
} }
externalNativeBuild {
cmake {
path file('src/main/cpp/CMakeLists.txt')
version '3.10.2'
}
}
ndkVersion '20.1.5948944'
} }
dependencies { dependencies {
implementation fileTree(include: ['*.jar'], dir: 'libs') implementation fileTree(include: ['*.aar'], dir: 'libs')
implementation 'com.android.support:appcompat-v7:28.0.0' implementation 'com.android.support:appcompat-v7:28.0.0'
//noinspection GradleDependency //noinspection GradleDependency
implementation 'com.android.support.constraint:constraint-layout:1.1.3' implementation 'com.android.support.constraint:constraint-layout:1.1.3'
@@ -52,49 +37,81 @@ dependencies {
androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2' androidTestImplementation 'com.android.support.test.espresso:espresso-core:3.0.2'
} }
def archives = [ def FD_MODEL = [
[
'src' : 'https://bj.bcebos.com/fastdeploy/release/android/fastdeploy-android-0.4.0-shared.tgz',
'dest': 'libs'
],
[ [
'src' : 'https://bj.bcebos.com/paddlehub/fastdeploy/picodet_s_320_coco_lcnet.tgz', 'src' : 'https://bj.bcebos.com/paddlehub/fastdeploy/picodet_s_320_coco_lcnet.tgz',
'dest': 'src/main/assets/models' 'dest': 'src/main/assets/models'
],
[
'src': 'https://bj.bcebos.com/paddlehub/fastdeploy/picodet_l_320_coco_lcnet.tgz',
'dest' : 'src/main/assets/models'
] ]
] ]
task downloadAndExtractArchives(type: DefaultTask) { def FD_JAVA_SDK = [
[
'src' : 'https://bj.bcebos.com/fastdeploy/test/fastdeploy-android-sdk-latest-dev.aar',
'dest': 'libs'
]
]
task downloadAndExtractModels(type: DefaultTask) {
doFirst { doFirst {
println "Downloading and extracting archives including libs and models" println "Downloading and extracting fastdeploy models ..."
} }
doLast { doLast {
// Prepare cache folder for archives
String cachePath = "cache" String cachePath = "cache"
if (!file("${cachePath}").exists()) { if (!file("${cachePath}").exists()) {
mkdir "${cachePath}" mkdir "${cachePath}"
} }
archives.eachWithIndex { archive, index -> FD_MODEL.eachWithIndex { model, index ->
MessageDigest messageDigest = MessageDigest.getInstance('MD5') MessageDigest messageDigest = MessageDigest.getInstance('MD5')
messageDigest.update(archive.src.bytes) messageDigest.update(model.src.bytes)
String cacheName = new BigInteger(1, messageDigest.digest()).toString(32) String[] modelPaths = model.src.split("/")
// Download the target archive if not exists String modelName = modelPaths[modelPaths.length - 1]
boolean copyFiles = !file("${archive.dest}").exists() // Download the target model if not exists
if (!file("${cachePath}/${cacheName}.tgz").exists()) { boolean copyFiles = !file("${model.dest}").exists()
ant.get(src: archive.src, dest: file("${cachePath}/${cacheName}.tgz")) if (!file("${cachePath}/${modelName}").exists()) {
copyFiles = true // force to copy files from the latest archive files println "Downloading ${model.src} -> ${cachePath}/${modelName}"
ant.get(src: model.src, dest: file("${cachePath}/${modelName}"))
copyFiles = true
} }
// Extract the target archive if its dest path does not exists
if (copyFiles) { if (copyFiles) {
println "Coping ${cachePath}/${modelName} -> ${model.dest}"
copy { copy {
from tarTree("${cachePath}/${cacheName}.tgz") from tarTree("${cachePath}/${modelName}")
into "${archive.dest}" into "${model.dest}"
} }
} }
} }
} }
} }
preBuild.dependsOn downloadAndExtractArchives
task downloadAndExtractSDKs(type: DefaultTask) {
doFirst {
println "Downloading and extracting fastdeploy android java sdk ..."
}
doLast {
String cachePath = "cache"
if (!file("${cachePath}").exists()) {
mkdir "${cachePath}"
}
FD_JAVA_SDK.eachWithIndex { sdk, index ->
String[] sdkPaths = sdk.src.split("/")
String sdkName = sdkPaths[sdkPaths.length - 1]
// Download the target SDK if not exists
boolean copyFiles = !file("${sdk.dest}/${sdkName}").exists()
if (!file("${cachePath}/${sdkName}").exists()) {
println "Downloading ${sdk.src} -> ${cachePath}/${sdkName}"
ant.get(src: sdk.src, dest: file("${cachePath}/${sdkName}"))
copyFiles = true
}
if (copyFiles) {
println "Coping ${cachePath}/${sdkName} -> ${sdk.dest}/${sdkName}"
copy {
from "${cachePath}/${sdkName}"
into "${sdk.dest}"
}
}
}
}
}
preBuild.dependsOn downloadAndExtractSDKs
preBuild.dependsOn downloadAndExtractModels

View File

@@ -1 +0,0 @@
fastdeploy-*

View File

@@ -1,6 +1,6 @@
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android" <manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.baidu.paddle.fastdeploy.examples"> package="com.baidu.paddle.fastdeploy.app.examples">
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/> <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
@@ -15,14 +15,14 @@
android:roundIcon="@mipmap/ic_launcher_round" android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true" android:supportsRtl="true"
android:theme="@style/AppTheme"> android:theme="@style/AppTheme">
<activity android:name="com.baidu.paddle.fastdeploy.examples.MainActivity"> <activity android:name=".detection.DetectionMainActivity">
<intent-filter> <intent-filter>
<action android:name="android.intent.action.MAIN"/> <action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/> <category android:name="android.intent.category.LAUNCHER"/>
</intent-filter> </intent-filter>
</activity> </activity>
<activity <activity
android:name="com.baidu.paddle.fastdeploy.examples.SettingsActivity" android:name=".detection.DetectionSettingsActivity"
android:label="Settings"> android:label="Settings">
</activity> </activity>
</application> </application>

View File

@@ -0,0 +1,94 @@
0
1
2
3
4
5
6
7
8
9
:
;
<
=
>
?
@
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z
[
\
]
^
_
`
a
b
c
d
e
f
g
h
i
j
k
l
m
n
o
p
q
r
s
t
u
v
w
x
y
z
{
|
}
~
!
"
#
$
%
&
'
(
)
*
+
,
-
.
/

View File

@@ -1,60 +0,0 @@
# For more information about using CMake with Android Studio, read the
# documentation: https://d.android.com/studio/projects/add-native-code.html
# Sets the minimum version of CMake required to build the native library.
cmake_minimum_required(VERSION 3.10.2)
# Declares and names the project.
project("fastdeploy_jni")
# Creates and names a library, sets it as either STATIC
# or SHARED, and provides the relative paths to its source code.
# You can define multiple libraries, and CMake builds them for you.
# Gradle automatically packages shared libraries with your APK.
set(FastDeploy_DIR "${CMAKE_CURRENT_SOURCE_DIR}/../../../libs/fastdeploy-android-0.4.0-shared")
find_package(FastDeploy REQUIRED)
include_directories(${CMAKE_CURRENT_SOURCE_DIR})
include_directories(${FastDeploy_INCLUDE_DIRS})
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -ffast-math -Ofast -Os -DNDEBUG -fno-exceptions -fomit-frame-pointer -fno-asynchronous-unwind-tables -fno-unwind-tables")
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fvisibility=hidden -fvisibility-inlines-hidden -fdata-sections -ffunction-sections")
set(CMAKE_SHARED_LINKER_FLAGS "${CMAKE_SHARED_LINKER_FLAGS} -Wl,--gc-sections -Wl,-z,nocopyreloc")
add_library(
fastdeploy_jni
SHARED
utils_jni.cc
bitmap_jni.cc
vision/results_jni.cc
vision/visualize_jni.cc
vision/detection/picodet_jni.cc
vision/classification/paddleclas_model_jni.cc)
# Searches for a specified prebuilt library and stores the path as a
# variable. Because CMake includes system libraries in the search path by
# default, you only need to specify the name of the public NDK library
# you want to add. CMake verifies that the library exists before
# completing its build.
find_library( # Sets the name of the path variable.
log-lib
# Specifies the name of the NDK library that
# you want CMake to locate.
log)
# Specifies libraries CMake should link to your target library. You can link
# multiple libraries, such as libraries you define in this build script,
# prebuilt third-party libraries, or system libraries.
target_link_libraries(
# Specifies the target library.
fastdeploy_jni
jnigraphics
${FASTDEPLOY_LIBS}
GLESv2
EGL
${log-lib}
)

View File

@@ -1,100 +0,0 @@
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include "bitmap_jni.h" // NOLINT
#include <android/bitmap.h> // NOLINT
#include "utils_jni.h" // NOLINT
namespace fastdeploy {
namespace jni {
jboolean ARGB888Bitmap2RGBA(JNIEnv *env, jobject j_argb8888_bitmap,
cv::Mat *c_rgba) {
// Convert the android bitmap(ARGB8888) to the OpenCV RGBA image. Actually,
// the data layout of ARGB8888 is R, G, B, A, it's the same as CV RGBA image,
// so it is unnecessary to do the conversion of color format, check
// https://developer.android.com/reference/android/graphics/Bitmap.Config#ARGB_8888
// to get the more details about Bitmap.Config.ARGB8888
AndroidBitmapInfo j_bitmap_info;
if (AndroidBitmap_getInfo(env, j_argb8888_bitmap, &j_bitmap_info) < 0) {
LOGE("Invoke AndroidBitmap_getInfo() failed!");
return JNI_FALSE;
}
if (j_bitmap_info.format != ANDROID_BITMAP_FORMAT_RGBA_8888) {
LOGE("Only Bitmap.Config.ARGB8888 color format is supported!");
return JNI_FALSE;
}
void *j_bitmap_pixels;
if (AndroidBitmap_lockPixels(env, j_argb8888_bitmap, &j_bitmap_pixels) < 0) {
LOGE("Invoke AndroidBitmap_lockPixels() failed!");
return JNI_FALSE;
}
cv::Mat j_bitmap_im(static_cast<int>(j_bitmap_info.height),
static_cast<int>(j_bitmap_info.width), CV_8UC4,
j_bitmap_pixels);
j_bitmap_im.copyTo(*(c_rgba));
if (AndroidBitmap_unlockPixels(env, j_argb8888_bitmap) < 0) {
LOGE("Invoke AndroidBitmap_unlockPixels() failed!");
return JNI_FALSE;
}
return JNI_TRUE;
}
jboolean ARGB888Bitmap2BGR(JNIEnv *env, jobject j_argb8888_bitmap,
cv::Mat *c_bgr) {
cv::Mat c_rgba;
if (!ARGB888Bitmap2RGBA(env, j_argb8888_bitmap, &c_rgba)) {
return JNI_FALSE;
}
cv::cvtColor(c_rgba, *(c_bgr), cv::COLOR_RGBA2BGR);
return JNI_TRUE;
}
jboolean RGBA2ARGB888Bitmap(JNIEnv *env, jobject j_argb8888_bitmap,
const cv::Mat &c_rgba) {
AndroidBitmapInfo j_bitmap_info;
if (AndroidBitmap_getInfo(env, j_argb8888_bitmap, &j_bitmap_info) < 0) {
LOGE("Invoke AndroidBitmap_getInfo() failed!");
return JNI_FALSE;
}
void *j_bitmap_pixels;
if (AndroidBitmap_lockPixels(env, j_argb8888_bitmap, &j_bitmap_pixels) < 0) {
LOGE("Invoke AndroidBitmap_lockPixels() failed!");
return JNI_FALSE;
}
cv::Mat j_bitmap_im(static_cast<int>(j_bitmap_info.height),
static_cast<int>(j_bitmap_info.width), CV_8UC4,
j_bitmap_pixels);
c_rgba.copyTo(j_bitmap_im);
if (AndroidBitmap_unlockPixels(env, j_argb8888_bitmap) < 0) {
LOGE("Invoke AndroidBitmap_unlockPixels() failed!");
return JNI_FALSE;
}
return JNI_TRUE;
}
jboolean BGR2ARGB888Bitmap(JNIEnv *env, jobject j_argb8888_bitmap,
const cv::Mat &c_bgr) {
if (c_bgr.empty()) {
return JNI_FALSE;
}
cv::Mat c_rgba;
cv::cvtColor(c_bgr, c_rgba, cv::COLOR_BGR2RGBA);
return RGBA2ARGB888Bitmap(env, j_argb8888_bitmap, c_rgba);
}
} // namespace jni
} // namespace fastdeploy

View File

@@ -1,39 +0,0 @@
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#pragma once
#include <jni.h> // NOLINT
#include "fastdeploy/vision.h" // NOLINT
namespace fastdeploy {
namespace jni {
// Convert the android bitmap(ARGB8888) to the OpenCV RGBA image. Actually,
// the data layout of ARGB8888 is R, G, B, A, it's the same as CV RGBA image,
// so it is unnecessary to do the conversion of color format, check
// https://developer.android.com/reference/android/graphics/Bitmap.Config#ARGB_8888
// to get the more details about Bitmap.Config.ARGB8888
jboolean ARGB888Bitmap2RGBA(JNIEnv *env, jobject j_argb8888_bitmap,
cv::Mat *c_rgba);
jboolean RGBA2ARGB888Bitmap(JNIEnv *env, jobject j_argb8888_bitmap,
const cv::Mat &c_rgba);
jboolean ARGB888Bitmap2BGR(JNIEnv *env, jobject j_argb8888_bitmap,
cv::Mat *c_bgr);
jboolean BGR2ARGB888Bitmap(JNIEnv *env, jobject j_argb8888_bitmap,
const cv::Mat &c_bgr);
} // namespace jni
} // namespace fastdeploy

View File

@@ -1,140 +0,0 @@
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#pragma once
#include <jni.h> // NOLINT
#include <string> // NOLINT
#include <vector> // NOLINT
namespace fastdeploy {
namespace jni {
template <typename OutputType, typename InputType>
OutputType ConvertTo(JNIEnv *env, InputType input);
template <typename OutputType, typename InputType>
OutputType ConvertTo(JNIEnv *env, const InputType *input, int64_t len);
/// jstring -> std::string
template <>
inline std::string ConvertTo(JNIEnv *env, jstring jstr) {
// In java, a unicode char will be encoded using 2 bytes (utf16).
// so jstring will contain characters utf16. std::string in c++ is
// essentially a string of bytes, not characters, so if we want to
// pass jstring from JNI to c++, we have convert utf16 to bytes.
if (!jstr) {
return "";
}
const jclass jstring_clazz = env->GetObjectClass(jstr);
const jmethodID getBytesID =
env->GetMethodID(jstring_clazz, "getBytes", "(Ljava/lang/String;)[B");
const jbyteArray jstring_bytes = (jbyteArray)env->CallObjectMethod(
jstr, getBytesID, env->NewStringUTF("UTF-8"));
size_t length = static_cast<size_t>(env->GetArrayLength(jstring_bytes));
jbyte *jstring_bytes_ptr = env->GetByteArrayElements(jstring_bytes, NULL);
std::string res =
std::string(reinterpret_cast<char *>(jstring_bytes_ptr), length);
env->ReleaseByteArrayElements(jstring_bytes, jstring_bytes_ptr, JNI_ABORT);
env->DeleteLocalRef(jstring_bytes);
env->DeleteLocalRef(jstring_clazz);
return res;
}
/// std::string -> jstring
template <>
inline jstring ConvertTo(JNIEnv *env, std::string str) {
auto *cstr_data_ptr = str.c_str();
jclass jstring_clazz = env->FindClass("java/lang/String");
jmethodID initID =
env->GetMethodID(jstring_clazz, "<init>", "([BLjava/lang/String;)V");
jbyteArray jstring_bytes = env->NewByteArray(strlen(cstr_data_ptr));
env->SetByteArrayRegion(jstring_bytes, 0, strlen(cstr_data_ptr),
reinterpret_cast<const jbyte *>(cstr_data_ptr));
jstring jstring_encoding = env->NewStringUTF("UTF-8");
jstring res = (jstring)(env->NewObject(jstring_clazz, initID, jstring_bytes,
jstring_encoding));
env->DeleteLocalRef(jstring_clazz);
env->DeleteLocalRef(jstring_bytes);
env->DeleteLocalRef(jstring_encoding);
return res;
}
/// jlongArray -> std::vector<int64_t>
template <>
inline std::vector<int64_t> ConvertTo(JNIEnv *env, jlongArray jdata) {
int jdata_size = env->GetArrayLength(jdata);
jlong *jdata_ptr = env->GetLongArrayElements(jdata, nullptr);
std::vector<int64_t> res(jdata_ptr, jdata_ptr + jdata_size);
env->ReleaseLongArrayElements(jdata, jdata_ptr, 0);
return res;
}
/// jfloatArray -> std::vector<float>
template <>
inline std::vector<float> ConvertTo(JNIEnv *env, jfloatArray jdata) {
int jdata_size = env->GetArrayLength(jdata);
jfloat *jdata_ptr = env->GetFloatArrayElements(jdata, nullptr);
std::vector<float> res(jdata_ptr, jdata_ptr + jdata_size);
env->ReleaseFloatArrayElements(jdata, jdata_ptr, 0);
return res;
}
/// std::vector<int64_t> -> jlongArray
template <>
inline jlongArray ConvertTo(JNIEnv *env, const std::vector<int64_t> &cvec) {
jlongArray res = env->NewLongArray(cvec.size());
jlong *jbuf = new jlong[cvec.size()];
for (size_t i = 0; i < cvec.size(); ++i) {
jbuf[i] = (jlong)cvec[i];
}
env->SetLongArrayRegion(res, 0, cvec.size(), jbuf);
delete[] jbuf;
return res;
}
/// cxx float buffer -> jfloatArray
template <>
inline jfloatArray ConvertTo(JNIEnv *env, const float *cbuf, int64_t len) {
jfloatArray res = env->NewFloatArray(len);
env->SetFloatArrayRegion(res, 0, len, cbuf);
return res;
}
/// cxx int buffer -> jintArray
template <>
inline jintArray ConvertTo(JNIEnv *env, const int *cbuf, int64_t len) {
jintArray res = env->NewIntArray(len);
env->SetIntArrayRegion(res, 0, len, cbuf);
return res;
}
/// cxx int8_t buffer -> jbyteArray
template <>
inline jbyteArray ConvertTo(JNIEnv *env, const int8_t *cbuf, int64_t len) {
jbyteArray res = env->NewByteArray(len);
env->SetByteArrayRegion(res, 0, len, cbuf);
return res;
}
} // namespace jni
} // namespace fastdeploy

View File

@@ -1,18 +0,0 @@
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#pragma once
#include "bitmap_jni.h" // NOLINT
#include "convert_jni.h" // NOLINT
#include "utils_jni.h" // NOLINT

View File

@@ -1,82 +0,0 @@
//
// Created by qiuyanjun on 2022/10/19.
//
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include "utils_jni.h"
namespace fastdeploy {
namespace jni {
// Assets Loader Utils.
bool AssetsLoaderUtils::detection_labels_loaded_ = false;
bool AssetsLoaderUtils::classification_labels_loaded_ = false;
std::vector<std::string> AssetsLoaderUtils::detection_labels_ = {};
std::vector<std::string> AssetsLoaderUtils::classification_labels_ = {};
bool AssetsLoaderUtils::IsDetectionLabelsLoaded() {
return detection_labels_loaded_;
}
bool AssetsLoaderUtils::IsClassificationLabelsLoaded() {
return classification_labels_loaded_;
}
const std::vector<std::string>& AssetsLoaderUtils::GetDetectionLabels() {
return detection_labels_;
}
const std::vector<std::string>& AssetsLoaderUtils::GetClassificationLabels() {
return classification_labels_;
}
void AssetsLoaderUtils::LoadClassificationLabels(const std::string& path,
bool force_reload) {
if (force_reload || (!classification_labels_loaded_)) {
classification_labels_loaded_ =
LoadLabelsFromTxt(path, &classification_labels_);
}
}
void AssetsLoaderUtils::LoadDetectionLabels(const std::string& path,
bool force_reload) {
if (force_reload || (!detection_labels_loaded_)) {
detection_labels_loaded_ = LoadLabelsFromTxt(path, &detection_labels_);
}
}
bool AssetsLoaderUtils::LoadLabelsFromTxt(const std::string& txt_path,
std::vector<std::string>* labels) {
labels->clear();
std::ifstream file;
file.open(txt_path);
if (!file.is_open()) {
return false;
}
while (file) {
std::string line;
std::getline(file, line);
if (!line.empty() && line != "\n") {
labels->push_back(line);
}
}
file.clear();
file.close();
return labels->size() > 0;
}
} // namespace jni
} // namespace fastdeploy

View File

@@ -1,80 +0,0 @@
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#pragma once
#ifdef __ANDROID__
#include <android/log.h> // NOLINT
#endif
#include <fstream> // NOLINT
#include <string> // NOLINT
#include <vector> // NOLINT
#define TAG "[FastDeploy][JNI]"
#ifdef __ANDROID__
#define LOGD(...) __android_log_print(ANDROID_LOG_DEBUG, TAG, __VA_ARGS__)
#define LOGI(...) __android_log_print(ANDROID_LOG_INFO, TAG, __VA_ARGS__)
#define LOGW(...) __android_log_print(ANDROID_LOG_WARN, TAG, __VA_ARGS__)
#define LOGE(...) __android_log_print(ANDROID_LOG_ERROR, TAG, __VA_ARGS__)
#define LOGF(...) __android_log_print(ANDROID_LOG_FATAL, TAG, __VA_ARGS__)
#else
#define LOGD(...) \
{}
#define LOGI(...) \
{}
#define LOGW(...) \
{}
#define LOGE(...) \
{}
#define LOGF(...) \
{}
#endif
namespace fastdeploy {
namespace jni {
inline int64_t GetCurrentTime() {
struct timeval time;
gettimeofday(&time, NULL);
return 1000000LL * (int64_t)time.tv_sec + (int64_t)time.tv_usec;
}
inline double GetElapsedTime(int64_t time) {
return (GetCurrentTime() - time) / 1000.0f;
}
class AssetsLoaderUtils {
public:
static bool detection_labels_loaded_;
static bool classification_labels_loaded_;
static std::vector<std::string> detection_labels_;
static std::vector<std::string> classification_labels_;
public:
static bool IsDetectionLabelsLoaded();
static bool IsClassificationLabelsLoaded();
static const std::vector<std::string>& GetDetectionLabels();
static const std::vector<std::string>& GetClassificationLabels();
static void LoadClassificationLabels(const std::string& path,
bool force_reload = false);
static void LoadDetectionLabels(const std::string& path,
bool force_reload = false);
private:
static bool LoadLabelsFromTxt(const std::string& txt_path,
std::vector<std::string>* labels);
};
} // namespace jni
} // namespace fastdeploy

View File

@@ -1,148 +0,0 @@
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <jni.h> // NOLINT
#include "fastdeploy_jni.h" // NOLINT
#ifdef __cplusplus
extern "C" {
#endif
JNIEXPORT jlong JNICALL
Java_com_baidu_paddle_fastdeploy_vision_classification_PaddleClasModel_bindNative(
JNIEnv *env, jclass clazz, jstring model_file, jstring params_file,
jstring config_file, jint cpu_num_thread, jboolean enable_lite_fp16,
jint lite_power_mode, jstring lite_optimized_model_dir,
jboolean enable_record_time_of_runtime, jstring label_file) {
std::string c_model_file =
fastdeploy::jni::ConvertTo<std::string>(env, model_file);
std::string c_params_file =
fastdeploy::jni::ConvertTo<std::string>(env, params_file);
std::string c_config_file =
fastdeploy::jni::ConvertTo<std::string>(env, config_file);
std::string c_label_file =
fastdeploy::jni::ConvertTo<std::string>(env, label_file);
std::string c_lite_optimized_model_dir =
fastdeploy::jni::ConvertTo<std::string>(env, lite_optimized_model_dir);
auto c_cpu_num_thread = static_cast<int>(cpu_num_thread);
auto c_enable_lite_fp16 = static_cast<bool>(enable_lite_fp16);
auto c_lite_power_mode =
static_cast<fastdeploy::LitePowerMode>(lite_power_mode);
fastdeploy::RuntimeOption c_option;
c_option.UseCpu();
c_option.UseLiteBackend();
c_option.SetCpuThreadNum(c_cpu_num_thread);
c_option.SetLitePowerMode(c_lite_power_mode);
c_option.SetLiteOptimizedModelDir(c_lite_optimized_model_dir);
if (c_enable_lite_fp16) {
c_option.EnableLiteFP16();
}
auto c_model_ptr = new fastdeploy::vision::classification::PaddleClasModel(
c_model_file, c_params_file, c_config_file, c_option);
// Enable record Runtime time costs.
if (enable_record_time_of_runtime) {
c_model_ptr->EnableRecordTimeOfRuntime();
}
// Load classification labels if label path is not empty.
if ((!fastdeploy::jni::AssetsLoaderUtils::IsClassificationLabelsLoaded()) &&
(!c_label_file.empty())) {
fastdeploy::jni::AssetsLoaderUtils::LoadClassificationLabels(c_label_file);
}
// WARN: need to release manually in Java !
return reinterpret_cast<jlong>(c_model_ptr); // native model context
}
JNIEXPORT jlong JNICALL
Java_com_baidu_paddle_fastdeploy_vision_classification_PaddleClasModel_predictNative(
JNIEnv *env, jclass clazz, jlong native_model_context,
jobject argb8888_bitmap, jboolean saved, jstring saved_image_path,
jfloat score_threshold, jboolean rendering) {
if (native_model_context == 0) {
return 0;
}
cv::Mat c_bgr;
auto t = fastdeploy::jni::GetCurrentTime();
if (!fastdeploy::jni::ARGB888Bitmap2BGR(env, argb8888_bitmap, &c_bgr)) {
return 0;
}
LOGD("Read from bitmap costs %f ms", fastdeploy::jni::GetElapsedTime(t));
auto c_model_ptr =
reinterpret_cast<fastdeploy::vision::classification::PaddleClasModel *>(
native_model_context);
auto c_result_ptr = new fastdeploy::vision::ClassifyResult();
t = fastdeploy::jni::GetCurrentTime();
if (!c_model_ptr->Predict(&c_bgr, c_result_ptr, 100)) {
delete c_result_ptr;
return 0;
}
LOGD("Predict from native costs %f ms", fastdeploy::jni::GetElapsedTime(t));
if (c_model_ptr->EnabledRecordTimeOfRuntime()) {
auto info_of_runtime = c_model_ptr->PrintStatisInfoOfRuntime();
LOGD("Avg runtime costs %f ms", info_of_runtime["avg_time"] * 1000.0f);
}
if (!c_result_ptr->scores.empty() && rendering) {
t = fastdeploy::jni::GetCurrentTime();
cv::Mat c_vis_im;
if (fastdeploy::jni::AssetsLoaderUtils::IsClassificationLabelsLoaded()) {
c_vis_im = fastdeploy::vision::VisClassification(
c_bgr, *(c_result_ptr),
fastdeploy::jni::AssetsLoaderUtils::GetClassificationLabels(), 5,
score_threshold, 1.0f);
} else {
c_vis_im = fastdeploy::vision::VisClassification(
c_bgr, *(c_result_ptr), 5, score_threshold, 1.0f);
}
LOGD("Visualize from native costs %f ms",
fastdeploy::jni::GetElapsedTime(t));
// Rendering to bitmap
t = fastdeploy::jni::GetCurrentTime();
if (!fastdeploy::jni::BGR2ARGB888Bitmap(env, argb8888_bitmap, c_vis_im)) {
delete c_result_ptr;
return 0;
}
LOGD("Write to bitmap from native costs %f ms",
fastdeploy::jni::GetElapsedTime(t));
std::string c_saved_image_path =
fastdeploy::jni::ConvertTo<std::string>(env, saved_image_path);
if (!c_saved_image_path.empty() && saved) {
t = fastdeploy::jni::GetCurrentTime();
cv::imwrite(c_saved_image_path, c_bgr);
LOGD("Save image from native costs %f ms, path: %s",
fastdeploy::jni::GetElapsedTime(t), c_saved_image_path.c_str());
}
}
// WARN: need to release it manually in Java !
return reinterpret_cast<jlong>(c_result_ptr); // native result context
}
JNIEXPORT jboolean JNICALL
Java_com_baidu_paddle_fastdeploy_vision_classification_PaddleClasModel_releaseNative(
JNIEnv *env, jclass clazz, jlong native_model_context) {
auto c_model_ptr =
reinterpret_cast<fastdeploy::vision::classification::PaddleClasModel *>(
native_model_context);
if (c_model_ptr->EnabledRecordTimeOfRuntime()) {
auto info_of_runtime = c_model_ptr->PrintStatisInfoOfRuntime();
LOGD("[End] Avg runtime costs %f ms",
info_of_runtime["avg_time"] * 1000.0f);
}
delete c_model_ptr;
LOGD("[End] Release PaddleClasModel in native !");
return JNI_TRUE;
}
#ifdef __cplusplus
}
#endif

View File

@@ -1,149 +0,0 @@
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <jni.h> // NOLINT
#include "fastdeploy_jni.h" // NOLINT
#ifdef __cplusplus
extern "C" {
#endif
JNIEXPORT jlong JNICALL
Java_com_baidu_paddle_fastdeploy_vision_detection_PicoDet_bindNative(
JNIEnv *env, jclass clazz, jstring model_file, jstring params_file,
jstring config_file, jint cpu_num_thread, jboolean enable_lite_fp16,
jint lite_power_mode, jstring lite_optimized_model_dir,
jboolean enable_record_time_of_runtime, jstring label_file) {
std::string c_model_file =
fastdeploy::jni::ConvertTo<std::string>(env, model_file);
std::string c_params_file =
fastdeploy::jni::ConvertTo<std::string>(env, params_file);
std::string c_config_file =
fastdeploy::jni::ConvertTo<std::string>(env, config_file);
std::string c_label_file =
fastdeploy::jni::ConvertTo<std::string>(env, label_file);
std::string c_lite_optimized_model_dir =
fastdeploy::jni::ConvertTo<std::string>(env, lite_optimized_model_dir);
auto c_cpu_num_thread = static_cast<int>(cpu_num_thread);
auto c_enable_lite_fp16 = static_cast<bool>(enable_lite_fp16);
auto c_lite_power_mode =
static_cast<fastdeploy::LitePowerMode>(lite_power_mode);
fastdeploy::RuntimeOption c_option;
c_option.UseCpu();
c_option.UseLiteBackend();
c_option.SetCpuThreadNum(c_cpu_num_thread);
c_option.SetLitePowerMode(c_lite_power_mode);
c_option.SetLiteOptimizedModelDir(c_lite_optimized_model_dir);
if (c_enable_lite_fp16) {
c_option.EnableLiteFP16();
}
auto c_model_ptr = new fastdeploy::vision::detection::PicoDet(
c_model_file, c_params_file, c_config_file, c_option);
// Enable record Runtime time costs.
if (enable_record_time_of_runtime) {
c_model_ptr->EnableRecordTimeOfRuntime();
}
// Load detection labels if label path is not empty.
if ((!fastdeploy::jni::AssetsLoaderUtils::IsDetectionLabelsLoaded()) &&
(!c_label_file.empty())) {
fastdeploy::jni::AssetsLoaderUtils::LoadDetectionLabels(c_label_file);
}
// WARN: need to release manually in Java !
return reinterpret_cast<jlong>(c_model_ptr); // native model context
}
JNIEXPORT jlong JNICALL
Java_com_baidu_paddle_fastdeploy_vision_detection_PicoDet_predictNative(
JNIEnv *env, jclass clazz, jlong native_model_context,
jobject argb8888_bitmap, jboolean saved, jstring saved_image_path,
jfloat score_threshold, jboolean rendering) {
if (native_model_context == 0) {
return 0;
}
cv::Mat c_bgr;
auto t = fastdeploy::jni::GetCurrentTime();
if (!fastdeploy::jni::ARGB888Bitmap2BGR(env, argb8888_bitmap, &c_bgr)) {
return 0;
}
LOGD("Read from bitmap costs %f ms", fastdeploy::jni::GetElapsedTime(t));
auto c_model_ptr = reinterpret_cast<fastdeploy::vision::detection::PicoDet *>(
native_model_context);
auto c_result_ptr = new fastdeploy::vision::DetectionResult();
t = fastdeploy::jni::GetCurrentTime();
if (!c_model_ptr->Predict(&c_bgr, c_result_ptr)) {
delete c_result_ptr;
return 0;
}
LOGD("Predict from native costs %f ms", fastdeploy::jni::GetElapsedTime(t));
if (c_model_ptr->EnabledRecordTimeOfRuntime()) {
auto info_of_runtime = c_model_ptr->PrintStatisInfoOfRuntime();
LOGD("Avg runtime costs %f ms", info_of_runtime["avg_time"] * 1000.0f);
}
if (!c_result_ptr->boxes.empty() && rendering) {
t = fastdeploy::jni::GetCurrentTime();
cv::Mat c_vis_im;
if (fastdeploy::jni::AssetsLoaderUtils::IsDetectionLabelsLoaded()) {
c_vis_im = fastdeploy::vision::VisDetection(
c_bgr, *(c_result_ptr),
fastdeploy::jni::AssetsLoaderUtils::GetDetectionLabels(),
score_threshold, 2, 1.0f);
} else {
c_vis_im = fastdeploy::vision::VisDetection(c_bgr, *(c_result_ptr),
score_threshold, 2, 1.0f);
}
LOGD("Visualize from native costs %f ms",
fastdeploy::jni::GetElapsedTime(t));
// Rendering to bitmap
t = fastdeploy::jni::GetCurrentTime();
if (!fastdeploy::jni::BGR2ARGB888Bitmap(env, argb8888_bitmap, c_vis_im)) {
delete c_result_ptr;
return 0;
}
LOGD("Write to bitmap from native costs %f ms",
fastdeploy::jni::GetElapsedTime(t));
std::string c_saved_image_path =
fastdeploy::jni::ConvertTo<std::string>(env, saved_image_path);
if (!c_saved_image_path.empty() && saved) {
t = fastdeploy::jni::GetCurrentTime();
cv::imwrite(c_saved_image_path, c_vis_im);
LOGD("Save image from native costs %f ms, path: %s",
fastdeploy::jni::GetElapsedTime(t), c_saved_image_path.c_str());
}
}
// WARN: need to release it manually in Java !
return reinterpret_cast<jlong>(c_result_ptr); // native result context
}
JNIEXPORT jboolean JNICALL
Java_com_baidu_paddle_fastdeploy_vision_detection_PicoDet_releaseNative(
JNIEnv *env, jclass clazz, jlong native_model_context) {
if (native_model_context == 0) {
return JNI_FALSE;
}
auto c_model_ptr = reinterpret_cast<fastdeploy::vision::detection::PicoDet *>(
native_model_context);
if (c_model_ptr->EnabledRecordTimeOfRuntime()) {
auto info_of_runtime = c_model_ptr->PrintStatisInfoOfRuntime();
LOGD("[End] Avg runtime costs %f ms",
info_of_runtime["avg_time"] * 1000.0f);
}
delete c_model_ptr;
LOGD("[End] Release PicoDet in native !");
return JNI_TRUE;
}
#ifdef __cplusplus
}
#endif

View File

@@ -1,132 +0,0 @@
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <android/bitmap.h> // NOLINT
#include <jni.h> // NOLINT
#include "fastdeploy/vision.h" // NOLINT
#include "fastdeploy_jni.h" // NOLINT
#ifdef __cplusplus
extern "C" {
#endif
/// Native DetectionResult for vision::DetectionResult.
JNIEXPORT jint JNICALL
Java_com_baidu_paddle_fastdeploy_vision_DetectionResult_copyBoxesNumFromNative(
JNIEnv *env, jclass clazz, jlong native_result_context) {
auto c_result_ptr = reinterpret_cast<fastdeploy::vision::DetectionResult *>(
native_result_context);
return static_cast<jint>(c_result_ptr->boxes.size());
}
JNIEXPORT jfloatArray JNICALL
Java_com_baidu_paddle_fastdeploy_vision_DetectionResult_copyBoxesFromNative(
JNIEnv *env, jclass clazz, jlong native_result_context) {
auto c_result_ptr = reinterpret_cast<fastdeploy::vision::DetectionResult *>(
native_result_context);
if (c_result_ptr->boxes.empty()) {
return {};
}
const auto len = static_cast<int64_t>(c_result_ptr->boxes.size());
float buffer[len * 4];
const auto &boxes = c_result_ptr->boxes;
for (int64_t i = 0; i < len; ++i) {
std::memcpy((buffer + i * 4), (boxes.at(i).data()), 4 * sizeof(float));
}
return fastdeploy::jni::ConvertTo<jfloatArray>(env, buffer, len * 4);
}
JNIEXPORT jfloatArray JNICALL
Java_com_baidu_paddle_fastdeploy_vision_DetectionResult_copyScoresFromNative(
JNIEnv *env, jclass clazz, jlong native_result_context) {
auto c_result_ptr = reinterpret_cast<fastdeploy::vision::DetectionResult *>(
native_result_context);
if (c_result_ptr->scores.empty()) {
return {};
}
const auto len = static_cast<int64_t>(c_result_ptr->scores.size());
const float *buffer = static_cast<float *>(c_result_ptr->scores.data());
return fastdeploy::jni::ConvertTo<jfloatArray>(env, buffer, len);
}
JNIEXPORT jintArray JNICALL
Java_com_baidu_paddle_fastdeploy_vision_DetectionResult_copyLabelIdsFromNative(
JNIEnv *env, jclass clazz, jlong native_result_context) {
auto c_result_ptr = reinterpret_cast<fastdeploy::vision::DetectionResult *>(
native_result_context);
if (c_result_ptr->label_ids.empty()) {
return {};
}
const auto len = static_cast<int64_t>(c_result_ptr->label_ids.size());
const int *buffer = static_cast<int *>(c_result_ptr->label_ids.data());
return fastdeploy::jni::ConvertTo<jintArray>(env, buffer, len);
}
JNIEXPORT jboolean JNICALL
Java_com_baidu_paddle_fastdeploy_vision_DetectionResult_releaseNative(
JNIEnv *env, jclass clazz, jlong native_result_context) {
if (native_result_context == 0) {
return JNI_FALSE;
}
auto c_result_ptr = reinterpret_cast<fastdeploy::vision::DetectionResult *>(
native_result_context);
delete c_result_ptr;
LOGD("Release DetectionResult in native !");
return JNI_TRUE;
}
/// Native ClassifyResult for vision::ClassifyResult.
JNIEXPORT jfloatArray JNICALL
Java_com_baidu_paddle_fastdeploy_vision_ClassifyResult_copyScoresFromNative(
JNIEnv *env, jclass clazz, jlong native_result_context) {
auto c_result_ptr = reinterpret_cast<fastdeploy::vision::ClassifyResult *>(
native_result_context);
if (c_result_ptr->scores.empty()) {
return {};
}
const auto len = static_cast<int64_t>(c_result_ptr->scores.size());
const float *buffer = static_cast<float *>(c_result_ptr->scores.data());
return fastdeploy::jni::ConvertTo<jfloatArray>(env, buffer, len);
}
JNIEXPORT jintArray JNICALL
Java_com_baidu_paddle_fastdeploy_vision_ClassifyResult_copyLabelIdsFromNative(
JNIEnv *env, jclass clazz, jlong native_result_context) {
auto c_result_ptr = reinterpret_cast<fastdeploy::vision::ClassifyResult *>(
native_result_context);
if (c_result_ptr->label_ids.empty()) {
return {};
}
const auto len = static_cast<int64_t>(c_result_ptr->label_ids.size());
const int *buffer = static_cast<int *>(c_result_ptr->label_ids.data());
return fastdeploy::jni::ConvertTo<jintArray>(env, buffer, len);
}
JNIEXPORT jboolean JNICALL
Java_com_baidu_paddle_fastdeploy_vision_ClassifyResult_releaseNative(
JNIEnv *env, jclass clazz, jlong native_result_context) {
if (native_result_context == 0) {
return JNI_FALSE;
}
auto c_result_ptr = reinterpret_cast<fastdeploy::vision::ClassifyResult *>(
native_result_context);
delete c_result_ptr;
LOGD("Release ClassifyResult in native !");
return JNI_TRUE;
}
#ifdef __cplusplus
}
#endif

View File

@@ -1,92 +0,0 @@
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <jni.h>
#include "fastdeploy_jni.h"
#ifdef __cplusplus
extern "C" {
#endif
JNIEXPORT jboolean JNICALL
Java_com_baidu_paddle_fastdeploy_vision_Visualize_visDetectionNative(
JNIEnv *env, jclass clazz, jobject argb8888_bitmap, jobjectArray boxes,
jfloatArray scores, jintArray label_ids, jfloat score_threshold,
jint line_size, jfloat font_size, jobjectArray labels) {
// Draw DetectionResult to ARGB8888 Bitmap
int len = env->GetArrayLength(boxes);
if ((len == 0) || (len != env->GetArrayLength(scores)) ||
(len != env->GetArrayLength(label_ids))) {
return JNI_FALSE;
}
fastdeploy::vision::DetectionResult c_result;
c_result.Resize(len);
bool check_validation = true;
for (int i = 0; i < len; ++i) {
auto j_box =
reinterpret_cast<jfloatArray>(env->GetObjectArrayElement(boxes, i));
if (env->GetArrayLength(j_box) == 4) {
jfloat *j_box_ptr = env->GetFloatArrayElements(j_box, nullptr);
std::memcpy(c_result.boxes[i].data(), j_box_ptr, 4 * sizeof(float));
env->ReleaseFloatArrayElements(j_box, j_box_ptr, 0);
} else {
check_validation = false;
break;
}
}
if (!check_validation) {
return JNI_FALSE;
}
jfloat *j_scores_ptr = env->GetFloatArrayElements(scores, nullptr);
std::memcpy(c_result.scores.data(), j_scores_ptr, len * sizeof(float));
env->ReleaseFloatArrayElements(scores, j_scores_ptr, 0);
jint *j_label_ids_ptr = env->GetIntArrayElements(label_ids, nullptr);
std::memcpy(c_result.label_ids.data(), j_label_ids_ptr, len * sizeof(int));
env->ReleaseIntArrayElements(label_ids, j_label_ids_ptr, 0);
// Get labels from Java
std::vector<std::string> c_labels;
int label_len = env->GetArrayLength(labels);
if (label_len > 0) {
c_labels.reserve(label_len);
for (int i = 0; i < label_len; ++i) {
auto j_str =
reinterpret_cast<jstring>(env->GetObjectArrayElement(labels, i));
c_labels.push_back(fastdeploy::jni::ConvertTo<std::string>(env, j_str));
}
}
cv::Mat c_bgr;
// From ARGB Bitmap to BGR
if (!fastdeploy::jni::ARGB888Bitmap2BGR(env, argb8888_bitmap, &c_bgr)) {
return JNI_FALSE;
}
cv::Mat c_vis_im;
if (!c_labels.empty()) {
c_vis_im = fastdeploy::vision::VisDetection(
c_bgr, c_result, c_labels, score_threshold, line_size, font_size);
} else {
c_vis_im = fastdeploy::vision::VisDetection(
c_bgr, c_result, score_threshold, line_size, font_size);
}
// Rendering to bitmap
if (!fastdeploy::jni::BGR2ARGB888Bitmap(env, argb8888_bitmap, c_vis_im)) {
return JNI_FALSE;
}
return JNI_TRUE;
}
#ifdef __cplusplus
}
#endif

View File

@@ -1,8 +0,0 @@
package com.baidu.paddle.fastdeploy;
public enum FDModelTag {
UNKNOWN,
VISION_DETECTION_PICODET,
VISION_DETECTION_PPYOLOE,
VISION_CLASSIFICATION_PPCLS
}

View File

@@ -1,22 +0,0 @@
package com.baidu.paddle.fastdeploy;
/**
* Initializer for FastDeploy. The initialization methods are called by package
* classes only. Public users don't have to call them. Public users can get
* FastDeploy information constants such as JNI lib name in this class.
*/
public class FastDeployInitializer {
/** name of C++ JNI lib */
public final static String JNI_LIB_NAME = "fastdeploy_jni";
/**
* loads the C++ JNI lib. We only call it in our package, so it shouldn't be
* visible to public users.
*
* @return true if initialize successfully.
*/
public static boolean init() {
System.loadLibrary(JNI_LIB_NAME);
return true;
}
}

View File

@@ -1,10 +0,0 @@
package com.baidu.paddle.fastdeploy;
public enum LitePowerMode {
LITE_POWER_HIGH,
LITE_POWER_LOW,
LITE_POWER_FULL,
LITE_POWER_NO_BIND,
LITE_POWER_RAND_HIGH,
LITE_POWER_RAND_LOW
}

View File

@@ -1,64 +0,0 @@
package com.baidu.paddle.fastdeploy;
public class RuntimeOption {
public int mCpuThreadNum = 1;
public boolean mEnableLiteFp16 = false;
public boolean mEnableRecordTimeOfRuntime = false;
public LitePowerMode mLitePowerMode = LitePowerMode.LITE_POWER_NO_BIND;
public String mLiteOptimizedModelDir = "";
public RuntimeOption() {
mCpuThreadNum = 1;
mEnableLiteFp16 = false;
mEnableRecordTimeOfRuntime = false;
mLitePowerMode = LitePowerMode.LITE_POWER_NO_BIND;
mLiteOptimizedModelDir = "";
}
public void enableLiteFp16() {
mEnableLiteFp16 = true;
}
public void disableLiteFP16() {
mEnableLiteFp16 = false;
}
public void setCpuThreadNum(int threadNum) {
mCpuThreadNum = threadNum;
}
public void setLitePowerMode(LitePowerMode mode) {
mLitePowerMode = mode;
}
public void setLitePowerMode(String modeStr) {
mLitePowerMode = parseLitePowerModeFromString(modeStr);
}
public void setLiteOptimizedModelDir(String modelDir) {
mLiteOptimizedModelDir = modelDir;
}
public void enableRecordTimeOfRuntime() {
mEnableRecordTimeOfRuntime = true;
}
// Helpers: parse lite power mode from string
public static LitePowerMode parseLitePowerModeFromString(String modeStr) {
if (modeStr.equalsIgnoreCase("LITE_POWER_HIGH")) {
return LitePowerMode.LITE_POWER_HIGH;
} else if (modeStr.equalsIgnoreCase("LITE_POWER_LOW")) {
return LitePowerMode.LITE_POWER_LOW;
} else if (modeStr.equalsIgnoreCase("LITE_POWER_FULL")) {
return LitePowerMode.LITE_POWER_FULL;
} else if (modeStr.equalsIgnoreCase("LITE_POWER_NO_BIND")) {
return LitePowerMode.LITE_POWER_NO_BIND;
} else if (modeStr.equalsIgnoreCase("LITE_POWER_RAND_HIGH")) {
return LitePowerMode.LITE_POWER_RAND_HIGH;
} else if (modeStr.equalsIgnoreCase("LITE_POWER_RAND_LOW")) {
return LitePowerMode.LITE_POWER_RAND_LOW;
} else {
return LitePowerMode.LITE_POWER_NO_BIND;
}
}
}

View File

@@ -0,0 +1,474 @@
package com.baidu.paddle.fastdeploy.app.examples.detection;
import android.Manifest;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.app.AlertDialog;
import android.content.DialogInterface;
import android.content.Intent;
import android.content.SharedPreferences;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.net.Uri;
import android.os.Bundle;
import android.os.SystemClock;
import android.preference.PreferenceManager;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.util.Log;
import android.view.View;
import android.view.ViewGroup;
import android.view.Window;
import android.view.WindowManager;
import android.widget.ImageButton;
import android.widget.ImageView;
import android.widget.SeekBar;
import android.widget.TextView;
import com.baidu.paddle.fastdeploy.RuntimeOption;
import com.baidu.paddle.fastdeploy.app.examples.R;
import com.baidu.paddle.fastdeploy.app.ui.view.CameraSurfaceView;
import com.baidu.paddle.fastdeploy.app.ui.view.ResultListView;
import com.baidu.paddle.fastdeploy.app.ui.Utils;
import com.baidu.paddle.fastdeploy.app.ui.view.adapter.BaseResultAdapter;
import com.baidu.paddle.fastdeploy.app.ui.view.model.BaseResultModel;
import com.baidu.paddle.fastdeploy.vision.DetectionResult;
import com.baidu.paddle.fastdeploy.vision.Visualize;
import com.baidu.paddle.fastdeploy.vision.detection.PicoDet;
import static com.baidu.paddle.fastdeploy.app.ui.Utils.decodeBitmap;
import static com.baidu.paddle.fastdeploy.app.ui.Utils.getRealPathFromURI;
import static com.baidu.paddle.fastdeploy.app.ui.Utils.readTxt;
import java.math.BigDecimal;
import java.util.ArrayList;
import java.util.List;
public class DetectionMainActivity extends Activity implements View.OnClickListener, CameraSurfaceView.OnTextureChangedListener {
private static final String TAG = DetectionMainActivity.class.getSimpleName();
CameraSurfaceView svPreview;
TextView tvStatus;
ImageButton btnSwitch;
ImageButton btnShutter;
ImageButton btnSettings;
ImageView realtimeToggleButton;
boolean isRealtimeStatusRunning = false;
ImageView backInPreview;
private ImageView albumSelectButton;
private View cameraPageView;
private ViewGroup resultPageView;
private ImageView resultImage;
private ImageView backInResult;
private SeekBar confidenceSeekbar;
private TextView seekbarText;
private float resultNum = 1.0f;
private ResultListView resultView;
private Bitmap picBitmap;
private Bitmap shutterBitmap;
private Bitmap originPicBitmap;
private Bitmap originShutterBitmap;
private boolean isShutterBitmapCopied = false;
public static final int TYPE_UNKNOWN = -1;
public static final int BTN_SHUTTER = 0;
public static final int ALBUM_SELECT = 1;
public static final int REALTIME_DETECT = 2;
private static int TYPE = REALTIME_DETECT;
private static final int REQUEST_PERMISSION_CODE_STORAGE = 101;
private static final int INTENT_CODE_PICK_IMAGE = 100;
private static final int TIME_SLEEP_INTERVAL = 50; // ms
long timeElapsed = 0;
long frameCounter = 0;
// Call 'init' and 'release' manually later
PicoDet predictor = new PicoDet();
private float[] scores;
private int[] labelId;
private boolean initialized;
private List<String> labelText;
private List<BaseResultModel> results = new ArrayList<>();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Fullscreen
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.detection_activity_main);
// Clear all setting items to avoid app crashing due to the incorrect settings
initSettings();
// Check and request CAMERA and WRITE_EXTERNAL_STORAGE permissions
if (!checkAllPermissions()) {
requestAllPermissions();
}
// Init the camera preview and UI components
initView();
}
@SuppressLint("NonConstantResourceId")
@Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.btn_switch:
svPreview.switchCamera();
break;
case R.id.btn_shutter:
TYPE = BTN_SHUTTER;
shutterAndPauseCamera();
resultView.setAdapter(null);
break;
case R.id.btn_settings:
startActivity(new Intent(DetectionMainActivity.this, DetectionSettingsActivity.class));
break;
case R.id.realtime_toggle_btn:
toggleRealtimeStyle();
break;
case R.id.back_in_preview:
finish();
break;
case R.id.iv_select:
TYPE = ALBUM_SELECT;
// Judge whether authority has been granted.
if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
// If this permission was requested before the application but the user refused the request, this method will return true.
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE}, REQUEST_PERMISSION_CODE_STORAGE);
} else {
Intent intent = new Intent(Intent.ACTION_PICK);
intent.setType("image/*");
startActivityForResult(intent, INTENT_CODE_PICK_IMAGE);
}
resultView.setAdapter(null);
break;
case R.id.back_in_result:
back();
break;
}
}
@Override
public void onBackPressed() {
super.onBackPressed();
back();
}
private void back() {
resultPageView.setVisibility(View.GONE);
cameraPageView.setVisibility(View.VISIBLE);
TYPE = REALTIME_DETECT;
isShutterBitmapCopied = false;
svPreview.onResume();
results.clear();
if (scores != null) {
scores = null;
}
if (labelId != null) {
labelId = null;
}
}
private void shutterAndPauseCamera() {
new Thread(new Runnable() {
@Override
public void run() {
try {
// Sleep some times to ensure picture has been correctly shut.
Thread.sleep(TIME_SLEEP_INTERVAL * 10); // 500ms
} catch (InterruptedException e) {
e.printStackTrace();
}
runOnUiThread(new Runnable() {
@SuppressLint("SetTextI18n")
public void run() {
// These code will run in main thread.
svPreview.onPause();
cameraPageView.setVisibility(View.GONE);
resultPageView.setVisibility(View.VISIBLE);
seekbarText.setText(resultNum + "");
confidenceSeekbar.setProgress((int) (resultNum * 100));
if (shutterBitmap != null && !shutterBitmap.isRecycled()) {
resultImage.setImageBitmap(shutterBitmap);
} else {
new AlertDialog.Builder(DetectionMainActivity.this)
.setTitle("Empty Result!")
.setMessage("Current picture is empty, please shutting it again!")
.setCancelable(true)
.show();
}
}
});
}
}).start();
}
private void copyBitmapFromCamera(Bitmap ARGB8888ImageBitmap) {
if (isShutterBitmapCopied || ARGB8888ImageBitmap == null) {
return;
}
if (!ARGB8888ImageBitmap.isRecycled()) {
synchronized (this) {
shutterBitmap = ARGB8888ImageBitmap.copy(Bitmap.Config.ARGB_8888, true);
originShutterBitmap = ARGB8888ImageBitmap.copy(Bitmap.Config.ARGB_8888, true);
}
SystemClock.sleep(TIME_SLEEP_INTERVAL);
isShutterBitmapCopied = true;
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == INTENT_CODE_PICK_IMAGE) {
if (resultCode == Activity.RESULT_OK) {
cameraPageView.setVisibility(View.GONE);
resultPageView.setVisibility(View.VISIBLE);
seekbarText.setText(resultNum + "");
confidenceSeekbar.setProgress((int) (resultNum * 100));
Uri uri = data.getData();
String path = getRealPathFromURI(this, uri);
picBitmap = decodeBitmap(path, 720, 1280);
originPicBitmap = picBitmap.copy(Bitmap.Config.ARGB_8888, true);
resultImage.setImageBitmap(picBitmap);
}
}
}
private void toggleRealtimeStyle() {
if (isRealtimeStatusRunning) {
isRealtimeStatusRunning = false;
realtimeToggleButton.setImageResource(R.drawable.realtime_stop_btn);
svPreview.setOnTextureChangedListener(this);
tvStatus.setVisibility(View.VISIBLE);
} else {
isRealtimeStatusRunning = true;
realtimeToggleButton.setImageResource(R.drawable.realtime_start_btn);
tvStatus.setVisibility(View.GONE);
isShutterBitmapCopied = false;
svPreview.setOnTextureChangedListener(new CameraSurfaceView.OnTextureChangedListener() {
@Override
public boolean onTextureChanged(Bitmap ARGB8888ImageBitmap) {
if (TYPE == BTN_SHUTTER) {
copyBitmapFromCamera(ARGB8888ImageBitmap);
}
return false;
}
});
}
}
@Override
public boolean onTextureChanged(Bitmap ARGB8888ImageBitmap) {
if (TYPE == BTN_SHUTTER) {
copyBitmapFromCamera(ARGB8888ImageBitmap);
return false;
}
boolean modified = false;
long tc = System.currentTimeMillis();
DetectionResult result = predictor.predict(ARGB8888ImageBitmap);
timeElapsed += (System.currentTimeMillis() - tc);
Visualize.visDetection(ARGB8888ImageBitmap, result, DetectionSettingsActivity.scoreThreshold);
modified = result.initialized();
frameCounter++;
if (frameCounter >= 30) {
final int fps = (int) (1000 / (timeElapsed / 30));
runOnUiThread(new Runnable() {
@SuppressLint("SetTextI18n")
public void run() {
tvStatus.setText(Integer.toString(fps) + "fps");
}
});
frameCounter = 0;
timeElapsed = 0;
}
return modified;
}
@Override
protected void onResume() {
super.onResume();
// Reload settings and re-initialize the predictor
checkAndUpdateSettings();
// Open camera until the permissions have been granted
if (!checkAllPermissions()) {
svPreview.disableCamera();
}
svPreview.onResume();
}
@Override
protected void onPause() {
super.onPause();
svPreview.onPause();
}
@Override
protected void onDestroy() {
if (predictor != null) {
predictor.release();
}
super.onDestroy();
}
public void initView() {
TYPE = REALTIME_DETECT;
svPreview = (CameraSurfaceView) findViewById(R.id.sv_preview);
svPreview.setOnTextureChangedListener(this);
tvStatus = (TextView) findViewById(R.id.tv_status);
btnSwitch = (ImageButton) findViewById(R.id.btn_switch);
btnSwitch.setOnClickListener(this);
btnShutter = (ImageButton) findViewById(R.id.btn_shutter);
btnShutter.setOnClickListener(this);
btnSettings = (ImageButton) findViewById(R.id.btn_settings);
btnSettings.setOnClickListener(this);
realtimeToggleButton = findViewById(R.id.realtime_toggle_btn);
realtimeToggleButton.setOnClickListener(this);
backInPreview = findViewById(R.id.back_in_preview);
backInPreview.setOnClickListener(this);
albumSelectButton = findViewById(R.id.iv_select);
albumSelectButton.setOnClickListener(this);
cameraPageView = findViewById(R.id.camera_page);
resultPageView = findViewById(R.id.result_page);
resultImage = findViewById(R.id.result_image);
backInResult = findViewById(R.id.back_in_result);
backInResult.setOnClickListener(this);
confidenceSeekbar = findViewById(R.id.confidence_seekbar);
seekbarText = findViewById(R.id.seekbar_text);
resultView = findViewById(R.id.result_list_view);
confidenceSeekbar.setMax(100);
confidenceSeekbar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
@Override
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
float resultConfidence = seekBar.getProgress() / 100f;
BigDecimal bd = new BigDecimal(resultConfidence);
resultNum = bd.setScale(1, BigDecimal.ROUND_HALF_UP).floatValue();
seekbarText.setText(resultNum + "");
confidenceSeekbar.setProgress((int) (resultNum * 100));
results.clear();
}
@Override
public void onStartTrackingTouch(SeekBar seekBar) {
}
@Override
public void onStopTrackingTouch(SeekBar seekBar) {
runOnUiThread(new Runnable() {
@Override
public void run() {
if (TYPE == ALBUM_SELECT) {
SystemClock.sleep(TIME_SLEEP_INTERVAL * 10);
detail(picBitmap);
picBitmap = originPicBitmap.copy(Bitmap.Config.ARGB_8888, true);
} else {
SystemClock.sleep(TIME_SLEEP_INTERVAL * 10);
// svPreview.onPause();
detail(shutterBitmap);
shutterBitmap = originShutterBitmap.copy(Bitmap.Config.ARGB_8888, true);
}
}
});
}
});
}
private void detail(Bitmap bitmap) {
DetectionResult result = predictor.predict(bitmap, true, resultNum);
scores = result.mScores;
labelId = result.mLabelIds;
initialized = result.initialized();
if (initialized) {
for (int i = 0; i < labelId.length; i++) {
if (scores[i] > resultNum) {
int idx = labelId[i];
String text = labelText.get(idx);
results.add(new BaseResultModel(idx, text, scores[i]));
}
}
}
BaseResultAdapter adapter = new BaseResultAdapter(getBaseContext(), R.layout.detection_result_page_item, results);
resultView.setAdapter(adapter);
resultView.invalidate();
resultImage.setImageBitmap(bitmap);
resultNum = 1.0f;
}
@SuppressLint("ApplySharedPref")
public void initSettings() {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.clear();
editor.commit();
DetectionSettingsActivity.resetSettings();
}
public void checkAndUpdateSettings() {
if (DetectionSettingsActivity.checkAndUpdateSettings(this)) {
String realModelDir = getCacheDir() + "/" + DetectionSettingsActivity.modelDir;
Utils.copyDirectoryFromAssets(this, DetectionSettingsActivity.modelDir, realModelDir);
String realLabelPath = getCacheDir() + "/" + DetectionSettingsActivity.labelPath;
Utils.copyFileFromAssets(this, DetectionSettingsActivity.labelPath, realLabelPath);
String modelFile = realModelDir + "/" + "model.pdmodel";
String paramsFile = realModelDir + "/" + "model.pdiparams";
String configFile = realModelDir + "/" + "infer_cfg.yml";
String labelFile = realLabelPath;
labelText = readTxt(labelFile);
RuntimeOption option = new RuntimeOption();
option.setCpuThreadNum(DetectionSettingsActivity.cpuThreadNum);
option.setLitePowerMode(DetectionSettingsActivity.cpuPowerMode);
if (Boolean.parseBoolean(DetectionSettingsActivity.enableLiteFp16)) {
option.enableLiteFp16();
}
predictor.init(modelFile, paramsFile, configFile, labelFile, option);
}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,
@NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults[0] != PackageManager.PERMISSION_GRANTED || grantResults[1] != PackageManager.PERMISSION_GRANTED) {
new AlertDialog.Builder(DetectionMainActivity.this)
.setTitle("Permission denied")
.setMessage("Click to force quit the app, then open Settings->Apps & notifications->Target " +
"App->Permissions to grant all of the permissions.")
.setCancelable(false)
.setPositiveButton("Exit", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
DetectionMainActivity.this.finish();
}
}).show();
}
}
private void requestAllPermissions() {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.CAMERA}, 0);
}
private boolean checkAllPermissions() {
return ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) == PackageManager.PERMISSION_GRANTED
&& ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED;
}
}

View File

@@ -1,4 +1,4 @@
package com.baidu.paddle.fastdeploy.examples; package com.baidu.paddle.fastdeploy.app.examples.detection;
import android.annotation.SuppressLint; import android.annotation.SuppressLint;
import android.content.Context; import android.content.Context;
@@ -8,18 +8,17 @@ import android.preference.EditTextPreference;
import android.preference.ListPreference; import android.preference.ListPreference;
import android.preference.PreferenceManager; import android.preference.PreferenceManager;
import android.support.v7.app.ActionBar; import android.support.v7.app.ActionBar;
import android.util.Log;
import com.baidu.paddle.fastdeploy.common.AppCompatPreferenceActivity; import com.baidu.paddle.fastdeploy.app.examples.R;
import com.baidu.paddle.fastdeploy.common.Utils; import com.baidu.paddle.fastdeploy.app.ui.view.AppCompatPreferenceActivity;
import com.baidu.paddle.fastdeploy.examples.R; import com.baidu.paddle.fastdeploy.app.ui.Utils;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
public class SettingsActivity extends AppCompatPreferenceActivity implements public class DetectionSettingsActivity extends AppCompatPreferenceActivity implements
SharedPreferences.OnSharedPreferenceChangeListener { SharedPreferences.OnSharedPreferenceChangeListener {
private static final String TAG = SettingsActivity.class.getSimpleName(); private static final String TAG = DetectionSettingsActivity.class.getSimpleName();
static public int selectedModelIdx = -1; static public int selectedModelIdx = -1;
static public String modelDir = ""; static public String modelDir = "";
@@ -47,7 +46,7 @@ public class SettingsActivity extends AppCompatPreferenceActivity implements
@Override @Override
public void onCreate(Bundle savedInstanceState) { public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState); super.onCreate(savedInstanceState);
addPreferencesFromResource(R.xml.settings); addPreferencesFromResource(R.xml.detection_settings);
ActionBar supportActionBar = getSupportActionBar(); ActionBar supportActionBar = getSupportActionBar();
if (supportActionBar != null) { if (supportActionBar != null) {
supportActionBar.setDisplayHomeAsUpEnabled(true); supportActionBar.setDisplayHomeAsUpEnabled(true);
@@ -60,8 +59,8 @@ public class SettingsActivity extends AppCompatPreferenceActivity implements
preInstalledCPUPowerModes = new ArrayList<String>(); preInstalledCPUPowerModes = new ArrayList<String>();
preInstalledScoreThresholds = new ArrayList<String>(); preInstalledScoreThresholds = new ArrayList<String>();
preInstalledEnableLiteFp16s = new ArrayList<String>(); preInstalledEnableLiteFp16s = new ArrayList<String>();
preInstalledModelDirs.add(getString(R.string.MODEL_DIR_DEFAULT)); preInstalledModelDirs.add(getString(R.string.DETECTION_MODEL_DIR_DEFAULT));
preInstalledLabelPaths.add(getString(R.string.LABEL_PATH_DEFAULT)); preInstalledLabelPaths.add(getString(R.string.DETECTION_LABEL_PATH_DEFAULT));
preInstalledCPUThreadNums.add(getString(R.string.CPU_THREAD_NUM_DEFAULT)); preInstalledCPUThreadNums.add(getString(R.string.CPU_THREAD_NUM_DEFAULT));
preInstalledCPUPowerModes.add(getString(R.string.CPU_POWER_MODE_DEFAULT)); preInstalledCPUPowerModes.add(getString(R.string.CPU_POWER_MODE_DEFAULT));
preInstalledScoreThresholds.add(getString(R.string.SCORE_THRESHOLD_DEFAULT)); preInstalledScoreThresholds.add(getString(R.string.SCORE_THRESHOLD_DEFAULT));
@@ -91,7 +90,7 @@ public class SettingsActivity extends AppCompatPreferenceActivity implements
SharedPreferences sharedPreferences = getPreferenceScreen().getSharedPreferences(); SharedPreferences sharedPreferences = getPreferenceScreen().getSharedPreferences();
String selected_model_dir = sharedPreferences.getString(getString(R.string.CHOOSE_PRE_INSTALLED_MODEL_KEY), String selected_model_dir = sharedPreferences.getString(getString(R.string.CHOOSE_PRE_INSTALLED_MODEL_KEY),
getString(R.string.MODEL_DIR_DEFAULT)); getString(R.string.DETECTION_MODEL_DIR_DEFAULT));
int selected_model_idx = lpChoosePreInstalledModel.findIndexOfValue(selected_model_dir); int selected_model_idx = lpChoosePreInstalledModel.findIndexOfValue(selected_model_dir);
if (selected_model_idx >= 0 && selected_model_idx < preInstalledModelDirs.size() && selected_model_idx != selectedModelIdx) { if (selected_model_idx >= 0 && selected_model_idx < preInstalledModelDirs.size() && selected_model_idx != selectedModelIdx) {
SharedPreferences.Editor editor = sharedPreferences.edit(); SharedPreferences.Editor editor = sharedPreferences.edit();
@@ -107,9 +106,9 @@ public class SettingsActivity extends AppCompatPreferenceActivity implements
} }
String model_dir = sharedPreferences.getString(getString(R.string.MODEL_DIR_KEY), String model_dir = sharedPreferences.getString(getString(R.string.MODEL_DIR_KEY),
getString(R.string.MODEL_DIR_DEFAULT)); getString(R.string.DETECTION_MODEL_DIR_DEFAULT));
String label_path = sharedPreferences.getString(getString(R.string.LABEL_PATH_KEY), String label_path = sharedPreferences.getString(getString(R.string.LABEL_PATH_KEY),
getString(R.string.LABEL_PATH_DEFAULT)); getString(R.string.DETECTION_LABEL_PATH_DEFAULT));
String cpu_thread_num = sharedPreferences.getString(getString(R.string.CPU_THREAD_NUM_KEY), String cpu_thread_num = sharedPreferences.getString(getString(R.string.CPU_THREAD_NUM_KEY),
getString(R.string.CPU_THREAD_NUM_DEFAULT)); getString(R.string.CPU_THREAD_NUM_DEFAULT));
String cpu_power_mode = sharedPreferences.getString(getString(R.string.CPU_POWER_MODE_KEY), String cpu_power_mode = sharedPreferences.getString(getString(R.string.CPU_POWER_MODE_KEY),
@@ -137,12 +136,12 @@ public class SettingsActivity extends AppCompatPreferenceActivity implements
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(ctx); SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(ctx);
String model_dir = sharedPreferences.getString(ctx.getString(R.string.MODEL_DIR_KEY), String model_dir = sharedPreferences.getString(ctx.getString(R.string.MODEL_DIR_KEY),
ctx.getString(R.string.MODEL_DIR_DEFAULT)); ctx.getString(R.string.DETECTION_MODEL_DIR_DEFAULT));
settingsChanged |= !modelDir.equalsIgnoreCase(model_dir); settingsChanged |= !modelDir.equalsIgnoreCase(model_dir);
modelDir = model_dir; modelDir = model_dir;
String label_path = sharedPreferences.getString(ctx.getString(R.string.LABEL_PATH_KEY), String label_path = sharedPreferences.getString(ctx.getString(R.string.LABEL_PATH_KEY),
ctx.getString(R.string.LABEL_PATH_DEFAULT)); ctx.getString(R.string.DETECTION_LABEL_PATH_DEFAULT));
settingsChanged |= !labelPath.equalsIgnoreCase(label_path); settingsChanged |= !labelPath.equalsIgnoreCase(label_path);
labelPath = label_path; labelPath = label_path;

View File

@@ -1,15 +1,31 @@
package com.baidu.paddle.fastdeploy.common; package com.baidu.paddle.fastdeploy.app.ui;
import android.content.Context; import android.content.Context;
import android.content.res.Resources; import android.content.res.Resources;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.hardware.Camera; import android.hardware.Camera;
import android.net.Uri;
import android.opengl.GLES20; import android.opengl.GLES20;
import android.os.Environment; import android.os.Environment;
import android.provider.MediaStore;
import android.util.Log; import android.util.Log;
import android.view.Surface; import android.view.Surface;
import android.view.WindowManager; import android.view.WindowManager;
import java.io.*; import java.io.BufferedInputStream;
import java.io.BufferedOutputStream;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.util.ArrayList;
import java.util.List; import java.util.List;
public class Utils { public class Utils {
@@ -110,7 +126,7 @@ public class Utils {
} }
public static Camera.Size getOptimalPreviewSize(List<Camera.Size> sizes, int w, int h) { public static Camera.Size getOptimalPreviewSize(List<Camera.Size> sizes, int w, int h) {
final double ASPECT_TOLERANCE = 0.1; final double ASPECT_TOLERANCE = 0.3;
double targetRatio = (double) w / h; double targetRatio = (double) w / h;
if (sizes == null) return null; if (sizes == null) return null;
@@ -151,8 +167,8 @@ public class Utils {
} }
public static int getCameraDisplayOrientation(Context context, int cameraId) { public static int getCameraDisplayOrientation(Context context, int cameraId) {
android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo(); Camera.CameraInfo info = new Camera.CameraInfo();
android.hardware.Camera.getCameraInfo(cameraId, info); Camera.getCameraInfo(cameraId, info);
WindowManager wm = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE); WindowManager wm = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE);
int rotation = wm.getDefaultDisplay().getRotation(); int rotation = wm.getDefaultDisplay().getRotation();
int degrees = 0; int degrees = 0;
@@ -234,4 +250,64 @@ public class Utils {
String hardware = android.os.Build.HARDWARE; String hardware = android.os.Build.HARDWARE;
return hardware.equalsIgnoreCase("kirin810") || hardware.equalsIgnoreCase("kirin990"); return hardware.equalsIgnoreCase("kirin810") || hardware.equalsIgnoreCase("kirin990");
} }
public static Bitmap decodeBitmap(String path, int displayWidth, int displayHeight) {
BitmapFactory.Options op = new BitmapFactory.Options();
op.inJustDecodeBounds = true;// Only the width and height information of Bitmap is read, not the pixels.
Bitmap bmp = BitmapFactory.decodeFile(path, op); // Get size information.
int wRatio = (int) Math.ceil(op.outWidth / (float) displayWidth);// Get Scale Size.
int hRatio = (int) Math.ceil(op.outHeight / (float) displayHeight);
// If the specified size is exceeded, reduce the corresponding scale.
if (wRatio > 1 && hRatio > 1) {
if (wRatio > hRatio) {
// If it is too wide, we will reduce the width to the required size. Note that the height will become smaller.
op.inSampleSize = wRatio;
} else {
op.inSampleSize = hRatio;
}
}
op.inJustDecodeBounds = false;
bmp = BitmapFactory.decodeFile(path, op);
// Create a Bitmap with a given width and height from the original Bitmap.
return Bitmap.createScaledBitmap(bmp, displayWidth, displayHeight, true);
}
public static String getRealPathFromURI(Context context, Uri contentURI) {
String result;
Cursor cursor = null;
try {
cursor = context.getContentResolver().query(contentURI, null, null, null, null);
} catch (Throwable e) {
e.printStackTrace();
}
if (cursor == null) {
result = contentURI.getPath();
} else {
cursor.moveToFirst();
int idx = cursor.getColumnIndex(MediaStore.Images.ImageColumns.DATA);
result = cursor.getString(idx);
cursor.close();
}
return result;
}
public static List<String> readTxt(String txtPath) {
File file = new File(txtPath);
if (file.isFile() && file.exists()) {
try {
FileInputStream fileInputStream = new FileInputStream(file);
InputStreamReader inputStreamReader = new InputStreamReader(fileInputStream);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader);
String text;
List<String> labels = new ArrayList<>();
while ((text = bufferedReader.readLine()) != null) {
labels.add(text);
}
return labels;
} catch (Exception e) {
e.printStackTrace();
}
}
return null;
}
} }

View File

@@ -1,4 +1,4 @@
package com.baidu.paddle.fastdeploy.common; package com.baidu.paddle.fastdeploy.app.ui.layout;
import android.content.Context; import android.content.Context;
import android.graphics.Color; import android.graphics.Color;

View File

@@ -1,4 +1,4 @@
package com.baidu.paddle.fastdeploy.common; package com.baidu.paddle.fastdeploy.app.ui.view;
import android.content.res.Configuration; import android.content.res.Configuration;
import android.os.Bundle; import android.os.Bundle;

View File

@@ -1,4 +1,4 @@
package com.baidu.paddle.fastdeploy.common; package com.baidu.paddle.fastdeploy.app.ui.view;
import android.content.Context; import android.content.Context;
import android.graphics.Bitmap; import android.graphics.Bitmap;
@@ -15,8 +15,7 @@ import android.opengl.Matrix;
import android.util.AttributeSet; import android.util.AttributeSet;
import android.util.Log; import android.util.Log;
import javax.microedition.khronos.egl.EGLConfig; import com.baidu.paddle.fastdeploy.app.ui.Utils;
import javax.microedition.khronos.opengles.GL10;
import java.io.IOException; import java.io.IOException;
import java.nio.ByteBuffer; import java.nio.ByteBuffer;
@@ -24,13 +23,15 @@ import java.nio.ByteOrder;
import java.nio.FloatBuffer; import java.nio.FloatBuffer;
import java.util.List; import java.util.List;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
public class CameraSurfaceView extends GLSurfaceView implements Renderer, public class CameraSurfaceView extends GLSurfaceView implements Renderer,
SurfaceTexture.OnFrameAvailableListener { SurfaceTexture.OnFrameAvailableListener {
private static final String TAG = CameraSurfaceView.class.getSimpleName(); private static final String TAG = CameraSurfaceView.class.getSimpleName();
public static final int EXPECTED_PREVIEW_WIDTH = 1280; public static int EXPECTED_PREVIEW_WIDTH = 1280; // 1920
public static final int EXPECTED_PREVIEW_HEIGHT = 720; public static int EXPECTED_PREVIEW_HEIGHT = 720; // 960
protected int numberOfCameras; protected int numberOfCameras;
protected int selectedCameraId; protected int selectedCameraId;
@@ -44,6 +45,9 @@ public class CameraSurfaceView extends GLSurfaceView implements Renderer,
protected int textureWidth = 0; protected int textureWidth = 0;
protected int textureHeight = 0; protected int textureHeight = 0;
protected Bitmap ARGB8888ImageBitmap;
protected boolean bitmapReleaseMode = true;
// In order to manipulate the camera preview data and render the modified one // In order to manipulate the camera preview data and render the modified one
// to the screen, three textures are created and the data flow is shown as following: // to the screen, three textures are created and the data flow is shown as following:
// previewdata->camTextureId->fboTexureId->drawTexureId->framebuffer // previewdata->camTextureId->fboTexureId->drawTexureId->framebuffer
@@ -95,6 +99,16 @@ public class CameraSurfaceView extends GLSurfaceView implements Renderer,
private int vcTex2Screen; private int vcTex2Screen;
private int tcTex2Screen; private int tcTex2Screen;
public void setBitmapReleaseMode(boolean mode) {
synchronized (this) {
bitmapReleaseMode = mode;
}
}
public Bitmap getBitmap() {
return ARGB8888ImageBitmap; // may null or recycled.
}
public interface OnTextureChangedListener { public interface OnTextureChangedListener {
boolean onTextureChanged(Bitmap ARGB8888ImageBitmap); boolean onTextureChanged(Bitmap ARGB8888ImageBitmap);
} }
@@ -196,9 +210,12 @@ public class CameraSurfaceView extends GLSurfaceView implements Renderer,
// Read pixels of FBO to a bitmap // Read pixels of FBO to a bitmap
ByteBuffer pixelBuffer = ByteBuffer.allocate(textureWidth * textureHeight * 4); ByteBuffer pixelBuffer = ByteBuffer.allocate(textureWidth * textureHeight * 4);
GLES20.glReadPixels(0, 0, textureWidth, textureHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, pixelBuffer); GLES20.glReadPixels(0, 0, textureWidth, textureHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, pixelBuffer);
Bitmap ARGB8888ImageBitmap = Bitmap.createBitmap(textureWidth, textureHeight, Bitmap.Config.ARGB_8888);
ARGB8888ImageBitmap = Bitmap.createBitmap(textureWidth, textureHeight, Bitmap.Config.ARGB_8888);
ARGB8888ImageBitmap.copyPixelsFromBuffer(pixelBuffer); ARGB8888ImageBitmap.copyPixelsFromBuffer(pixelBuffer);
boolean modified = onTextureChangedListener.onTextureChanged(ARGB8888ImageBitmap); boolean modified = onTextureChangedListener.onTextureChanged(ARGB8888ImageBitmap);
if (modified) { if (modified) {
targetTexureId = drawTexureId[0]; targetTexureId = drawTexureId[0];
// Update a bitmap to the GL texture if modified // Update a bitmap to the GL texture if modified
@@ -207,8 +224,10 @@ public class CameraSurfaceView extends GLSurfaceView implements Renderer,
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, targetTexureId); GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, targetTexureId);
GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, ARGB8888ImageBitmap, 0); GLUtils.texImage2D(GL10.GL_TEXTURE_2D, 0, ARGB8888ImageBitmap, 0);
} }
if (bitmapReleaseMode) {
ARGB8888ImageBitmap.recycle(); ARGB8888ImageBitmap.recycle();
} }
}
// fboTexureId/drawTexureId->Screen // fboTexureId/drawTexureId->Screen
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0); GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
@@ -268,20 +287,28 @@ public class CameraSurfaceView extends GLSurfaceView implements Renderer,
public void openCamera() { public void openCamera() {
if (disableCamera) return; if (disableCamera) return;
camera = Camera.open(selectedCameraId); camera = Camera.open(selectedCameraId);
List<Size> supportedPreviewSizes = camera.getParameters().getSupportedPreviewSizes();
Size previewSize = Utils.getOptimalPreviewSize(supportedPreviewSizes, EXPECTED_PREVIEW_WIDTH,
EXPECTED_PREVIEW_HEIGHT);
Camera.Parameters parameters = camera.getParameters(); Camera.Parameters parameters = camera.getParameters();
parameters.setPreviewSize(previewSize.width, previewSize.height);
if (parameters.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
}
camera.setParameters(parameters);
int degree = Utils.getCameraDisplayOrientation(context, selectedCameraId); int degree = Utils.getCameraDisplayOrientation(context, selectedCameraId);
camera.setDisplayOrientation(degree); camera.setDisplayOrientation(degree);
boolean rotate = degree == 90 || degree == 270; boolean rotate = degree == 90 || degree == 270;
textureWidth = rotate ? previewSize.height : previewSize.width; int adjusted_width = rotate ? EXPECTED_PREVIEW_HEIGHT : EXPECTED_PREVIEW_WIDTH;
textureHeight = rotate ? previewSize.width : previewSize.height; int adjusted_height = rotate ? EXPECTED_PREVIEW_WIDTH : EXPECTED_PREVIEW_HEIGHT;
List<Size> supportedPreviewSizes = camera.getParameters().getSupportedPreviewSizes();
Size previewSize = Utils.getOptimalPreviewSize(
supportedPreviewSizes, adjusted_width, adjusted_height);
textureWidth = previewSize.width;
textureHeight = previewSize.height;
parameters.setPreviewSize(previewSize.width, previewSize.height);
camera.setParameters(parameters);
if (parameters.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
}
// Destroy FBO and draw textures // Destroy FBO and draw textures
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0); GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
GLES20.glDeleteFramebuffers(1, fbo, 0); GLES20.glDeleteFramebuffers(1, fbo, 0);

View File

@@ -0,0 +1,43 @@
package com.baidu.paddle.fastdeploy.app.ui.view;
import android.content.Context;
import android.os.Handler;
import android.util.AttributeSet;
import android.widget.ListView;
public class ResultListView extends ListView {
public ResultListView(Context context) {
super(context);
}
public ResultListView(Context context, AttributeSet attrs) {
super(context, attrs);
}
public ResultListView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
}
private Handler handler;
public void setHandler(Handler mHandler) {
handler = mHandler;
}
public void clear() {
handler.post(new Runnable() {
@Override
public void run() {
removeAllViewsInLayout();
invalidate();
}
});
}
@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
int expandSpec = MeasureSpec.makeMeasureSpec(Integer.MAX_VALUE >> 2,
MeasureSpec.AT_MOST);
super.onMeasure(widthMeasureSpec, expandSpec);
}
}

View File

@@ -0,0 +1,48 @@
package com.baidu.paddle.fastdeploy.app.ui.view.adapter;
import android.content.Context;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.ArrayAdapter;
import android.widget.TextView;
import com.baidu.paddle.fastdeploy.app.examples.R;
import com.baidu.paddle.fastdeploy.app.ui.view.model.BaseResultModel;
import java.text.DecimalFormat;
import java.util.List;
public class BaseResultAdapter extends ArrayAdapter<BaseResultModel> {
private int resourceId;
public BaseResultAdapter(@NonNull Context context, int resource) {
super(context, resource);
}
public BaseResultAdapter(@NonNull Context context, int resource, @NonNull List<BaseResultModel> objects) {
super(context, resource, objects);
resourceId = resource;
}
@NonNull
@Override
public View getView(int position, @Nullable View convertView, @NonNull ViewGroup parent) {
BaseResultModel model = getItem(position);
View view = LayoutInflater.from(getContext()).inflate(resourceId, null);
TextView indexText = (TextView) view.findViewById(R.id.index);
TextView nameText = (TextView) view.findViewById(R.id.name);
TextView confidenceText = (TextView) view.findViewById(R.id.confidence);
indexText.setText(String.valueOf(model.getIndex()));
nameText.setText(String.valueOf(model.getName()));
confidenceText.setText(formatFloatString(model.getConfidence()));
return view;
}
public static String formatFloatString(float number) {
DecimalFormat df = new DecimalFormat("0.00");
return df.format(number);
}
}

View File

@@ -0,0 +1,41 @@
package com.baidu.paddle.fastdeploy.app.ui.view.model;
public class BaseResultModel {
private int index;
private String name;
private float confidence;
public BaseResultModel() {
}
public BaseResultModel(int index, String name, float confidence) {
this.index = index;
this.name = name;
this.confidence = confidence;
}
public float getConfidence() {
return confidence;
}
public void setConfidence(float confidence) {
this.confidence = confidence;
}
public int getIndex() {
return index;
}
public void setIndex(int index) {
this.index = index;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}

View File

@@ -1,251 +0,0 @@
package com.baidu.paddle.fastdeploy.examples;
import android.Manifest;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.app.AlertDialog;
import android.content.DialogInterface;
import android.content.Intent;
import android.content.SharedPreferences;
import android.content.pm.PackageManager;
import android.graphics.*;
import android.os.Bundle;
import android.preference.PreferenceManager;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v4.content.ContextCompat;
import android.util.Log;
import android.view.*;
import android.widget.*;
import com.baidu.paddle.fastdeploy.RuntimeOption;
import com.baidu.paddle.fastdeploy.common.CameraSurfaceView;
import com.baidu.paddle.fastdeploy.common.Utils;
import com.baidu.paddle.fastdeploy.examples.R;
import com.baidu.paddle.fastdeploy.vision.DetectionResult;
import com.baidu.paddle.fastdeploy.vision.detection.PicoDet;
import java.io.File;
import java.text.SimpleDateFormat;
import java.util.Date;
public class MainActivity extends Activity implements View.OnClickListener, CameraSurfaceView.OnTextureChangedListener {
private static final String TAG = MainActivity.class.getSimpleName();
CameraSurfaceView svPreview;
TextView tvStatus;
ImageButton btnSwitch;
ImageButton btnShutter;
ImageButton btnSettings;
ImageView realtimeToggleButton;
boolean isRealtimeStatusRunning = false;
ImageView backInPreview;
String savedImagePath = "result.jpg";
int lastFrameIndex = 0;
long lastFrameTime;
// Call 'init' and 'release' manually later
PicoDet predictor = new PicoDet();
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
// Fullscreen
requestWindowFeature(Window.FEATURE_NO_TITLE);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_main);
// Clear all setting items to avoid app crashing due to the incorrect settings
initSettings();
// Init the camera preview and UI components
initView();
// Check and request CAMERA and WRITE_EXTERNAL_STORAGE permissions
if (!checkAllPermissions()) {
requestAllPermissions();
}
}
@SuppressLint("NonConstantResourceId")
@Override
public void onClick(View v) {
switch (v.getId()) {
case R.id.btn_switch:
svPreview.switchCamera();
break;
case R.id.btn_shutter:
@SuppressLint("SimpleDateFormat")
SimpleDateFormat date = new SimpleDateFormat("yyyy_MM_dd_HH_mm_ss");
synchronized (this) {
savedImagePath = Utils.getDCIMDirectory() + File.separator + date.format(new Date()).toString() + ".png";
}
Toast.makeText(MainActivity.this, "Save snapshot to " + savedImagePath, Toast.LENGTH_SHORT).show();
break;
case R.id.btn_settings:
startActivity(new Intent(MainActivity.this, SettingsActivity.class));
break;
case R.id.realtime_toggle_btn:
toggleRealtimeStyle();
break;
case R.id.back_in_preview:
finish();
break;
}
}
private void toggleRealtimeStyle() {
if (isRealtimeStatusRunning) {
isRealtimeStatusRunning = false;
realtimeToggleButton.setImageResource(R.drawable.realtime_stop_btn);
svPreview.setOnTextureChangedListener(this);
tvStatus.setVisibility(View.VISIBLE);
} else {
isRealtimeStatusRunning = true;
realtimeToggleButton.setImageResource(R.drawable.realtime_start_btn);
tvStatus.setVisibility(View.GONE);
svPreview.setOnTextureChangedListener(new CameraSurfaceView.OnTextureChangedListener() {
@Override
public boolean onTextureChanged(Bitmap ARGB8888ImageBitmap) {
return false;
}
});
}
}
@Override
public boolean onTextureChanged(Bitmap ARGB8888ImageBitmap) {
String savedImagePath = "";
synchronized (this) {
savedImagePath = MainActivity.this.savedImagePath;
}
boolean modified = false;
DetectionResult result = predictor.predict(
ARGB8888ImageBitmap, savedImagePath, SettingsActivity.scoreThreshold);
modified = result.initialized();
if (!savedImagePath.isEmpty()) {
synchronized (this) {
MainActivity.this.savedImagePath = "result.jpg";
}
}
lastFrameIndex++;
if (lastFrameIndex >= 30) {
final int fps = (int) (lastFrameIndex * 1e9 / (System.nanoTime() - lastFrameTime));
runOnUiThread(new Runnable() {
@SuppressLint("SetTextI18n")
public void run() {
tvStatus.setText(Integer.toString(fps) + "fps");
}
});
lastFrameIndex = 0;
lastFrameTime = System.nanoTime();
}
return modified;
}
@Override
protected void onResume() {
super.onResume();
// Reload settings and re-initialize the predictor
checkAndUpdateSettings();
// Open camera until the permissions have been granted
if (!checkAllPermissions()) {
svPreview.disableCamera();
}
svPreview.onResume();
}
@Override
protected void onPause() {
super.onPause();
svPreview.onPause();
}
@Override
protected void onDestroy() {
if (predictor != null) {
predictor.release();
}
super.onDestroy();
}
public void initView() {
svPreview = (CameraSurfaceView) findViewById(R.id.sv_preview);
svPreview.setOnTextureChangedListener(this);
tvStatus = (TextView) findViewById(R.id.tv_status);
btnSwitch = (ImageButton) findViewById(R.id.btn_switch);
btnSwitch.setOnClickListener(this);
btnShutter = (ImageButton) findViewById(R.id.btn_shutter);
btnShutter.setOnClickListener(this);
btnSettings = (ImageButton) findViewById(R.id.btn_settings);
btnSettings.setOnClickListener(this);
realtimeToggleButton = findViewById(R.id.realtime_toggle_btn);
realtimeToggleButton.setOnClickListener(this);
backInPreview = findViewById(R.id.back_in_preview);
backInPreview.setOnClickListener(this);
}
@SuppressLint("ApplySharedPref")
public void initSettings() {
SharedPreferences sharedPreferences = PreferenceManager.getDefaultSharedPreferences(this);
SharedPreferences.Editor editor = sharedPreferences.edit();
editor.clear();
editor.commit();
SettingsActivity.resetSettings();
}
public void checkAndUpdateSettings() {
if (SettingsActivity.checkAndUpdateSettings(this)) {
String realModelDir = getCacheDir() + "/" + SettingsActivity.modelDir;
Utils.copyDirectoryFromAssets(this, SettingsActivity.modelDir, realModelDir);
String realLabelPath = getCacheDir() + "/" + SettingsActivity.labelPath;
Utils.copyFileFromAssets(this, SettingsActivity.labelPath, realLabelPath);
String modelFile = realModelDir + "/" + "model.pdmodel";
String paramsFile = realModelDir + "/" + "model.pdiparams";
String configFile = realModelDir + "/" + "infer_cfg.yml";
String labelFile = realLabelPath;
RuntimeOption option = new RuntimeOption();
option.setCpuThreadNum(SettingsActivity.cpuThreadNum);
option.setLitePowerMode(SettingsActivity.cpuPowerMode);
option.enableRecordTimeOfRuntime();
if (Boolean.parseBoolean(SettingsActivity.enableLiteFp16)) {
option.enableLiteFp16();
}
predictor.init(modelFile, paramsFile, configFile, labelFile, option);
}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,
@NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults[0] != PackageManager.PERMISSION_GRANTED || grantResults[1] != PackageManager.PERMISSION_GRANTED) {
new AlertDialog.Builder(MainActivity.this)
.setTitle("Permission denied")
.setMessage("Click to force quit the app, then open Settings->Apps & notifications->Target " +
"App->Permissions to grant all of the permissions.")
.setCancelable(false)
.setPositiveButton("Exit", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
MainActivity.this.finish();
}
}).show();
}
}
private void requestAllPermissions() {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.CAMERA}, 0);
}
private boolean checkAllPermissions() {
return ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) == PackageManager.PERMISSION_GRANTED
&& ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED;
}
}

View File

@@ -1,51 +0,0 @@
package com.baidu.paddle.fastdeploy.vision;
import android.support.annotation.NonNull;
public class ClassifyResult {
public float[] mScores; // [n]
public int[] mLabelIds; // [n]
public boolean mInitialized = false;
public ClassifyResult() {
mInitialized = false;
}
public ClassifyResult(long nativeResultContext) {
mInitialized = copyAllFromNativeContext(nativeResultContext);
}
public boolean initialized() {
return mInitialized;
}
private void setScores(@NonNull float[] scoresBuffer) {
if (scoresBuffer.length > 0) {
mScores = scoresBuffer.clone();
}
}
private void setLabelIds(@NonNull int[] labelIdsBuffer) {
if (labelIdsBuffer.length > 0) {
mLabelIds = labelIdsBuffer.clone();
}
}
private boolean copyAllFromNativeContext(long nativeResultContext) {
if (nativeResultContext == 0) {
return false;
}
setScores(copyScoresFromNative(nativeResultContext));
setLabelIds(copyLabelIdsFromNative(nativeResultContext));
// WARN: must release ctx.
return releaseNative(nativeResultContext);
}
// Fetch native buffers from native context.
private static native float[] copyScoresFromNative(long nativeResultContext);
private static native int[] copyLabelIdsFromNative(long nativeResultContext);
private static native boolean releaseNative(long nativeResultContext);
}

View File

@@ -1,80 +0,0 @@
package com.baidu.paddle.fastdeploy.vision;
import android.support.annotation.NonNull;
import java.util.Arrays;
import com.baidu.paddle.fastdeploy.FastDeployInitializer;
public class DetectionResult {
// Not support MaskRCNN now.
public float[][] mBoxes; // [n,4]
public float[] mScores; // [n]
public int[] mLabelIds; // [n]
public boolean mInitialized = false;
public DetectionResult() {
mInitialized = false;
}
public DetectionResult(long nativeResultContext) {
mInitialized = copyAllFromNativeContext(nativeResultContext);
}
public boolean initialized() {
return mInitialized;
}
// Setup results from native buffers.
private boolean copyAllFromNativeContext(long nativeResultContext) {
if (nativeResultContext == 0) {
return false;
}
if (copyBoxesNumFromNative(nativeResultContext) > 0) {
setBoxes(copyBoxesFromNative(nativeResultContext));
setScores(copyScoresFromNative(nativeResultContext));
setLabelIds(copyLabelIdsFromNative(nativeResultContext));
}
// WARN: must release ctx.
return releaseNative(nativeResultContext);
}
private void setBoxes(@NonNull float[] boxesBuffer) {
int boxesNum = boxesBuffer.length / 4;
if (boxesNum > 0) {
mBoxes = new float[boxesNum][4];
for (int i = 0; i < boxesNum; ++i) {
mBoxes[i] = Arrays.copyOfRange(
boxesBuffer, i * 4, (i + 1) * 4);
}
}
}
private void setScores(@NonNull float[] scoresBuffer) {
if (scoresBuffer.length > 0) {
mScores = scoresBuffer.clone();
}
}
private void setLabelIds(@NonNull int[] labelIdsBuffer) {
if (labelIdsBuffer.length > 0) {
mLabelIds = labelIdsBuffer.clone();
}
}
// Fetch native buffers from native context.
private static native int copyBoxesNumFromNative(long nativeResultContext);
private static native float[] copyBoxesFromNative(long nativeResultContext);
private static native float[] copyScoresFromNative(long nativeResultContext);
private static native int[] copyLabelIdsFromNative(long nativeResultContext);
private static native boolean releaseNative(long nativeResultContext);
// Initializes at the beginning.
static {
FastDeployInitializer.init();
}
}

View File

@@ -1,85 +0,0 @@
package com.baidu.paddle.fastdeploy.vision;
import android.graphics.Bitmap;
import com.baidu.paddle.fastdeploy.FastDeployInitializer;
public class Visualize {
// TODO(qiuyanjun):
// VisClassification, VisSegmentation, VisMatting, VisOcr, ...
// Visualize DetectionResult without labels
public static boolean visDetection(Bitmap ARGB8888Bitmap,
DetectionResult result) {
return visDetectionNative(
ARGB8888Bitmap,
result.mBoxes,
result.mScores,
result.mLabelIds,
0.f, 1, 0.5f,
new String[]{});
}
public static boolean visDetection(Bitmap ARGB8888Bitmap,
DetectionResult result,
float score_threshold,
int line_size,
float font_size) {
return visDetectionNative(
ARGB8888Bitmap,
result.mBoxes,
result.mScores,
result.mLabelIds,
score_threshold,
line_size,
font_size,
new String[]{});
}
// Visualize DetectionResult with labels
public static boolean visDetection(Bitmap ARGB8888Bitmap,
DetectionResult result,
String[] labels) {
return visDetectionNative(
ARGB8888Bitmap,
result.mBoxes,
result.mScores,
result.mLabelIds,
0.f, 1, 0.5f,
labels);
}
public static boolean visDetection(Bitmap ARGB8888Bitmap,
DetectionResult result,
float score_threshold,
int line_size,
float font_size,
String[] labels) {
return visDetectionNative(
ARGB8888Bitmap,
result.mBoxes,
result.mScores,
result.mLabelIds,
score_threshold,
line_size,
font_size,
labels);
}
// VisDetection in native
public static native boolean visDetectionNative(Bitmap ARGB8888Bitmap,
float[][] boxes,
float[] scores,
int[] labelIds,
float score_threshold,
int line_size,
float font_size,
String[] labels);
/* Initializes at the beginning */
static {
FastDeployInitializer.init();
}
}

View File

@@ -1,172 +0,0 @@
package com.baidu.paddle.fastdeploy.vision.classification;
import android.graphics.Bitmap;
import com.baidu.paddle.fastdeploy.FastDeployInitializer;
import com.baidu.paddle.fastdeploy.RuntimeOption;
import com.baidu.paddle.fastdeploy.vision.ClassifyResult;
public class PaddleClasModel {
protected long mNativeModelContext = 0; // Context from native.
protected boolean mInitialized = false;
public PaddleClasModel() {
mInitialized = false;
}
// Constructor with default runtime option
public PaddleClasModel(String modelFile,
String paramsFile,
String configFile) {
init_(modelFile, paramsFile, configFile, "", new RuntimeOption());
}
public PaddleClasModel(String modelFile,
String paramsFile,
String configFile,
String labelFile) {
init_(modelFile, paramsFile, configFile, labelFile, new RuntimeOption());
}
// Constructor without label file
public PaddleClasModel(String modelFile,
String paramsFile,
String configFile,
RuntimeOption option) {
init_(modelFile, paramsFile, configFile, "", option);
}
// Constructor with label file
public PaddleClasModel(String modelFile,
String paramsFile,
String configFile,
String labelFile,
RuntimeOption option) {
init_(modelFile, paramsFile, configFile, labelFile, option);
}
// Call init manually without label file
public boolean init(String modelFile,
String paramsFile,
String configFile,
RuntimeOption option) {
return init_(modelFile, paramsFile, configFile, "", option);
}
// Call init manually with label file
public boolean init(String modelFile,
String paramsFile,
String configFile,
String labelFile,
RuntimeOption option) {
return init_(modelFile, paramsFile, configFile, labelFile, option);
}
public boolean release() {
mInitialized = false;
if (mNativeModelContext == 0) {
return false;
}
return releaseNative(mNativeModelContext);
}
public boolean initialized() {
return mInitialized;
}
// Predict without image saving and bitmap rendering.
public ClassifyResult predict(Bitmap ARGB8888Bitmap) {
if (mNativeModelContext == 0) {
return new ClassifyResult();
}
// Only support ARGB8888 bitmap in native now.
return new ClassifyResult(predictNative(
mNativeModelContext, ARGB8888Bitmap, false,
"", 0.f, false));
}
// Predict with image saving and bitmap rendering (will cost more times)
public ClassifyResult predict(Bitmap ARGB8888Bitmap,
String savedImagePath,
float scoreThreshold) {
// scoreThreshold is for visualizing only.
if (mNativeModelContext == 0) {
return new ClassifyResult();
}
// Only support ARGB8888 bitmap in native now.
return new ClassifyResult(predictNative(
mNativeModelContext, ARGB8888Bitmap, true,
savedImagePath, scoreThreshold, true));
}
// Internal init_ method
private boolean init_(String modelFile,
String paramsFile,
String configFile,
String labelFile,
RuntimeOption option) {
if (!mInitialized) {
mNativeModelContext = bindNative(
modelFile,
paramsFile,
configFile,
option.mCpuThreadNum,
option.mEnableLiteFp16,
option.mLitePowerMode.ordinal(),
option.mLiteOptimizedModelDir,
option.mEnableRecordTimeOfRuntime, labelFile);
if (mNativeModelContext != 0) {
mInitialized = true;
}
return mInitialized;
} else {
// release current native context and bind a new one.
if (release()) {
mNativeModelContext = bindNative(
modelFile,
paramsFile,
configFile,
option.mCpuThreadNum,
option.mEnableLiteFp16,
option.mLitePowerMode.ordinal(),
option.mLiteOptimizedModelDir,
option.mEnableRecordTimeOfRuntime, labelFile);
if (mNativeModelContext != 0) {
mInitialized = true;
}
return mInitialized;
}
return false;
}
}
// Bind predictor from native context.
private static native long bindNative(String modelFile,
String paramsFile,
String configFile,
int cpuNumThread,
boolean enableLiteFp16,
int litePowerMode,
String liteOptimizedModelDir,
boolean enableRecordTimeOfRuntime,
String labelFile);
// Call prediction from native context.
private static native long predictNative(long nativeModelContext,
Bitmap ARGB8888Bitmap,
boolean saved,
String savedImagePath,
float scoreThreshold,
boolean rendering);
// Release buffers allocated in native context.
private static native boolean releaseNative(long nativeModelContext);
// Initializes at the beginning.
static {
FastDeployInitializer.init();
}
}

View File

@@ -1,170 +0,0 @@
package com.baidu.paddle.fastdeploy.vision.detection;
import android.graphics.Bitmap;
import com.baidu.paddle.fastdeploy.FastDeployInitializer;
import com.baidu.paddle.fastdeploy.RuntimeOption;
import com.baidu.paddle.fastdeploy.vision.DetectionResult;
public class PicoDet {
protected long mNativeModelContext = 0; // Context from native.
protected boolean mInitialized = false;
public PicoDet() {
mInitialized = false;
}
// Constructor with default runtime option
public PicoDet(String modelFile,
String paramsFile,
String configFile) {
init_(modelFile, paramsFile, configFile, "", new RuntimeOption());
}
public PicoDet(String modelFile,
String paramsFile,
String configFile,
String labelFile) {
init_(modelFile, paramsFile, configFile, labelFile, new RuntimeOption());
}
// Constructor without label file
public PicoDet(String modelFile,
String paramsFile,
String configFile,
RuntimeOption option) {
init_(modelFile, paramsFile, configFile, "", option);
}
// Constructor with label file
public PicoDet(String modelFile,
String paramsFile,
String configFile,
String labelFile,
RuntimeOption option) {
init_(modelFile, paramsFile, configFile, labelFile, option);
}
// Call init manually without label file
public boolean init(String modelFile,
String paramsFile,
String configFile,
RuntimeOption option) {
return init_(modelFile, paramsFile, configFile, "", option);
}
// Call init manually with label file
public boolean init(String modelFile,
String paramsFile,
String configFile,
String labelFile,
RuntimeOption option) {
return init_(modelFile, paramsFile, configFile, labelFile, option);
}
public boolean release() {
mInitialized = false;
if (mNativeModelContext == 0) {
return false;
}
return releaseNative(mNativeModelContext);
}
public boolean initialized() {
return mInitialized;
}
// Predict without image saving and bitmap rendering.
public DetectionResult predict(Bitmap ARGB8888Bitmap) {
if (mNativeModelContext == 0) {
return new DetectionResult();
}
// Only support ARGB8888 bitmap in native now.
return new DetectionResult(predictNative(
mNativeModelContext, ARGB8888Bitmap, false,
"", 0.f, false));
}
// Predict with image saving and bitmap rendering (will cost more times)
public DetectionResult predict(Bitmap ARGB8888Bitmap,
String savedImagePath,
float scoreThreshold) {
// scoreThreshold is for visualizing only.
if (mNativeModelContext == 0) {
return new DetectionResult();
}
// Only support ARGB8888 bitmap in native now.
return new DetectionResult(predictNative(
mNativeModelContext, ARGB8888Bitmap, true,
savedImagePath, scoreThreshold, true));
}
private boolean init_(String modelFile,
String paramsFile,
String configFile,
String labelFile,
RuntimeOption option) {
if (!mInitialized) {
mNativeModelContext = bindNative(
modelFile,
paramsFile,
configFile,
option.mCpuThreadNum,
option.mEnableLiteFp16,
option.mLitePowerMode.ordinal(),
option.mLiteOptimizedModelDir,
option.mEnableRecordTimeOfRuntime, labelFile);
if (mNativeModelContext != 0) {
mInitialized = true;
}
return mInitialized;
} else {
// release current native context and bind a new one.
if (release()) {
mNativeModelContext = bindNative(
modelFile,
paramsFile,
configFile,
option.mCpuThreadNum,
option.mEnableLiteFp16,
option.mLitePowerMode.ordinal(),
option.mLiteOptimizedModelDir,
option.mEnableRecordTimeOfRuntime, labelFile);
if (mNativeModelContext != 0) {
mInitialized = true;
}
return mInitialized;
}
return false;
}
}
// Bind predictor from native context.
private static native long bindNative(String modelFile,
String paramsFile,
String configFile,
int cpuNumThread,
boolean enableLiteFp16,
int litePowerMode,
String liteOptimizedModelDir,
boolean enableRecordTimeOfRuntime,
String labelFile);
// Call prediction from native context.
private static native long predictNative(long nativeModelContext,
Bitmap ARGB8888Bitmap,
boolean saved,
String savedImagePath,
float scoreThreshold,
boolean rendering);
// Release buffers allocated in native context.
private static native boolean releaseNative(long nativeModelContext);
// Initializes at the beginning.
static {
FastDeployInitializer.init();
}
}

View File

@@ -1,99 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:keepScreenOn="true"
tools:context="com.baidu.paddle.fastdeploy.examples.MainActivity">
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@color/colorWindow">
<com.baidu.paddle.fastdeploy.common.CameraSurfaceView
android:id="@+id/sv_preview"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerInParent="true" />
<RelativeLayout
android:layout_width="@dimen/top_bar_height"
android:layout_height="match_parent"
android:layout_alignParentLeft="true"
android:background="@color/colorTopBar">
<TextView
android:id="@+id/tv_status"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:layout_marginLeft="@dimen/top_bar_left_right_margin"
android:layout_marginBottom="@dimen/top_bar_left_right_margin"
android:textColor="@color/colorText"
android:gravity="center"
android:textSize="@dimen/small_font_size" />
</RelativeLayout>
<LinearLayout
android:layout_width="wrap_content"
android:layout_height="match_parent"
android:layout_alignParentRight="true"
android:background="@color/colorBottomBar"
android:orientation="horizontal">
<LinearLayout
android:layout_width="@dimen/bottom_bar_top_margin"
android:layout_height="match_parent"
android:orientation="horizontal"></LinearLayout>
<RelativeLayout
android:layout_width="@dimen/large_button_height"
android:layout_height="match_parent">
<ImageButton
android:id="@+id/btn_switch"
android:layout_width="@dimen/small_button_width"
android:layout_height="@dimen/small_button_height"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:layout_marginTop="@dimen/bottom_bar_left_right_margin"
android:layout_marginBottom="@dimen/bottom_bar_left_right_margin"
android:background="#00000000"
android:scaleType="fitXY"
android:src="@drawable/btn_switch" />
<ImageButton
android:id="@+id/btn_shutter"
android:layout_width="@dimen/large_button_width"
android:layout_height="@dimen/large_button_height"
android:layout_centerInParent="true"
android:background="@null"
android:focusable="true"
android:focusableInTouchMode="true"
android:scaleType="fitXY"
android:src="@drawable/btn_shutter" />
<ImageButton
android:id="@+id/btn_settings"
android:layout_width="@dimen/small_button_width"
android:layout_height="@dimen/small_button_width"
android:layout_alignParentTop="true"
android:layout_centerHorizontal="true"
android:layout_marginTop="@dimen/bottom_bar_left_right_margin"
android:background="@null"
android:scaleType="fitXY"
android:src="@drawable/btn_settings" />
</RelativeLayout>
<LinearLayout
android:layout_width="@dimen/bottom_bar_bottom_margin"
android:layout_height="match_parent"
android:orientation="horizontal"></LinearLayout>
</LinearLayout>
</RelativeLayout>
</android.support.constraint.ConstraintLayout>

View File

@@ -0,0 +1,14 @@
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent">
<include
layout="@layout/detection_camera_page"
android:id="@+id/camera_page"></include>
<include
layout="@layout/detection_result_page"
android:id="@+id/result_page"
android:visibility="gone"></include>
</FrameLayout>

View File

@@ -0,0 +1,14 @@
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent">
<include
layout="@layout/detection_camera_page"
android:id="@+id/camera_page"></include>
<include
layout="@layout/detection_result_page"
android:id="@+id/result_page"
android:visibility="gone"></include>
</FrameLayout>

View File

@@ -5,17 +5,17 @@
android:layout_width="match_parent" android:layout_width="match_parent"
android:layout_height="match_parent" android:layout_height="match_parent"
android:keepScreenOn="true" android:keepScreenOn="true"
tools:context=".MainActivity"> tools:context=".detection.DetectionMainActivity">
<RelativeLayout <RelativeLayout
android:layout_width="match_parent" android:layout_width="match_parent"
android:layout_height="match_parent" android:layout_height="match_parent"
android:background="@color/colorWindow"> android:background="@color/colorWindow">
<com.baidu.paddle.fastdeploy.common.ActionBarLayout <com.baidu.paddle.fastdeploy.app.ui.layout.ActionBarLayout
android:id="@+id/action_bar" android:id="@+id/action_bar_main"
android:layout_width="match_parent" android:layout_width="match_parent"
android:layout_height="wrap_content"> android:layout_height="wrap_content"/>
<ImageView <ImageView
android:id="@+id/back_in_preview" android:id="@+id/back_in_preview"
@@ -47,25 +47,22 @@
<TextView <TextView
android:id="@+id/action_realtime_btn" android:id="@+id/action_realtime_btn"
style="@style/action_btn" style="@style/action_btn"
android:layout_width="300px" android:layout_width="wrap_content"
android:layout_height="wrap_content" android:layout_height="wrap_content"
android:text="@string/action_bar_realtime" android:text="@string/action_bar_realtime"
android:textAlignment="center" /> android:textAlignment="center" />
</LinearLayout> </LinearLayout>
</com.baidu.paddle.fastdeploy.common.ActionBarLayout> <com.baidu.paddle.fastdeploy.app.ui.view.CameraSurfaceView
<!-- 时实-->
<com.baidu.paddle.fastdeploy.common.CameraSurfaceView
android:id="@+id/sv_preview" android:id="@+id/sv_preview"
android:layout_width="match_parent" android:layout_width="match_parent"
android:layout_height="match_parent" android:layout_height="match_parent"
android:layout_above="@+id/contral" android:layout_above="@+id/contral"
android:layout_below="@+id/action_bar" android:layout_below="@+id/action_bar_main"
android:layout_centerInParent="true" /> android:layout_centerInParent="true" />
<ImageView <ImageView
android:id="@+id/albumSelect" android:id="@+id/iv_select"
android:layout_width="40dp" android:layout_width="40dp"
android:layout_height="40dp" android:layout_height="40dp"
android:layout_alignParentRight="true" android:layout_alignParentRight="true"
@@ -73,8 +70,7 @@
android:layout_marginRight="20dp" android:layout_marginRight="20dp"
android:layout_marginBottom="145dp" android:layout_marginBottom="145dp"
android:background="@drawable/album_btn" android:background="@drawable/album_btn"
android:scaleType="fitXY" android:scaleType="fitXY" />
android:visibility="gone"/>
<TextView <TextView
android:id="@+id/tv_status" android:id="@+id/tv_status"

View File

@@ -0,0 +1,160 @@
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent">
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#FFFFFF"
android:orientation="vertical">
<com.baidu.paddle.fastdeploy.app.ui.layout.ActionBarLayout
android:id="@+id/action_bar_result"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal">
<ImageView
android:id="@+id/back_in_result"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:cropToPadding="true"
android:paddingLeft="40px"
android:paddingTop="60px"
android:paddingRight="60px"
android:paddingBottom="40px"
android:src="@drawable/back_btn" />
<TextView
android:id="@+id/model_name"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerHorizontal="true"
android:layout_marginTop="50px"
android:textColor="@color/textColor"
android:textSize="@dimen/action_btn_text_size" />
</com.baidu.paddle.fastdeploy.app.ui.layout.ActionBarLayout>
<FrameLayout
android:layout_width="match_parent"
android:layout_height="700px">
<ImageView
android:id="@+id/result_image"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@color/bk_result_image_padding" />
</FrameLayout>
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginLeft="40px"
android:layout_marginTop="26px"
android:layout_marginBottom="20px"
android:text="@string/result_label"
android:textColor="@color/bk_black"
android:textSize="56px"
android:visibility="visible" />
<LinearLayout
android:id="@+id/result_seekbar_section"
android:layout_width="match_parent"
android:layout_height="130px"
android:layout_marginLeft="@dimen/result_list_padding_lr"
android:layout_marginRight="@dimen/result_list_padding_lr"
android:layout_marginBottom="@dimen/result_list_gap_width"
android:background="@drawable/result_page_border_section_bk"
android:visibility="visible">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_vertical"
android:layout_weight="2"
android:paddingLeft="30px"
android:text="@string/result_table_header_confidence"
android:textColor="@color/table_result_tableheader_text_color"
android:textSize="@dimen/result_list_view_text_size" />
<SeekBar
android:id="@+id/confidence_seekbar"
android:layout_width="220dp"
android:layout_height="wrap_content"
android:layout_gravity="center_vertical"
android:layout_weight="6"
android:focusable="false"
android:maxHeight="8px"
android:progressDrawable="@drawable/seekbar_progress_result"
android:splitTrack="false"
android:thumb="@drawable/seekbar_handle" />
<TextView
android:id="@+id/seekbar_text"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_vertical"
android:layout_weight="1"
android:paddingRight="30px"
android:textSize="@dimen/result_list_view_text_size"
/>
</LinearLayout>
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginLeft="@dimen/result_list_padding_lr"
android:layout_marginRight="@dimen/result_list_padding_lr"
android:layout_marginBottom="@dimen/result_list_gap_width"
android:background="@drawable/result_page_border_section_bk"
android:visibility="visible">
<TextView
style="@style/list_result_view_tablehead_style"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/result_table_header_index"
android:textColor="@color/table_result_tableheader_text_color" />
<TextView
style="@style/list_result_view_tablehead_style"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/result_table_header_name"
android:textColor="@color/table_result_tableheader_text_color" />
<TextView
style="@style/list_result_view_tablehead_style"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_weight="0.4"
android:gravity="right"
android:text="@string/result_table_header_confidence"
android:textColor="@color/table_result_tableheader_text_color" />
</LinearLayout>
<FrameLayout
android:layout_width="match_parent"
android:layout_height="wrap_content">
<ScrollView
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginBottom="15px"
android:paddingLeft="@dimen/result_list_padding_lr"
android:paddingRight="@dimen/result_list_padding_lr">
<com.baidu.paddle.fastdeploy.app.ui.view.ResultListView
android:id="@+id/result_list_view"
android:layout_width="match_parent"
android:layout_height="700px"
android:divider="#FFFFFF"
android:dividerHeight="@dimen/result_list_gap_width"></com.baidu.paddle.fastdeploy.app.ui.view.ResultListView>
</ScrollView>
</FrameLayout>
</LinearLayout>
</FrameLayout>

View File

@@ -0,0 +1,26 @@
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="horizontal"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:background="@drawable/result_page_border_section_bk">
<TextView
android:id="@+id/index"
style="@style/list_result_view_item_style"
android:layout_width="wrap_content"
android:layout_weight="0.2" />
<TextView
android:id="@+id/name"
style="@style/list_result_view_item_style"
android:layout_width="wrap_content"
android:layout_weight="0.6"
android:maxWidth="300px" />
<TextView
android:id="@+id/confidence"
style="@style/list_result_view_item_style"
android:layout_weight="0.2"
android:layout_width="wrap_content" />
</LinearLayout>

View File

@@ -1,5 +1,13 @@
<resources> <resources>
<string name="app_name">FastDeploy PicoDet</string> <!-- Default App name -->
<string name="app_name">EasyEdge</string>
<!-- Other App name -->
<string name="detection_app_name">EasyEdge</string>
<string name="ocr_app_name">EasyEdge</string>
<string name="classification_app_name">EasyEdge</string>
<string name="facedet_app_name">EasyEdge</string>
<string name="segmentation_app_name">EasyEdge</string>
<!-- Keys for PreferenceScreen -->
<string name="CHOOSE_PRE_INSTALLED_MODEL_KEY">CHOOSE_INSTALLED_MODEL_KEY</string> <string name="CHOOSE_PRE_INSTALLED_MODEL_KEY">CHOOSE_INSTALLED_MODEL_KEY</string>
<string name="MODEL_DIR_KEY">MODEL_DIR_KEY</string> <string name="MODEL_DIR_KEY">MODEL_DIR_KEY</string>
<string name="LABEL_PATH_KEY">LABEL_PATH_KEY</string> <string name="LABEL_PATH_KEY">LABEL_PATH_KEY</string>
@@ -7,15 +15,30 @@
<string name="CPU_POWER_MODE_KEY">CPU_POWER_MODE_KEY</string> <string name="CPU_POWER_MODE_KEY">CPU_POWER_MODE_KEY</string>
<string name="SCORE_THRESHOLD_KEY">SCORE_THRESHOLD_KEY</string> <string name="SCORE_THRESHOLD_KEY">SCORE_THRESHOLD_KEY</string>
<string name="ENABLE_LITE_FP16_MODE_KEY">ENABLE_LITE_FP16_MODE_KEY</string> <string name="ENABLE_LITE_FP16_MODE_KEY">ENABLE_LITE_FP16_MODE_KEY</string>
<string name="MODEL_DIR_DEFAULT">models/picodet_s_320_coco_lcnet</string> <!-- Common default values ... -->
<string name="LABEL_PATH_DEFAULT">labels/coco_label_list.txt</string>
<string name="CPU_THREAD_NUM_DEFAULT">2</string> <string name="CPU_THREAD_NUM_DEFAULT">2</string>
<string name="CPU_POWER_MODE_DEFAULT">LITE_POWER_HIGH</string> <string name="CPU_POWER_MODE_DEFAULT">LITE_POWER_HIGH</string>
<string name="SCORE_THRESHOLD_DEFAULT">0.4</string> <string name="SCORE_THRESHOLD_DEFAULT">0.4</string>
<string name="SCORE_THRESHOLD_CLASSIFICATION">0.1</string>
<string name="SCORE_THRESHOLD_FACEDET">0.25</string>
<string name="ENABLE_LITE_FP16_MODE_DEFAULT">true</string> <string name="ENABLE_LITE_FP16_MODE_DEFAULT">true</string>
<!--Other values--> <!--Other values-->
<!-- Detection model & Label paths & other values ... -->
<string name="DETECTION_MODEL_DIR_DEFAULT">models/picodet_s_320_coco_lcnet</string>
<string name="DETECTION_LABEL_PATH_DEFAULT">labels/coco_label_list.txt</string>
<!-- PP-OCRv2 & PP-OCRv3 values ... -->
<string name="OCR_MODEL_DIR_DEFAULT">models</string>
<string name="OCR_REC_LABEL_DEFAULT">labels/ppocr_keys_v1.txt</string>
<!-- classification values ... -->
<string name="CLASSIFICATION_MODEL_DIR_DEFAULT">models/MobileNetV1_x0_25_infer</string>
<string name="CLASSIFICATION_LABEL_PATH_DEFAULT">labels/imagenet1k_label_list.txt</string>
<!-- facedet values ... -->
<string name="FACEDET_MODEL_DIR_DEFAULT">models/scrfd_500m_bnkps_shape320x320_pd</string>
<!-- segmentation values ... -->
<string name="SEGMENTATION_MODEL_DIR_DEFAULT">models/portrait_pp_humansegv2_lite_256x144_inference_model</string>
<!-- Other resources values-->
<string name="action_bar_take_photo">拍照识别</string> <string name="action_bar_take_photo">拍照识别</string>
<string name="action_bar_realtime">FD 实时识别</string> <string name="action_bar_realtime">实时识别</string>
<string name="action_bar_back">&lt;</string> <string name="action_bar_back">&lt;</string>
<string name="action_bar_model_name">模型名称</string> <string name="action_bar_model_name">模型名称</string>
<string name="result_label">识别结果</string> <string name="result_label">识别结果</string>

View File

@@ -0,0 +1,17 @@
<?xml version="1.0" encoding="utf-8"?>
<resources>
<dimen name="action_btn_size">120dp</dimen>
<dimen name="action_btn_text_size">46px</dimen>
<dimen name="operation_btn_margin_top_take_picture">126px</dimen>
<dimen name="operation_btn_margin_top">136px</dimen>
<dimen name="result_list_view_text_size">46px</dimen>
<dimen name="result_list_popview_text_size">36px</dimen>
<dimen name="result_list_padding_lr">15dp</dimen>
<dimen name="result_list_gap_width">15dp</dimen>
</resources>

View File

@@ -1,17 +1,17 @@
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android"> <PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android">
<ListPreference <ListPreference
android:defaultValue="@string/MODEL_DIR_DEFAULT" android:defaultValue="@string/DETECTION_MODEL_DIR_DEFAULT"
android:key="@string/CHOOSE_PRE_INSTALLED_MODEL_KEY" android:key="@string/CHOOSE_PRE_INSTALLED_MODEL_KEY"
android:negativeButtonText="@null" android:negativeButtonText="@null"
android:positiveButtonText="@null" android:positiveButtonText="@null"
android:title="Choose Pre-Installed Models" /> android:title="Choose Pre-Installed Models" />
<EditTextPreference <EditTextPreference
android:defaultValue="@string/MODEL_DIR_DEFAULT" android:defaultValue="@string/DETECTION_MODEL_DIR_DEFAULT"
android:key="@string/MODEL_DIR_KEY" android:key="@string/MODEL_DIR_KEY"
android:title="Model Dir" /> android:title="Model Dir" />
<EditTextPreference <EditTextPreference
android:defaultValue="@string/LABEL_PATH_DEFAULT" android:defaultValue="@string/DETECTION_LABEL_PATH_DEFAULT"
android:key="@string/LABEL_PATH_KEY" android:key="@string/LABEL_PATH_KEY"
android:title="Label Path" /> android:title="Label Path" />
<ListPreference <ListPreference

View File

@@ -420,7 +420,6 @@ String configFile = "picodet_s_320_coco_lcnet/infer_cfg.yml";
RuntimeOption option = new RuntimeOption(); RuntimeOption option = new RuntimeOption();
option.setCpuThreadNum(2); option.setCpuThreadNum(2);
option.setLitePowerMode(LitePowerMode.LITE_POWER_HIGH); option.setLitePowerMode(LitePowerMode.LITE_POWER_HIGH);
option.enableRecordTimeOfRuntime();
option.enableLiteFp16(); option.enableLiteFp16();
// 使用init函数初始化 // 使用init函数初始化
model.init(modelFile, paramFile, configFile, option); model.init(modelFile, paramFile, configFile, option);
@@ -489,7 +488,7 @@ App示例工程只需要在AndroidManifest.xml中切换不同的Activity即可
</application> </application>
</manifest> </manifest>
``` ```
- 目标检测 - 目标检测场景
```xml ```xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android" <manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.baidu.paddle.fastdeploy.app.examples"> package="com.baidu.paddle.fastdeploy.app.examples">
@@ -503,7 +502,7 @@ App示例工程只需要在AndroidManifest.xml中切换不同的Activity即可
</application> </application>
</manifest> </manifest>
``` ```
- OCR文字识别 - OCR文字识别场景
```xml ```xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android" <manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.baidu.paddle.fastdeploy.app.examples"> package="com.baidu.paddle.fastdeploy.app.examples">
@@ -517,7 +516,7 @@ App示例工程只需要在AndroidManifest.xml中切换不同的Activity即可
</application> </application>
</manifest> </manifest>
``` ```
- 人像分割 - 人像分割场景
```xml ```xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android" <manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.baidu.paddle.fastdeploy.app.examples"> package="com.baidu.paddle.fastdeploy.app.examples">
@@ -531,7 +530,7 @@ App示例工程只需要在AndroidManifest.xml中切换不同的Activity即可
</application> </application>
</manifest> </manifest>
``` ```
- 人脸检测 - 人脸检测场景
```xml ```xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android" <manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.baidu.paddle.fastdeploy.app.examples"> package="com.baidu.paddle.fastdeploy.app.examples">