mirror of
https://github.com/PaddlePaddle/FastDeploy.git
synced 2025-10-05 16:48:03 +08:00
[Doc]Add English version of documents in docs/cn and api/vision_results (#931)
* 第一次提交 * 补充一处漏翻译 * deleted: docs/en/quantize.md * Update one translation * Update en version * Update one translation in code * Standardize one writing * Standardize one writing * Update some en version * Fix a grammer problem * Update en version for api/vision result * Merge branch 'develop' of https://github.com/charl-u/FastDeploy into develop * Checkout the link in README in vision_results/ to the en documents * Modify a title * Add link to serving/docs/ * Finish translation of demo.md
This commit is contained in:
@@ -1,13 +1,18 @@
|
||||
English | [中文](../../cn/faq/build_on_win_with_gui.md)
|
||||
# Use CMakeGUI + VS 2019 IDE to Compile FastDeploy
|
||||
|
||||
Note: This method only supports FastDeploy C++ SDK
|
||||
|
||||
## Contents
|
||||
|
||||
- [How to Use CMake GUI for Basic Compliation](#CMakeGuiAndVS2019Basic)
|
||||
- [How to Set for CPU version C++ SDK Compilation](#CMakeGuiAndVS2019CPU)
|
||||
- [How to Set for GPU version C++ SDK Compilation](#CMakeGuiAndVS2019GPU)
|
||||
- [How to Use Visual Studio 2019 IDE for Compliation](#CMakeGuiAndVS2019Build)
|
||||
- [Use CMakeGUI + VS 2019 IDE to Compile FastDeploy](#use-cmakegui--vs-2019-ide-to-compile-fastdeploy)
|
||||
- [Contents](#contents)
|
||||
- [How to Use CMake GUI for Basic Compilation](#how-to-use-cmake-gui-for-basic-compilation)
|
||||
- [How to Set for CPU version C++ SDK Compilation](#how-to-set-for-cpu-version-c-sdk-compilation)
|
||||
- [How to Set for GPU version C++ SDK Compilation](#how-to-set-for-gpu-version-c-sdk-compilation)
|
||||
- [How to Use Visual Studio 2019 IDE for Compliation](#how-to-use-visual-studio-2019-ide-for-compliation)
|
||||
- [Compile all examples(Optional)](#compile-all-examplesoptional)
|
||||
- [Note](#note)
|
||||
|
||||
### How to Use CMake GUI for Basic Compilation
|
||||
<div id="CMakeGuiAndVS2019Basic"></div>
|
||||
|
@@ -1,3 +1,4 @@
|
||||
English | [中文](../../cn/faq/develop_a_new_model.md)
|
||||
# How to Integrate New Model on FastDeploy
|
||||
|
||||
How to add a new model on FastDeploy, including C++/Python deployment? Here, we take the ResNet50 model in torchvision v0.12.0 as an example, introducing external [Model Integration](#modelsupport) on FastDeploy. The whole process only needs 3 steps.
|
||||
|
@@ -1,3 +1,4 @@
|
||||
English | [中文](../../cn/faq/how_to_change_backend.md)
|
||||
# How to Change Model Inference Backend
|
||||
|
||||
FastDeploy supports various backends, including
|
||||
|
50
docs/en/faq/rknpu2/export.md
Normal file
50
docs/en/faq/rknpu2/export.md
Normal file
@@ -0,0 +1,50 @@
|
||||
English | [中文](../../../cn/faq/rknpu2/export.md)
|
||||
|
||||
# Export Model
|
||||
|
||||
## Introduction
|
||||
|
||||
Fastdeploy has simply integrated the onnx->rknn conversion process. In this instruction, we first write yaml configuration files, then export models in `tools/export.py`.
|
||||
Before you start the conversion, please check if the environment is installed successfully referring to [RKNN-Toolkit2 Installation](./install_rknn_toolkit2.md).
|
||||
|
||||
|
||||
## Configuration Parameter in export.py
|
||||
|
||||
| Parameter | Whether it can be NULL | Parameter Role |
|
||||
|-----------------|------------|--------------------|
|
||||
| verbose | Y(DEFAULT=TRUE) | Decide whether to output specific information when converting |
|
||||
| config_path | N | Path to configuration file |
|
||||
|
||||
## Config File Introduction
|
||||
|
||||
### Module of config yaml file
|
||||
|
||||
```yaml
|
||||
model_path: ./portrait_pp_humansegv2_lite_256x144_pretrained.onnx
|
||||
output_folder: ./
|
||||
target_platform: RK3588
|
||||
normalize:
|
||||
mean: [[0.5,0.5,0.5]]
|
||||
std: [[0.5,0.5,0.5]]
|
||||
outputs: None
|
||||
```
|
||||
|
||||
### Config parameters
|
||||
* model_path: Model saving path.
|
||||
* output_folder: Model saving folder name.
|
||||
* target_platform: The device model runs on, only RK3588 or RK3568 can be chosen.
|
||||
* normalize: Configure the normalize operation on NPU with two parameters std and mean.
|
||||
* std: If you do the normalize operation externally, please configure to [1/255,1/255,1/255].
|
||||
* mean: If you do the normalize operation externally, please configure to [0,0,0].
|
||||
* outputs: Output node list, if you use default output node, please configure to None.
|
||||
|
||||
## How to convert model
|
||||
Run the line in the root directory:
|
||||
|
||||
```bash
|
||||
python tools/export.py --config_path=./config.yaml
|
||||
```
|
||||
|
||||
## Things to note in Model Export
|
||||
|
||||
* Please don't export models with softmax or argmax, calculate them externally instead.
|
49
docs/en/faq/rknpu2/install_rknn_toolkit2.md
Normal file
49
docs/en/faq/rknpu2/install_rknn_toolkit2.md
Normal file
@@ -0,0 +1,49 @@
|
||||
English | [中文](../../../cn/faq/rknpu2/install_rknn_toolkit2.md)
|
||||
# RKNN-Toolkit2 Installation
|
||||
|
||||
## Download
|
||||
|
||||
Here are two methods to download RKNN-Toolkit2:
|
||||
|
||||
* Download from github library
|
||||
|
||||
A stable version of RKNN-Toolkit2 is available on github.
|
||||
```bash
|
||||
git clone https://github.com/rockchip-linux/rknn-toolkit2.git
|
||||
```
|
||||
|
||||
* Download from Baidu Netdisk
|
||||
|
||||
In some cases, if the stable version has bugs and does not meet the requirements for model deployment, you can also use the beta version by downloading it from Baidu Netdisk. The installation way is the same as its stable version.
|
||||
```text
|
||||
link:https://eyun.baidu.com/s/3eTDMk6Y password:rknn
|
||||
```
|
||||
|
||||
## Installation
|
||||
|
||||
There will be dependency issues during the installation. Since some specific packages are required, it is recommended that you create a new conda environment at first.
|
||||
You may get conda installation instruction on google, let's just skip it and introduce how to install RKNN-Toolkit2.
|
||||
|
||||
|
||||
### Download and Install the packages required
|
||||
```bash
|
||||
sudo apt-get install libxslt1-dev zlib1g zlib1g-dev libglib2.0-0 \
|
||||
libsm6 libgl1-mesa-glx libprotobuf-dev gcc g++
|
||||
```
|
||||
|
||||
### Environment for installing RKNN-Toolkit2
|
||||
```bash
|
||||
# Create a new environment
|
||||
conda create -n rknn2 python=3.6
|
||||
conda activate rknn2
|
||||
|
||||
# RKNN-Toolkit2 has a specific dependency on numpy
|
||||
pip install numpy==1.16.6
|
||||
|
||||
# Install rknn_toolkit2-1.3.0_11912b58-cp38-cp38-linux_x86_64.whl
|
||||
cd ~/download/rknn-toolkit2-master/packages
|
||||
pip install rknn_toolkit2-1.3.0_11912b58-cp38-cp38-linux_x86_64.whl
|
||||
```
|
||||
|
||||
## Other Documents
|
||||
- [How to convert ONNX to RKNN](./export.md)
|
74
docs/en/faq/rknpu2/rknpu2.md
Normal file
74
docs/en/faq/rknpu2/rknpu2.md
Normal file
@@ -0,0 +1,74 @@
|
||||
English | [中文](../../../cn/faq/rknpu2/rknpu2.md)
|
||||
# RKNPU2 Model Deployment
|
||||
|
||||
## Installation Environment
|
||||
RKNPU2 model export is only supported on x86 Linux platform, please refer to [RKNPU2 Model Export Environment Configuration](./install_rknn_toolkit2.md).
|
||||
|
||||
## Convert ONNX to RKNN
|
||||
Since the ONNX model cannot directly calculate by calling the NPU, it is necessary to convert the ONNX model to RKNN model. For detailed information, please refer to [RKNPU2 Conversion Document](./export.md).
|
||||
|
||||
## Models supported for RKNPU2
|
||||
The following tests are at end-to-end speed, and the test environment is as follows:
|
||||
* Device Model: RK3588
|
||||
* ARM CPU is tested on ONNX
|
||||
* with single-core NPU
|
||||
|
||||
|
||||
| Mission Scenario | Model | Model Version(tested version) | ARM CPU/RKNN speed(ms) |
|
||||
|------------------|-------------------|-------------------------------|--------------------|
|
||||
| Detection | Picodet | Picodet-s | 162/112 |
|
||||
| Detection | RKYOLOV5 | YOLOV5-S-Relu(int8) | -/57 |
|
||||
| Detection | RKYOLOX | - | -/- |
|
||||
| Detection | RKYOLOV7 | - | -/- |
|
||||
| Segmentation | Unet | Unet-cityscapes | -/- |
|
||||
| Segmentation | PP-HumanSegV2Lite | portrait | 133/43 |
|
||||
| Segmentation | PP-HumanSegV2Lite | human | 133/43 |
|
||||
| Face Detection | SCRFD | SCRFD-2.5G-kps-640 | 108/42 |
|
||||
|
||||
|
||||
## How to use RKNPU2 Backend to Infer Models
|
||||
|
||||
We provide an example on Scrfd model here to show how to use RKNPU2 Backend for model inference. The modifications mentioned in the annotations below are in comparison to the ONNX CPU.
|
||||
|
||||
```c++
|
||||
int infer_scrfd_npu() {
|
||||
char model_path[] = "./model/scrfd_2.5g_bnkps_shape640x640.rknn";
|
||||
char image_file[] = "./image/test_lite_face_detector_3.jpg";
|
||||
auto option = fastdeploy::RuntimeOption();
|
||||
// Modification1: option.UseRKNPU2 function should be called
|
||||
option.UseRKNPU2();
|
||||
|
||||
// Modification2: The parameter 'fastdeploy::ModelFormat::RKNN' should be transferred when loading the model
|
||||
auto *model = new fastdeploy::vision::facedet::SCRFD(model_path,"",option,fastdeploy::ModelFormat::RKNN);
|
||||
if (!model->Initialized()) {
|
||||
std::cerr << "Failed to initialize." << std::endl;
|
||||
return 0;
|
||||
}
|
||||
|
||||
// Modification3(optional): RKNPU2 supports to normalize using NPU and the input format is nhwc format.
|
||||
// The action of DisableNormalizeAndPermute will block the nor action and hwc to chw converting action during preprocessing.
|
||||
// If you use an already supported model list, please call its method before Predict.
|
||||
model->DisableNormalizeAndPermute();
|
||||
auto im = cv::imread(image_file);
|
||||
auto im_bak = im.clone();
|
||||
fastdeploy::vision::FaceDetectionResult res;
|
||||
clock_t start = clock();
|
||||
if (!model->Predict(&im, &res, 0.8, 0.8)) {
|
||||
std::cerr << "Failed to predict." << std::endl;
|
||||
return 0;
|
||||
}
|
||||
clock_t end = clock();
|
||||
double dur = (double) (end - start);
|
||||
printf("infer_scrfd_npu use time:%f\n", (dur / CLOCKS_PER_SEC));
|
||||
auto vis_im = fastdeploy::vision::Visualize::VisFaceDetection(im_bak, res);
|
||||
cv::imwrite("scrfd_rknn_vis_result.jpg", vis_im);
|
||||
std::cout << "Visualized result saved in ./scrfd_rknn_vis_result.jpg" << std::endl;
|
||||
return 0;
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
## Other related Documents
|
||||
- [How to Build RKNPU2 Deployment Environment](../../build_and_install/rknpu2.md)
|
||||
- [RKNN-Toolkit2 Installation Document](./install_rknn_toolkit2.md)
|
||||
- [How to convert ONNX to RKNN](./export.md)
|
@@ -1,3 +1,250 @@
|
||||
English | [中文](../../cn/faq/use_cpp_sdk_on_android.md)
|
||||
# FastDeploy to deploy on Android Platform
|
||||
|
||||
coming soon...
|
||||
This document will take PicoDet as an example and explain how to encapsulate FastDeploy model to Android through JNI. You need to know at least the basics of C++, Java, JNI and Android. If you mainly focus on how to call FastDeploy API in Java layer, you can skip this document.
|
||||
|
||||
## Content
|
||||
- [FastDeploy to deploy on Android Platform](#fastdeploy-to-deploy-on-android-platform)
|
||||
- [Content](#content)
|
||||
- [Create a new Java class and Define the native API](#create-a-new-java-class-and-define-the-native-api)
|
||||
- [Generate JNI function definition with Android Studio](#generate-jni-function-definition-with-android-studio)
|
||||
- [Implement JNI function in the C++ layer](#implement-jni-function-in-the-c-layer)
|
||||
- [Write CMakeLists.txt and configure build.gradle](#write-cmakeliststxt-and-configure-buildgradle)
|
||||
- [More examples of FastDeploy Android](#more-examples-of-fastdeploy-android)
|
||||
|
||||
|
||||
## Create a new Java class and Define the native API
|
||||
<div id="Java"></div>
|
||||
|
||||
```java
|
||||
public class PicoDet {
|
||||
protected long mNativeModelContext = 0; // Context from native.
|
||||
protected boolean mInitialized = false;
|
||||
// ...
|
||||
// Bind predictor from native context.
|
||||
private static native long bindNative(String modelFile,
|
||||
String paramsFile,
|
||||
String configFile,
|
||||
int cpuNumThread,
|
||||
boolean enableLiteFp16,
|
||||
int litePowerMode,
|
||||
String liteOptimizedModelDir,
|
||||
boolean enableRecordTimeOfRuntime,
|
||||
String labelFile);
|
||||
|
||||
// Call prediction from native context.
|
||||
private static native long predictNative(long nativeModelContext,
|
||||
Bitmap ARGB8888Bitmap,
|
||||
boolean saved,
|
||||
String savedImagePath,
|
||||
float scoreThreshold,
|
||||
boolean rendering);
|
||||
|
||||
// Release buffers allocated in native context.
|
||||
private static native boolean releaseNative(long nativeModelContext);
|
||||
|
||||
// Initializes at the beginning.
|
||||
static {
|
||||
FastDeployInitializer.init();
|
||||
}
|
||||
}
|
||||
```
|
||||
These interfaces, marked as native, are required to be implemented by JNI and should be available to call for Class PicoDet in the Java layer. For the complete PicoDet Java code, please refer to [PicoDet.java](../../../java/android/fastdeploy/src/main/java/com/baidu/paddle/fastdeploy/vision/detection/PicoDet.java). The functions are described seperately:
|
||||
- `bindNative`: Initialize the model resource in the C++ layer. It returns a cursor (of type long) to the model if it is successfully initialized, otherwise it returns a 0 cursor.
|
||||
- `predictNative`: Run the prediction code in th C++ layer with the initialized model cursor. If executed successfully, it returns a cursor to the result, otherwise it returns a 0 cursor. Please note that the cursor needs to be released after the current prediction, please refer to the definition of the `predict` funtion in [PicoDet.java](../../../java/android/fastdeploy/src/main/java/com/baidu/paddle/fastdeploy/vision/detection/PicoDet.java) for details.
|
||||
- `releaseNative`: Release model resources in the C++ layer according to the input model cursor.
|
||||
|
||||
## Generate JNI function definition with Android Studio
|
||||
<div id="JNI"></div>
|
||||
|
||||
Hover over the native function defined in Java and Android Studio will prompt if you want to create a JNI function definition. Here, we create the definition in a pre-created c++ file `picodet_jni.cc`.
|
||||
|
||||
- Create a JNI function definition with Android Studio:
|
||||

|
||||
|
||||
- Create the definition in picodet_jni.cc:
|
||||

|
||||
|
||||
- The JNI function definition created:
|
||||

|
||||
|
||||
You can create JNI function definitions corresponding to other native functions referring to this process.
|
||||
|
||||
## Implement JNI function in the C++ layer
|
||||
<div id="CPP"></div>
|
||||
|
||||
Here is an example of the PicoDet JNI layer implementation. For the complete C++ code, please refer to [android/app/src/main/cpp](../../../examples/vision/detection/paddledetection/android/app/src/main/cpp/).
|
||||
```C++
|
||||
// Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
#include <jni.h> // NOLINT
|
||||
#include "fastdeploy_jni/convert_jni.h" // NOLINT
|
||||
#include "fastdeploy_jni/assets_loader_jni.h" // NOLINT
|
||||
#include "fastdeploy_jni/runtime_option_jni.h" // NOLINT
|
||||
#include "fastdeploy_jni/vision/results_jni.h" // NOLINT
|
||||
#include "fastdeploy_jni/vision/detection/detection_utils_jni.h" // NOLINT
|
||||
|
||||
namespace fni = fastdeploy::jni;
|
||||
namespace vision = fastdeploy::vision;
|
||||
namespace detection = fastdeploy::vision::detection;
|
||||
|
||||
#ifdef __cplusplus
|
||||
extern "C" {
|
||||
#endif
|
||||
|
||||
JNIEXPORT jlong JNICALL
|
||||
Java_com_baidu_paddle_fastdeploy_vision_detection_PicoDet_bindNative(
|
||||
JNIEnv *env, jobject thiz, jstring model_file, jstring params_file,
|
||||
jstring config_file, jobject runtime_option, jstring label_file) {
|
||||
auto c_model_file = fni::ConvertTo<std::string>(env, model_file);
|
||||
auto c_params_file = fni::ConvertTo<std::string>(env, params_file);
|
||||
auto c_config_file = fni::ConvertTo<std::string>(env, config_file);
|
||||
auto c_label_file = fni::ConvertTo<std::string>(env, label_file);
|
||||
auto c_runtime_option = fni::NewCxxRuntimeOption(env, runtime_option);
|
||||
auto c_model_ptr = new detection::PicoDet(
|
||||
c_model_file, c_params_file, c_config_file, c_runtime_option);
|
||||
INITIALIZED_OR_RETURN(c_model_ptr)
|
||||
|
||||
#ifdef ENABLE_RUNTIME_PERF
|
||||
c_model_ptr->EnableRecordTimeOfRuntime();
|
||||
#endif
|
||||
if (!c_label_file.empty()) {
|
||||
fni::AssetsLoader::LoadDetectionLabels(c_label_file);
|
||||
}
|
||||
vision::EnableFlyCV();
|
||||
return reinterpret_cast<jlong>(c_model_ptr);
|
||||
}
|
||||
|
||||
JNIEXPORT jobject JNICALL
|
||||
Java_com_baidu_paddle_fastdeploy_vision_detection_PicoDet_predictNative(
|
||||
JNIEnv *env, jobject thiz, jlong cxx_context, jobject argb8888_bitmap,
|
||||
jboolean save_image, jstring save_path, jboolean rendering,
|
||||
jfloat score_threshold) {
|
||||
if (cxx_context == 0) {
|
||||
return NULL;
|
||||
}
|
||||
cv::Mat c_bgr;
|
||||
if (!fni::ARGB888Bitmap2BGR(env, argb8888_bitmap, &c_bgr)) {
|
||||
return NULL;
|
||||
}
|
||||
auto c_model_ptr = reinterpret_cast<detection::PicoDet *>(cxx_context);
|
||||
vision::DetectionResult c_result;
|
||||
auto t = fni::GetCurrentTime();
|
||||
c_model_ptr->Predict(&c_bgr, &c_result);
|
||||
PERF_TIME_OF_RUNTIME(c_model_ptr, t)
|
||||
|
||||
if (rendering) {
|
||||
fni::RenderingDetection(env, c_bgr, c_result, argb8888_bitmap, save_image,
|
||||
score_threshold, save_path);
|
||||
}
|
||||
|
||||
return fni::NewJavaResultFromCxx(env, reinterpret_cast<void *>(&c_result),
|
||||
vision::ResultType::DETECTION);
|
||||
}
|
||||
|
||||
JNIEXPORT jboolean JNICALL
|
||||
Java_com_baidu_paddle_fastdeploy_vision_detection_PicoDet_releaseNative(
|
||||
JNIEnv *env, jobject thiz, jlong cxx_context) {
|
||||
if (cxx_context == 0) {
|
||||
return JNI_FALSE;
|
||||
}
|
||||
auto c_model_ptr = reinterpret_cast<detection::PicoDet *>(cxx_context);
|
||||
PERF_TIME_OF_RUNTIME(c_model_ptr, -1)
|
||||
|
||||
delete c_model_ptr;
|
||||
LOGD("[End] Release PicoDet in native !");
|
||||
return JNI_TRUE;
|
||||
}
|
||||
|
||||
#ifdef __cplusplus
|
||||
}
|
||||
#endif
|
||||
```
|
||||
## Write CMakeLists.txt and configure build.gradle
|
||||
<div id="CMakeAndGradle"></div>
|
||||
|
||||
The implemented JNI code needs to be compiled into a so library to be called by Java. To achieve this, you need to add JNI project support in build.gradle, and write the corresponding CMakeLists.txt.
|
||||
- Configure NDK, CMake and Android ABI in build.gradle
|
||||
```java
|
||||
android {
|
||||
defaultConfig {
|
||||
// Other configurations are omitted ...
|
||||
externalNativeBuild {
|
||||
cmake {
|
||||
arguments '-DANDROID_PLATFORM=android-21', '-DANDROID_STL=c++_shared', "-DANDROID_TOOLCHAIN=clang"
|
||||
abiFilters 'armeabi-v7a', 'arm64-v8a'
|
||||
cppFlags "-std=c++11"
|
||||
}
|
||||
}
|
||||
}
|
||||
// Other configurations are omitted ...
|
||||
externalNativeBuild {
|
||||
cmake {
|
||||
path file('src/main/cpp/CMakeLists.txt')
|
||||
version '3.10.2'
|
||||
}
|
||||
}
|
||||
sourceSets {
|
||||
main {
|
||||
jniLibs.srcDirs = ['libs']
|
||||
}
|
||||
}
|
||||
ndkVersion '20.1.5948944'
|
||||
}
|
||||
```
|
||||
- An example of CMakeLists.txt
|
||||
```cmake
|
||||
cmake_minimum_required(VERSION 3.10.2)
|
||||
project("fastdeploy_jni")
|
||||
|
||||
# Where xxx indicates the version number of C++ SDK
|
||||
set(FastDeploy_DIR "${CMAKE_CURRENT_SOURCE_DIR}/../../../libs/fastdeploy-android-xxx-shared")
|
||||
|
||||
find_package(FastDeploy REQUIRED)
|
||||
|
||||
include_directories(${CMAKE_CURRENT_SOURCE_DIR})
|
||||
include_directories(${FastDeploy_INCLUDE_DIRS})
|
||||
|
||||
add_library(
|
||||
fastdeploy_jni
|
||||
SHARED
|
||||
utils_jni.cc
|
||||
bitmap_jni.cc
|
||||
vision/results_jni.cc
|
||||
vision/visualize_jni.cc
|
||||
vision/detection/picodet_jni.cc
|
||||
vision/classification/paddleclas_model_jni.cc)
|
||||
|
||||
find_library(log-lib log)
|
||||
|
||||
target_link_libraries(
|
||||
# Specifies the target library.
|
||||
fastdeploy_jni
|
||||
jnigraphics
|
||||
${FASTDEPLOY_LIBS}
|
||||
GLESv2
|
||||
EGL
|
||||
${log-lib}
|
||||
)
|
||||
```
|
||||
For the complete project, please refer to [CMakelists.txt](../../../java/android/fastdeploy/src/main/cpp/CMakeLists.txt) and [build.gradle](../../../java/android/fastdeploy/build.gradle).
|
||||
|
||||
## More examples of FastDeploy Android
|
||||
<div id="Examples"></div>
|
||||
|
||||
For more examples of using FastDeploy Android, you can refer to:
|
||||
- [Image classification on Android](../../../examples/vision/classification/paddleclas/android/README.md)
|
||||
- [Object detection on Android](../../../examples/vision/detection/paddledetection/android/README.md)
|
||||
|
||||
|
@@ -3,33 +3,38 @@ English | [中文](../../cn/faq/use_sdk_on_windows.md)
|
||||
# Using the FastDeploy C++ SDK on Windows Platform
|
||||
|
||||
## Contents
|
||||
- [1. Environment Dependent](#Environment)
|
||||
- [2. Download FastDeploy Windows 10 C++ SDK](#Download)
|
||||
- [3. Various ways to use C++ SDK on Windows Platform](#CommandLine)
|
||||
- [3.1 Using the C++ SDK from the Command Line](#CommandLine)
|
||||
- [3.1.1 Build the example on the Windows Platform command line terminal](#CommandLine)
|
||||
- [3.1.2 Run the Executable to Get Inference Results](#CommandLine)
|
||||
- [3.2 Visual Studio 2019 Creates sln Project Using C++ SDK](#VisualStudio2019Sln)
|
||||
- [3.2.1 Visual Studio 2019 creates sln project project](#VisualStudio2019Sln1)
|
||||
- [3.2.2 Copy the code of infer_ppyoloe.cc from examples to the project](#VisualStudio2019Sln2)
|
||||
- [3.2.3 Set the project configuration to "Release x64" configuration](#VisualStudio2019Sln3)
|
||||
- [3.2.4 Configure Include Header File Path](#VisualStudio2019Sln4)
|
||||
- [3.2.5 Configure Lib Path and Add Library Files](#VisualStudio2019Sln5)
|
||||
- [3.2.6 Build the Project and Run to Get the Result](#VisualStudio2019Sln6)
|
||||
- [3.3 Visual Studio 2019 Create CMake project using C++ SDK](#VisualStudio2019)
|
||||
- [3.3.1 Visual Studio 2019 Creates a CMake Project](#VisualStudio20191)
|
||||
- [3.3.2 Configure FastDeploy C++ SDK in CMakeLists](#VisualStudio20192)
|
||||
- [3.3.3 Generate project cache and Modify CMakeSetting.json Configuration](#VisualStudio20193)
|
||||
- [3.3.4 Generate executable file, Run to Get the Result](#VisualStudio20194)
|
||||
- [4. Multiple methods to Configure the Required Dependencies for the Exe Runtime](#CommandLineDeps1)
|
||||
- [4.1 Use Fastdeploy_init.bat for Configuration (Recommended)](#CommandLineDeps1)
|
||||
- [4.1.1 fastdeploy_init.bat User's Manual](#CommandLineDeps11)
|
||||
- [4.1.2 fastdeploy_init.bat View all dll, lib and include paths in the SDK](#CommandLineDeps12)
|
||||
- [4.1.3 fastdeploy_init.bat Installs all the dlls in the SDK to the specified directory](#CommandLineDeps13)
|
||||
- [4.1.4 fastdeploy_init.bat Configures SDK Environment Variables](#CommandLineDeps14)
|
||||
- [4.2 Modify CMakeLists.txt, One Line of Command Configuration (Recommended)](#CommandLineDeps2)
|
||||
- [4.3 Command Line Setting Environment Variables](#CommandLineDeps3)
|
||||
- [4.4 Manually Copy the Dependency Library to the Exe Directory](#CommandLineDeps4)
|
||||
|
||||
- [Using the FastDeploy C++ SDK on Windows Platform](#using-the-fastdeploy-c-sdk-on-windows-platform)
|
||||
- [Contents](#contents)
|
||||
- [1. Environment Dependent](#1-environment-dependent)
|
||||
- [2. Download FastDeploy Windows 10 C++ SDK](#2-download-fastdeploy-windows-10-c-sdk)
|
||||
- [2.1 Download the Pre-built Library or Build the Latest SDK from Source](#21-download-the-pre-built-library-or-build-the-latest-sdk-from-source)
|
||||
- [2.2 Prepare Model Files and Test Images](#22-prepare-model-files-and-test-images)
|
||||
- [3. Various ways to use C++ SDK on Windows Platform](#3-various-ways-to-use-c-sdk-on-windows-platform)
|
||||
- [3.1 SDK usage method 1:Using the C++ SDK from the Command Line](#31-sdk-usage-method-1using-the-c-sdk-from-the-command-line)
|
||||
- [3.1.1 Build PPYOLOE on Windows Platform](#311-build-ppyoloe-on-windows-platform)
|
||||
- [3.1.2 Run Demo](#312-run-demo)
|
||||
- [3.2 SDK usage method 2: Visual Studio 2019 creates sln project using C++ SDK](#32-sdk-usage-method-2-visual-studio-2019-creates-sln-project-using-c-sdk)
|
||||
- [3.2.1 Step 1:Visual Studio 2019 creates sln project project](#321-step-1visual-studio-2019-creates-sln-project-project)
|
||||
- [3.2.2 Step 2:Copy the code of infer\_ppyoloe.cc from examples to the project](#322-step-2copy-the-code-of-infer_ppyoloecc-from-examples-to-the-project)
|
||||
- [3.2.3 Step 3:Set the project configuration to "Release x64" configuration](#323-step-3set-the-project-configuration-to-release-x64-configuration)
|
||||
- [3.2.4 Step 4:Configure Include Header File Path](#324-step-4configure-include-header-file-path)
|
||||
- [3.2.5 Step 5:Configure Lib Path and Add Library Files](#325-step-5configure-lib-path-and-add-library-files)
|
||||
- [3.2.6 Step 6:Build the Project and Run to Get the Result](#326-step-6build-the-project-and-run-to-get-the-result)
|
||||
- [3.3 Visual Studio 2019 Create CMake project using C++ SDK](#33-visual-studio-2019-create-cmake-project-using-c-sdk)
|
||||
- [3.3.1 Step 1: Visual Studio 2019 Creates a CMake Project](#331-step-1-visual-studio-2019-creates-a-cmake-project)
|
||||
- [3.3.2 Step 2:Configure FastDeploy C++ SDK in CMakeLists](#332-step-2configure-fastdeploy-c-sdk-in-cmakelists)
|
||||
- [3.3.3 Step 3:Generate project cache and Modify CMakeSetting.json Configuration](#333-step-3generate-project-cache-and-modify-cmakesettingjson-configuration)
|
||||
- [3.3.4 Step 4:Generate executable file, Run to Get the Result](#334-step-4generate-executable-file-run-to-get-the-result)
|
||||
- [4. Multiple methods to Configure the Required Dependencies for the Exe Runtime](#4-multiple-methods-to-configure-the-required-dependencies-for-the-exe-runtime)
|
||||
- [4.1 Use method 1:Use Fastdeploy\_init.bat for Configuration (Recommended)](#41--use-method-1use-fastdeploy_initbat-for-configuration-recommended)
|
||||
- [4.1.1 fastdeploy\_init.bat User's Manual](#411-fastdeploy_initbat-users-manual)
|
||||
- [4.1.2 fastdeploy\_init.bat View all dll, lib and include paths in the SDK](#412-fastdeploy_initbat-view-all-dll-lib-and-include-paths-in-the-sdk)
|
||||
- [4.1.3 fastdeploy\_init.bat Installs all the dlls in the SDK to the specified directory](#413-fastdeploy_initbat-installs-all-the-dlls-in-the-sdk-to-the-specified-directory)
|
||||
- [4.1.4 fastdeploy\_init.bat Configures SDK Environment Variables](#414-fastdeploy_initbat-configures-sdk-environment-variables)
|
||||
- [4.2 Use method 2:Modify CMakeLists.txt, One Line of Command Configuration (Recommended)](#42--use-method-2modify-cmakeliststxt-one-line-of-command-configuration-recommended)
|
||||
- [4.3 Use method 3:Command Line Setting Environment Variables](#43--use-method-3command-line-setting-environment-variables)
|
||||
- [4.4 Use method 4:Manually Copy the Dependency Library to the Exe Directory](#44-use-method-4manually-copy-the-dependency-library-to-the-exe-directory)
|
||||
|
||||
|
||||
## 1. Environment Dependent
|
||||
@@ -52,7 +57,7 @@ Please refer to source code compilation: [build_and_install](../build_and_instal
|
||||
### 2.2 Prepare Model Files and Test Images
|
||||
Model files and test images can be downloaded from the link below and unzipped
|
||||
```text
|
||||
https://bj.bcebos.com/paddlehub/fastdeploy/ppyoloe_crn_l_300e_coco.tgz # (下载后解压缩)
|
||||
https://bj.bcebos.com/paddlehub/fastdeploy/ppyoloe_crn_l_300e_coco.tgz # (please unzip it after downloading)
|
||||
https://gitee.com/paddlepaddle/PaddleDetection/raw/release/2.4/demo/000000014439.jpg
|
||||
```
|
||||
|
||||
|
Reference in New Issue
Block a user