[Doc]Add English version of documents in docs/cn and api/vision_results (#931)

* 第一次提交

* 补充一处漏翻译

* deleted:    docs/en/quantize.md

* Update one translation

* Update en version

* Update one translation in code

* Standardize one writing

* Standardize one writing

* Update some en version

* Fix a grammer problem

* Update en version for api/vision result

* Merge branch 'develop' of https://github.com/charl-u/FastDeploy into develop

* Checkout the link in README in vision_results/ to the en documents

* Modify a title

* Add link to serving/docs/

* Finish translation of demo.md
This commit is contained in:
charl-u
2022-12-22 18:15:01 +08:00
committed by GitHub
parent ac255b8ab8
commit 02eab973ce
80 changed files with 1430 additions and 53 deletions

View File

@@ -1,3 +1,5 @@
English | [中文](../../cn/build_and_install/a311d.md)
# How to Build A311D Deployment Environment
FastDeploy supports AI deployment on Rockchip Soc based on Paddle Lite backend. For more detailed information, please refer to: [Paddle Lite Deployment Example](https://www.paddlepaddle.org.cn/lite/develop/demo_guides/verisilicon_timvx.html).

View File

@@ -1,3 +1,5 @@
English | [中文](../../cn/build_and_install/android.md)
# How to Build FastDeploy Android C++ SDK
FastDeploy supports Paddle Lite backend on Android. It supports both armeabi-v7a and arm64-v8a cpu architectures, and supports fp16 precision inference on the armv8.2 architecture. The relevant compilation options are described as follows:

View File

@@ -1,4 +1,4 @@
English | [中文](../../cn/build_and_install/cpu.md)
# How to Build CPU Deployment Environment

View File

@@ -1,4 +1,5 @@
English | [中文](../../cn/build_and_install/download_prebuilt_libraries.md)
# How to Install Prebuilt Library
FastDeploy provides pre-built libraries for developers to download and install directly. Meanwhile, FastDeploy also offers easy access to compile so that developers can compile FastDeploy according to their own needs.
@@ -92,7 +93,7 @@ Install the released versionLatest 1.0.1 for now, Android is 1.0.1
| Mac OSX x64 | [fastdeploy-osx-x86_64-1.0.1.tgz](https://bj.bcebos.com/fastdeploy/release/cpp/fastdeploy-osx-x86_64-1.0.1.tgz) | clang++ 10.0.0|
| Mac OSX arm64 | [fastdeploy-osx-arm64-1.0.1.tgz](https://bj.bcebos.com/fastdeploy/release/cpp/fastdeploy-osx-arm64-1.0.1.tgz) | clang++ 13.0.0 |
| Linux aarch64 | [fastdeploy-osx-arm64-1.0.1.tgz](https://bj.bcebos.com/fastdeploy/release/cpp/fastdeploy-linux-aarch64-1.0.1.tgz) | gcc 6.3 |
| Android armv7&v8 | [fastdeploy-android-1.0.0-shared.tgz](https://bj.bcebos.com/fastdeploy/release/android/fastdeploy-android-1.0.0-shared.tgz)| NDK 25, clang++, support arm64-v8aarmeabi-v7a |
| Android armv7&v8 | [fastdeploy-android-1.0.0-shared.tgz](https://bj.bcebos.com/fastdeploy/release/android/fastdeploy-android-1.0.0-shared.tgz)| NDK 25, clang++, support arm64-v8a and armeabi-v7a |
## Java SDK
@@ -109,6 +110,6 @@ Install the Develop versionNightly build
| Linux x64 | [fastdeploy-linux-x64-0.0.0.tgz](https://fastdeploy.bj.bcebos.com/dev/cpp/fastdeploy-linux-x64-0.0.0.tgz) | g++ 8.2 |
| Windows x64 | [fastdeploy-win-x64-0.0.0.zip](https://fastdeploy.bj.bcebos.com/dev/cpp/fastdeploy-win-x64-0.0.0.zip) | Visual Studio 16 2019 |
| Mac OSX x64 | [fastdeploy-osx-arm64-0.0.0.tgz](https://bj.bcebos.com/fastdeploy/dev/cpp/fastdeploy-osx-arm64-0.0.0.tgz) | - |
| Mac OSX arm64 | [fastdeploy-osx-arm64-0.0.0.tgz](https://fastdeploy.bj.bcebos.com/dev/cpp/fastdeploy-osx-arm64-0.0.0.tgz) | clang++ 13.0.0编译产出 |
| Mac OSX arm64 | [fastdeploy-osx-arm64-0.0.0.tgz](https://fastdeploy.bj.bcebos.com/dev/cpp/fastdeploy-osx-arm64-0.0.0.tgz) | clang++ 13.0.0 to compile |
| Linux aarch64 | - | - |
| Android armv7&v8 | - | - |

View File

@@ -1,3 +1,4 @@
English | [中文](../../cn/build_and_install/gpu.md)
# How to Build GPU Deployment Environment

View File

@@ -1,3 +1,4 @@
English | [中文](../../cn/build_and_install/ipu.md)
# How to Build IPU Deployment Environment

View File

@@ -1,3 +1,4 @@
English | [中文](../../cn/build_and_install/jetson.md)
# How to Build FastDeploy Library on Nvidia Jetson Platform

View File

@@ -0,0 +1,106 @@
English | [中文](../../cn/build_and_install/rknpu2.md)
# How to Build RKNPU2 Deployment Environment
## Notes
FastDeploy has initial support for RKNPU2 deployments. If you find bugs while using, please report an issue to give us feedback.
## Introduction
Currently, the following backend engines on the RK platform are supported:
| Backend | Platform | Model format supported | Description |
|:------------------|:---------------------|:-------|:-------------------------------------------|
| ONNX&nbsp;Runtime | RK356X <br> RK3588 | ONNX | Compile switch is controlled by setting `ENABLE_ORT_BACKEND` ON or OFF(default) |
| RKNPU2 | RK356X <br> RK3588 | RKNN | Compile switch is controlled by setting `ENABLE_RKNPU2_BACKEND` ON or OFF(default) |
## How to Build and Install C++ SDK
RKNPU2 only supports compiling on linux, the following steps are done on linux.
### Update the driver and install the compiling environment
Before running the program, we need to install the latest RKNPU driver, which is currently updated to 1.4.0. To simplify the installation, here is a quick install script.
**Method 1: Install via script**
```bash
# Download and unzip rknpu2_device_install_1.4.0
wget https://bj.bcebos.com/fastdeploy/third_libs/rknpu2_device_install_1.4.0.zip
unzip rknpu2_device_install_1.4.0.zip
cd rknpu2_device_install_1.4.0
# For RK3588
sudo rknn_install_rk3588.sh
# For RK356X
sudo rknn_install_rk356X.sh
```
**Method 2: Install via gitee**
```bash
# Install necessary packages
sudo apt update -y
sudo apt install -y python3
sudo apt install -y python3-dev
sudo apt install -y python3-pip
sudo apt install -y gcc
sudo apt install -y python3-opencv
sudo apt install -y python3-numpy
sudo apt install -y cmake
# download rknpu2
# For RK3588
git clone https://gitee.com/mirrors_rockchip-linux/rknpu2.git
sudo cp ./rknpu2/runtime/RK3588/Linux/librknn_api/aarch64/* /usr/lib
sudo cp ./rknpu2/runtime/RK3588/Linux/rknn_server/aarch64/usr/bin/* /usr/bin/
# For RK356X
git clone https://gitee.com/mirrors_rockchip-linux/rknpu2.git
sudo cp ./rknpu2/runtime/RK356X/Linux/librknn_api/aarch64/* /usr/lib
sudo cp ./rknpu2/runtime/RK356X/Linux/rknn_server/aarch64/usr/bin/* /usr/bin/
```
### Compile C++ SDK
```bash
git clone https://github.com/PaddlePaddle/FastDeploy.git
cd FastDeploy
mkdir build && cd build
# Only a few key configurations are introduced here, see README.md for details.
# -DENABLE_ORT_BACKEND: Whether to enable ONNX model, default OFF
# -DENABLE_RKNPU2_BACKEND: Whether to enable RKNPU model, default OFF
# -RKNN2_TARGET_SOC: Compile the SDK board model. Enter RK356X or RK3588 with case sensitive required.
cmake .. -DENABLE_ORT_BACKEND=ON \
-DENABLE_RKNPU2_BACKEND=ON \
-DENABLE_VISION=ON \
-DRKNN2_TARGET_SOC=RK3588 \
-DCMAKE_INSTALL_PREFIX=${PWD}/fastdeploy-0.0.3
make -j8
make install
```
### Compile Python SDK
Python packages depend on `wheel`, please run `pip install wheel` before compiling.
```bash
git clone https://github.com/PaddlePaddle/FastDeploy.git
cd FastDeploy
cd python
export ENABLE_ORT_BACKEND=ON
export ENABLE_RKNPU2_BACKEND=ON
export ENABLE_VISION=ON
export RKNN2_TARGET_SOC=RK3588
python3 setup.py build
python3 setup.py bdist_wheel
cd dist
pip3 install fastdeploy_python-0.0.0-cp39-cp39-linux_aarch64.whl
```
## Model Deployment
Please refer to [RKNPU2 Model Deployment](../faq/rknpu2/rknpu2.md).

View File

@@ -1,3 +1,5 @@
English | [中文](../../cn/build_and_install/rv1126.md)
# How to Build RV1126 Deployment Environment
FastDeploy supports AI deployment on Rockchip Soc based on Paddle Lite backend. For more detailed information, please refer to: [Paddle Lite Deployment Example](https://www.paddlepaddle.org.cn/lite/develop/demo_guides/verisilicon_timvx.html).

View File

@@ -0,0 +1,16 @@
English | [中文](../../cn/build_and_install/third_libraries.md)
# Third Library Dependency
FastDeploy will depend on the following third libraries according to compile options.
- OpenCV: OpenCV 3.4.16 library will be downloaded and pre-compiled automatically while ENABLE_VISION=ON.
- ONNX Runimte: ONNX Runtime library will be downloaded automatically while ENABLE_ORT_BACKEND=ON.
- OpenVINO: OpenVINO library will be downloaded automatically while ENABLE_OPENVINO_BACKEND=ON.
You can decide your own third libraries that exist in the environment by setting the following switches.
- OPENCV_DIRECTORY: Specify the OpenCV path in your environment, e.g. `-DOPENCV_DIRECTORY=/usr/lib/aarch64-linux-gnu/cmake/opencv4/`
- ORT_DIRECTORY: Specify the ONNX Runtime path in your environment, e.g.`-DORT_DIRECTORY=/download/onnxruntime-linux-x64-1.0.0`
- OPENVINO_DIRECTORY: Specify the OpenVINO path in your environment, e.g.`-DOPENVINO_DIRECTORY=//download/openvino`

View File

@@ -1,3 +1,5 @@
English | [中文](../../cn/build_and_install/xpu.md)
# How to Build KunlunXin XPU Deployment Environment
FastDeploy supports deployment AI on KunlunXin XPU based on Paddle Lite backend. For more detailed information, please refer to: [Paddle Lite Deployment Example](https://www.paddlepaddle.org.cn/lite/develop/demo_guides/kunlunxin_xpu.html#xpu)。