This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
products:sbc:vim3:npu:npu-prebuilt-demo-usage [2022/09/16 02:27] ivan |
products:sbc:vim3:npu:npu-prebuilt-demo-usage [2023/09/11 09:58] (current) hyphop |
||
---|---|---|---|
Line 1: | Line 1: | ||
+ | ~~tag> | ||
+ | |||
====== NPU Prebuilt Demo Usage ====== | ====== NPU Prebuilt Demo Usage ====== | ||
- | < | + | Prebuilt example demos for interacting with the Amlogic NPU using OpenCV4 |
- | - Please follow this docs to upgrade | + | |
- | - Just support Opencv4 | + | ===== Install OpenCV4 ===== |
+ | Update your system and install the OpenCV packages. | ||
+ | ```shell | ||
+ | $ sudo apt update | ||
+ | $ sudo apt install libopencv-dev python3-opencv | ||
+ | ``` | ||
+ | |||
+ | ===== Get NPU Demo ===== | ||
+ | |||
+ | /* | ||
+ | < | ||
+ | The NPU Demo is not installed on the board by default. You need to download it from GitHub first. | ||
</ | </ | ||
+ | */ | ||
- | ===== install OpenCV4 ===== | + | Get the demo source: [[gh> |
+ | |||
+ | ```shell | ||
+ | $ git clone --recursive https:// | ||
+ | ``` | ||
- | ```sh Install-OpenCV4.sh | + | The NPU demo contains three examples: |
- | sudo apt install libopencv-dev python3-opencv | + | |
+ | | ||
+ | - '' | ||
+ | - '' | ||
+ | |||
+ | ===== Inception Model ===== | ||
+ | The inception model does not have any library dependencies and can be used as is. | ||
+ | |||
+ | |||
+ | Enter the '' | ||
+ | |||
+ | ```shell | ||
+ | $ cd aml_npu_demo_binaries/ | ||
+ | $ ls | ||
+ | dog_299x299.jpg | ||
``` | ``` | ||
- | ===== Get NPU Demo ===== | + | '' |
<WRAP tip > | <WRAP tip > | ||
- | NPU Demo is not installed | + | Depending |
</ | </ | ||
- | 1) Clone to the board through the git command. | + | ```shell |
+ | $ ls aml_npu_demo_binaries/ | ||
+ | $ inceptionv3 | ||
+ | $ cd aml_npu_demo_binaries/ | ||
+ | $ ./run.sh | ||
+ | Create Neural Network: 59ms or 59022us | ||
+ | Verify... | ||
+ | Verify Graph: 0ms or 739us | ||
+ | Start run graph [1] times... | ||
+ | Run the 1 time: 20.00ms or 20497.00us | ||
+ | vxProcessGraph execution time: | ||
+ | Total | ||
+ | Average 20.54ms or 20540.00us | ||
+ | --- Top5 --- | ||
+ | 2: 0.833984 | ||
+ | 795: 0.009102 | ||
+ | 974: 0.003592 | ||
+ | 408: 0.002207 | ||
+ | 393: 0.002111 | ||
+ | ``` | ||
+ | <WRAP Info> | ||
+ | | ||
+ | By querying '' | ||
- | ```sh get-npu-demo-from-demo.sh | + | </WRAP> |
- | cd {workspace} | + | |
- | git clone --recursive https:// | + | |
- | ``` | + | |
- | 2) Or download the compressed package directly, and then unzip it to the board. | + | |
- | There are three directories in NPU Demo: | + | ===== Yolo Series Model ===== |
- | <tabbox detect_demo> | + | ==== Install and uninstall libraries ==== |
- | - detect_demo: | + | |
- | <tabbox detect_demo_picture> | + | |
- | - detect_demo_picture: | + | |
- | <tabbox inceptionv3> | + | |
- | - inceptionv3: | + | |
- | </ | + | |
- | ===== Inception Model ===== | + | The yolo series models need to install the library into the system. Both the '' |
+ | You can follow the steps to either install or uninstall the libraries. | ||
- | - The inception model does not need to install any libraries | + | Install |
- | - imagenet_slim_labels.txt is a label file. After the result is identified, the label corresponding to the result can be queried in this file. | + | |
+ | ```shell | ||
+ | $ cd aml_npu_demo_binaries/ | ||
+ | $ sudo ./INSTALL | ||
+ | ``` | ||
+ | |||
+ | Uninstall libraries: | ||
+ | |||
+ | ```shell | ||
+ | $ cd aml_npu_demo_binaries/ | ||
+ | $ sudo ./UNINSTALL | ||
+ | ``` | ||
- | $ cd {workspace}/ | + | ==== Type Parameter Description ==== |
- | $ ls | + | |
- | | + | |
<WRAP important > | <WRAP important > | ||
- | If your board is VIM3, enter the VIM3 directory, if it is VIM3L, then enter the VIM3L directory. Here is VIM3 as an example. | + | The type parameter |
</ | </ | ||
- | | + | ```shell |
- | 2 $ inceptionv3 | + | 0 : yoloface model |
+ | 1 : yolov2 model | ||
+ | 2 : yolov3 model | ||
+ | 3 : yolov3_tiny model | ||
+ | 4 : yolov4 model | ||
+ | ``` | ||
+ | |||
+ | ==== Operating Environment for NPU demo ==== | ||
| | ||
- | 1 | + | NPU Demo can run in X11 Desktop or framebuffer mode, just select the corresponding demo to run. |
- | | + | |
- | 3 Create Neural Network: 59ms or 59022us | + | - The demo with fb is running in framebuffer mode. |
- | 4 Verify... | + | - The demo with x11 is running in X11 mode. |
- | 5 Verify Graph: 0ms or 739us | + | |
- | 6 Start run graph [1] times... | + | ==== Demo examples ==== |
- | | + | === detect_demo_picture === |
- | | + | |
- | | + | ```shell |
- | 10 Average 20.54ms or 20540.00us | + | $ cd aml_npu_demo_binaries/ |
- | | + | $ ls |
- | 12 2: 0.833984 | + | 1080p.bmp detect_demo_x11 |
- | 13 795: 0.009102 | + | ``` |
- | 14 974: 0.003592 | + | == Run == |
- | 15 408: 0.002207 | + | |
- | 16 393: 0.002111 | + | Command format of the picture. |
+ | |||
+ | ```shell | ||
+ | $ cd aml_npu_demo_binaries/ | ||
+ | $ ./ | ||
+ | ``` | ||
+ | |||
+ | Here is an example of using OpenCV4 to call the '' | ||
+ | |||
+ | ```shell | ||
+ | $ cd aml_npu_demo_binaries/ | ||
+ | $ ./ | ||
+ | ``` | ||
+ | |||
+ | The results of the operation are as follows. | ||
+ | |||
+ | {{: | ||
+ | |||
+ | === detect_demo === | ||
+ | |||
+ | <WRAP tip > | ||
+ | You should use the demo of usb to use the USB camera, and the demo of mipi to use the mipi camera. | ||
+ | </ | ||
+ | |||
+ | == Run == | ||
+ | Command format for camera dynamic recognition. | ||
+ | |||
+ | ```shell | ||
+ | $ cd aml_npu_demo_binaries/ | ||
+ | $ ./ | ||
+ | ``` | ||
+ | Here is an example of using OpenCV4 to call '' | ||
+ | ```shell | ||
+ | $ cd aml_npu_demo_binaries/ | ||
+ | $ ./ | ||
+ | ``` | ||
+ | |||
+ | <WRAP info > | ||
+ | After turning on the camera, the recognition result will be displayed on the screen. | ||
+ | </ | ||
+ | |||
+ | {{: |