This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision Next revision | Previous revision | ||
|
products:sbc:edge2:npu:demos:yolov8n [2023/08/22 06:09] louis |
products:sbc:edge2:npu:demos:yolov8n [2025/06/24 06:02] (current) louis |
||
|---|---|---|---|
| Line 1: | Line 1: | ||
| - | ====== Demo2 yolov8n ====== | + | ~~tag> YOLO NPU Edge2 RK3588~~ |
| - | ===== Get Source Code ===== | + | ====== YOLOv8n OpenCV Edge2 Demo - 2 ====== |
| - | Download yolov8 official codes. Refer README.md | + | {{indexmenu_n> |
| + | |||
| + | ===== Introduction ===== | ||
| + | |||
| + | YOLOv8n is an object detection model. It uses bounding boxes to precisely draw each object in image. | ||
| + | |||
| + | Inference results on Edge2. | ||
| + | |||
| + | {{: | ||
| + | |||
| + | **Inference speed test**: USB camera about **52ms** per frame. MIPI camera about **40ms** per frame. | ||
| + | |||
| + | ===== Train Model ===== | ||
| + | |||
| + | Download YOLOv8 official code [[gh> | ||
| ```shell | ```shell | ||
| $ git clone https:// | $ git clone https:// | ||
| ``` | ``` | ||
| + | |||
| + | Refer '' | ||
| ===== Convert Model ===== | ===== Convert Model ===== | ||
| Line 34: | Line 50: | ||
| ==== Get convert tool ==== | ==== Get convert tool ==== | ||
| - | Download Tool from [[https:// | + | Download Tool from [[gh>rockchip-linux/ |
| ```shell | ```shell | ||
| Line 51: | Line 67: | ||
| ``` | ``` | ||
| - | ==== convert | + | ==== Convert |
| - | After training model, modify ultralytics/ | + | After training model, modify |
| - | ```shell | + | ```diff head.py |
| diff --git a/ | diff --git a/ | ||
| index 0b02eb3..0a6e43a 100644 | index 0b02eb3..0a6e43a 100644 | ||
| Line 85: | Line 101: | ||
| ``` | ``` | ||
| - | Create a python file written as follows to export onnx model. | + | <WRAP important> |
| + | If you pip-installed ultralytics package, you should modify in package. | ||
| + | </ | ||
| - | ```shell | + | Create a python file written as follows to export **onnx** model. |
| + | |||
| + | ```python export.py | ||
| from ultralytics import YOLO | from ultralytics import YOLO | ||
| model = YOLO(" | model = YOLO(" | ||
| Line 93: | Line 113: | ||
| ``` | ``` | ||
| - | Enter rknn-toolkit2/examples/onnx/yolov5 and modify test.py as follows. | + | <WRAP important> |
| + | Use [[https://netron.app/ | Netron]] to check your model output like this. If not, please check your '' | ||
| - | ```shell | + | {{: |
| + | </ | ||
| + | |||
| + | Enter '' | ||
| + | |||
| + | ```python test.py | ||
| # Create RKNN object | # Create RKNN object | ||
| rknn = RKNN(verbose=True) | rknn = RKNN(verbose=True) | ||
| Line 106: | Line 132: | ||
| # Load ONNX model | # Load ONNX model | ||
| print(' | print(' | ||
| - | ret = rknn.load_onnx(model=”./ | + | ret = rknn.load_onnx(model='./ |
| if ret != 0: | if ret != 0: | ||
| print(' | print(' | ||
| Line 114: | Line 140: | ||
| # Build model | # Build model | ||
| print(' | print(' | ||
| - | ret = rknn.build(do_quantization=True, | + | ret = rknn.build(do_quantization=True, |
| if ret != 0: | if ret != 0: | ||
| print(' | print(' | ||
| Line 122: | Line 148: | ||
| # Export RKNN model | # Export RKNN model | ||
| print(' | print(' | ||
| - | ret = rknn.export_rknn(“./ | + | ret = rknn.export_rknn('./ |
| if ret != 0: | if ret != 0: | ||
| print(' | print(' | ||
| Line 129: | Line 155: | ||
| ``` | ``` | ||
| - | Run test.py to generate rknn model. | + | Run '' |
| ```shell | ```shell | ||
| Line 139: | Line 165: | ||
| ==== Get source code ==== | ==== Get source code ==== | ||
| - | Clone the source code form our [[https:// | + | Clone the source code from our [[gh>khadas/ |
| ```shell | ```shell | ||
| Line 156: | Line 182: | ||
| === Picture input demo === | === Picture input demo === | ||
| - | Put yolov8n.rknn in edge2-npu/ | + | Put '' |
| ```shell | ```shell | ||
| - | // compile | + | # Compile |
| $ bash build.sh | $ bash build.sh | ||
| - | // run | + | # Run |
| $ cd install/ | $ cd install/ | ||
| $ ./yolov8n data/ | $ ./yolov8n data/ | ||
| Line 169: | Line 195: | ||
| === Camera input demo === | === Camera input demo === | ||
| - | Put yolov8n.rknn in edge2-npu/ | + | Put '' |
| ```shell | ```shell | ||
| - | // compile | + | # Compile |
| $ bash build.sh | $ bash build.sh | ||
| - | // run | + | # Run USB camera |
| + | $ cd install/yolov8n_cap | ||
| + | $ ./yolov8n_cap data/ | ||
| + | |||
| + | # Run MIPI camera | ||
| + | $ cd install/ | ||
| + | $ ./ | ||
| + | ``` | ||
| + | |||
| + | <WRAP info > | ||
| + | '' | ||
| + | </ | ||
| + | |||
| + | === Camera input multithreading demo === | ||
| + | |||
| + | Put '' | ||
| + | |||
| + | ```shell | ||
| + | # Compile | ||
| + | $ bash build.sh | ||
| + | |||
| + | # Run USB camera | ||
| + | $ cd install/ | ||
| + | $ ./ | ||
| + | |||
| + | # Run MIPI camera | ||
| $ cd install/ | $ cd install/ | ||
| - | $ ./ | + | $ ./ |
| ``` | ``` | ||
| - | '' | + | <WRAP info > |
| + | The last num, '' | ||
| + | </ | ||
| <WRAP tip > | <WRAP tip > | ||
| - | If your yolov8n | + | If your **YOLOv8n** |
| </ | </ | ||