Khadas Docs

Amazing Khadas, always amazes you!

User Tools

Site Tools


Sidebar

products:sbc:vim3:npu:ksnn:demos:yolov7_tiny

This is an old revision of the document!


YOLOv7-tiny KSNN Demo - 1

Train the model

Download the YOLOv7 official code WongKinYiu/yolov7

$ git clone https://github.com/WongKinYiu/yolov7

Refer README.md to create and train a YOLOv7 tiny model.

Convert the model

Get the conversion tool

$ git clone --recursive https://github.com/khadas/aml_npu_sdk.git

The KSNN conversion tool is under acuity-toolkit/python,

$ cd aml_npu_sdk/acuity-toolkit/python && ls
$ convert  data  outputs

Convert

After training the model, modify yolov7/models/yolo.py as follows.

diff --git a/models/yolo.py b/models/yolo.py
index 95a019c..a2e611d 100644
--- a/models/yolo.py
+++ b/models/yolo.py
@@ -144,7 +144,7 @@ class IDetect(nn.Module):
             x[i] = self.m[i](x[i])  # conv
             bs, _, ny, nx = x[i].shape  # x(bs,255,20,20) to x(bs,3,20,20,85)
-            x[i] = x[i].view(bs, self.na, self.no, ny, nx).permute(0, 1, 3, 4, 2).contiguous()
+            # x[i] = x[i].view(bs, self.na, self.no, ny, nx).permute(0, 1, 3, 4, 2).contiguous()
 
             if not self.training:  # inference
                 if self.grid[i].shape[2:4] != x[i].shape[2:4]:

Then, run export.py to convert the model to ONNX.

$ python export.py

Enter aml_npu_sdk/acuity-toolkit/python and run command as follows.

$ ./convert --model-name yolov7_tiny \
            --platform onnx \
            --model yolov7_tiny.onnx \
            --mean-values '0 0 0 0.00392156' \
            --quantized-dtype asymmetric_affine \
            --source-files ./data/dataset/dataset0.txt \
            --kboard VIM3 --print-level 0 

If you use VIM3L , please use VIM3L to replace VIM3.

Last modified: 2023/12/21 05:24 by louis