Khadas Docs

Amazing Khadas, always amazes you!

User Tools

Site Tools


Sidebar

products:sbc:edge2:npu:demos:yolov7-tiny

This is an old revision of the document!


Demo1 - Yolov7 Tiny

Train Model

Download yolov7 official codes. Refer README.md to train a yolov7_tiny model.

$ git clone https://github.com/rockchip-linux/rknn-toolkit2.git

Convert Model

Build virtual environment

The SDK only supports python3.6 or python3.8, here is an example of creating a virtual environment for python3.8.

Install python packages.

$ sudo apt update
$ sudo apt install python3-dev python3-numpy

Follow this docs to install conda.

Then create a virtual environment.

$ conda create -n npu-env python=3.8
$ conda activate npu-env     #activate
$ conda deactivate           #deactivate

Get convert tool

Download Tool from Rockchip Github.

$ git clone https://github.com/rockchip-linux/rknn-toolkit2.git
$ git checkout 9ad79343fae625f4910242e370035fcbc40cc31a

Install dependences and RKNN toolkit2 packages,

$ cd rknn-toolkit2
$ sudo apt-get install python3 python3-dev python3-pip
$ sudo apt-get install libxslt1-dev zlib1g-dev libglib2.0 libsm6 libgl1-mesa-glx libprotobuf-dev gcc cmake
$ pip3 install -r doc/requirements_cp38-*.txt
$ pip3 install packages/rknn_toolkit2-*-cp38-cp38-linux_x86_64.whl

convert

After training model, run export.py to convert model from pt to onnx.

Enter rknn-toolkit2/examples/onnx/yolov5 and modify test.py as follows.

test.py
# Create RKNN object
rknn = RKNN(verbose=True)
 
# pre-process config
print('--> Config model')
rknn.config(mean_values=[[0, 0, 0]], std_values=[[255, 255, 255]], target_platform='rk3588')
print('done')
 
# Load ONNX model
print('--> Loading model')
ret = rknn.load_onnx(model=”./yolov7_tiny.onnx)
if ret != 0:
    print('Load model failed!')
    exit(ret)
print('done')
 
# Build model
print('--> Building model')
ret = rknn.build(do_quantization=True, dataset=”./dataset.txt)
if ret != 0:
    print('Build model failed!')
    exit(ret)
print('done')
 
# Export RKNN model
print('--> Export rknn model')
ret = rknn.export_rknn(“./yolov7_tiny.rknn)
if ret != 0:
    print('Export rknn model failed!')
    exit(ret)
print('done')

Run test.py to generate rknn model.

$ python3 test.py

Run NPU

Get source code

Clone the source code form our edge2-npu.

$ git clone https://github.com/khadas/edge2-npu

Install dependencies

$ sudo apt update
$ sudo apt install cmake libopencv-dev

Compile and run

Picture input demo

Put yolov7_tiny.rknn in edge2-npu/C++/yolov7_tiny/data/model

# compile
$ bash build.sh
 
# run
$ cd install/yolov7_tiny
$ ./yolov7_tiny data/model/yolov7_tiny.rknn data/img/bus.jpg

Camera input demo

Put yolov7_tiny.rknn in edge2-npu/C++/yolov7_tiny_cap/data/model

# compile
$ bash build.sh
 
# run
$ cd install/yolov7_tiny
$ ./yolov7_tiny data/model/yolov7_tiny.rknn 33

33 is camera device node index

If your yolov7_tiny model classes is not the same as coco, please change ‘’data/coco_80_labels_list.txt’’ and the ‘’OBJ_CLASS_NUM’’ in ‘’include/postprocess.h’’.

Last modified: 2023/08/25 01:03 by hyphop