Khadas Docs

Amazing Khadas, always amazes you!

User Tools

Site Tools


Sidebar

products:sbc:vim3:npu:ksnn:ksnn-usage

KSNN Usage

This article shows VIM3 NPU usage examples through KSNN - Python API.

Install KSNN

Get the library and example code: khadas/ksnn

$ git clone --recursive https://github.com/khadas/ksnn.git

Install KSNN library:

$ cd ksnn/ksnn
$ pip3 install ksnn-1.4-py3-none-any.whl

Usage example

All Demo examples in the examples directory are sorted by folders.

$ cd ksnn/examples/ && ls
caffe  darknet  keras  onnx  pytorch  tensorflow  tflite

Choose keras and xception.py for example, other demos are similar.

$ cd keras && ls -1
data
libs
models
README.md
xception.py

The running commands and conversion parameters are in the README.md file in the same directory.

$ ~/ksnn/examples/keras$ cat README.md
 
# Run
 
$ python3 xception.py --model ./models/VIM3/xception_uint8.nb --library ./libs/libnn_xception_uint8.so --picture data/goldfish_299x299.jpg --level 0
 
 
# Convert
 
# uint8
$ ./convert \
--model-name xception \
--platform keras \
--model /home/yan/yan/Yan/models-zoo/keras/xception/xception.h5 \
--mean-values '127.5 127.5 127.5 0.007843137' \
--quantized-dtype asymmetric_affine \
--source-files ./data/dataset/dataset0.txt \
--kboard VIM3 --print-level 1
 
If you use VIM3L , please use `VIM3L` to replace `VIM3`

Run xception.py:

$ python3 xception.py --model ./models/VIM3/xception_uint8.nb --library ./libs/libnn_xception_uint8.so --picture data/goldfish_299x299.jpg --level 0
 |---+ KSNN Version: v1.4 +---| 
Start init neural network ...
Done.
Get input data ...
Done
Start inference ...
Done. inference time:  0.07830595970153809
----Xception----
-----TOP 5-----
[1]: 0.99609375
[0]: 0.0009250640869140625
[391]: 0.00019299983978271484
[29]: 0.00017976760864257812
[124]: 0.00016736984252929688

The –level parameter can be used to adjust the level of printed information. The following command sets the printing level to the highest.

$ python3 xception.py --model ./models/VIM3/xception_uint8.nb --library ./libs/libnn_xception_uint8.so --picture data/goldfish_299x299.jpg --level 2
 |---+ KSNN Version: v1.4 +---| 
Start init neural network ...
#productname=VIPNano-QI, pid=0x88
Create Neural Network: 47ms or 47458us
Done.
Get input data ...
Done
Start inference ...
Start run graph [1] times...
generate command buffer, total device count=1, core count per-device: 1, 
current device id=0, AXI SRAM base address=0xff000000
---------------------------Begin VerifyTiling -------------------------
AXI-SRAM = 1048320 Bytes VIP-SRAM = 522240 Bytes SWTILING_PHASE_FEATURES[1, 1, 0]
  0 NBG [(   0    0    0 0,        0, 0x(nil)(0x(nil), 0x(nil)) ->    0    0    0 0,        0, 0x(nil)(0x(nil), 0x(nil))) k(0 0    0,        0) pad(0 0) pool(0 0, 0 0)]
 
 id IN [ x  y  w   h ]   OUT  [ x  y  w  h ] (tx, ty, kpc) (ic, kc, kc/ks, ks/eks, kernel_type) NNT(in, out)
 
 id | opid IN [ x  y  w   h ]   OUT  [ x  y  w  h ] (tx, ty, kpc) (ic, kc, kc/ks, ks/eks, kernel_type) NNT(in, out)
  0 |   0 NBG DD 0x00000000 [   0    0        0        0] -> DD 0x00000000 [   0    0        0        0] (  0,   0,   0) (       0,        0, 0.000000%, 0.000000%, NONE) (       0,        0)
 
PreLoadWeightBiases = 1048320  100.000000%
---------------------------End VerifyTiling -------------------------
layer_id: 0 layer name:network_binary_graph operation[0]:unkown operation type target:unkown operation target.
uid: 0
abs_op_id: 0
execution time:             97432 us
[     1] TOTAL_READ_BANDWIDTH  (MByte): 135.867976
[     2] TOTAL_WRITE_BANDWIDTH (MByte): 74.167511
[     3] AXI_READ_BANDWIDTH  (MByte): 61.521023
[     4] AXI_WRITE_BANDWIDTH (MByte): 50.982863
[     5] DDR_READ_BANDWIDTH (MByte): 74.346954
[     6] DDR_WRITE_BANDWIDTH (MByte): 23.184648
[     7] GPUTOTALCYCLES: 77303691
[     8] GPUIDLECYCLES: 22877687
VPC_ELAPSETIME: 97775
*********
Run the 1 time: 100.00ms or 100247.00us
vxProcessGraph execution time:
Total   100.00ms or 100303.00us
Average 100.30ms or 100303.00us
Done. inference time:  0.11749053001403809
----Xception----
-----TOP 5-----
[1]: 0.99609375
[0]: 0.0009250640869140625
[391]: 0.00019299983978271484
[29]: 0.00017976760864257812
[124]: 0.00016736984252929688

You can see all relevant information.

Camera Demo

1. The Demos that currently support cameras include the Yolo series and OpenPose. Take Yolov3 as an example,

cd ksnn/examples/darknet
 
python3 yolov3-cap.py --model ./models/VIM3/yolov3_uint8.nb --library ./libs/libnn_yolov3_uint8.so --device X

More

Last modified: 2024/04/15 03:21 by louis