Khadas Docs

Amazing Khadas, always amazes you!

User Tools

Site Tools


products:sbc:vim3:npu:ksnn:ksnn-usage

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
products:sbc:vim3:npu:ksnn:ksnn-usage [2023/09/10 05:34]
hyphop
products:sbc:vim3:npu:ksnn:ksnn-usage [2024/04/15 03:21] (current)
louis
Line 4: Line 4:
 ====== KSNN Usage ====== ====== KSNN Usage ======
  
-This article shows VIM3 NPU usage examples through **KSSN** - Python API.+This article shows VIM3 NPU usage examples through **KSNN** - Python API.
  
 <WRAP help > <WRAP help >
-**KSSN** is [[https://github.com/khadas/ksnn#readme|Khadas Software Neural Network]]+**KSNN** is [[https://github.com/khadas/ksnn#readme|Khadas Software Neural Network]]
 </WRAP> </WRAP>
  
 ===== Install KSNN ===== ===== Install KSNN =====
  
-Get code: [[gh>khadas/ksnn]]+Get the library and example code: [[gh>khadas/ksnn]]
  
 ```shell ```shell
 $ git clone --recursive https://github.com/khadas/ksnn.git $ git clone --recursive https://github.com/khadas/ksnn.git
-``` 
- 
-Installation dependencies: 
- 
-```shell 
-$ pip3 install matplotlib 
 ``` ```
  
Line 28: Line 22:
 ```shell ```shell
 $ cd ksnn/ksnn $ cd ksnn/ksnn
-$ pip3 install ksnn-1.3-py3-none-any.whl+$ pip3 install ksnn-1.4-py3-none-any.whl
 ``` ```
  
Line 40: Line 34:
 ``` ```
  
-Choose ''tensorflow'' and ''inceptionv3.py'' for example, other demos are similar.+Choose ''keras'' and ''xception.py'' for example, other demos are similar.
  
 ```shell ```shell
-$ cd tensorflow && ls -1 +$ cd keras && ls -1 
-data  +data
 libs libs
 models models
-README.md   +README.md 
-box_priors.txt +xception.py
-inceptionv3.py +
-mobilenet_ssd_picture.py+
 ``` ```
  
-The running commands and conversion parameters are in the ''README'' file in the corresponding directory.+The running commands and conversion parameters are in the ''README.md'' file in the same directory.
  
 ```shell ```shell
-$ ~/ksnn/examples/tensorflow$ cat README.md+$ ~/ksnn/examples/keras$ cat README.md
  
-run+Run 
 + 
 +$ python3 xception.py --model ./models/VIM3/xception_uint8.nb --library ./libs/libnn_xception_uint8.so --picture data/goldfish_299x299.jpg --level 0
  
-$ python3 inceptionv3.py --model ./models/VIM3/inceptionv3.nb --library ./libs/libnn_inceptionv3.so --picture ./data/goldfish_299x299.jpg --level 0 
-$ python3 mobilenet_ssd_picture.py --model ./models/VIM3/mobilenet_ssd.nb --library ./libs/libnn_mobilenet_ssd.so --picture data/1080p.bmp --level 0 
  
 # Convert # Convert
  
 +# uint8
 $ ./convert \ $ ./convert \
---model-name inception +--model-name xception 
---platform tensorflow +--platform keras 
---model inception_v3_2016_08_28_frozen.pb \ +--model /home/yan/yan/Yan/models-zoo/keras/xception/xception.h5 
---input-size-list '299,299,3'+--mean-values '127.5 127.5 127.5 0.007843137' \
---inputs input \ +
---outputs InceptionV3/Predictions/Reshape_1 \ +
---mean-values '128,128,128,128'+
---quantized-dtype asymmetric_affine \ +
---kboard VIM3 --print-level 1 +
- +
-$ ./convert \ +
---model-name mobilenet_ssd \ +
---platform tensorflow \ +
---model ssd_mobilenet_v1_coco_2017_11_17.pb \ +
---input-size-list '300,300,3'+
---inputs FeatureExtractor/MobilenetV1/MobilenetV1/Conv2d_0/BatchNorm/batchnorm/mul_1 \ +
---outputs "'concat concat_1'" +
---mean-values '127.5,127.5,127.5,127.5' \+
 --quantized-dtype asymmetric_affine \ --quantized-dtype asymmetric_affine \
 +--source-files ./data/dataset/dataset0.txt \
 --kboard VIM3 --print-level 1 --kboard VIM3 --print-level 1
  
Line 91: Line 71:
 ``` ```
  
-Run Inception V3:+Run ''xception.py'':
  
 ```shell ```shell
-$ python3 inceptionv3.py --model ./models/VIM3/inceptionv3.nb --library ./libs/libnn_inceptionv3.so --picture ./data/goldfish_299x299.jpg --level 0 +$ python3 xception.py --model ./models/VIM3/xception_uint8.nb --library ./libs/libnn_xception_uint8.so --picture data/goldfish_299x299.jpg --level 0 
- |--- KSNN Version: v1.+---| + |---KSNN Version: v1.+---| 
 Start init neural network ... Start init neural network ...
 Done. Done.
 Get input data ... Get input data ...
-Done.+Done
 Start inference ... Start inference ...
-Done. inference :  0.042353153228759766 +Done. inference time:  0.07830595970153809 
------ Show Top5 +----- +----Xception---- 
-     2: 0.93457 +-----TOP 5----- 
-   795: 0.00328 +[1]: 0.99609375 
-   408: 0.00158 +[0]: 0.0009250640869140625 
-   974: 0.00148 +[391]: 0.00019299983978271484 
-   393: 0.00093+[29]: 0.00017976760864257812 
 +[124]: 0.00016736984252929688
 ``` ```
  
Line 113: Line 94:
  
 ```shell ```shell
-$ python3 inceptionv3.py --model ./models/VIM3/inceptionv3.nb --library ./libs/libnn_inceptionv3.so --picture ./data/goldfish_299x299.jpg --level 2 +$ python3 xception.py --model ./models/VIM3/xception_uint8.nb --library ./libs/libnn_xception_uint8.so --picture data/goldfish_299x299.jpg --level 2 
- |--- KSNN Version: v1.+---| + |---KSNN Version: v1.+---| 
 Start init neural network ... Start init neural network ...
 #productname=VIPNano-QI, pid=0x88 #productname=VIPNano-QI, pid=0x88
-Create Neural Network: 283ms or 283181us+Create Neural Network: 47ms or 47458us
 Done. Done.
 Get input data ... Get input data ...
-Done.+Done
 Start inference ... Start inference ...
 Start run graph [1] times... Start run graph [1] times...
Line 126: Line 107:
 current device id=0, AXI SRAM base address=0xff000000 current device id=0, AXI SRAM base address=0xff000000
 ---------------------------Begin VerifyTiling ------------------------- ---------------------------Begin VerifyTiling -------------------------
-AXI-SRAM = 1048576 Bytes VIP-SRAM = 522240 Bytes SWTILING_PHASE_FEATURES[1, 1, 0]+AXI-SRAM = 1048320 Bytes VIP-SRAM = 522240 Bytes SWTILING_PHASE_FEATURES[1, 1, 0]
   0 NBG [(      0    0 0,        0, 0x(nil)(0x(nil), 0x(nil)) ->    0    0    0 0,        0, 0x(nil)(0x(nil), 0x(nil))) k(0 0    0,        0) pad(0 0) pool(0 0, 0 0)]   0 NBG [(      0    0 0,        0, 0x(nil)(0x(nil), 0x(nil)) ->    0    0    0 0,        0, 0x(nil)(0x(nil), 0x(nil))) k(0 0    0,        0) pad(0 0) pool(0 0, 0 0)]
  
- id IN [ x  y  w   h ]   OUT  [ x  y  w  h ] (tx, ty, kpc) (ic, kc, kc/ks, ks/eks, kernel_type) + id IN [ x  y  w   h ]   OUT  [ x  y  w  h ] (tx, ty, kpc) (ic, kc, kc/ks, ks/eks, kernel_type) NNT(inout)
-   0 NBG DD 0x(nil) [      0        0        0] -> DD 0x(nil) [      0        0        0] (  0,   0,   0) (       0,        0, 0.000000%, 0.000000%NONE)+
  
-PreLoadWeightBiases = 1048576  100.000000%+ id | opid IN [ x  y  w   h ]   OUT  [ x  y  w  h ] (tx, ty, kpc) (ic, kc, kc/ks, ks/eks, kernel_type) NNT(in, out) 
 +  0 |   0 NBG DD 0x00000000 [      0        0        0] -> DD 0x00000000 [      0        0        0] (  0,   0,   0) (       0,        0, 0.000000%, 0.000000%, NONE) (       0,        0) 
 + 
 +PreLoadWeightBiases = 1048320  100.000000%
 ---------------------------End VerifyTiling ------------------------- ---------------------------End VerifyTiling -------------------------
 layer_id: 0 layer name:network_binary_graph operation[0]:unkown operation type target:unkown operation target. layer_id: 0 layer name:network_binary_graph operation[0]:unkown operation type target:unkown operation target.
 uid: 0 uid: 0
 abs_op_id: 0 abs_op_id: 0
-execution time:             20552 us +execution time:             97432 us 
-[     1] TOTAL_READ_BANDWIDTH  (MByte): 67.540481 +[     1] TOTAL_READ_BANDWIDTH  (MByte): 135.867976 
-[     2] TOTAL_WRITE_BANDWIDTH (MByte): 18.245340 +[     2] TOTAL_WRITE_BANDWIDTH (MByte): 74.167511 
-[     3] AXI_READ_BANDWIDTH  (MByte): 30.711348 +[     3] AXI_READ_BANDWIDTH  (MByte): 61.521023 
-[     4] AXI_WRITE_BANDWIDTH (MByte): 15.229973 +[     4] AXI_WRITE_BANDWIDTH (MByte): 50.982863 
-[     5] DDR_READ_BANDWIDTH (MByte): 36.829133 +[     5] DDR_READ_BANDWIDTH (MByte): 74.346954 
-[     6] DDR_WRITE_BANDWIDTH (MByte): 3.015367 +[     6] DDR_WRITE_BANDWIDTH (MByte): 23.184648 
-[     7] GPUTOTALCYCLES: 94344921 +[     7] GPUTOTALCYCLES: 77303691 
-[     8] GPUIDLECYCLES: 78109663 +[     8] GPUIDLECYCLES: 22877687 
-VPC_ELAPSETIME: 118090+VPC_ELAPSETIME: 97775
 ********* *********
-Run the 1 time: 118.00ms or 118636.00us+Run the 1 time: 100.00ms or 100247.00us
 vxProcessGraph execution time: vxProcessGraph execution time:
-Total   118.00ms or 118996.00us +Total   100.00ms or 100303.00us 
-Average 119.00ms or 118996.00us +Average 100.30ms or 100303.00us 
-Done. inference :  0.1422710418701172 +Done. inference time:  0.11749053001403809 
------ Show Top5 +----- +----Xception---- 
-     2: 0.93457 +-----TOP 5----- 
-   795: 0.00328 +[1]: 0.99609375 
-   408: 0.00158 +[0]: 0.0009250640869140625 
-   974: 0.00148 +[391]: 0.00019299983978271484 
-   393: 0.00093+[29]: 0.00017976760864257812 
 +[124]: 0.00016736984252929688
 ``` ```
  
Line 167: Line 151:
 1. The Demos that currently support cameras include the Yolo series and OpenPose. Take Yolov3 as an example, 1. The Demos that currently support cameras include the Yolo series and OpenPose. Take Yolov3 as an example,
  
-```shell +```sh 
-cd ksnn/examples/darknet +cd ksnn/examples/darknet
-$ python3 hand-cap.py --model ./models/VIM3/hand.nb --library ./libs/libnn_hand.so --device X +
-```+
  
-2. Currently, the only demo that supports RTSP is the yolo series. Take Yolov3 as an example, +python3 yolov3-cap.py --model ./models/VIM3/yolov3_uint8.nb --library ./libs/libnn_yolov3_uint8.so --device X
- +
-```shell +
-$ cd ksnn/examples/darknet +
-python3 flask-yolov3.py --model ./models/VIM3/yolov3.nb --library ./libs/libnn_yolov3.so --device X+
 ``` ```
  
Last modified: 2023/09/10 05:34 by hyphop