Khadas Docs

Amazing Khadas, always amazes you!

User Tools

Site Tools


Sidebar

products:sbc:common:applications:armnn-tflite

ArmNN TFLite Delegate

Khadas Edges and VIMs are powered by ARM Cortex-A CPUs and Mali GPUs.

We can make use of the available computation power for running TFLite models using ArmNN Libraries by leveraging TFLite interpreter Delegate.

The Delegate is provided by ARM as arm-software/armnn and the provided examples are made to make use of it.

This library depends on OpenCL, make sure you are using the following platforms to ensure the driver is present.

board Linux Kernel (BSP) OS
VIM3 4.9
5.15
Ubuntu 22.04
Ubuntu 24.04
VIM3L 4.9
5.15
Ubuntu 22.04
Ubuntu 24.04
VIM4 5.4
5.15
Ubuntu 22.04
Ubuntu 24.04
Edge2 5.10
6.1
Ubuntu 22.04
Ubuntu 24.04

You can refer to the OpenCL doc for more info.

Setup the environment

Install pip

$ sudo apt-get install python3-pip

Ubuntu 24.04 ships with the updated 3.12 version of python which Tensorflow has not been updated to the newer python version, hence we must use an older supported version.

Furthermore, it has constraints on installing non-system python packages, as a result, python packages must be installed inside a python virtual environment to avoid conflicts.

You must activate this environment whenever you need to use the application.

Add the alternate python repository

$ sudo add-apt-repository ppa:deadsnakes/ppa
$ sudo apt install python3.10 python3.10-venv

Create a new virtual environment

$ python3.10 -m venv work

Activate the virtual environment

$ source work/bin/activate

Deactivate the virtual environment

$ deactivate

Get source code

Clone the examples sravansenthiln1/armnn_tflite

$ git clone https://github.com/sravansenthiln1/armnn_tflite
$ cd armnn_tflite

Install necessary python packages

pip3 install numpy==1.26.2 tflite_runtime pillow opencv librosa sounddevice

Download ArmNN libraries

$ wget -O ArmNN-aarch64.tgz https://github.com/ARM-software/armnn/releases/download/v24.11/ArmNN-linux-aarch64.tar.gz
$ mkdir libs
$ tar -xvf ArmNN-aarch64.tgz -C libs

Run Examples

Taking the Mobilenet v1 as example.

Enter the example directory

$ cd mobilenet_v1
$ sudo ln ../libs/delegate/libarmnnDelegate.so.29.1 libarmnnDelegate.so.29
$ sudo ln ../libs/libarmnn.so.34.0 libarmnn.so.34

Run the example

$ python3 run_inference.py

You can modify whether you want to use the CPU or GPU to accelerate the inference. Change the BACKEND variable in the code to use either GpuAcc - GPU or CpuAcc - CPU backends.

Last modified: 2024/12/16 05:22 by sravan