Getting Started with Amlogic NPU on Khadas VIM3/VIM3L

Shenzhen Wesion released the NPU toolkit for Khadas VIM3/VIM3L last November, so I decided to try the latest Ubuntu 18.04 image and the NPU toolkit on Khadas VIM3L, before switching to VIM3 for reasons I’ll explain below.

I’ve followed two tutorials from the forum and wiki to run pre-built samples and then building a firmware image and samples from source.

Khadas VIM3L and VIM3 Have Different & Optional NPUs

This will be obvious to anyone who read the specs for Khadas VIM3 and VIM3L that the former comes with a 5 TOPS NPU, while the one in the latter only delivers up to 1.2 TOPS. But somehow, I forgot about this, and assume both had the same NPU making VIM3L more attractive but this type of task, Obviously I was wrong.

But the real reason I stopped using Khadas VIM3L can be seen in the photo below.

S905D3 vs S905D3-NON
My VIM3L Sample vs Shipping VIM3L Boards

My board is an early sample that comes with Amlogic S905D3 processor, but people who purchased Khadas VIM3L got a board with Amlogic S905D3-NON processor instead, and the NON part means it integrates an NPU. All Khadas VIM3L will ship with the NON version so you don’t have to worry, but if you ever wanted to use the NPU on other S905D3/S905X3 devices you’d have to make sure you have a “NON” processor first.

Running Pre-built Yolo & Facenet Demos

First, download the latest Ubuntu Desktop firmware for VIM3, and flash it using the same method we used for Android:


You may also consider using the server image to run those demos. If you choose the desktop image, you are told to switch to the framebuffer console using Ctrl+Alt+F1 before running them. I tried in XFCE, and I did not notice any issues, buy when running in an SSH session, I found out it would corrupt the video output.

We can now download the NPU demo binaries as follows:


You’ll get four directories:

  • detect_demo – Yoloface/YoloV2/YoloV3 demo with video (camera)
  • detect_demo_picture – Yoloface/YoloV2/YoloV3 demo with picture
  • detect_demo_khadas – YoloV2/YoloV3 for MIPI or UVC camera
  • inceptionv3 – Inceptionv3

I don’t have any USB webcam working in Linux right now, so I’ll play with detect_demo_picture in this section, and checking inceptionv3 when we are building the image from source.


After installation, we can check out the samples described in Gitlab.

  • Yoloface demo


Demos will generate output.bmp file which I’ll show under each command.

output type 0 yolofaceAs its name implies, Yoloface specifically tries to detect faces.

  • YoloV2 demo


output type 1 yolov2

  • YoloV3 demo


output type 2 yolov3

YoloV3 looks to be more accurate than YoloV2, but the samples do not output any performance data for other comparisons.

The last demo Facenet demo works with two passes: (0) writes data to emb.db, (1) performs inference:


Facenet detects a face, provides a name (Deng Liu), and save the results into two image files:

  • face_160.bmp (160×160 pixels)

face_160

  • face.bmp with slightly higher resolution and keeping the aspect ratio of the original photo

face

As mentioned above, I don’t have a USB camera that works properly in Linux. But it’s still detected, and for reference, here are some of the info I got when I connected mine and tried the samples:


The camera is Venus USB2.0 camera, but using the sample fails:


We can check the kernel log for more details:


It looks like the format used by the camera is not supported, and I had a similar issue when I tried similar inference samples with Jetson Nano SBC.

The source code for the above demo programs is available on Gitlab, but you’ll need to setup your host PC as described below in order to build those from source.

Building NPU demo from source

First, I had to download the NPU toolkit, and as noted in the announcement you’ll need to provide your contact and project details before download. However, at the time I did so, there was no manual review of my request, and I received receive an email almost immediately after application.

The file is named aml_npu_sdk.tgz, and we need to extract the file in a directory. I did so in ~/edev/amlogic/, and we’ll find four directories in aml_npu_dsk with docs, acuity-toolkit (Model conversion tools), and the Android and Linux SDKs.

Installing Tensorflow in a computer running Ubuntu 18.04

The first step is to install Tensorflow in your Ubuntu 16.04/18.04 computer. Mine is a laptop running Ubuntu 18.04:


One it’s done, we can check whether TensorFlow is properly installed by launching Python3, and running the following four commands in the shell:


This is the output on my machine:


The last line means that’s OK.

Models Conversion

Khadas VIM3(L) NPU SDK supports Caffe, Tensorflow, Tflite, Darknet, and ONNX models, and some sample scripts can be found in acuity-toolkit/conversion_scripts folder in the SDK to convert the model. We can execute the scripts as follows:


It will take a few minutes to complete (less than 10 minutes on my system).

Fenix installation

Fenix is a script to build Ubuntu/Debian images for Khadas VIM boards. Let’s make sure the system is up-to-date and install dependencies first:


Now we can create a working directory and retrieve the script from Github, before entering the directory, configuring the script and build the image:


The line before make configures the build. Here’s what I selected:


Running make will take a while, as you’ll be asked to input your password at one point to get sudo access:

This is what the output should look like at the end of a successful build:


We can now flash the image to the eMMC flash using the usual method:


The system boots and everything seems to be as it should good.

Khadas VIM3L LubuntuNow we can go back to our host computer, and build (cross-compiler) Inceptionv3 from source:


Once the build is done we can see inceptionv3 binary in bin_r folder:


But we’ll find all file needed for our demo in the bin_demo directory:


After copying all files to the board, the sample did not run successfully:


For some reason, the sample does not bother doing a look-up on “imagenet_slim_labels.txt” file, but if we open the file…


We can see “2” is goldfish (the index starts at 0), so it works fine.

For reference, if you try the sample, on hardware without NPU, or firmware that does not support the NPU, you’ll get the following error as I did on my early sample of Khadas VIM3L without NPU:


 

 

 

Share this:
FacebookTwitterHacker NewsSlashdotRedditLinkedInPinterestFlipboardMeWeLineEmailShare

Support CNX Software! Donate via cryptocurrencies, become a Patron on Patreon, or purchase goods on Amazon or Aliexpress

ROCK 5 ITX RK3588 mini-ITX motherboard

Leave a Reply

Your email address will not be published. Required fields are marked *

Boardcon Rockchip RK3588S SBC with 8K, WiFI 6, 4G LTE, NVME SSD, HDMI 2.1...
Boardcon Rockchip RK3588S SBC with 8K, WiFI 6, 4G LTE, NVME SSD, HDMI 2.1...