NPU Demo and source code

Hello @CodeLogist

Yes, it all about IR-CUT. You can use the camera test command to get the preview image with IR-CUT anabled or disabled.

  • IR-CUT enabled:
$ v4l2_test  -c 1 -p 0 -F 0 -f 0 -D 0 -R 1 -r 2 -d 2 -N 1000 -n 800 -w 0 -e 1 -I 1 -b /dev/fb0 -v /dev/video0
  • IR-CUT disabled:
$ v4l2_test  -c 1 -p 0 -F 0 -f 0 -D 0 -R 1 -r 2 -d 2 -N 1000 -n 800 -w 0 -e 1 -I 0 -b /dev/fb0 -v /dev/video0

And which image you used to test ? I checked the latest Ubunt Focal image, the image display is correct when run the Yolo demo.


Thanks @numbqq, disabling the IR cut works. I used Ubuntu kernel 4.9 as the latest kernel 5.7 don’t support touchscreen. What did you changed in demo code to disable the IR in latest version? What should I edit in code just to disable it?

Hello @CodeLogist

I have updated the demo to add IR-CUT control support. Please update your repo.

  • IR-CUT enable:
$ ./detect_demo_mipi /dev/video0 1 on
  • IR-CUT disable:
$ ./detect_demo_mipi /dev/video0 1 off

Ohk Thank you so much @numbqq, this feature is much needed Thanks. I don’t see any documentation about this feature on forum or docs. Should I make another post about it and write the solution their itself for other new members?



Hello @CodeLogist

We will add some information to docs, thank you!


Hi everyone,

Can someone help me there on topic: Modifying NPU demo app to run on GUI desktop or support touch response in FrameBuffer mode

Its relevant to current topic only but though it would be better for new members also to create a new topic.


1 Like

@numbqq please help me! Thanks you? demo edit in code, new camera Sony IMX586 8000 x 6000 48 MP, please help me! ISO Low Min Max 1 - 102400, Up Exposure time min max 1/25000s - 256s

1 Like

Is there now a demo that can accept rtsp input and video file?

1 Like

Hi @RichardG maybe @Archangel1235 might know something, he was talking about implementing rtsp with some demos…

How to run the, as it gives me the following message, any help is apprecaited.
I am trying to compile my own model and build it on the khadas aml_nputy _app so as to generate binaries, but having difficulty in running this script. Please help!!!

1 Like

Hello, ask the user about this question, he seems to know the solution, good luck!

1 Like

thanks sir, take good care


Refer also to Khadas docs.

It’s not clear which paths to give, and it throws out the error. The steps for compiling are not very clear in the documentation. Please help me out

I have compiled my model successfully in the sdk acuity toolkit but I am not able to compile it in the aml_npu-app-master so that I could move the .so files to the board for compilation. Please guide.

Hello, Khadas team will be coming out soon after the holidays and I think they will be able to guide you right too!

thanks will be waiting urgently

@Dhruv_Gaba Hello, Please download our lastest relase code , it support local compile , you don’t need to build with HOST PC

1 Like

Thanks @Frank for the suggestion, I am able to generate the .so files now, but when I try to run the model on theVIM3 board it through’s out error saying that the model is incorrect.
Then while checking on the model I found that the website from which I took the model is different than the one is recommended in the khadas website ? the example uses 416x416 model but on the website it’s using 608x608?
any recommendations on how to get the correct model so that I can start to deploy my own models on the VIM3 board???
looking forward to your reply in anticipation

Also, if someone could please mentions the correct path’s and the devices where we have to do every step during the conversion specifically, that will be a great clarification too. Thanks in advance.

Also, how can we run the same example for a video sequence?? should we just follow the similar procedure of copying the files to detect_demo_picture to detect_demo? If you could provide some instructions on that it will be really helpful for us to move forward in the development of our product. Thanks in advance.