Can't run NPU Demo for YOLOv3 on VIM3 using MIPI CSI camera from Khadas

Hi Khadas Community,

I tried really hard to run the YOLOv3 demo on NPU of VIM3 using OS08A10 8MP HDR Camera MIPI-CSI interface. I tried every ./detect_demo_mipi /dev/video* 2 commands but the error was same and I couldn’t see any frame/video feed. The screenshot is attached below:

Instance of cv::Exception from resize.cpp

I used v4l2-ctl --list-devices and the output is attached below:

Also I tried to use this camera in cheese / ffplay but it doesn’t work, the screen was all black. The cheese was showing me these devices:

And the commands of ffplay /dev/video* showed me this error:

Is there something like any driver for MIPI CSI has to be installed or something? Or any Camera Inferace has to be enabled just like Raspberry Pi we need to edit raspi-config file?

I really hope if these all screenshots help and someone would resolve my issue. I was using a monitor, OS08A10 8MP HDR Camera, USB mouse and USB Keyboard, power from both wall charger as well as high power Power Bank.

Also, if someone could tell me which software should I use to remotely access/control the Khadas VIM3 ubuntu as I have already tried Teamviewer, vnc4server and NoMachine but teamviewer won’t install and vnc4server and NoMachine won’t connect. I want to control SBC remotely from different location from different network/WiFi.

Thanks for the reply in advance. Please help me out here.

@CodeLogist Hello, video0 is the mipi camrea driver node .

1 Like


You can check the images here:

For desktop image you need to switch to framebuffer console to run the MIPI camera demo:

v4l2_test  -c 1 -p 0 -F 0 -f 0 -D 0 -R 1 -r 2 -d 2 -N 1000 -n 800 -w 0 -e 1 -I 0 -b /dev/fb0 -v /dev/video0

Hi, I’m already using xfce ubuntu EMMC 20191231 image and can I use the mipi camera to click an image and save it to some storage? @numbqq can you please specify what this command means? And how can I use this in NPU demo yolov3 command…/detect_demo_mipi?

Just use this command to check whether the mipi camera works first.

1 Like

Thanks @numbqq and @Frank, The command works on frame buffer screen and shows up the camera feed on screen (had a bluish tint on the frames though) and yes after running this command the NPU demo also worked smoothly, I don’t know why it wasn’t previously.

Can you please, suggest me a way or software to capture the pictures from this MIPI CSI camera and store them for a YOLOv3 model training purpose. The code can be in python. Also, it will be good to train on the images from the camera which will be always used while real testing. right?

Thanks for your support

@CodeLogist You can use this command cat /dev/fb0 > frame.raw to get a raw file . Then use ffmpeg -vcodec rawvideo -f rawvideo -pix_fmt rgb32 -s 1920x1080 -i frame.raw -f image2 -vcodec png frame-%d.png .But you need to install ffmpeng first . I suggest you use this command in PC. (ps: I forgot whether it’s rgb32 or bgr32. If rgb32 doesn’t work, change bgr32)


Thanks @Frank, the images i want to click had to be from outside my house. I couldn’t type this command everytime to capture an image. Is there any way to add a button to click or maybe integrate the above command lines with python so that I could modify the clicking feature and image saving paths. Also, i would have to run these commands in framebuffer mode i guesd, right? Or it’ll be good if i run it in normal terminal.

Also, can I use something like in this repo: for our MIPI camera? Is there any way to use MIPI camera via gstreamer command, so that I could use cv2.VideoCapture(gstreamer_pipeline()) function in python. Or any other way to use cv2.VideoCapture() function. I really need that working.


Hi, I Tried so many commands from v4l2 and gst-launch-1.0 but got no luck to view the MIPI CSI camera feed in GUI mode and not only in framebuffer mode. I wanted to use gstreamer as I can integrate opencv and python on it but got no luck. But then I started guvcview and the camera feed popped up but with a greenish color image and low fps(image attached below).

After, I exit it and restart my VIM3. I tried opening guvcview again but it was showing "Invalid Resolution Index " and on terminal it shows “resource temporarily unavailable” so I searched for the PIDs using /dev/video0 and killed them all. but then too the error persist and till now I’m unable to see the camera feed. Although when guvcview was working It was taking input from MIPI camera whose name was shown as “JunoR2”. I have attached the error screenshot too. Kindly help me out here. I really want it to work.

Hi @Frank,
Nope this didn’t worked for me on both the framebuffer mode as well as normal Xfce mode.
I have attached the output images and the terminal output images below. Please see if that helps. I tried both rgb32 and bgr32.

Thanks for your support :slight_smile:

@CodeLogist Hello . Can you try with my steps .

  1. Switch to FB mode (Ctrl + Alt + F1)
  2. Run v4l2_test -c 1 -p 0 -F 0 -f 0 -D 0 -R 1 -r 2 -d 2 -N 1000 -n 800 -w 0 -e 1 -I 0 -b /dev/fb0 -v /dev/video0
  3. cat /dev/fb0 > frame.raw
  4. ffmpeg -vcodec rawvideo -f rawvideo -pix_fmt rgb24 -s 1920x1080 -i frame.raw -f image2 -vcodec png frame-%d.png
1 Like

Thanks @Frank, can not type tese commands everytime as the training dataset consist of outside environment. Is their any way using python or gstreamer or any GUI software like guvcview? I am not able to goto step 3 of your code as the frames cover full screen and I could see any space to enter command line and neither do I could exit from v4l2_test screen by ctrl+C/Z/X or Esc key? Also any solution for that greenish and blueish frames?

@CodeLogist Maybe you can write a shell to do it … I am sorry that I don’t have a particularly good way

Okay thanks @Frank for your help. I’ll figure out something or write the script. Will post it here too so that others can use.

What about the greenish or blueish camera frame? What can I do to make it clearer?


@CodeLogist I think it should be Related to v4l2 …
I found RGB format in our code about mipi camera in aml_npu_app is PIX_FMT_RGB888 . This is rgb24 . So , I am not sure the RGB foramt setiing in V4l2_test . Can you run our aml_npu_demo_binaries in the background and try step3 and step4 again ?

Yes, Sure I’ll do that. How can I run aml_npu_demo_binaries in background? It runs in framebuffer mode, do you mean run the npu demo in 1 framebuffer ttyl and running step 3 and 4 in another ttyl in ubuntu right?

Also, as it is rgb24, should I have to change the some settings in conversion_scripts while transforming YOLOv3? like channel_mean_value or reorder_channel?
Please confirm this.


@CodeLogist use & After your command . It means run at backgroud.

1 Like

Okay Thanks @Frank, Surely will test it out and tell here.

I am getting the same greenish image when I use the the camera in 1920x1080 from opencv4.
However, if I use 1280x720 everything is fine. I am using ubuntu bionic kernel 4.9.206.
Will this work better with ubuntu focal kernel 4.9.224 ?