VIM3 H.264/H.265 video encoding

Is there any documentation on VIM3’s HW-accelerated video encoding? I’m on Ubuntu built with Fenix and get the following error when trying to run a simple pipeline:

khadas@Khadas:~$ uname -a
Linux Khadas 4.9.206 #4 SMP PREEMPT Tue Feb 25 16:30:33 GMT 2020 aarch64 aarch64 aarch64 GNU/Linux
khadas@Khadas:~$ gst-launch-1.0 videotestsrc ! amlvenc  ! fakesink -ve
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...

(gst-launch-1.0:6848): GStreamer-CRITICAL **: 12:46:21.939: gst_mini_object_unref: assertion 'mini_object != NULL' failed

(gst-launch-1.0:6848): GStreamer-CRITICAL **: 12:46:21.939: gst_mini_object_ref: assertion 'mini_object != NULL' failed
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
**
ERROR:gstamlvenc.c:631:gst_amlvenc_init_encoder: assertion failed: (encoder->vtable != NULL)
Aborted

Can you help?

Additionally, where can I find the source code for the HW encoding driver and GStreamer plugins?

Anyone? I can’t find the source and Fenix builds from a blob in @numbqq’s GitHub repo (https://github.com/numbqq/gstreamer_aml). Is VIM3 closed source?

Can someone answer this please? I’m considering the board for a large commercial application and this is a blocker.

Hello @coudy

I will check this issue this week.

Thanks, @numbqq. Have you had any luck?

Still not, but I think this topic maybe will help you

The API on the kernel side for the encoder is different from what you have on the driver. That’s why the same encoder does not work on A311D. You may not need to update the kernel module for the encoder, but I found that it may work incorrectly sometime. DMA buffer seems to be incoherent…

We are also using gstreamer for our application as well, unfortunately, I don’t have the gstreamer_aml code. For us, we read data from camera and pipe to the encoder then send the encoded image to gstreamer for further use ( streaming/saving to file). In the end, we are happier than just using gstreamer because we can open multiple channels on the cameras, and allows us much greater control (we get main camera, additional information, and a down sampling all from V4L2)

Hello @pitchaya

I will work more on this next week, And I can share the source code to you if you want.

If you shared me the code, I can try to fix it.

Hello @pitchaya

Here is the source code of buildroot. We build gstreamer from it.

1 Like

Thanks. Is it feasible to transcode a live 1080p MJPEG stream to H.264/5 at 30fps?

1 Like

Hey @coudy, I see a gstreamer command in your post. Can you please tell me, that you were using a MIPI-CSI camera from khadas or a USB cam? also from where did you got this command:
gst-launch-1.0 videotestsrc ! amlvenc ! fakesink -ve
Is there any docs present for it. I want to use gstreamer for my VIM3+MIPI-CSI khadas camera. Can you help me? I’ll use this gstreamer command with cv2.VideoCapture in python

Thanks.

Hi! The buildroot link doesn’t seem to be working any more, is there any chance you could re-upload it somewhere?

the great daveshah! hi! :slight_smile:
i want hw accel h265 encoding also!
ffmpeg encoding to 960x540 runs about 2.0 fps on vim3.

request re-upload for sources to amlogic hw encoder
[edit] ah well i have encoder.c and it looks 264 only
“gxl_h264_enc”,
“txl_h264_enc_cavlc”,
“ga_h264_enc_cabac”,
sorry. ignore me.

Hi @coudy!

Can you tell me please, did you get some results of h264/265 encoding on khadas VIM3?
And could you provide some advices to make it realtime?

Best regards,
Maxim

anybody have a solution to real-time H264/265 encoding the vim camera? This thread indicates it’s possible, but nothing documented on how to actually do it. I recently downloaded the Ubuntu image, Linux 4.9, and I couldn’t get it to work.

Thanks,
Patrick

I have test the vim3 camera os08a10 with hardware encoding rtsp streaming ( gstreamer) OK.

The main problem is the amlogic gst encoder plugin’s bad implementation.

The os08a10 could fetch data by format RGB/YUY2, but the amlvenc encoder just only implemetaion input format = NV12, despite it’s caps with RGB/NV21/BGR. But amlvenc using external library /usr/lib/libvpcodecs.so to do the actually work, so we can hack the libvpcodecs.so to force using one format.

Steps:

  1. Use offical ubuntu focal linux 4.9 v1.0.10-220108 image.
  2. RTSP using gst-rtsp-launch, download and build it, source: GitHub - sfalexrog/gst-rtsp-launch: Simple gst-launch-like RTSP server
  3. Download the encoder libs GitHub - numbqq/encoder_libs_aml, change file encoder_libs_aml/libencoder/h264/bjunion_enc/libvpcodec.cpp , add line format=3; before line 155:
        format = 3; // force using rgb888 format
        if (format == 0) { //NV12
            videoInput.fmt = AMVENC_NV12;
            videoInput.YCbCr[2] = 0;
        } else if(format == 1) { //NV21
            videoInput.fmt = AMVENC_NV21;
            videoInput.YCbCr[2] = 0;
        } else if (format == 2) { //YV12
            videoInput.fmt = AMVENC_YUV420;
            videoInput.YCbCr[2] = (unsigned long)(videoInput.YCbCr[1] + videoInput.height * videoInput.pitch / 4);
        } else if (format == 3) { //rgb888
            videoInput.fmt = AMVENC_RGB888;
            videoInput.YCbCr[1] = 0;
            videoInput.YCbCr[2] = 0;
        } else if (format == 4) { //bgr888
            videoInput.fmt = AMVENC_BGR888;
        }
  1. Build encoder_libs_aml/libencoder/h264/bjunion_enc on vim3 and copy libvpcodecs.so to /usr/lib/
  2. Running following script under gst-rtsp-launch/build/src:
./gst-rtsp-launch "( v4l2src ! video/x-raw,width=1920,height=1080,framerate=30/1,format=RGB ! amlvenc ! video/x-h264,profile=baseline ! rtph264pay name=pay0  pt=96 )"
  1. Now you can using mpv or ffplay to get the RTSP streaming by url rtsp://ip:8554/video.
1 Like

I was able to use this to successfully encode a v4l2 mjpeg document camera into an h264 stream using Gstreamer.

Unfortunately, the quality is bad and the latency is extremely high.

Which interface used by your camera. If it is USB, the latency maybe very high, and amlvenc does not support mjpeg codec, it will cosume much more CPU.