VIM3 H.264/H.265 video encoding

Is there any documentation on VIM3’s HW-accelerated video encoding? I’m on Ubuntu built with Fenix and get the following error when trying to run a simple pipeline:

khadas@Khadas:~$ uname -a
Linux Khadas 4.9.206 #4 SMP PREEMPT Tue Feb 25 16:30:33 GMT 2020 aarch64 aarch64 aarch64 GNU/Linux
khadas@Khadas:~$ gst-launch-1.0 videotestsrc ! amlvenc  ! fakesink -ve
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...

(gst-launch-1.0:6848): GStreamer-CRITICAL **: 12:46:21.939: gst_mini_object_unref: assertion 'mini_object != NULL' failed

(gst-launch-1.0:6848): GStreamer-CRITICAL **: 12:46:21.939: gst_mini_object_ref: assertion 'mini_object != NULL' failed
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=(fraction)30/1, multiview-mode=(string)mono, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
**
ERROR:gstamlvenc.c:631:gst_amlvenc_init_encoder: assertion failed: (encoder->vtable != NULL)
Aborted

Can you help?

Additionally, where can I find the source code for the HW encoding driver and GStreamer plugins?

Anyone? I can’t find the source and Fenix builds from a blob in @numbqq’s GitHub repo (https://github.com/numbqq/gstreamer_aml). Is VIM3 closed source?

Can someone answer this please? I’m considering the board for a large commercial application and this is a blocker.

Hello @coudy

I will check this issue this week.

Thanks, @numbqq. Have you had any luck?

Still not, but I think this topic maybe will help you

The API on the kernel side for the encoder is different from what you have on the driver. That’s why the same encoder does not work on A311D. You may not need to update the kernel module for the encoder, but I found that it may work incorrectly sometime. DMA buffer seems to be incoherent…

We are also using gstreamer for our application as well, unfortunately, I don’t have the gstreamer_aml code. For us, we read data from camera and pipe to the encoder then send the encoded image to gstreamer for further use ( streaming/saving to file). In the end, we are happier than just using gstreamer because we can open multiple channels on the cameras, and allows us much greater control (we get main camera, additional information, and a down sampling all from V4L2)

Hello @pitchaya

I will work more on this next week, And I can share the source code to you if you want.

If you shared me the code, I can try to fix it.

Hello @pitchaya

Here is the source code of buildroot. We build gstreamer from it.

Thanks. Is it feasible to transcode a live 1080p MJPEG stream to H.264/5 at 30fps?