H264 hardware encoding examples

Hi @numbqq, @Frank!

I’ve got some questions about hardware video encoding.

So I need to encode input video (MIPI Khadas cam), using only gstreamer it works (really works!) like:

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=1920,height=1080,framerate=60/1,format=RGB ! videoconvert  ! x264enc ! filesink location=/tmp/test.avi

But now the question: How can I use hardware accelerated encoding? Like things described there, for example. Two years ago you wrote such a pipeline:

gst-launch-1.0 v4l2src device=/dev/video0  ! amlvdec ! amlvsink sync=false

Its output (errors) looks like:

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.001708664
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

If I use it like I wrote above (standart gstreamer pipeline):

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=1920,height=1080,framerate=60/1,format=RGB ! videoconvert  ! amlvdec ! amlvsink sync=false ! filesink location=/tmp/test.avi

I get warnings and nothing good like:

WARNING: erroneous pipeline: could not link videoconvert0 to amlvdec0

or

WARNING: erroneous pipeline: could not link v4l2src0 to amlvdec0, amlvdec0 can't handle caps video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)60/1, format=(string)RGB

And can you please tell, what hardware is used by gstreamer (above) to encode?

Best regards,
Maxim

Hardware encoding with gstreamer may not work well, please use the hardware encoder library.

@numbqq

I tried to use that library for raw *.rgb video file and it works well. But it’s not exactly what I am looking for.

Could you please provide some examples/thoughts, how can I use that library for MIPI camera stream encoding?

Best regards,
Maxim

You can grab raw rgb frames from MIPI camera and feed to encoder library.

@numbqq @Frank
Also I should save frames to device, and than encode them?
I think that will take too much time.

Or how can I grab frames directly?

h264EncoderDemo 'v4l2src device=/dev/video0' /tmp/encoded.mp4 1920 1080 10 30 8000000 200 3

raises an error:

v4l2src device=/dev/video0
/tmp/encoded.mp4
src_url is: v4l2src device=/dev/video0 ;
out_url is: /tmp/encoded.mp4 ;
width   is: 1920 ;
height  is: 1080 ;
gop     is: 10 ;
frmrate is: 30 ;
bitrate is: 8000000 ;
frm_num is: 200 ;
open src file error!

Best regards,
Maxim

The MIPI camera use the standard V4L2 driver, so you can use V4L2 api to grab the frames, here is an example.

https://docs.khadas.com/linux/vim3/MIPICamera.html#Use-MIPI-Camera-via-opencv

1 Like

Hello,

No plans to permit hardware encoding with a kernel mainline ?
Regards,

@numbqq @Frank

Could you please provide an exact example?
I can’t grab frames using v4l2 driver - it raises errors.

Best regards,
Maxim