On 09/03/2026 11:28, Andrew Randrianasulu wrote:
You need to set remappig manually in  (for single-user buld)

bin/ffmpeg/decode.opts

like

remap_video_decoder mpeg2video=mpeg2_qsv

and other similar lines if you want to test other formats (hevc, h264)

Thank you for quick response.
After editing:

~/Applications/squashfs-root/usr/bin/ffmpeg # cat decode.opts
# apply at init decode
loglevel=fatal
formatprobesize=5000000
scan_all_pmts=1
remap_video_decoder libaom-av1=libdav1d
remap_video_decoder mpeg2video=mpeg2_qsv

Compilation of my test results as follows:

mpeg2 transcoded to h264_qsv.mp4 

mpeg2 transcoded to hevc_qsv.mp4 

mpeg2 transcoded to av1_qsv.mp4 

As seen the Cingg fps results are much slower than my system ffmpeg results.
Is there any way to verify that decoding is done in the gpu like with ffmpeg? 



On Sun, Mar 8, 2026 at 11:46 PM Terje J. Hanssen via Cin
<cin@lists.cinelerra-gg.org> wrote:
I have tested transcoding mpeg2video to h264_qsv, hevc_qsv, vp9_qsv and
av1_qsv HW GPU accelerated decoding+encoding with ffmpeg.
Just by adding "-hwaccel qsv" before the input file, the correct qsv
codec for decoding in GPU apparently is auto-detected.

ffmpeg -hide_banner -hwaccel qsv -i hdv09_04.m2t -c:v hevc_qsv -y
hdv09_04_ffmpeg_hevc_qsv.mp4

With a medium powerful (balanced) Intel Alder Lake (i7-12700KF) /DG2
(A750) the qsv.mp4 video output and playback are clean and faster with
GPU decoding.
For the legacy, limited SkyLake (h264_qsv) and Coffee Lake (hevc_qsv)
igpu's, it is best to keep decoding in CPU, due to igpu buffer errors
and some distortions on the playback.


So I wonder if it manageable to test-implement an additional environment
variable "CIN_HW_DEV=qsv" for GPU HW decoding that adds "-hwaccel qsv"
on the input side?
If it works, it may be added as a "Use HW Device: qsv" next.

Terje J. H






_______________________________________________
Cin mailing list -- cin@lists.cinelerra-gg.org
To unsubscribe send an email to cin-leave@lists.cinelerra-gg.org