You need to set remappig manually in (for single-user buld) bin/ffmpeg/decode.opts like remap_video_decoder mpeg2video=mpeg2_qsv and other similar lines if you want to test other formats (hevc, h264) On Sun, Mar 8, 2026 at 11:46 PM Terje J. Hanssen via Cin <[email protected]> wrote:
I have tested transcoding mpeg2video to h264_qsv, hevc_qsv, vp9_qsv and av1_qsv HW GPU accelerated decoding+encoding with ffmpeg. Just by adding "-hwaccel qsv" before the input file, the correct qsv codec for decoding in GPU apparently is auto-detected.
ffmpeg -hide_banner -hwaccel qsv -i hdv09_04.m2t -c:v hevc_qsv -y hdv09_04_ffmpeg_hevc_qsv.mp4
With a medium powerful (balanced) Intel Alder Lake (i7-12700KF) /DG2 (A750) the qsv.mp4 video output and playback are clean and faster with GPU decoding. For the legacy, limited SkyLake (h264_qsv) and Coffee Lake (hevc_qsv) igpu's, it is best to keep decoding in CPU, due to igpu buffer errors and some distortions on the playback.
So I wonder if it manageable to test-implement an additional environment variable "CIN_HW_DEV=qsv" for GPU HW decoding that adds "-hwaccel qsv" on the input side? If it works, it may be added as a "Use HW Device: qsv" next.
Terje J. H
_______________________________________________ Cin mailing list -- [email protected] To unsubscribe send an email to [email protected]