Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Switching to hardware decoding is likely to slow down your pipeline.

If thats your experience, then something is off with your setup.

Hardware decoding on a modern computer might seem equally fast as software until you look at power usage where hw decode is a hard win. Even so, the resources used to decode modern high end codecs like x265 or av1 will be major and hard to miss.

but anyway discussed are encoding &decoding.

And video encoding goes from like 3 fps in software to 100s of fps in hardware, making it possible at all with any resolution and quality to speak of.

You need encoding to stream your video, like in a video meeting.



Maybe I am doing it wrong. But I tried the following commands to test it on h264/h265 and 1080p/4k sources:

  Soft: ffmpeg -i {src}  -map 0:v -f null - 
  Hard: ffmpeg -hwaccel cuda -i {src}  -map 0:v -f null - 
the result I get is:

  Source       Hard  Soft
  1080p-h264  25.1x 37.1x
  1080p-h265  26.4x 36.5x
  4k-h264      5.5x  7.1x
  4k-h265      5.5x  6.5x
this is on an old CPU (i7-6700K) and a somewhat recent GPU (RTX 2060) and with ffmpeg 6


You are using cuda for some reason? cuda is for general gpu computing, not the same as using the dedicated hardware enc/dec engines.

See ffmpeg docs for the hw decoding name, like nvdec iirc for nvidia cards.


I am pretty sure that does not activate the hardware encoder. https://trac.ffmpeg.org/wiki/HWAccelIntro


We are talking about decoding here. You meant decoder? I can certainly see the hardware decoder being busy in task manager.


I think you should use nvdec instead of cuda. GPU itself is not a hardware decoder, the special decoder unit inside of it is.


I barely know what I'm talking about, but in my experience to use software I had to explicitly enable it with "-allow_sw" but this may have only been necessary due to videotoolbox


Cuda is not a normal hardware decoding API. Look for VA-API and VDPAU.


What type of "setup" are you talking about? Servers or home use?

I've been working with video transcoding/broadcasts a lot and software decoding was still worth it in large amount of cases - mostly because the CPUs these days (threadrippers & co) can handle significantly more concurrent encodes than the HW decoders.

HW decoders are built to play video on your PC so you can watch a movie and usually don't supoprt all that many concurrent streams and aren't all that fast (they "just" need to be realtime, after all). That's amazing on playback devices (pretty much mandatory for H.265/AV1), but for "2U racks at Amazon" that's not very useful and large cored CPUs are still kings. Especially since software encdoders are still massively winning on visual quality per second per MB of video.

(Why am I talking about servers? Because this thread has started with AWS 2U video racks, not Apple TV boxes.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: