So yes for 720p low bitrate stuff it's not so much of a difference, but as others said watch some real content in 1080p or better 4k and your CPU will definitely break a sweat and the fans spin up.
Now that I think of it with your claim of it not even registering as a blip in taskmanager I'd actually argue you're probably using hardware decoding without noticing, because I wouldn't know which player (esp if on Windows) wouldn't make use of it.
Decoding horsepower required increases sharply with increase in resolution/framerate. 720p30 doesn't cause my fan to even run but 1080p60 does and 4K30/60 drops frames.
Just to be clear, 4K is 9 times the load of 720p. 4K60 is 18 times.
My setup is Media Player Classic Home Cinema, splitting and decoding via LAVFilters, then fed through ffdshow for some filters, before finally rendering on Enhanced Video Renderer.
Nowhere in the pipeline is the GPU involved as far as decoding is concerned, it's all deliberately software on the CPU.
Generally 8-bit or 10-bit h.264, occasionally h.265, at either 1280x720 or 1920x1080 progressive with frame rate usually 23.976. Split and decoded in software via LAVFilters then ran through ffdshow before rendering on Enhanced Video Renderer.
CPU anything ranging from an i7-14700K to an i3-2100 (yes, Sandy Bridge). Seriously, decoding video has never been a significant workload for over a decade.
>4K h.265 / AV1 video on a couple of years old laptop dual-core CPU.
Kindly, why the hell would I even watch 4K video on a laptop? Y'all keep throwing out contrived situations like that, meanwhile I'll be a sane man living in reality and re-encode it (in software, finer tuning of parameters) in my spare time down to 1080p or 720p so I save myself precious disk space and CPU usage while travelling.
> Kindly, why the hell would I even watch 4K video on a laptop?
Because that's the file I have on hand. Why on earth would I re-encode it if I want to watch it once?
I also connect my laptop to my 32" 4K screen. There the battery life is not a consideration, but the spinning fan is.
> in my spare time down to 1080p or 720p so I save myself precious disk space and CPU usage while travelling.
Your use case might work for you, and that's fine, but you claim that software decoding is universally not a problem. But it is, if you don't limit yourself to 720p h.264 pre-encoded at home. Most people are not fine with having to do this and having to limit themselves to low resolutions / bad IQ.
> Kindly, why the hell would I even watch 4K video on a laptop
How about if you have a 4K monitor plugged in? Or if your notebook display is itself 4K (which is a completely valid configuration nowadays)?
I have a pretty beefy laptop with an RTX 3080. I regularly watch BDRips that exceed 50+ Mbps, and software decoding even on my 8-core Intel Xeon will cause some stutters. Hardware decoding is just so much faster.
mpv (definitely the best player for advanced users) does not use it by default. I'll simply quote `man mpv`:
> Hardware decoding is not enabled by default, to keep the out-of-the-box configuration as reliable as possible. However, when using modern hardware, hardware video decoding should work correctly, offering reduced CPU usage, and possibly lower power consumption. On older systems, it may be necessary to use hardware decoding due to insufficient CPU re‐sources; and even on modern systems, sufficiently complex content (eg: 4K60 AV1) may require it.
Now that I think of it with your claim of it not even registering as a blip in taskmanager I'd actually argue you're probably using hardware decoding without noticing, because I wouldn't know which player (esp if on Windows) wouldn't make use of it.