Wait, didn't we figure out HD video playback like 15 years ago?

in hardware •  4 years ago  (edited)

image.png

When you bought your computer many years ago, it was able to play HD video just fine.
All those (I'm sure perfectly legal) copies of 2008 Batman The Dark Knight and 2009 James Cameron's Avatar were looking perfectly crisp and running perfectly smooth.
On YouTube, you could always select the highest settings, both on the flash version and later on the HTML5 version.

But now you go on YouTube, select the lowest HD resolution (according to YouTube) and it's buffering every few seconds (or displays an error or even crashes the browser altogether).

Okay, maybe that was just the issue with YouTube. So you download that that TikTok video that someone sent you (which is not even in Full HD). You try to play it, the fans ramp up, the heat rises, the computer behaves as if you were trying to run Cyberpunk 2077 in 8K, and yet the video feels laggy.
Surely something must be wrong! Maybe the computer is broken?

Because sure, the games required more graphical sacrifices as the years went by. 1920x1080 resolution with ultra settings quickly became unachievable in the latest titles. Eventually, even the lowest resolutions and settings became problematic.
But those are games - they need to be rendered by the computer, so it needs to be powerful enough to keep up.
This is just a video. "Intel HD" was already a thing in the mid-2000s back in Core 2 Duo era. And since 2008 hardware accelerated video functions are built-in on all Intel integrated graphics chips.
Is there any logical explanation behind this?

Yes, there is - and it's not (or perhaps not only 😜 ) planned obsolescence.
The videos you are watching, regardless online or offline, are always compressed, to save storage space and/or internet bandwidth.
Advancements in codecs (software that compresses and decompresses the video) enabled the same or sometimes even better video quality with smaller file sizes. Unfortunately:

free lunch.png

Just like with all methods of data compression, the better the compression (ie the smaller the file size compared to the original), the longer it takes to decompress the data. Alternatively - more powerful hardware is required to decompress the data just as fast.

Just a side note - video file formats are not the same as video codecs. We had mp4 file format since the first iPhone back in 2007.
But mp4 file format can use many different codecs to compress and decompress video.

So does it mean that the old computers are doomed and no one will enjoy HD videos on them ever again? Not necessarily.
Check out this article to learn how to improve video playback performance on older machines:
https://steemit.com/hardware/@hwtrendsetter/problems-with-hd-video-playback-don-t-worry-there-is-an-easy-way-to-make-your-movie-watching-experience-great-again

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!