The Unwatchability of Yesterday's Videos Today and the Promise of Tomorrow
A topic I end up discussing with friends and thinking about regularly is how videos and games we used to like a decade or two ago looked completely fine back then and are currently unwatchable. I don’t mean this in the more common way of shifting tastes and values but in almost literally barely seeing what’s going on. Videos in 360p and even lower (how is 144p even still an option in youtube?) seemed complete fine to me but now they are not and I am not even able to make out the details. Games that looked like huge graphical improvements at the time and giving us a taste of how realism might look like soon now look absolutely awful to the point where I can’t even tell what’s going on with certain objects (if I can even recognize them) in some scenes. One might say “Well, you are spoiled by much better graphics” and sure, it does seem like it but what does that entail? Why did I use to think I can see things I now can’t?
My TV was definitely worse quality than now (if we pretend I even have one) but I am not sure if it was quite 240p. Reading various links suggest 720p and more were already available in the 90s/early 2000s, at least ~480p+ seems to have been somewhat more common and at minimum movies at the cinema had a decent effective resolution by then even accounting for the distance. Yet Youtube only offered 320x240 between 2005-2008 which was mostly fine then and nearly unwatchable now?
There is a popular Evolution of Dance video from 2006 which I remember and in my memories, it looked pretty okay. I just re-watched it now and sure, I can still tell what’s going on but I can, for example, barely see the dancer’s face. Indeed, it isn’t that bad if you watch a single clip, however, when I tried to watch an older sketch show for more than a couple of episodes the quality made it actively unpleasant. The effect also definitely exists in games that seemed to look pretty good to me - e.g. Command and Conquer which we will get back to later or early Tomb Rider games for a 3D example.
One argument is that old videos look worse because of our massive screens today. Checking on my phone there might be some truth to it, however, I am far from convinced that it explains it.
Refresh Rate Comparison
People getting used to better quality and finding it hard to go back is not uncommon of course - I hear it quite often today in regards to higher refresh rate displays. You get comments like this claiming that “You don’t realize how ‘laggy’ 60hz is until you go to 144. Now I can never go back.” or “now I HATE my monitors at work” which I find interesting as the ‘never go back’ sentiment is very common. In fact, if you try to look into 144 Hz monitors these comments are nearly ever-present. This is seemingly more prominent with people who mainly use higher refresh rates, less so among those who use e.g. a lower refresh rate on one screen and a higher on another. That suggests to me that perhaps in general nearly constant exposure to better quality ‘ruins’ low quality much more easily than if you get exposed to both.
“Hallucination” Theory
An argument which I like is that we didn’t know much better but were used to those low resolutions and found it easy to fill in the details. The brain already does a lot of that kind of thing all the time, so it doesn’t seem so unreasonable. Additionally, most of us currently rarely watch anything in low-res and we are more inherently aware that a given thing is in bad quality when it is, due to our regular experiences with HD, so we wouldn’t normally be benefitting from that particular extrapolation. I wouldn’t be surprised if this is partially what’s happening in addition to a more general “we just didn’t know better” explanation. People with Anton’s blindness can fill in everything and think they are seeing while being completely blind, what is some minor upscaling and ‘hallucinating’ more details after you get used to looking at only low-quality videos and games on your computer or TV since you were born.
Regardless of whether the above explanation is correct, where does this leave us? It seems clear that we could enjoy these lower in visual quality works before but a lot of us can’t do so quite as easily today. Should we consider re-training ourselves at enjoying those works? That seems a bit tedious. Is this just a (small) price we have to pay for progress and call it a day? Maybe, but it seems like progress might still have us covered. There have been a lot of advancements in AI Super-Resolution and we are getting to the point where it can be used well in some cases, and things like games and old videos that aren’t in too bad a quality can benefit a lot from it.
AI Super Resolution
Of course, it will be irresponsible not to mention that upscaling and enhancements don’t have their issues. It is impossible to accurately increase the quality of an image without guessing at what should be in the extra pixels. This is typically done by showing a lot of examples to the model and it learning the most common ways to upscale accurately - but if you are upscaling something too different from what’s in the original dataset, there might be issues. This is part of the reason why fully-automated upscaling without humans in the loop for serious tasks beyond entertainment can be unwise.
Games, in particular, have benefited a lot from AI-based upscaling - for example, Nvidia’s DLSS which is used more and more or Facebook announcing their own (near) real-time Neural Supersampling using additional data available in Game Engines. Even more relevant, the 1995 Command and Conquer game, which looked pretty good to me as a kid then and awful to me as an adult now has been recently remastered, largely using AI to great success! This includes both the playable portion of the game but also the videos and sounds in it. As far as I can tell, this has been met with overwhelming approval and the game is currently in Steam’s top-seller list. Of course, this is far from the first remastered game - in fact, the practice is getting more popular - but it uses AI for it’s upscaling to an even larger extent than those before. That and the general trend of reworking old games can paint an optimistic picture where in the near future we might be able to get most old games brought back to life with significant visual improvements and less and less work for the developers.
What about videos in general? This video, originally from 1896 recently made the rounds when it was upscaled to 4k, 60 fps and I have to say - it looks a lot better to me.
There are many other examples, although currently more often than not you still need a human guiding the process to make sure you end up with a version that doesn’t have too many unpleasant artefacts. Additionally, the process is still very compute-heavy and far from real-time. Either way, as algorithms get better, datasets larger and compute cheaper it is likely that in a few years we’ll be able to upscale any old video on demand or at minimum that most older content that is at least somewhat popular will eventually get the super-resolution treatment. Things like browser extensions and apps that do it for you automatically and in real-time might soon become common for those who want them.
For the time being, you can pay someone to upscale a video for you, e.g. neural.love who made that 1896 train video, you can register for various services that deoldify photos or you can play with models directly in Colab or on your own machines. The future looks bright and with the right model so will the past.