I’m blaming television and monitor marketers for the current obsession for screen sharpness. Partial blame goes for people marketing every-advancing home video media formats. Sharper image! Better colour! Higher resolution! HDMI connectivity! It’s understandable that consumers would end up wanting the best picture and sound from their home media, be it whatever. This makes sense in regards to film and music, as the original recordings usually were in a better format than what you could have at home. 35mm film is, by any measure, superior to VHS or DVD, and if we’re completely honest, any digital format we currently have. We can’t really apply the digital age measurements to what is an analogue format, much like how we really can’t apply digital screens’ resolution to CRT screens. The technology and measuring system are not compatible with each other.
In which we end up with the current era of digital technology, and how easily we disregard the technological divide. The way we see old media nowadays is probably completely wrong. The strife for ever-better visual and sound has effectively beaten down the intended method of seeing something over what has been possible, and in many ways, this has been a marketing slogan at times.
Star Wars was, much like most other movies, was intended to be seen on the big screen. If you haven’t seen the movie in a theatre, “you haven’t seen it all”. Then, the inverse should be true as well. If something was meant to be seen on the small screen, in our case a 4:3 television screen, then we really haven’t truly seen it as intended. For example, nowadays we enjoy Star Trek at least on what we could call DVD-quality, and that probably is not the way it was ever intended to be viewed, digitally remastered or not. The show may have been recorded on film, everything from set designs to costumes, and their colours, was designed and made to be shown on 1960s television. Most often the television set was black and white with the picture quality probably being deteriorated due to the received signal. The farther away you were from the city, the worse the signal would get. If you had a rotator antenna, you had the best quality. Interface from planes and trucks would be a factor. The screen quality would vary widely depending on what sort of TV set people had, and also how well people fine-tuned the channel. That’s how Star Trek was expected to be seen, and that’s how people watched it.
With the advancing technology, we would end up seeing more of what was on the film, which in many places lead to an unintended result of seeing the (literal) seams of the sets and costumes. It becomes easier to ridicule these as cheap sets and costumes, but in cases of shows like Star Trek, that’s part of the low-budget television. With home releases on VHS, Laserdisc and later on digital media, we saw the show in resolution and manner like never before. What used to be hidden technology decades older was now in plain sight, and people would laugh at it. However, put the same media in its proper timeframe and technology, and things look a whole lot different.
An issue that has to be taken with the DVDs and digital remasters is that they still showcase the “original” in much higher fidelity than originally aired
We should not forget the change in culture as well. Television was new at the time, and image quality didn’t mean nearly as much as it does now. There was no prior generation of people who had grown with worse picture quality or the like. When television was new, the picture didn’t really matter. It was what it was and you worked with it. What mattered was the content and the novelty of it. Shows like Star Trek was something new and exciting, and seeing this more cerebral television show about humanity in the stars in a hopeful manner captivated people in the long run. Nowadays, with the proliferation of science fiction shows and dozens upon dozens of derivates, it’s very easy to put the original series down both in terms of its content and delivery.
Television has the benefit of having a pure analogue format in film. The images and sounds are recorded on pieces of film and tape; they are not set in stone and are relatively easily remastered according to modern digital standards. It’s work-intensive for sure, and probably requires tons of extra work if you wish to clean every single thing, but it can be done. Sometimes you have to use multiple different sections of film from different prints of the same movie to achieve this, but it can be done.
I recommend watching, or listening, to the whole three hours video. It covers pretty much everything this particular fan’s own restoration. It covers pretty much everything from how certain elements were layered in the original movie to how he uses multiple sources to restore parts of a individual frame to gain the best possible version of a shot
This is not possible for video games or any other purely digital media format. The moment a game developer, or any other creator of digital content, defines the way their work is seen or heard, it will be stuck to that moment. While they can future-proof their work and save everything in much higher fidelity than it would be currently possible to output, e.g. a digital movie was recorded in 4k in an era where 1080p was the standard, at some point the technology will catch up to them. 35mm film movies are being progressively ruined by noise removal algorithms and smoothening nowadays, in a manner, the same has been done to video games. The difference is, video games and their consumers have a completely different paradigm that, in effect, has skewed the idea of how raster graphics should be seen.
Composite – RGB – Emulator screenshot The emulator screencap has also cut away the overscan area, which would not be seen in a real CRT screen, but would be visible on a flatscreen. See more in this video, where the two first were nabbed from.
The above three screenshots, while usable when comparing different signal qualities coming from the machine itself and how things look in emulation, isn’t how Sonic the Hedgehog was intended to look. As we are now, sitting in front of our computers or using some palm device to read and see these shots, we are not seeing the sort of middle-hand output. The end result of a console, or any other device for the matter that was using a CRT screen, is lost to us. The image we get from emulators, digital re-releases of games and whatnot to our modern screens is inaccurate how the game was developed and meant to be seen.
However, we can surmise some things from the above three screenshots. For example, Sonic is much bluer in the composite shot, with shading and the greens melding into each other in a natural manner. The further we go to the right, the sharper the image gets, but at the same time, we lose smooth surfaces and these melding of colours. We can also see a slight shift in the aspect ratio. It wasn’t uncommon for games to have oval circles that got stretched into proper circles due to how the console was outputting the signal or how a monitor might naturally stretch it, but props for the emulator shot for correcting the aspect ratio.
Dithering is often discussed topic when it comes to the Mega Drive visuals, as many Mega Drive games use dithering to smooth out colours. You would use two colours in dithering, which would meld together on a CRT and produce a third colour, melding them all in a nice gradient. However, this isn’t apparent in higher-end cables, which would show the dithering in a much distinct and crisp way, destroying the carefully laid graphics. Retro-Sanctuary has a short write-up on dithering I would warmly recommend giving a look.
Yuji Naka uploaded a short clip from 1990 showcasing the room where games were being developed, where we see a young Naka working on Sonic the Hedgehog‘s collision. You also get a shot at Michael Jackson’s Moonwalker being developed, particularly Michael’s walking cycle. These games were developed on and for CRT screens. It wasn’t until the seventh generation of consoles when games began to be fully developed for digital screens. Most, if not all sixth-generation games that used sprite graphics, were developed with CRT monitors and non-digital cables in mind. Now, what if we took a photo of that same Sonic title screen on an actual high-end CRT monitor and compared it to an emulated screen?
Sonic the Hedgehog (1991, Sega) – Genesis
Sharp Pixels vs. Genesis Composite via Sony PVM-20L2MD
I can’t believe I forgot my man’s 30th birthday! I’ve even had this post ready for months! Here’s to 30 more years! pic.twitter.com/lH8HVwOU72
CRT Pixels is an account that posts these comparison shots between emulators and CRT screens. There are tons of images comparisons that showcase how dot graphics, sprites or pixel graphics, whatever you want to call them, were designed and drawn with CRT monitors in mind. When an already existing artwork has been digitised, the person in charge of digitization had to take into account how the image would be represented on screen. It could never have been a 1:1 transfer of data from a painting to pixels due to the sheer nature of the technology of the era. Considering how a machine could output an image that was intended to be stretched naturally on a CRT, sometimes the graphics had to be squished in a direction so that it’d look proper when outputted. This happens a lot with Super Nintendo games, which had led to some heated discussions about whether or not its games have to be stretched to a proper aspect ratio, or whether or not the console’s internal aspect ratio and resolution is the real one. The real answer, however, is that it varies game by game, as some titles relied on SNES’ internal resolution while other developers created their graphics the output devices in mind.
Of course, arcade game developers and manufacturers had the freedom to decide on these things on their own. Capcom’s CP System uses 4:3 aspect ratio across the board, but you probably see loads of emulator screenshots in 12:7 aspect ratio. This is because, before digital screens, we had non-square pixels. This is also is one of the reasons why we can’t apply modern screen resolution standards, which counts pixels per heigh and width, when we had no pixels per see, and even then they were non-square. Displaced Gamer has a good video on the topic in a much better package than what I could do. Though I might add that it didn’t help that we had some widescreen format CRTs as well, and people always wanting to fill the screens never helped in the matter. Something that persists to this day, as so many emulation enthusiasts force their old games’ ROMs into the widescreen format.
We are fast losing the way games, and many other forms of media were intended to be consumed. Emulation and game preservation has made immense strides in preserving video and computer games’ data, and have begun to replicate consoles’ and computers’ internal workings in 1:1 emulation manners, something that probably will be impossible to fully emulate with the PlayStation 2, this scene has largely ignored the intended way these games were meant to be seen. No, that’s not exactly correct. For years we’ve got dozens of different ways to mess with emulators’ output. We’ve had tons of different filters that add fake scanlines or smooth the emulated pixels for an effect, often trying to mimic how a game would’ve looked like on a CRT screen. Different renderers are trying to replicate the originally intended form, some a better effect, some mangling them to a horrible degree. However, consoles like the Game Boy Advance, don’t really need these sort of post-processing effects, when the display itself already had square pixels. Hell, sometimes watching sharp pixels can mangle a sprite to the point of you not knowing what the hell you’re supposed to see there, but with that softer quality via post-processor filters or proper CRT screen, the sprite’s shapes and colours make a whole new shape and shades you can’t see otherwise.
A paper describing a method to depixilize pixel art is probably slightly off the intended path. This post-processing method doesn’t take into notion how the graphics were meant to be seen, but rather it ends up re-creating an interpretation of pixel graphics in a smoother form. The end result is less than desirable, but in a manner could also consider this kind of approach to aim to recreate the original underlying artwork that was then used to make the sprites. This is not, however, how the games’ graphics were meant to be seen.
Post-processing probably will end up being a way to solve the issue of how old games are being represented in the future. Perhaps we simply need high resolution enough screens to properly portray non-square pixels and colours a CRT can shows. In essence, rather than emulating just the hardware, emulators would have to take into account the cable quality and how CRTs output the picture. Granted, tons of emulators already do this, but not as default. Most often you still get a modern interpretation of square pixel, internal resolutions when you open an emulator, necessitating individuals to go into the settings menu. Menu, where they have tons of options they might not know what to do with. While we are getting copy systems that emulate hardware to a tee, they are also machines that are made to have HDMI output only. Clone consoles like RetroN and all the Analogue consoles, like the NT Mini, only output in modern HD via HDMI. Sure, you have in-system post-processing to make the games look like they’re played on a CRT. That’s the breaking part really.
A Hi-DEF NES kit modification kit
Console modifications have been around since consoles have been a thing, with RGB output and mods to circumvent region-locking have been the most popular things. Nowadays, we have these custom made boards that you solder to your older console and have it output via HDMI cable. They’re often directly connected to the CPU and video unit, so it interprets whatever the console wants output and tweaks it so the image is compatible with modern screens. Much like their copy-console brethren, they have built-on filters. Nevertheless, both of them utterly destroy the intended manner of how to view games on these older systems. They might be crisper, sharper, have the perfect colour from the palette. That may be preferable to some people, and certainly makes these old consoles compatible with modern screens, but they nevertheless destroy the intended way these games were meant to be seen.
The issue may end up being about authenticity. Modders and certain parts of the electronics consumers don’t really want to let go of these old machines and will do everything to update them for modern standards. That is a losing battle in many ways, and perhaps the approach is wrong too. While we can change some of the inner components, like the leaking caps and that, we can’t really restore old technology per se. Perhaps rather than trying to find a way to emulate the CRT screen, we should find a way how to replicate that particular screen technology. However, considering how dead CRT technology is, I doubt anyone will go their way out and try to find a way to revive it. I’m sure if CRT tech would’ve kept advancing, the shape and weight would’ve dropped, but the flatscreen tech we have now is in most aspects superior. It may still be struggling with replicating the same range of colours and true blacks as even cheap CRT could do, but their utility really beats CRTs in every other aspect.
I guess we can’t return to the intended way games were assumed to be played and seen. Much like how we didn’t have any other options to play the games “back in the day,” the same kind of applies to what we have now. The difference is, from all the options we have nowadays, from line doublers, upscalers and such, that crude reality is your older consoles were not meant to be played on modern monitors let alone be emulated in a crisp, in-hardware pixel-perfect output. These older games were played on a piece of shit telly, and that’s how they were build to be.
Of course, some Australian cunts probably would tell you there’s only one way to properly play the game, e.g. using SNES’ internal resolution and not give one flying fuck about intentions. Consumers have created options for themselves, and only relatively recently game companies have awoken to what emulator filters have been doing for a longer time. Filters themselves need to be completely re-evaluated, as there used to be rather heated discussions between people who wanted those raw pixels and the people who used all sorts of filters. Of course, neither party were absolutely correct, though if you managed to attach your PC to a CRT screen via S-Video cable or something, then there was no need to use filters.
In the future, we will lose the intended method of viewing games, and the rest of the media, which were created in analogue means as intended as the world proceeds with digitalization. With time, we’ll either lose them altogether to time, or most probably, they will be replaced with the closest possible approximation. No amount of remaking, remastering or modding can save old media. All we can really do is preserve and repair them in order to keep things in their original form as much as possible. At least in gaming, emulation will always be the second-best option to the original thing, and to some, emulation is already superior to the original hardware. That of course is not playing or seeing games as intended, but that has not been a factor to many at any point. What matters to many is the sharper image with higher resolution, even if that would effectively destroy the carefully balanced image the developers put all their effort in creating.