Technology, consoles and computers

One way to determine a difference between a console and computer gamer is how they approach their respective platforms. For console gamer, he approaches the games first and foremost, giving them a platform and concentrating on those themselves. A computer gamer however concentrates on the specs, citing everything from the RAM of their gaming rig to the maximum resolution their monitor can output. Essentially, the software versus the hardware.

Because of the mixing of computer and console games, the approach between the two has become just as mudded. More often than not computer games used to drive the users stupidly insane, as they had to continue updating their rigs to play the latest Ultima or Privateer. Not only it demanded money, but dedication most of all. For console gamers life was easier, as all they had to worry about were the release dates and whether or not they had money to purchase the game, or in case of games like Zelda II, if they managed to nab a copy for themselves.

Console games are tailor made to run on a console, using the best possible capabilities it offers from controllers to whatever the hardware could do. It is far more easier to produce a computer game, as it has very little limitations what you can do with it, outside the whole keyboard and mouse controls. Whether or not you prefer those over a controller is up to opinion, even thou one can make a proper argument about the tactility of controllers and their form fitting.

Recently SONY admitted that the PlayStation 4 was not up to the technological higher end it was meant to be, but this is an oxymoron. Consoles have never been at the technological high end. Because of the technological development and the solid nature of the consoles, the moment any game console is released it holds outdated technology that can’t be upgraded. However, SONY is absolutely right in that PlayStation 4 not being at the top of the line has little do with game quality. Over and over again we have seen the consoles with lesser specs beating their more powerful competitors. No, Super Nintendo is not an exception in this. Mega Drive has 32X and Sega CD in the end, putting it on the higher end than Nintendo’s reverb filled console.

Lately, we’ve been hearing a lot of roaring from the computer gamers about the 4k displays. Your normal user doesn’t care about that at this point, nor has the 4k displayed the other HD displays at this point in time. Display technology advances at such a high rate, that buying a high end television once every twenty or thirty years should suffice you just fine, unless you’re a huge tech fanatic.

For the computer side, the 4k is another hardware issue the love to discuss how they would be able to make use of them the best possible way. For a console gamer, the issue is not relevant, at least not yet. This is because the 4k sets are not common in households. The companies have released numerous versions of their sets offering 4k support, but as with others things, there’s very little reason to put money now into 4k television when there’s very little that supports it. Much like with almost every console launch, the first years of any new technology sees little adoption before the gradual shift either makes a household standard or something new comes along and beats it. The change from VHS to DVD is an example of rather rapid change in both industry and household standards. SONY wished to replicate that success with the Blu-Ray format, but the jump from DVD to Blu-Ray has been far more slower. Even with DVD, the early players were rather low in quality and some early players simply can’t play the latest discs because of the technology differences, and then you had the fact most DVDs were low quality VHS transfers. There are instances where the Laserdisc edition was superior before a proper digital remaster came along, or in few rare cases, a Blu-Ray release.

There is also an issue of worldwide markets. The cultural values regarding technology varies massively and not all areas simply accept the new technology as the best. That said, the opposite applies as well. All we can really do is to individually wager whether or not it is worth purchasing potential rather than proved practicality. Early adopters always purchase for potential, and there are times when they simply bet on the wrong horse, much like with the BetaMAX or HD-DVD.

4k displays may be latest of the tech, and we all know that in a year or two we may be hearing about something that makes the 4k a moot point. Actually, there already exists numerous higher resolution standards than the 4k, like DCI Standard and the 8k FUHD. New ones will come along. Skipping a technological step isn’t anything new, and majority of consumers simply skip the things they don’t consider as worthwhile purchases. I can assure you that while 4k displays, objectively speaking, seem to offer better visual experience, there are those who simply don’t care and are fully content on using their current sets due to variety of reasons.

All that said, when would we see actual use for 4k display sets outside video games? NHK has announced Super Hi-Vision broadcasts for Japan in 2016. Eutelsat is Ultra HD dedicated channel that is already operating. 2013 and 2014 have been the years when Ultra HD has made its impact mainly within the industries. The end-consumer hasn’t really seen anything worth purchasing outside the potential. Then again, we had people announcing that 3D would be the future, and the boom ended up being a simple whimper.

First adopters purchase for the possibilities products offers, whereas most other consumers base their purchase on what already is offered. The compute gamer most likely would put his money into a 4k display in order to keep himself atop the hardware race, whereas a console gamer wouldn’t need a 4k set before something worthwhile would come into play. Even then, when something has a 4k support, there’s always the question whether or not the content in itself is worth the investment.