Updating the build-in obsolescence

Sometimes I come across news that just feel stupid. Logitech announced that they will shut down all services for their Harmony Link, essentially bricking the device with an update. Why? Well, they’re out of certificate on technology that’s inside the lil’ smart device remote. This of course caused rather serious backlash on the usual Internet forums, to which Logitech responded that they’ll replace the obsoleted devices to a new one.

This is, sadly, par for the course in modern era. Licenses and certificates from every which way is being implemented in devices that are not though to last. Devices are not thought to last at all, with some companies expecting you to replace your phone yearly. Apple, for example, optimises all their latest updates to their newest models, the old ones be damned, meaning the old hardware gets sub-optimal OS update, which will cause things to slow down and requires more numbers to be crunched. Apple pulled back one of their iOS updates after they released it, as it made older systems inoperable due to inability to make phone calls or unresponsive fingerprint sensors.

Back in the day, obsolescence was designed in the product from the get go. Some film companies even wanted VCRs to wipe tapes slightly each time they were played. This meant, that after certain number of watches, the tape would be blank and the consumer would be forced to buy a new copy of the movie. Imagine if a DVD or Blu-Ray discs and their players would’ve been built so that after certain amount of watches, the player’s laser would burn a mark that would prevent any further playbacks. Apple’s products are full of planned obsolescence from hardware to software, with the customer being completely dependent on the company’s services when it comes to maintenance and repair.

While bricking updates are exactly nothing new, they’ve become more and more common at a steady pace. It has not been profitable to design and manufacture products that would last anymore. We have the technology to make phones and whatnot last a solid decade, but this would mean the companies wouldn’t get that steady stream of high revenue yearly. This may sound overtly dramatic or even anti-corporate, but this is more or less personal experience with numerous companies. The discussions I’ve had with professional from the industry who have worked in different fields of productions, from the cases to the software, all have said the same thing; it’s cheap. The outer shells cost barely anything to tool, the electronics manufactured and fabricated at a very low price in countries that don’t care about certain legislation issues, assembly is done in an area where pay is extremely low and people are prevented from doing suicides via nets. Shipping per unit costs absolutely jack shit, coding is done to drive the latest things up and probably is the second most costly bit after advertisement. It is the name that drives the price up. Hell, the lack of earphone jack and other physical properties in more modern phones nowadays is to drive the production price down while the sales price is jacked up.

The only thing that ultimately costs is the brand. iPhone X costs a thousand bucks to buy, and it has nothing to justify its price outside the Apple logo and branding. The profit margin is extraordinarily high. I won’t even try to calculate the production price, but a good guess would be that the production costs are hundreds times less than the final sales price. But hey, if people will pay for it, then that’s the rule of the market.

That veered a bit off the topic, but it’s relevant. The core problem in updated obsolescence is that it will be everywhere. Smart homes are not all that common nowadays, but the more we will have such devices on our homes, from freezers and microwaves to simple light switches. If any of these devices use similarly certified technology that has been essentially licensed from outside, they will face a kill-update. All these smart devices will contain programs and services, which the companies see as the main sales. From a company’s point of view, they’re not really selling you an item, but the service the item will enable. In this sense, the consumer is purchasing a long lasting license to their service via this device. From the customer’s point of view, they’re paying for a device that enables a function, like the smart device control with Logitech’s Harmony Link.

This disparity is clear in gaming as well, where companies and some consumers argue that nobody is purchasing anything anymore. Rather, you are subscribing to a service with one-time payment. However, nobody can come to your home and tell disable your games. Unless you’re using Steam.

If we’re to believe this tight device cycle will stay for the foreseeable future, it will also cause another issue to build up. Apple alone is responsible for a huge pileup of e-waste, and if we count all other electronics companies with similar pace of new product introduction, we’re getting large quantities of products that will not last long. Africa probably feels the brute of the hit from this, with tons of e-waste being dumped in Ghana’s landfills.

The first step to fight this cycle would be sustainable development and design. However, the core principle of sustainable design is against most corporate interests, as it dictates that a product should be designed to last as long as possible. However, a phone that would last a decade would not be as profitable compared to a phone that gets the shaft after two years.

Logitech’s response to the outcry of their kill-update isn’t any solution. The Harmony Link will become obsoleted not because the devices have broken, but because the company chooses to terminate its function. The action is not a solution, but a pathetic way to weasel out of it. This is not sustainable design.

I’m not an Earth hugging hippie by any stretch of the imagination you may get from this post, but sustainable development and design are two key factors that need to become more relevant as the time goes by. We only got one Earth, and seeing we’re not getting off this world any time soon, we should take better care of it.

Advertisements

Technology, consoles and computers

One way to determine a difference between a console and computer gamer is how they approach their respective platforms. For console gamer, he approaches the games first and foremost, giving them a platform and concentrating on those themselves. A computer gamer however concentrates on the specs, citing everything from the RAM of their gaming rig to the maximum resolution their monitor can output. Essentially, the software versus the hardware.

Because of the mixing of computer and console games, the approach between the two has become just as mudded. More often than not computer games used to drive the users stupidly insane, as they had to continue updating their rigs to play the latest Ultima or Privateer. Not only it demanded money, but dedication most of all. For console gamers life was easier, as all they had to worry about were the release dates and whether or not they had money to purchase the game, or in case of games like Zelda II, if they managed to nab a copy for themselves.

Console games are tailor made to run on a console, using the best possible capabilities it offers from controllers to whatever the hardware could do. It is far more easier to produce a computer game, as it has very little limitations what you can do with it, outside the whole keyboard and mouse controls. Whether or not you prefer those over a controller is up to opinion, even thou one can make a proper argument about the tactility of controllers and their form fitting.

Recently SONY admitted that the PlayStation 4 was not up to the technological higher end it was meant to be, but this is an oxymoron. Consoles have never been at the technological high end. Because of the technological development and the solid nature of the consoles, the moment any game console is released it holds outdated technology that can’t be upgraded. However, SONY is absolutely right in that PlayStation 4 not being at the top of the line has little do with game quality. Over and over again we have seen the consoles with lesser specs beating their more powerful competitors. No, Super Nintendo is not an exception in this. Mega Drive has 32X and Sega CD in the end, putting it on the higher end than Nintendo’s reverb filled console.

Lately, we’ve been hearing a lot of roaring from the computer gamers about the 4k displays. Your normal user doesn’t care about that at this point, nor has the 4k displayed the other HD displays at this point in time. Display technology advances at such a high rate, that buying a high end television once every twenty or thirty years should suffice you just fine, unless you’re a huge tech fanatic.

For the computer side, the 4k is another hardware issue the love to discuss how they would be able to make use of them the best possible way. For a console gamer, the issue is not relevant, at least not yet. This is because the 4k sets are not common in households. The companies have released numerous versions of their sets offering 4k support, but as with others things, there’s very little reason to put money now into 4k television when there’s very little that supports it. Much like with almost every console launch, the first years of any new technology sees little adoption before the gradual shift either makes a household standard or something new comes along and beats it. The change from VHS to DVD is an example of rather rapid change in both industry and household standards. SONY wished to replicate that success with the Blu-Ray format, but the jump from DVD to Blu-Ray has been far more slower. Even with DVD, the early players were rather low in quality and some early players simply can’t play the latest discs because of the technology differences, and then you had the fact most DVDs were low quality VHS transfers. There are instances where the Laserdisc edition was superior before a proper digital remaster came along, or in few rare cases, a Blu-Ray release.

There is also an issue of worldwide markets. The cultural values regarding technology varies massively and not all areas simply accept the new technology as the best. That said, the opposite applies as well. All we can really do is to individually wager whether or not it is worth purchasing potential rather than proved practicality. Early adopters always purchase for potential, and there are times when they simply bet on the wrong horse, much like with the BetaMAX or HD-DVD.

4k displays may be latest of the tech, and we all know that in a year or two we may be hearing about something that makes the 4k a moot point. Actually, there already exists numerous higher resolution standards than the 4k, like DCI Standard and the 8k FUHD. New ones will come along. Skipping a technological step isn’t anything new, and majority of consumers simply skip the things they don’t consider as worthwhile purchases. I can assure you that while 4k displays, objectively speaking, seem to offer better visual experience, there are those who simply don’t care and are fully content on using their current sets due to variety of reasons.

All that said, when would we see actual use for 4k display sets outside video games? NHK has announced Super Hi-Vision broadcasts for Japan in 2016. Eutelsat is Ultra HD dedicated channel that is already operating. 2013 and 2014 have been the years when Ultra HD has made its impact mainly within the industries. The end-consumer hasn’t really seen anything worth purchasing outside the potential. Then again, we had people announcing that 3D would be the future, and the boom ended up being a simple whimper.

First adopters purchase for the possibilities products offers, whereas most other consumers base their purchase on what already is offered. The compute gamer most likely would put his money into a 4k display in order to keep himself atop the hardware race, whereas a console gamer wouldn’t need a 4k set before something worthwhile would come into play. Even then, when something has a 4k support, there’s always the question whether or not the content in itself is worth the investment.

Different take on customers; for the love of God learn how to use it

Why does this program ask me this? What is this message that Windows is showing me? Why can’t my phone do this? Why can’t I tweak my Mac for better performance? Why is there a virus in my computer? Why won’t this computer work? These are questions that I’ve heard too many times, especially the last one.

Self-repair manifesto
is something I expect everybody to follow to a limited extent. The idea of can’t fix it myself, can’t own it is a bit extreme for the common folks out there, but it has the correct core in there. While I agree that some things are beyond the repairs of a mortal man and better left for fixing gods at your local shop, I’m truly expecting people to know how their devices and household items work. A surprisingly small amount of people know how their vacuum cleaner or microwave oven works, and that’s a bit alarming. In cooking, if you know how stuff works and what they do, cooking becomes both easier and much entertaining in its own rights. Then again, cooking for one isn’t the most riveting thing to do. Trust me on this. 

I recommend everybody to open some of their devices and just take a look inside what they have and just take a look at what they have inside and familiarise yourself with it. See where the power switch is, what kind of chip is attached to it, what things are in the way and how they’re all connected. Using a reference guide on what certain parts are helps a lot. For example, knowing what is a capacitor and what it does helps on the long run. If one blows up, you might want to learn how to solder in order to replace one and fix the device by yourself. Soldering isn’t hard to learn, but just like everything else, it takes some training to get the idea and become good at it.

With computers in the software side I can only blame people who never wanted to know how their system works and just want to use it without anything getting in their way. Windows Vista’s infamous security system which asked if you really wanted to do something was a direct result of people not understanding what they were doing. If something is made foolproof, it seems that its utility is almost completely lost. This in most cases also prevents the user from making tweaks and adjustments for the device as they see fit and modify it as they like. It’s pretty stupid to think that the more simplified systems get, the more text and holding the users’ hands we get, which just pisses other people off.

Windows 8 is actually a good example of this. Where Microsoft wanted to go with Windows 8 was to have it more open for the common folk who were using tablet, but what they designed was one of the worst interfaces I’ve seen in a long time. It’s a horrible GUI (look it up), even for tablets. But no, certain groups within Microsoft thought that it was best idea to make everything more simple and easier to understand, which ended up with the version we have. Honestly, Windows 8 is horribly designed, especially in home PC use.It’s just so awful to use, switching between two views and neither is completely supported. Microsoft really dropped the ball here.

And you know the reason why Microsoft thought Windows 8 was a good idea? Because there is a bunch of stupid people who just don’t want to learn how to use the goddamn operating system. In other words, the customers are stupid enough NOT to want to get into what they’re using.

I’ve said that I’ve got nothing against Apple products, I just don’t like how closed they are. But for the love of Quantum conductor, they are not any better than the competing product. You’re just too damn inept to learn what to do with them. Most Apple products, like the Macintosh PCs, are a good example of decent balance between openness and closed system; you really can’t do anything to change or tweak it, but on the other hand everything works just fine most of the time. If something goes wrong, then you’re screwed and need to contact Apple services for help. Oh but with PCs everything just crashes all the time. First, I hope you realize that Macs are PCs as well. Second, no they don’t if you know what the hell you’re doing. No buts.

Things just get more closed and stupider the more the customers refuse to understand what they need to learn in order to use different products. It’s insanely grating to think that we used to pop in a VHS cassette and press play. That worked. Now we pop in a DVD and I hear people asking how they can get into the movie. There are DVD menus that are clearly telling you what to do, and I still get a call every single week from selected people asking me how they get proper subtitles. [What in the name of fucking god. Even I never did that.Edit] It was so easier with older media. Modern media, for better or worse, asks the user to get into what the hell they’re doing.

Then again, people still don’t know the universal markings for PLAY and PAUSE. For the love that 00-Unit has for us, please learn those at least. Standardised markings exist for a reason, and that reason is to make your daily life easier.

But no, when customers are dense motherfuckers who refuse to acknowledge that there’s something wrong in them, the shit hits the fan harder than a G-Bomb. And we’re supposed to design these people a product that would be easy to use. Existing products WOULD be easy to use you would just read that one damn comprehensive manual and apply that knowledge to other similar products with little effort, trying and research. I will continue to develop and design better products for your use, but you need to meet me half-way and put some effort in there as well. Otherwise don’t blame me when I design you a house that works like 1984 police state and dictates everything you do and how you do it in order to ensure that things work as intended.

I hate that analogy. A product should be something we all can use as we want. Misusing a product or using it wrong is customers’ fault and nobody else’s.

Then again, we have shitloads of free information on the Internet and in the libraries for t people to use, and it feels like nobody is doing any goddamn research.

Wii U’s not looking good, SONY’s junk and Microsoft is…

… in pretty damn deep trouble it seems. For some time now I’ve said that Microsoft should concentrate on their strengths on PC and allow XBOX to be their tertiary objective or abandon it altogether. Because Microsoft has done the same thing SONY did ie. concentrating on the damn video games rather than on what they know to do well, their nightmare might come true. Or rather, it might already be in some form.

Let’s go point-by-point.
Are the pads eating the PC markets? Yes, but only if we assume that the iPad and other smart devices are something else than PCs. Pads and smarthphones are not some sort of magical thing of their own, they’re as much PC as laptops. The proper way to put this would’ve been that non-Windows based PCs are eating away from Microsoft’s share.

Are employees really converting away from Windows based PCs? I’ve discussed this with people who work on government facilities, and they do admit that there’s iPads for certain purposes, but majority of the work is still made on machines that have actual keyboard. As such, while iPad is certainly taking its place in the work environment, it’s way too early to say whether or not it will completely replace Windows. Perhaps a sort of paradigm shift is happening, where we are going towards more LACRS devices from Star Trek.

I wouldn't really oppose those, but there will be a lot of people who will never get used to the lack of tactile contact
I wouldn’t really oppose those, but there will be a lot of people who will never get used to the lack of tactile contact. There’s also the position where you type, most of the time is awkward if you’re using two hands

Now the third point is completely true and Microsoft can only blame themselves. The GUI of Windows 8 has got a lot of hate for a reason. For one, it abandons a lot of functions that users expect from Windows. it doesn’t help that Microsoft has emphasized on the touch-screen function a lot, and if you’re a home PC users this functions becomes more or less completely useless. When I gave the retail version of Win8 a throughout testing, it was clunky, rather horrible and felt that everything sank beneath something. Microsoft should have looked back what worked and not remove whatever they liked. Will Windows 8 replace Windows 7 like all other versions have done previously? It’s hard to say, but I’ve heard rumours that this wouldn’t be the case in every institution.

Windows 8 is also the reason why loyal developers are moving away from Microsoft. Technically this point, and all other up to point seven, can be crunched into one sentence that sums it all up; Microsoft dedicated too much of their attention to XBOX and 360 that they forgot where their true business and strengths were. Because of this Windows’ development has taken a hit and clearly the company has lost their sight on what the hell they’re doing. You can see that a lot of Windows’ properties has changed since the XP for worse, thou there are truckloads of improvements as well. Still, we all can agree that something was never right with Vista or 7, and we call agree that as light and efficient as 8’s core might be, it’s functionality is pretty damn awful (unless you train religiously on it.)

It’s no surprise that as Microsoft is losing in their main front, the XBOX series is also suffering. XBOX doesn’t bring in any money and only spends whatever Microsoft has made in the 90’s and early 00’s. There has been some reports about Microsoft making loss with every 360 sold, so if Microsoft would lose their main pillar, ie computer OS monopoly, the company would be in very deep trouble.

If Microsoft won’t get their business together, the worst possible scenario is that Windows loses its place in the work environment and the next XBOX will bomb worse than the PSVita. It’s their third time trying it, and I really hope their get it right. However, the blowing winds tell me that this won’t be the case.

However, while we certainly can judge MS and SONY at this point, the earliest point we can say anything about the Wii U’s success is after the holiday season. I’d put that somewhere around February or March.

A matter of ease of use

You open the case, pop in the cassette into the player and press Play. It’s that easy to use a VCR. With disc format it has always been a little bit more convoluted; you open the case, pop in the disc, navigate the menus, and then possibly you are able to press Play.

Convoluted is a keyword when it comes to design of use. You want to avoid it as much as possible. However, making things far too simple in certain fields also produce results that nobody really likes. VCRs are in this regard a shining example in concept, as they require minimal input from the customer to access the purchased content and get the most out of the player’s basic function. However, the design of different players add loads of different features from different speed recordings and numerous other functions I’ve already forgotten about. Nevertheless, this kind of simplicity is always good as one of the goals to achieve in any design.

DVD and BD are a little different matter. Personally I do not find them any worse than using any other playback device in their regard, but I can’t but help to think that something went wrong when designing their interface either on-disc or in players.

While I have nothing against menus in disc formats, I do have a lot against their design interfaces in most cases. How many times have you found a disc that has loads of stupid design choices either visually or in function? Simplicity should and ease of use is lost in some discs, and I have to applaud to a lot of cheap releases for having the right idea of putting the budget into something else than damn menu screens. But no, almost all of the bigger releases have animated menus, menus with music, menus with selections that make no sense. I recall having a disc somewhere, where the menu selection where allocated into the four corners of the screen, and it made no sense how the indicator went between them. You’d expect it to move from upmost left corner to right and then down, but I can assure you that it didn’t. It shouldn’t be that hard to make simple intro screen, perhaps with a theme tune playing in the background, and simple to use menu.

But simplicity also can cause problems. With disc and cassette players it’s easy to make them function with minimal amount of effort from the user, but something like a computer operating system is a different kind of beast.

Windows is arguably the best operating system out there, otherwise it wouldn’t be as widespread and abused as it is. However, with every new iteration new problems rise as they have made Windows into more something that everybody can use.

There used to be time when using a computer required actual knowledge on how the hell you use a computer. I’m ashamed to admit that I’ve forgotten loads of DOS commands that old computers needed and I used to be able to code in Pascal quite well, but with these new machines these kind of skills are unneeded. As Windows has been moved more into realm of ease of use, all the most complex functions have been moved behind the curtains of the operating system, thus the user is mostly unable to access these. In older systems you were able to access these for your own pleasure, and thus change various settings for your liking.

I do see and value Apple’s aim with their computers, but making their machines completely closed isn’t really the best option. If anyone would bother making a proper virus to a Mac OS, pretty much every Apple machine out there would be under serious threat. History has shown that pirates and hackers put their effort into systems and machines with greatest value, a reason that one should take notice why either PSVita and 3DS has not been hacked yet; it’s not that it’s impossible, it’s just that it doesn’t have enough reward to do so.

As a designer my aim in most cases is to lessen the customers’ stress to learn something new to properly use any device. However, sometimes it would be better for the customer himself to take matters in their own hands and learn to use something more advanced than just pressing play button. I would really recommend everybody to learn their computers’ OS better, what it does, what different term mean and all that, because then you are able to use your computer better and closer to its full potential. Same with every other device and system out there.

However, that doesn’t allow the designer to make convoluted shit.