Home’s Future

by Estim20, HSM Editor

July 15th, 1983. Japan released two video game consoles onto the world: the Family Computer (or Famicom) and the SG-1000, by Nintendo and SEGA respectively. Prior to their release to the market, Atari (and American consoles as a whole) dominated before it all came to an abrupt collision in 1983. The Japanese console brought the video game crash of 1983 to an end and saw a new wave of consoles born; the world entered the market’s third generation – and with it some classic examples of gaming that retain popular to modern day. Japan held a vise grip upon video game sales for the next few generations, a grip that wouldn’t weaken until the 6th generation with the release of the original Xbox – laying the foundation for the modern console market.

It was the scene that into which I was almost literally born.

There’s a sense of nostalgic warmth reflecting upon one’s introduction, one tempered by a more realistic, pragmatic approach to the history of the industry. We all have earliest memories of first consoles, first games, and first triumphs over the likes of villains, bad controls and worse implementation of gaming ideas. If we were children when we started throwing controllers to the wall in unmitigated frustration, such experienced carved a niche in our brains where we feel fuzzy and pumped with soft focus, reminiscing over how it was like to be a gamer ‘once upon a time.’

For me, it was the mid to late ‘80s when I broke into video gaming. July 15th, 1983 – the magic date for the third generation. For me, I hovered around a year old. At the time, the NES debuted alongside SEGA’s Master System and the Atari 7200. American consoles fell under as Japan picked up the pieces, turning a generation of gamers towards a love affair with Japanese consoles and tropes – an affair that would last even today, with a diverse fanbase extolling the virtues of games manufactured on the other side of the Pacific.

It isn’t difficult to see why, either, no matter where you stand on the matter. Atari became a blundering, anachronistic company in the eyes of many consumers; a perception triggered the previous year or so thanks to its dubious business practices.  It pioneered the concept of underselling consoles in the hopes of making up the difference in video game sales, a practice that worked leagues better when they produced the only consoles. It didn’t hold up so much when competitors entered the market, especially when one considered you could get palatably similar experiences across each of their products. Q-Bert, Pac-Man, Asteroids? Yeah, they all released that on their systems – with barely enough differences upon which to make judgments.

This left consumers with more choices than perhaps seemed reasonable. We’re used to a limited console market today, in that there are a Big Three (Microsoft, Nintendo, Sony), each with its own console. When you’re required only to debate the quality of three consoles, you won’t have to juggle the specifications of an inordinate supply of technology – nor a gaming library rivaling the Library of Congress. On the other hand, the second generation of consoles slapped us in the face with a console list that was three times as large as it is today, thanks to a ‘follow the leader’ mentality – the leader being Atari. A good chunk of them originated in the States so when the market turned sour, the idea of an American-made console turned sour with it.

It didn’t help matters that Atari quickly fell into quick cash-in games, alongside releasing salaciously pandering content (of which I will not go into). If you’re familiar with Sturgeon’s Law and willfully combine it with the concept of ‘licensed games,’ this is where it started.

We’re used to the concept of quick, cheaply-produced movie tie-in games where timeliness is an issue, though that’s just the tip of the iceberg.  Quaker Oats produced games for the Atari, never mind the slew of other games from other companies, including Purina and Kool-Aid. Many of us also remember the notorious debacle known as the E.T. video game, which met the fate of a landfill as six million copies were returned to Atari. Quite often they were shoddy products, as we now come to expect from cash-ins, but we know that now thanks to history and a little invention called the Internet. Back then we had hopes and dreams combined with a faintly naïve attitude that quickly fell under the weight of increasingly poor quality control on the part of the market.

This wave of doom and gloom adversely affected the American market, to the point that Japanese competitors filled the void when the bottom eventually fell. The European and Japanese market met with a different environment, for different reasons, but nonetheless history dealt a painful hand to American gamers for a brief period of time. 1983 became a changing point for the industry as a whole – one whose ramifications are still felt.

As a kid, I didn’t care about any of this, though. I also didn’t have the capacity to understand the situation at the time, being less than a year old when the NES debuted in Japan, but that’s neither here nor there. What survived the crash affected what I understood about video games from then on out.

I remember standard definition being the only definition available for the home market. You could have wrestled with the (comparatively nascent) personal computer market at the time if you wanted better, with the likes of minimum requirements and peripherals such as keyboards and mice that required dexterity and competence beyond an infant’s abilities. I did find interest in that as the ‘90s dawned but in the meantime, I was captivated by the siren’s call of the NES and, later, SNES.

These consoles epitomized ‘set up and play quickly’ when all one desired was the opportunity to send Mario to mushroom-related demises. They used the mono-sound, SD-compatible audio-video cables that we identified by the yellow and red prongs on the ends (later joined by white when stereo became a thing for televisions). ‘8-bit’ entered our vocabulary at this time, thanks to a slightly shrewd marketing attempt by Nintendo. It knew the words ‘video game console’ filled our throats with bile, so it instead marketed its Famicom as a toy with R.O.B., a ‘robotic’ ‘buddy’ that played only one game. It operated with slightly more sophistication than Rock ‘Em Sock ‘Em Robots and it met with some scorn, but the Trojan Robot succeeded anyway – NES was here in the states, bringing with it Nintendo’s increasing stable of family-friendly entertainment.

NES, in it's grey glory.

NES, in it’s grey glory.

(Looking back at it, Japanese influences became a huge factor in the ‘80s – at least more so than before. The various anime titles being released, the American animated shows being produced overseas, consoles produced in Japan – it’s no wonder it became huge in the 21st century. We’re looking at the consequence of this influx of eastern media in the ‘80s and we’re seeing the fruits of it.)

Anyway, it’s where my generation started getting addicted to electronic gaming –arcades and consoles became a popular pastime for the kids. My first memories focused on a few important situations.

Socializing revolved significantly around either existing friends sitting on the same comfy couch, playing the same copy of Contra or heading to the arcade and face-punching some kid’s choice of fighter in any myriad of fighting games. The concept of playing online did exist at the time (the NES like totally had a modem, ohmygod!), but this was back when connecting to the Internet meant tying up the phone line to hear a screeching machine struggle with remaining online. PCs pierced our eardrums with this very thing and it wasn’t always possible to play online for extended periods of time, depending on where you lived (and who was trying to call your house).

Online play wasn’t quite the ‘thing’ then as it is today. Many American households didn’t utilize an online feature, even into the fourth generation, where Sega Channel earned roughly 250,000 subscribers. However, it was important for setting up the interest in online gaming as we know it today. We can likely owe its existence partially to the successes, however minor, of previous generations getting it done.

Which brings me to where this all ties together: gaming on a global scale. And towards that, quite a lot has happened in the intervening time between the 3rd and 8th generations.

3D, polygon-rendered gaming became a reality (or at least the norm) as we passed into the fifth generation, creating a demand for finer controls that made us look in numerous directions. RPGs secured a demand for story and immersion, beyond simply moving your character through an ‘excuse plot’ and side-scrolling environment where you attacked opponents for the sake of attacking. Cinematic experiences became increasingly important, producing demand for switching from cartridges to optical discs that offered more memory and (in the case of some lengthy games) multiple disc-spanning adventures.

SD screens remained the standard until the sixth generation, when the PS2 and original Xbox allowed for HD support. HD became a massive industry move, as widescreen HDTVs became cheaper and replaced smaller, SD-only screens. Online gaming joined the party, after PCs and early consoles pioneered the possibilities

Color-coded for your convenience.

Color-coded for your convenience.

– by the 8th generation, every current console has online support built in and, perhaps more importantly, expected.

Casual and mobile gaming has simply exploded since the days of the Game Boy, with numerous options for gamers on the go. We’re seeing more indie games break into the market, with various degrees of success. We’re thus seeing an influx of new ideas and new faces cultivating a new wave of gaming

So . . . here we are. What does this mean for Home?

Indie Gaming Goes First

Let’s discuss indie gaming for a second here.

Many of us can name a few indie games, such as Cave Story, Terraria, or Minecraft. Independent game developers have been on the rise as the 21st century dawned, and for good reason. Big publishers used to control the market prior to the mid-90s, forcing independent game developers to get creative about distributing their games. Think about it for a second.

I remember it being rather difficult to get anything independently made on the NES and SNES, at least in the sense that I don’t recall very many games that qualified. Wisdom Tree comes to mind – and for those of you familiar with it, you may remember their line of Christian-themed games. Nintendo’s comparatively draconian attitudes about how one can produce a game for the systems of the day came after Atari’s infamous issues with adult games, giving the impression it was far too easy to produce games for their systems.

Prior to this, you found the likes of shareware and demoware. The likes of Commando Keen and Duke Nukem (before his fall from lack-of-grace with Duke Nukem Forever) offered their first episodes for free and required payment for remaining episodes (usually counting three in all). Hobbyists dominated the era afterward, paralleling the early days of computer history. “Bedroom programmers” became a popular term to describe these independent developers.

Basically up until about the release of Cave Story, indie games were either not a thing (at least as far as the modern perception is concerned) or limited to hobbyists, thanks in part to the litany of game development engines that didn’t support one another’s games. Fragmented communities cropped up that fostered an unintended isolation. People have been producing their own games for ages, as long as the industry existed, but ‘indie development’ didn’t seem commercially viable in the same way the largest companies could take a risk and take matters global. When you rode on your own wallet’s reliability, you used to depend on major corporations to stand a chance – and that wasn’t guaranteed when they wrote the rules.

To think this was 20 years ago.

To think this was 20 years ago.

Ever since 2005 minimum, with the launch of the seventh generation of consoles, we saw a vast increase in spotlighting independent efforts, with the likes of XBLA, Steam, and our own PSN Store offering a place for indie developers. There may not be a sense of uniformity quite yet, but there are a few facets that tie modern indie developers together.

Mainly you have two competing schools of thought: should indie games compete with mainstream gaming by embracing them, or should they focus their energies on pushing the envelope and experiment? Assuming we don’t take a third option here and realize the value of both.

To that end: quite of my generation’s childhood (Nintendo’s early days on up to today) was dominated by the largest companies of the day basically running the show. Indie games, while available, were not the tour de force we’re seeing them become today. Companies (Nintendo being a prime example) latched down on how easy one could produce on their systems. At least that was the perception – indie games either didn’t exist for most practical purposes or if they did (re: Wisdom Tree), quality wasn’t assured and led to at least Nintendo being a bigger stick in the mud the following generation. This was partially what led to Atari’s efforts flatlining in 1983.

So historically we had two options, at least until the recent two broke some new ground (for console, at least). The Atari Method killed off Atari in its original form, with attempts to revive it meeting with limited success. It paved the way for Nintendo’s rise to dominance, which eventually helped Sony create the PlayStation line – including the current best-selling console, PlayStation 2. So while it has its benefits, it has its drawbacks. Everything as we know it will probably die off if we follow Atari’s example.

Option 2: the ‘console wars’ option. If you grew up in the ‘80s and ‘90s, you know how this ensued. Nintendo and Sega concocted a console war against each, when Sega actually produced consoles, with fans of both arguing their choice’s superiority. While a good chunk of it was marketing buzz and nostalgia does exaggerate matters to our minds, nonetheless, it stayed for a good while. Nintendo soon faced competition against Sony, which raised its fair share of ‘my side is better’ arguments – some of which may still be held today.

Cave Story - the progenitor of modern indie game.

Cave Story – the progenitor of modern indie game.

Option 2 is good from a marketing perspective – if you can convince an entire generation of gamers to stir sales via manipulating purchasing validation biases, who knows how much you can manufacture. On the other hand, it isn’t quite the perfect method, of course. Gamers do grow up and while some psychological idiosyncrasies continue, we eventually learn to recognize attempts to create drama and console warfare. We also eventually (if ideally) learn to live with the choices we make and not care nearly as much about how others feel.

With that in mind, we may have a third option opening up for consoles, something of which originated on PC: The Indie Gaming Route.

Home’s Future

Let’s take a look at what this looks like for Home.

Indie development works best when you have an opportunity for outside parties to produce something without a hassle. It also helps when they can maintain a sense of the ‘indie’ part of ‘indie games.’ Given many indie developers are smaller teams, a publisher is one option, as is crowd-sourcing efforts and digital distribution. And given they differentiate from hobbyists largely in the idea of being commercial-centric, sales are going to be crucial.

We already saw indie developers on Home – think of LOOT and Lockwood, especially with their new IPs outside of Home. Forsaken Planet and Avakin show how developers on Home can use their experience on the platform as leverage for something new, something on their own. With Home’s future in a haze, as far as many users are concerned, it may even be required for Home developers to take a risk outside this little walled garden.

So here is what I am thinking needs to happen minimum with Home to make it viable for the 8th generation and beyond: make Home 2.0. Do not port Home Beta over the PS4 as is.

Why? Well, here are a few things to consider about the option:

  1. Home Beta’s technological spine is comparatively ancient. Even on the PS3, it was a concept mired by an engine stuck in the past generation. Remember how consoles are; the older the console, the greater the need for emulation. If the console is old enough, it may not even work on modern televisions. We already saw Sony lesion PlayStation 2 disc support from the most recent PlayStation 3 models, requiring companies to take roundabout measures through digital downloads.  You may lose some functionality, plus the bugs and errors that may arise on top of the bugs and errors that already exist for a digital-only service. People have complained about disconnections and F-12s since Home began; we need to have something better powering it (and built specifically for the newest generation) to make it last longer in the long run and provide better stability.
  2. Home, Sony, and the PlayStation brand flourished on third-party support. The PlayStation blossomed due to a massive third-party library, whether we’re discussing its earliest days to the November release of the PlayStation 4. In fact, let’s remember something – one of the chief complaints of the PS4’s release library is that there isn’t much of one. Early adopters are going to hope you will get what they want out the door so that they can justify their purchase. With a list of companies willing to support your product, it inspires more confidence and provides a means of getting it off the ground. After all, more hands willing to release content means you’ll see less stress on your part to fill a void.
  3. Home has yet to be ‘mainstream.’ This is a double-edged sword, if for good reason. On the one hand, we want it to be accepted by society as a viable outlet, much like how society eventually came around to a Lord of the Rings movie or the fact that Krull can exist. Until that happens, it may feel like a Sisyphean ordeal getting it there. On the other hand, indie developers may thrive in an environment where we don’t need to cope with what’s mainstream. The stress of adhering to ‘what’s popular’ simply doesn’t exist, at least not in the exact same way that it exists in the outside world. It allows indie developers to experiment and gain proper experience for if/when they decide to branch out – plus it gives you exposure from people looking for a back alley away from Call of Duty clones.
  4. It allows Sony to future-proof itself for the rising tide of indie games. We already seen them breech the PSN store, with the likes of Terraria and Minecraft, popular games from the PC side. With the increasing interest in independent development and the increase in ease of releasing anything in such a manner, it would be remiss for Sony to avoid providing a platform for social networking. We already get the power of socializing in Home – updating it can provide indie developers a means of communicating with fans as well as getting started with Home as a foundation.
  5. As we seen with Lockwood, LOOT, Juggernaut and others in the past generation, Home is a good outlet to build one’s technical chops as well as network from there and back again. It can serve as a foundation for future endeavors, especially with the benefit of lacking the stress of conforming to outside expectations. And that brings me to . . .
  6. If there’s one thing that mires MMORPGs but doesn’t hurt Home (at least not quite the same way) is the very gaming aspect that comes with it. Here’s what I mean: At least from my experience, when people have gaming skills to fall back upon, socialization is considered secondary, if at least by a small group. Socialization skills are not as widely anticipated, even by those who decent individuals – just look at Final Fantasy XIV: A Realm Reborn or any litany of other MMOs where the game segment is the biggest draw. When you can’t be sure whether that random stranger is going to be cooperative or an S-Rank nozzle of a sort that’s too family-unfriendly to print here, you restrict yourself to either your friends or working solo. For an MMORPG, that can be an incredibly dangerous to leave unchecked – and it’s baked into the fabric of MMO players across the board, so changing attitudes won’t happen overnight. Home, on the other hand, is showing us the value of stripping away the very gaming element common to anything outside Home – you take the gaming from the gamer in favor of socialization, you’re forcing people to take better care of themselves socially. If handled well, it can foster a greater cooperation and diminish toxic bile that spews up and scares anyone from user to developer alike. Plus it may carry over to other games – and it should be our collective motive to ensure it.

As such, Home may already be the grandest experiment in Sony’s arsenal. We’re already used to PC counterparts pioneering the way, so not only do we have their experience to work with, we have an opportunity to fill a void for the burgeoning indie scene.

March 6th, 2014 by | 2 comments

Share

2 Responses to “Home’s Future”

  1. Gary160974 says:

    Some of that brought back memories lol. Most of the developers have worked on projects under other company names before home or even the ps3 was released. Loots parent company is Sony and is along with Juggernaut part of an exclusive few that have be formed to create items for home. Heavy Water have done work on xbox games and been around about 14 years. Lockwood changed they name in 2012 from Outso, Outso had done some work wih face book apps. N Dreams made Lewis Hamilton -- secret life. Veemee make branded apps for the iphone. Actually most of homes developers are digital marketing experts that make apps for big brands. So the developers are not running away from home they have never been exclusive to home in the first place. Home is too slow, glitchy, broken, hacked to be anything useful to indie developers when there are much more stable platforms out there. Ultimately the need to make money is there, and none of the grand games that were released on home ever made any money, and to be honest werent that good either. so they now lay unloved except by the few that still play them. Your right Home could of been so much more. But it seems what makes these social mmo’s successful on pc’s doesnt translate to a ps3 very well.

  2. Danger_Dad says:

    ;^) I agree that a New Home would have to be re-written from the ground up. This would allow support to be built in from scratch for features that would be too problematic to retrofit in to Home as it exists today. These new features could include increased security, making New Home more hack-reisitant.

    Five years have shown Sony, et. al. what the customers like and what they want. This hindsight would to a long way towards guiding Sony’s development of Home’s newer iteration.

Leave a Reply to Danger_Dad

Allowed tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>


7 − four =