Gaming as an adult is quite different than when I was a teen. There was a post I saw on Reddit that contrasted 3 quantities - time, money, and energy. As a teen I had tons of time and energy, but not as much money as I would have liked. Now I have more money but far less time (apparently there’s a stage when I’m older where I’ll have time and money, but no energy). I have a lot more responsibilities and obligations that take away time from what I would most likely spend the majority of my time doing. I have to maintain cars/house, give attention to the wife, help kids with homework, do household chores like cooking/dishes, stay at work more hours than I used to spend in school as a student, etc. I also have some new hobbies like Photoshop, new time wasters like YouTube/Reddit, and there are a lot of movies/shows that I still want to watch. On the other hand, there are other things that I don’t do as much anymore - like scouts, church, drawing, comics, etc.
But I tend to be the kind that wants to find games I really like to play them multiple times. So when I find something like the Mega Man series, I want to go back and re-play it occasionally. But in the meantime a bunch of new stuff has come out. I feel this way with movies as well; I think I could be deserted on an island with electricity and my current game/movie collections and be good replaying/rewatching them for years without anything new and would not get bored. While I like finding new stuff to play, I feel like I have to abandon/neglect a lot of the older stuff I liked in order to do so.
Games themselves have evolved quite a bit, and not just in the graphics/sound departments. The original business model was arcade machines that ate quarters, not too different from how Pinball worked. Each machine could be custom made with their own control scheme and hardware requirements, so arcade developers were free to experiment with new ideas like using a trackball instead of a joystick, using 2 screens, or changing the number of buttons. But the software had to reside within a small boundary of having a concept that was easy to discern to a passer-by, being enticing enough to spend a quarter on, and also being challenging enough that people were willing to spend multiple quarters. The prevailing wisdom from Nintendo’s Shigeru Miyamoto and Atari’s Nolan Bushnell was to take an easy “story” like rescuing a kidnapped girlfriend or fighting off alien invaders and design a game that was easy to learn but difficult to master. Many games accomplished the ‘difficult to master’ part by ramping up enemy AI and/or speed as levels progressed. Most arcade games had no “end level,” but just kept recycling the same levels with added difficulty/speed (Pac-Man had a famous kill screen after 255 levels). Players who stayed alive longer were rewarded with a higher score and the ability to input initials onto a leaderboard for bragging rights (at least until the power got shut off).
It is interesting to note that part of what drove games to be the way they were was purely because of economic factors, and not just technology. Games had to be profitable, so a game that could be played for 45 minutes on a single quarter would not be popular with arcade owners, and games that ended after 15 seconds would not be popular with players. Thus things like lives, continues (i.e., +25 cents), and timers were worked into how games worked.
Early home game offerings from Atari and other companies sought to emulate the arcade experience with a small number of repeated stages, points/high scores, and ever-escalating difficulty. Remnants of that mentality can be seen in early games and even through the mid-90's. But once the majority of gaming shifted to decent home consoles in the 8-bit era, entire new genres were developed now that developers’ imaginations weren’t as constricted. Older systems could only handle a few genres (like space shooters or maze runner games), but now there are things like Role Playing Games, cooking simulators, First Person Shooters, or visual novels that couldn’t have been made back in the platformer days. Along with this paradigm shift came a change in some gaming conventions, such as the “lives & continues” setup, points being the end goal of the game (or at least one of them), timer limits, and small numbers of difficult levels that were intended to be played over and over (e.g., Donkey Kong) instead of having a large number of levels (e.g., Donkey Kong Country). Now games could be saved and played over multiple days, and concepts like player progression and leveling up took root (borrowing from older tabletop RPGs, obviously). The goal shifted from staving off the ‘Game Over’ screen and earning points to finishing the game or completing the quest, score be damned. A good analogy is that the the first movies were basically just stage plays but recorded on film for easier viewing. It wasn't until movies had been around for a bit that filmmakers realized they weren't hampered by the same constraints that Shakespeare had faced. Camera tricks, zooming in, playing with different lenses and other elements started to arise as cinema evolved. Actors didn't have to project their voice or overact, and makeup now had to look good and not just passable from 30 feet away.
Certain other new conventions arose because of unique hardware (such as many Nintendo multiplayer games not allowing Player 2 to pause because on the original Famicom controller 2 there was a microphone instead of buttons), or other hardware limitations (like combining up plus A to use a dagger on Castlevania because there wasn’t a third button to assign sub-weapons to). Another non-gaming example is that the “save” icon in computer programs is still a floppy disk, but we haven’t used those in 24 years really. Kids these days just learn what the picture for it is without understanding why, usually. Other gaming conventions were developed over the years, such as auto-saving and health regeneration, but to a certain extent it also has the effect of making people less “responsible” in the same way that automatic headlights off make people less responsible over time because they never suffer the consequences of a dead battery.
As alluded to in my Evolution of Geek Culture posts, a lot of the things once considered “nerdy” and/or “geeky” have come of age and aren’t ridiculed like they once were, video games included. But one thing that I never would have predicted was the rise of E-sports and competitive gaming as a source of income, nor would I have guessed how many people can make money by streaming themselves playing games on Twitch or YouTube. But on the other hand, if we’re being honest, is watching 8 people playing Smash Bros. in a professional competition online really all that different from watching 10 guys try and throw a ball into a hoop? I contend that they aren’t so dissimilar.
But now that it’s much less of a niche hobby/industry, there is more to play and more news to keep up on, which is both good and bad. There are times when I feel like I’m kinda out of the loop, stuck over here in Nintendo-land (the hypothetical concept, not the Wii U game) and 15 years behind the times. I hear of popular AAA games that either don’t interest me or I just don’t have the time/money to spend on something I’m not sure I’ll like ahead of time. Perhaps it’s that I’ve just learned what I like and I shouldn’t feel bad for not trying out all of these games, or it could be that I’m just getting more deeply set in my ways.
What I have been doing more recently, though, is turning 10 pm to Midnight into my own personal 2 hours of games (usually). Ever since the N64, I’ve tended to find 5-8 games that have good replayability and not branch out much beyond them. But with the Switch I’ve played a lot more Indy, retro, and smaller games because of the ease of availability digitally and keeping up on when things go on sale. That said, I now have a library of games that I’ll probably never play again, but I suppose at some point all games will be played for the last time.
No comments:
Post a Comment