Today at 6:00 p.m., Graeme Devine, designer and programmer for Id Software and chairman of the International Game Developers Association, will be giving a free lecture in the Engineering and Mines Classroom Building room 105. When I first heard this news, like many of you are doing right now, I asked myself, “So what? Who’s Graeme Devine?” I’ve since found answers to these questions.
At the age of 16, Devine was programming for Atari, working on games like Pole Position for the Commodore 64 and Apple IIe. He has formed his own video game company, Trilobyte, responsible for the PC game hits 7th Guest and the 11th Hour, and he now works for Id Software, maker of the revolutionary Doom and Quake video games (for more information visit http://sigda.asuu.utah.edu). With games like these under his belt, Devine is practically a celebrity.
Devine’s quasi-celebrity status is especially noteworthy because it indicates the great divide between our generation and our parents’ generation: video games. This is the true mark of difference between the college students of 2002 and their forebears in the 1960s and 70s.
You might try to claim heavy metal or “grunge” as a generational divide, but it was our parents who pioneered rock and roll. You might try to claim that the internet is our generation’s legacy, but the web is revolutionizing life for people of all demographic categories. Video games, however, are a truly distinctive mark of our generation.
In our basement, my roommates and I have assembled a veritable video game museum tracing our video game experience through the years. On the table next to the Atari and stacks of games like Frogger, Joust and Breakout, is an original Commodore 64 with Epyx joysticks and a box full of 5.25-inch floppy disks. The Nintendo products?Nintendo, Super Nintendo and N64?are hooked up to a small television in the corner. And connected to the entertainment system are a Play Station, PS2 and the newest addition, an X-Box. As a generation, we’ve been there from the beginning of the video game revolution, and now as we enter into adulthood we are creating a new entertainment market and artistic medium.
It all started in 1977 with the creation of Pong, the first video game, and has since exploded into a multi-billion dollar industry. As twenty-somethings, it is a cultural wave that we’ve ridden our whole lives. Who doesn’t remember sleepovers spent playing Frogger and Pitfall on the Atari? As a group we were collectively awed by the advent of the Nintendo controller and the graphics of Super Mario Brothers. We made Mario and Luigi household names. We were caught up in the controversial Super Nintendo vs. Sega Genesis debate. And now at dinner, my friends and I discuss the advantages of the seemingly bulky X-Box controller over the out-dated Play Station pad.
Before you jump to any conclusions and accuse my friends and me of being “nerds” or “losers with nothing better to do than play stupid video games, listen to this. According to the Interactive Digital Software Association, a consortium of video game manufacturers whose products account for 90 percent of video game sales in the United States, 3 out of 5 Americans, or 145 million people, routinely play video games. The average age of a game player is 28 years old and more than 90 percent of those purchasing video games are older than the age of 18. This means that most of you reading this column can relate to what I’m talking about. It also means that no longer are video games banished to the old 13-inch television in the unfinished basement. Instead, college graduates are buying video game consoles with digital component outputs and surround sound capabilities to hook up to their new Dolby 5.1 receiver and 36-inch television.
Like television, aptly dubbed “the idiot box,” video games suffer from a stigma of being ultra-violent time wasters that rot your brain. But again, our generation knows otherwise and remains faithful to gaming as a worthwhile form of entertainment and social interaction.
More statistics from the IDSA also show that the stigma may be misapplied. The majority of gamers play with friends and family, 60 percent of frequent gamers play with their friends, 33 percent play with siblings and 25 percent report playing with their parents or spouses. One of the main objections to video games is the portrayal of violence. But in 2001, 17 of the top 20 selling games were rated “E” for “Everyone” or “T” for “Teen,” meaning that the vast majority of games being consumed have little or no violent content.
As our generation matures, so will the video game market. Granted, many games will always be marketed to children. But more complex games, targeted at adult audiences, are already emerging. Online games construct alternate realities based on real life where mere existence and daily adventures are the goal, rather than the old-fashioned Super Mario Brothers-type game with only one way to go and one way to win. In 20 years there could be a sharp divide in the video game industry with games targeted at middle-aged people and retirees as well as children and teens.
For our generation, video games are a lasting legacy. It will be a long time before we put down the controller and say “game over.”
Jeremy welcomes feedback at [email protected]. Send letters to the editor to [email protected].