In my mind, gaming has historically been social in nature. Chess, soccer, Chinese checkers, poker, basketball, hide-and-seek, horseshoes and Dungeons and Dragons all exist as activities that enable and facilitate social interaction. As games, they nearly cease to exist without the presence of others.
In their infancy, video games were similarly social. Computers and developers weren't far enough along in the establishment of artificial intelligence, to do much more than allow a few players to duke it out. Single player experiences were incredibly limited, so video games were inherently social out of necessity. Sure, there were plenty of single-player experiences to be had, but even those were successful because they became social. Due to their limited ability to present complex character interactions to the player, games like Space Invaders, Pac-Man and Galaga became more about skill, showmanship and competition. Arcades became incredibly popular as locations for gamers to gather, lining up quarters for who had "next" in Street Fighter II and setting up high-score tournaments for 1942. Electronic gaming was "classic" gaming in that it was still an inherently social activity.
Video games evolved in many ways over the following few decades, and as processors got more powerful and developers became more experienced, AI was used to alter how we fundamentally experienced electronic entertainment. While this evolution brought us Baldur's Gate, Grand Theft Auto III, Final Fantasy XII and countless other unforgettable single-player experiences, it also slowly, but very successfully, removed social interaction from a large segment of mainstream gaming culture. Social games did continue to exist and flourish (see Counter-Strike, Everquest, etc.), but we quickly had a significant number of completely non-social experiences. Nothing about my hundreds of hours (literally) spent leveling up my characters in Baldur's Gate was communicated to my friends unless I went out of my way to tell them. Unless I brought them over to my house, my friends had to take my word for it that I'd defeated a particular rare or epic monster in Final Fantasy XII. Instead of facilitating or demanding interaction with other gamers, many titles from the past decade could be played and fully enjoyed in complete isolation. I'm not sure that was the direction anyone really wanted it to go.
Things seem to be changing, though, as the market transitions hardware generations...
First, broadband has truly become ubiquitous enough that console manufacturers can assume everyone will have internet access. As a result, services like XBOX Live Arcade and Steam Community have recreated the physical arcade experience in a virtual space. With both services, players can quickly and easily chat with friends, setup and invite each other to games, check out the "leader boards" of games they play and more. Both services have even brought single player games into the social experience with achievements - points or "badges of honor" for individual games.
Second, party games have truly come into the mainstream. Wii Sports and Rock Band are the runaway successes in this realm. Both games are enjoyable as a solo player, but become unique, indescribable experiences as more players are added to the mix. The sense of accomplishment when four players, in cooperation, beat an incredibly difficult song in Rock Band, and the sheer joy and child-like fun of four player tennis in Wii Sports has yet to be matched.
Finally, World of Warcraft, while debatably "next-gen" and certainly not the first of its kind, has brought massively-social gaming into the mainstream. It did so in such a nearly-perfect way that, as a result, gamers now have high expectations and demands for future games of its kind.
As the "next-gen" gaming platforms rapidly become the "current-gen," I can't help but recognize that gaming culture is rapidly returning, in many ways, to its roots. I couldn't be happier.