In his video game design seminar at Parsons the New School for Design in Greenwich Village, Nick Fortugno was recently explaining the basic taxonomy of players in online role-playing games like World of Warcraft or Lineage, games that millions of people around the world play every day.


“You might think that killers are just bad for the game, right?” he said. “Well, they actually provide a really valuable social function: they provide something for other players to talk about. ‘Oh, my God, did you hear that Dorag407 got killed last night at the dungeon?’ See, all of these things exist in a social network, which is what really provides the game experience.”



Most of the students kept pecking at their laptops. A few took notes the old-fashioned way.



Three decades after bursting into pool halls and living rooms, video games are taking a place in academia. A handful of relatively obscure vocational schools have long taught basic game programming. But in the last few years a small but growing cadre of well-known universities, from the University of Southern California to the University of Central Florida, have started formal programs in game design and the academic study of video games as a slice of contemporary culture.



Traditionalists in both education and the video game industry pooh-pooh the trend, calling it a bald bid by colleges to cash in on a fad. But others believe that video games – which already rival movie tickets in sales – are poised to become one of the dominant media of the new century.



Certainly, the burgeoning game industry is famished for new talent. And now, universities are stocked with both students and young faculty members who grew up with joystick in hand. And some educators say that studying games will soon seem no less fanciful than going to film school or examining the cultural impact of television.



More here.