Thus almost everything about the Wintel PC, from the graphics board and Microsoft's DirectX software interface to the physical separation of keyboard, processor, and monitor has been determined by the gaming industry's needs - with business applications following along behind. Vista's Aero GUI, for example, is implemented using DirectX9, requires hardware support for Shader2 and 32bit color, and needs one of last year's high end games GPUs just to start-up.
That drive carries forward too: Microsoft's current "hole card" strategy appears to be based on pushing the PowerPC based X-box 360 into the home as both a gaming machine and an entertainment center, then offering a network OS linking X-360s for next generation collaborative gaming, and thus putting itself in a position where it can choose whether or not to release business versions of the hardware and software involved as an x86 (and Dell/HP) replacement technology.
Today's chess programs are predominantly simulators, capable of comparing a position to millions of stored cases and then simulating the consequences of each possible move through "deep blue" levels of recursive trial and error. That's ultimately how things like Microsoft's flight simulator -and its big brothers, the simulators used in both civilian and military training - work too: a known starting point is combined with well defined behavioural rules to produce predictable behaviour.
Work done during the seventies and early eighties on dungeons and dragons games, however, didn't always follow the same directions. In particular games like Chess and Flight Simulator apply the rules to choose from among a very large number of possible next states at each step, but all of those states are knowable because they follow directly from what came before. In contrast games like Rogue initially applied randomisation only to the values or weights of events - choosing randomly, for example, whether a found potion would be poisonous or strengthening- but later applied the same ideas to whether or not, or to what extent, previous rules applied. Thus a +2 +2 two handed sword would cut through any enemy - except when it didn't.
A mathematician, John Conway, published a game of life in 1970 in which a cellular automaton follows harshly enforced rules (like a simulation) but not all outcomes are knowable consequences of starting conditions -like a randomisation game.
Modern games are very much focused on limiting the rules to enable the production of more realistic displays - thus things like Street Fighter for the X-360 or Dreamfall (Longest Journey II?) have incredible graphics and sound that hide the limited variability of the action. Look inside most of today's massively parallel role playing games and you'll see that same set of forces at work: substituting visualisation for diversity - but you'll also see the beginnings of a move in Conway's direction: harshly enforced rules, but within an unpredictable "universe" set by the game scenario.
What those are doing is building shared communities - complete with real life legal issues over the ownership of virtual properties and potential terrorist or criminal uses of the games environments for planning and communications - that offer interesting models for collaboration in the real world of business, academic, and government computing.
In fact, two things are clear: the obvious one that the games business will continue to drive PC development, and the unobvious one that we can expect to see the emergence of "Conway communities" as real world markets and testbeds for real world ideas