PC game developers only hurt themselves when they release games on top of the line (or near top of the line) min system requirements.
Economically it makes sense to release a game with the lowest possible system specs in order to allow it to be played in a satisfactory manner by the largest number of computers possible.
If they figure the percentage of the market that has below a pentium 450 is really low (say only 10%) they are likely to go with higher system reqs in order to make the game they want to.
But all sorts of things limit them. Budget, time constraints, etc.
After all, a lot of games start development on top of the line, "way cool" engines and end up taking years to complete, at which time the "wow bang" factor has often been dramatically reduced.
Oh sure, you say, people always buy new computers/upgrade to make their games look better/play better right?
Well, to a point and it's still only a niche "hardcore elite gamer" market.
How many people can be bothered to buy a new computer every 9 months?
That is why consoles are so popular (far outstripping any other gaming medium), because the system specs are always the same! No compatability issues ever need happen. And the developers are forced to work with what they have, rather than letting high system requirements act as an escape clause for shoddy programming or driver support. Additionally of course consoles have the built in advantages of being exteremly low maintainence (no need to do reformats or software installs) and the hardware is much more affordable (in the hundreds rather than thousands of dollars for a complete system, plus since they aren't being constantly upgraded and made obsolete, they come down in price rather quickly). Consoles pick up the cost again with more expensive games (say 50-60$ and staying at that price for longer than a 20-40$ pc game that may drop in price in a few months) of course, but still.
We should give PC game developers credit, after all they work in a cutthroat market that has been shrinking to the behemoth of console gaming for quite some time. But to say that people just need to constantly upgrade is silly.
And there have always been games that despite top of the line hardware STILL have problems.
Hell, I remember just a month or so ago people with top of the line systems complaining endlessly about the choppy framerates they were getting with JA (using the antiquated Q3 engine no less). There's a lot to be said for proper product support, optimization, etc.
And anyway, we all know that graphics alone don't make a game. Look at all the pretty games that won awards for flashy visuals then were promptly forgotten.
It's sort of like movies. Movies get awards for special effects, but if they lack a good story and/or compelling characters, what is there to bring us back? Special Effects get outdated pretty quickly to these days...