Originally Posted by Jae Onasi
That and our critical flicker frequency (where we see something as a continual light instead of as a series of flashes) is maxed out for most people below 50 Hz, unless you have an extremely bright big spot of light. Some people can go up to 65 Hz (and a few with super-eyes may perceive higher frequencies), but most of us looking at a monitor won't perceive anything different past 60 Hz anyway. Now, if you have a game that won't keep the fps up around 60 unless you have a humongous graphic card, then it might be worth it.
All you never wanted to know about critical flicker frequency
Thanks Jae. I was going to go digging for a similar article. 60 is indeed the magic number it seems. There are some who state aiming for 80fps in games as a top end is good as it it means your lower end or mean fps can sit around 60... but this is very variable depending on which game, which settings and what hardware.
I wonder about those 120Mhz home theater TVs. Not even the highest definition video formats clock in at 120fps! Heck, blu-ray
doesnt even go anywhere near it...
I smell a sales gimmick aimed at the cashed up home theater crowd