Had this reflection that 144hz screens where the only type of screen I knew, that was not a multiple of 60. 60 Hz - 120hz - 240hz - 360hz

And in the middle 144hz Is there a reason why all follow this 60 rule, and if so, why is 144hz here

  • @[email protected]
    link
    fedilink
    9
    edit-2
    1 year ago

    The reason 60Hz was so prominent has to do with the power line frequency. Screens originated as cathode ray tube (CRT) TVs that were only able to use a single frequency, which was the one chosen by TV networks. They chose a the power line frequency because this minimizes flicker when recording light powered with the same frequency as the one you record with, and you want to play back in the same frequency for normal content.

    This however isn’t as important for modern monitors. You have other image sources than video content produced for TV which benefit from higher rates but don’t need to match a multiple of 60. So nowadays manufacturers go as high as their panels allow, my guess is 144 exists because that’s 6*24Hz (the latter being the “cinematic” frequency). My monitor for example is 75 Hz which is 1.5*50Hz, which is the European power line frequency, but the refresh rate is variable anyways, making it can match full multiples of content frequency dynamically if desired.