In the last article, we talked about various monitors and their features. Let’s dive a bit into the technologies you might’ve heard — Adaptive-Sync technology. Before we dive into these, let me tell you why we need them. When your monitor’s refresh rate and GPU’s frame rate are not synchronized, screen tearing would occur in your monitors. It’s terrible that happens during your game. So there’s a fix to it — V-Sync.
V-Sync:
V-Sync, short for vertical synchronization, is a display technology for preventing your screen from tearing. V-Sync would impose a hard cap on your fps, It will reduce the cap to 60fps or 30fps. This works perfectly well on 60 hertz monitors if your monitor manages to have a stable 60 fps. Unfortunately, the situation isn’t that simple with higher frame rates and refresh rates, and it always comes with stuttering and input lag. So why bother using V-Sync? No worries, Nvidia and AMD both have their own adaptive sync technologies — G-Sync & FreeSync. As a rule to these 2 kinds of monitors, each can only be used with their own graphics card. Usually, FreeSync monitors are for AMD GPU and G sync is only viable when pair with an Nvidia graphics card.
G-Sync:
G-Sync is a proprietary adaptive sync technology developed by Nvidia for eliminating screen tearing and as an alternative to V-Sync. G-Sync has gained popularity in the electronics space because monitor refresh rates are always better than the GPU’s ability to output data. If a graphics card is pushing 50 frames per second (FPS), the display would then switch its refresh rate to 50 Hz. If the FPS count decreases to 40, then the display adjusts to 40 Hz. The typical effective range of G-Sync technology is 30 Hz up to the maximum refresh rate of the display. However, G-Sync monitors always cost more than the other because OEM has to buy a proprietary G-Sync module from Nvidia to make it work.
FreeSync:
FreeSync is a standard developed by AMD that, similar to G-Sync, is an adaptive synchronization technology for liquid-crystal displays. It’s intended to reduce screen tearing and stuttering triggered by the monitor not being in sync with the content frame rate. Since this technology uses the Adaptive-Sync standard built into the DisplayPort 1.2a standard, any monitor equipped with this input can be compatible with FreeSync technology. With that in mind, FreeSync is not compatible with legacy connections such as VGA and DVI. AMD takes a more open approach, OEM is free to use scalar modules they want, don’t even need to pay AMD for licenses to implement free sync in their monitors. This is why free sync monitors are so much affordable.
FreeSync | FreeSync Premium | G-Sync | G-Sync Ultimate |
---|---|---|---|
No price premium | No price premium | HDR and extended color support | Refresh rates of 144 Hz and higher |
Refresh rates of 60 Hz and higher | Refresh rates of 120 Hz and higher | Frame-doubling below 30 Hz to ensure Adaptive-Sync at all frame rates | Factory-calibrated accurate SDR (sRGB) and HDR color (P3) gamut support |
Many FreeSync monitors can also run G-Sync | Low Framerate Compensation (LFC) | Ultra-low motion blur | “Lifelike” HDR support |
May have HDR support | May have HDR support (Many FreeSync Premium monitors can also run G-Sync with HDR) | Variable LCD overdrive | |
Optimized latency |
Conclusion:
So, G-Sync or FreeSync? The editor, I, was struggling with the two as G-Sync always takes around 20% more budget than FreeSync. However, more advantages are coming with G-Sync as it has additional features to the other like HDR. To have a better game performance, there’s always a point going for G-Sync. Also, I would say it’s a more long-term monitor. For example, you wouldn’t expect you would always have high frame rates running Battlefield 5. That’s where G-Sync makes the value that it could not only Adaptive-Sync with 120 or more frame rates but all frame rates. Then, what’s the point of getting a Free-Sync monitor with low frame rates. But FreeSync is still a decent option if your GPU usually has high frame rates.
Source: tom’s Hardware