In a brief, Nvidia G-Sync is a display technology used in certain PC monitors, laptops, and TVs to combat screen tearing, stuttering, and juddering, particularly during fast-paced games or video. G-Sync is only available when the display is connected to a system equipped with a compatible Nvidia graphics card (including third-party branded ones). If you don’t already have a compatible card, be sure to read our graphics card purchase guide, as well as our in-depth comparison of AMD and Nvidia GPU characteristics.
G-Sync was announced by Nvidia in 2013, and its main competitor is AMD FreeSync. However, the answer to the question “what is G-Sync?” is becoming more complicated. G-Sync now comes in three flavors: G-Sync, G-Sync Ultimate, and G-Sync Compatible.
What is Screen Tearing?
Screen tearing is a distracting effect on the photograph (see photo above). It is caused by the game’s framerate (the rate at which image frames are displayed) not matching the monitor’s refresh rate (the frequency with which an image on a display is redrawn). G-Sync TVs have a variable refresh rate (also known as VRR or a dynamic refresh rate) and can sync their minimum and maximum refresh rates with the system’s Nvidia graphics card’s framerate. The refresh rate range would be as high as the maximum refresh rate of the display. This way, you can see images as soon as they are rendered, while also avoiding input lag or delays between mouse movements.
G-Sync vs. FreeSync
Both FreeSync and G-Sync are AMD’s responses to the VESA’s Adaptive-Sync protocol. To use FreeSync, you must have an AMD graphics card, just as you would have an Nvidia graphics card to use G-Sync.
There are several significant differences. One notable difference is that FreeSync works over HDMI and DisplayPort (which also works over USB Type-C), while G-Sync only works over DisplayPort unless you have a G-Sync Compatible TV (more on that below). Nvidia, on the other hand, has stated that it is working to change this. See our DisplayPort vs. HDMI comparison for more information on the two ports and which is better for gaming.
While G-Sync and FreeSync are all based on Adaptive-Sync, G-Sync and G-Sync Ultimate require the use of a proprietary Nvidia chip. If a monitor provider wants their display certified for G-Sync or G-Sync Ultimate, they must purchase this instead of a scaler. In contrast, FreeSync is an open standard, and FreeSync monitors are typically less expensive than G-Sync or G-Sync Ultimate monitors. G-Sync Compatible monitors, on the other hand, do not need this chip, and many FreeSync monitors are now G-Sync Compatible.
Nvidia began testing and approving specific displays, including those with other kinds of Adaptive-Sync technology, such as FreeSync, to run G-Sync in 2019. These monitors are referred to as G-Sync Compatible. G-Sync Compatible displays, as confirmed by our own tests, will effectively run G-Sync with the proper driver and a few caveats, despite not having the same processors as a G-Sync or G-Sync Ultimate display.
Nvidia says that you can’t do certain things with G-Sync. Compatible displays have ultra-low motion blur, overclocking, and variable overdrive when compared to ordinary G-Sync displays.