Nvidia G-Sync.
You have heard about how it can make for a smoother gaming experience on your PC. But you're not sure if its good enough to warrant spending at least extra $200 on it?
Or, you're not even sure how much will it improve your gaming experience?
Is G-Sync actually worth it for you?
Let’s find out.
1. What is NVIDIA G-Sync?

Screenshot: Microsoft Forums
NVIDIA G-Sync is a proprietary tech primarily used in their gaming monitors. The main purpose of this is to sync the refresh rate of the monitor to the FPS being produced by a computer’s GPU. This is helpful when high frame rates from the GPU cause issues like ghosting, screen tearing and more on the monitor.
2. How does NVIDIA G-Sync work?
As you can see in the video above, G-Sync syncs the refresh rate of your monitor with the frames per second that are being produced by your computer’s GPU.
Let’s say your GPU is outputting 110 frames per second and your monitor’s refresh rate is 60Hz which means it can only display 60 frames per second. With the extra 50 fps that your GPU is processing, your monitor will not be able to catch up, and this will cause screen tearing because of the frame overlap where one part of your monitor shows another frame and the other shows another.
Games and applications used to have V-Sync to prevent this. V-Sync limits the frames of a game or application to the refresh rate of the monitor it is being shown on. This ultimately results in stuttering and artefacts showing up on display.
With NVIDIA G-Sync, an extra controller on the monitor makes sure that your monitor shows synced frames. G-Sync will communicate with your NVIDIA GPU and show the exact frame being processed.
So, if you have a 144Hz monitor and your GPU is producing 110fps, G-Sync will control your monitor to show 110Hz content.
The benefits of having an NVIDIA G-Sync monitor is that you will hardly be able to see any screen tearing, ghosting or artefacts on your monitor while gaming.
This helps in competitive esports environment where factors like these can result in a very different experience thereby making it unfair to people who don’t have G-Sync on the monitor they are competing with.
A monitor with NVIDIA G-Sync comes with an extra controller chip to help the monitor sync directly with the GPU.
3. Do I need NVIDIA G-Sync? Does it Make a Difference?
The short answer: It depends.
Here are some scenarios when a G-sync monitor actually makes a difference, and buying one is recommended.
- If you own a GeForce GTX 650 Ti Nvidia GPU or later, and you encounter stutters with V-sync enabled below 60fps.
- If your current high-end GPU is powerful enough to cause noticeable input lags.
- If you've a GPU with G-Sync support, it makes sense to upgrade to a G-Sync monitor.
- Even with a high-end GPU with V-sync, you might experience an increased lag/latency. If it's noticeably higher, you should consider upgrading.
Expert Tip
If the increased lag/latency is in a few milliseconds, it might not be much of a problem. Ultimately, this lag depends on your monitor. Here's a good place to find out the latency of your monitor.
Before buying a G-Sync enabled monitor, make sure that your GPU is G-Sync enabled. As of writing, here's a list of G-Sync enabled GPUs.

But it's not for everyone.
Here are some scenarios where it doesn't make much sense to upgrade to a G-Sync monitor.
- You aren't into hardcore gaming.
- You have a high-end GPU with V-sync, and aren't experiencing any screen tearing.
- You have an AMD GPU. Since it is incompatible with G-Sync, you'll have to buy a Nvidia GPU and a G-Sync monitor—which will burn a huge hole in your wallet.
4. NVIDIA G-Sync vs. AMD FreeSync
A lot of cheaper 144Hz monitors come with AMD FreeSync by default as adding NVIDIA G-Sync can make a product costly. The thing is, NVIDIA G-sync has been explicitly designed for NVIDIA GPUs while the AMD FreeSync technology is available for AMD GPUs.
Depending on what GPU you have, you can buy a monitor with either of these syncing technologies. The main difference between NVIDIA G-Sync and AMD FreeSync is the fact that the former is a hardware-based technology while AMD FreeSync doesn’t need specific extra hardware on the monitor to sync the frames.
The FreeSync technology uses adaptive sync which has now become a standard part of DisplayPort which is the connector that is used to connect the monitor and GPU for higher refresh rates. HDMI in its fundamental form doesn’t have the bandwidth of transmitting such large number of frames, so DisplayPort is the way to go.
The main reason why AMD calls their technology FreeSync is the fact that it doesn’t need a hardware-based scaler or controller to sync the frames.
While NVIDIA could have gone the same route, but their G-Sync tech means more money for the company and their GPUs are restricted from using FreeSync, and that is why FreeSync doesn’t work well with NVIDIA GPUs.
However, that doesn’t mean that the G-Sync tech is just out there to milk money. The hardware-based controller and scaler help when there is a higher chance of screen tearing even with adaptive sync.
5. Does AMD FreeSync work with NVIDIA GPUs?
The simple answer here is no, FreeSync doesn’t work with NVIDIA graphic cards, but you can still use adaptive sync with your NVIDIA card which means it will be able to manage the screen tearing and ghosting issues to some extent.
The main issue here is that NVIDIA has restricted their graphics cards to work with FreeSync monitors. It might sound like a decision to make money from the G-Sync licensed monitors which might be right after all.
The video from Joker Productions sheds more light on the question, and the tester compares the performance of NVIDIA GTX 1060 which supports G-Sync with an AMD RX570 which works well with AMD FreeSync. The card is being tested on a monitor that supports FreeSync, and it can perform well and not show any signs of tearing or ghosting even on high refresh rates.
There is no official confirmation from NVIDIA about the support for FreeSync, but many people use FreeSync monitors with their NVIDIA cards because of the lower price and lot of them find that it works well even with the NVIDIA GPU.
6. What About NVIDIA Adaptive V-Sync?
NVIDIA also has an Adaptive V-Sync technology which they launched even before G-Sync became a thing for monitors. The Adaptive V-Sync technology is basically software based and does not need any extra controller or scaler hardware on the monitor and can be controlled from NVIDIA’s Control Panel software which is bundled with their drivers.
The video above from Linus Tech Tips showcases the difference between normal V-Sync and Adaptive V-Sync tech from NVIDIA.
As you can see in the video, NVIDIA Adaptive V-Sync works differently from normal V-Sync. It can quickly lock the frame rate of the game to the refresh rate of your monitor and can turn itself off when the frames per second are lower than your monitors refresh rate.
To help you understand, let’s consider an example.
Say you have a 120Hz monitor running a game that is outputting 150fps. Due to the extra frames and no sync technology, the monitor will show screen tearing and ghosting. But with NVIDIA Adaptive V-Sync turned on, the frames will be locked at 120fps which is the refresh rate of the monitor.
The Adaptive V-Sync will turn itself off if the game is producing a lower fps like 90fps which are under the refresh rate of your monitor. This results in no screen tearing and ghosting when you are playing a game.
To turn on Adaptive V-Sync, you need to open your NVIDIA Control Panel and then go to Manage 3D Settings. In this window, look for an option that reads V-Sync and chooses Adaptive from the drop down. This is it, and your monitor will work with Adaptive V-Sync from NVIDIA.
7. The Bottom Line: Is buying a G-Sync monitor worth it in 2018?
The NVIDIA G-Sync technology is now being included in monitors that can have refresh rates higher than 144Hz, and there are 4K monitors from companies like Asus that can produce 144Hz. However, the main thing here is, there are no graphics cards from NVIDIA or AMD on the market that can produce such high refresh rates for such high resolution. However, on 1080p which most modern GPUs can handle easily, things become different.
Buying a G-Sync monitor in today’s time is a personal choice instead of a requirement. Many people like to have high refresh rates but having an NVIDIA G-Sync monitor isn’t mainstream even today.
The reason for this is two-fold. The first is the steep price of the G-Sync monitors. Secondly, cheaper FreeSync monitors do the job for most of these gamers.
In the end, if you can afford to buy a G-Sync monitor in 2018, buy it and enjoy the NVIDIA ecosystem that the PC gaming has become today.
But if you want to save a few bucks and invest the savings in a better upgrade for your PC, you can go ahead and buy one of the FreeSync monitors.