Kids are always talking about their “frames” these days. But did you know your monitor may be refreshing more than your graphics card can keep up with? It’s a common problem when running new games on dated hardware. The result is a nasty tear — half of one frame sewed up with half of another. This “screen tearing” is not a new problem, but there are some new solutions to patch the whole mess up.
One of the industry-leading solutions is Nvidia G-Sync, which seeks to bring your refreshes in line with your frames on the fly, thus preventing tearing. But does it work? And more importantly, is it worth the potential added costs? Let’s find out!
Torn Asunder: Screen Tearing Overview
Screen tearing occurs when the monitor tries to render a frame while the graphics card is halfway between two. Perhaps the refresh rates between the two devices simply don’t match up, or maybe you generated more explosions in-game than your GPU can handle. Either way, without V-Sync enabled, you get a lovely patchwork effect as the monitor renders parts of two frames at once. And V-Sync can sort-of resolve the issue from the software and GPU side of things, but software solutions for hardware issues are hardly ideal. In worst-case scenarios, V-Sync introduces unacceptable input lag and even some visual artifacts of its own.
What’s an intrepid graphics card manufacturer to do? If you’re Nvidia, the answer is clear: develop your own anti-tearing ecosystem and call it G-Sync!
The key lies in flipping the flow of control for frame rendering and posing an interesting question. What if the monitor could set its refresh rate based on the graphics card’s refresh rate? Historically, the flow of control forced the graphics card to try and match the refresh rate of the monitor — and therein lies all the trouble and tearing.
So Nvidia flipped the script. But how do the monitor and graphics card communicate this stateful information between one another? The answer: specialized drivers for the graphics card and a unique hardware module which much be installed in the monitor.
Problem solved, right?
Lockstep: Nvidia’s Solution for Screen Tearing
G-Sync does what it sets out to do. It resolves screen tearing without using V-Sync by allowing the monitor to throttle based off the frames coming from the GPU. This is great news, and a significant step forward for the industry.
Except it isn’t, because Nvidia took their ball and went home.
Now, there’s a time and a place for proprietary solutions, and there’s nothing inherently wrong with Nvidia developing their own screen synchronization solution and keeping it in-house. But we live in a post-Steam world now – a world where GOG and the Epic Games Store fight the House of Gabe Newell for market share, and a world where consumers recognize that competition is a good thing.
Nvidia charges royalties for the use of their proprietary modules in monitors. This naturally drives up the price of the monitors needed to make the most of their already pricey graphics cards. If they were the only big player on the block, folks might grumble about a monopoly but fall in line anyway. But the fact is that Nvidia isn’t the only player in the game with a screen synchronization solution.
And the other solutions are either free and open-source or built upon such tech.
As of the release of DisplayPort 1.2a, a “G-Sync-style adaptive refresh mechanism” is available to everyone who has a DisplayPort 1.2a compatible monitor and graphics card. As accessible to consumers as a video port standard — it doesn’t get better than that. Not to mention that AMD has their own competing technology called “FreeSync.” Now the million-dollar question: are AMD’s royalties for FreeSync higher or lower than Nvidia’s royalties for G-Sync?
Trick question! AMD build FreeSync on top of the DisplayPort 1.2a tech. Therefore, if you have a DisplayPort 1.2a monitor and a new AMD graphics card, you’ve got everything you need. It’s that easy.
Apple is proof that proprietary ecosystems still have a place and can be very profitable. But not every company is Apple. And complaints about their proprietary ports like Lightning and the old 30-pin are reaching fever pitch. Consumers crave choice in these trying times. They hate feeling as if they don’t truly “own” their tech, and they hate to pay a premium for that dubious privilege. Especially gamers. Which makes Nvidia’s choices seem, frankly, foolish.
Bottom Line: Is Nvidia G-SYNC Worth It?
So does G-Sync achieve what it set out to do? Does it solve the issue of screen tearing in a hardware-friendly way? Absolutely.
Should you invest in this exciting new technology? You can’t see me right now, but I’m shrugging at the open-air beside my monitor like a lead character on The Office.
If you’re already all-in on Nvidia with a compatible monitor, now you know what all those G-Sync stickers were about and can cash in on the tech. Go ahead. There’s no reason not to!
But if you’re shopping for an upgrade, unless you’re already planning to dive feet-first into RTX, I recommend taking a look around first. I can already see the comments rolling in with cherry-picked G-Sync compatible monitors for door-busting prices. But the fact remains, as a whole, that G-Sync monitors are more expensive than ones that aren’t. Not to mention that you’re already in the deep end of the price pool with a high-end Nvidia graphics card anyways.
Nvidia is even starting to support open “adaptive refresh mechanism” with their “G-SYNC Compatible” line, just to try to sweeten the deal. But it’s too late. There are plenty of nifty new features worth dropping a chunk of change on, but preventing screen tearing via hardware solutions isn’t one of them. At least, not at a premium.
Is Nvidia G-Sync worth it? No. But if you already have it, enjoy it! Don’t let a review like this make you feel worse about what you’ve got under the hood. At the end of the day, all that matters is that you’re fragging faces and having the peak of fun while doing it.
Oh, and playing Half-Life: Alyx. That matters a lot too.
Related Reading
You must sign in to comment.
Don't have an account? Sign up here!