NVIDIA GPUs to make use of adaptive sync on FreeSync monitors
NVIDIA shocked the PC gaming world Sunday night with news that certain Freesync monitors would receive G-SYNC certification with an upcoming driver.
Desktop graphics giant NVIDIA made some big headlines Sunday night during their CES 2019 Keynote presentation. Among announcements for new graphics cards, displays, and expanding software support for its RTX series of graphics cards, the company dropped a bombshell on the PC gaming world with the news that adaptive sync will be supported on monitors that adhere to the VESA specs for variable refresh rate operation. A few select gaming monitors that have previously been marketed as compatible with AMD’s adaptive sync technology, known as Freesync, will be receiving NVIDIA G-SYNC certification. This new functionality will be enabled in a new driver update.
NVIDIA first launched their G-SYNC-capable displays back in 2013. These monitors overcame the screen tearing and stuttering issues that plagued PC gamers for years by synchronizing the display of frames rendered by the GPU with the refresh rate of the monitor. This solution offered benefits over traditional v-sync because it didn’t introduce input lag or stutter. At its most effective, the technology allowed the smooth display of games, even when the PC hardware could not deliver a consistent framerate.
AMD later offered its own solution, dubbed FreeSync, which allowed similar functionality with its own graphics cards. AMD’s solution was effectively a re-branding of the industry-standard VESA Adaptive Sync spec. Unlike NVIDIA’s approach, which required dedicated hardware built into the monitor, as well as strict guidelines for performance, AMD’s FreeSync certification could be freely slapped onto any monitor capable of variable refresh rates. FreeSync-capable monitors were generally much cheaper than similar G-SYNC monitors, and consumers would be locked in to one company or the other if they wanted adaptive sync for their video games.
NVIDIA’s announcement at CES 2019 marks a drastic change from the approach the company offered over the last five years. The company says it tested over 400 variable refresh rate monitors and has certified 12 of them as G-SYNC capable. All of these models were previously marketed and sold with AMD FreeSync compatibility. With a driver update arriving on January 15, the specific models listed below will be automatically configured for variable refresh rate operation when used with Turing (RTX 2xxx) or Pascal (GTX 1xxx) GPUs.
If this new wasn’t wild enough, NVIDIA will allow users of this new driver to manually enabled adaptive sync for any compatible display, but they are not officially supporting the feature outside of the 12 models mentioned during the keynote. AMD’s FreeSync certification is rather lenient compared to NVIDIA’s G-SYNC certification, so there are loads of FreeSync displays in the wild that only offer narrow ranges of refresh rates. Most G-SYNC-certified displays have active ranges from 30-144Hz (or more) and some FreeSync displays only offer ranges of 48-60Hz. The 12 models that NVIDIA granted certification to are certainly the cream of the crop when it comes to the FreeSync family.
As for guesses why NVIDIA decided to change its course and open up adaptive sync for its GPUs after years of keeping it locked to G-SYNC? No one knows for certain, but the company may want to put extra pressure on AMD. It’s also possible that the impending release of HDMI 2.1 (and its inclusion of variable refresh rate support in the spec) could have forced their hand. Rumors continue to float that the next generation consoles will support variable refresh rates and many new 4K TVs support the technology already. Either way, this is nothing but good news for PC gamers with NVIDIA GPUs. The market is loaded with low-priced FreeSync gaming monitors that can now use adaptive sync with NVIDIA GPUs.
If you have been thinking about buying a new gaming monitor for your PC or upgrading from an older model, check out our gaming monitor guide. It’s loaded with great G-SYNC and FreeSync options.
-
Chris Jarrard posted a new article, NVIDIA GPUs to make use of adaptive sync on FreeSync monitors
-
-
Has anyone done real (first hand!) comparisons between VA and IPS panels? Is it dramatically different?
I have a 10 year old VA panel I've been happy with. But I do a fair bit of photo editing (as a hobby) so I figure an upgrade to IPS might be beneficial.
Of course, I REALLY want to play Overwatch at 144hz as well (on my 1080 GTX).-
-
Yes!
I've had many ips panels, a bunch for work (which I still have) and a 34 inch UltraWide which I gamed on. I also have a 4k hdr ips tv.
For VA, I have a 4k vizio tv (non hdr) and a z35 computer monitor (35 inch UltraWide 200hz).
Ive also had many TN including a vg248qe, great for online competitive mp where motion clarity at high hz/fps is key, but absolute dog shit for everything else imo.
IPS is certainly the better all purpose productivity and reading (web browsing etc) panel, but it's relatively low contrast means, even with hdr, colors don't tend to pop as much and blacks are not very deep - and you ALWAYS have the dreaded ips glow in dark scenes which drove me nuts in games. But of course, accurate colors and great viewing angles.
VA has its own issues like gamma shift and 'ghosting' or 'smearing' in scenes with a lot of g2g (grey to grey) transitions. Newer panels are better at this, but it's still an issue inherent with the tech. But the panels have super great colors, deep blacks, and are good at high refresh rates (mine goes to 200hz). They are also generally cheaper than ips panels (z35 has been as low as $500usd).
For gaming and movies, I'll take VA any day over ips because of the high contrast and deep blacks. IPS glow is something I cannot personally get over, and washed out colors in darker scenes drives me insane. IPS, however, wins hands down for multi purpose and productivity.-
VA for "gaming and movies"..... would you say the same thing if you were primarily playing twitch first person shooters (Q3a, overwatch... I dunno what else. I'm old.), as well as having zero interest in playing a story driven (rpg style for example) game?
I'm wondering if the IPS "glow" compares to the VA glow on my OLD ass dell 2407, I've had forever. - It's weird but blacks are definitely not BLACK on this monitor.
-
Yea, I think you'd notice a big improvement on newer VA panels vs the Dell you're talking about.
If my primary focus was online conpeitive mp fps shooters, I would get one of the newer 144hz+ TN panels (like the Swift) or if you're on a budget, you can get a 24inch 144hz TN for as low as $150 USD.
Still, if you're doing professional work you need an ips, is a 2 monitor solution ok? If not then get an x34 or something since the quality of your work > gaming.-
I've been googling monitor reviews between acting like I'm hard at work - and I'm still waffling. I know IPS would be best for my photos, but man... I need to go hang out in some computer stores and check out the IPS screens to see if the glow bothers me.
Finding a 27" regular or 34" UW IPS with 144hz is also pretty rare. Blegh. I guess I'll be waiting a month or three anyway... I'll see what's out.
-
-
-
-
VA still isn't there for 144 Hz yet, it's pretty darn close though. You should expect to see some overdrive artifacting at times, such as discolouration and shimmery edges.
The colours and contrast on VA are fantastic though. It'll be the technology to buy once the refresh rates are there (unless OLED dislodges it before then) -
I have had both next to each other.Both excel and suck for different things. Va has great contrast and black but ghosts like a motherfucker and is a bit slower than ips, while ips has accurate color but tends to make any dark game borderline unplayable due to ips glow (i couldn’t see shit in metro33).
Also backlight bleed and panel lottery. https://duckduckgo.com/?q=backlight+bleed&t=ffsb&iax=images&ia=images
for a long time I kept a tn panel around for gaming but color and contrast on those is not great in comparison. If I were to buy a new monitor today its would be a 27" ips ala http://www.tftcentral.co.uk/reviews/asus_mg279q.htm
-
-
-
Was bound to happen sooner or later. You can only ask your customers to pay a premium for a certain feature while your competition gives it "for free" for so long. After a certain point having been there first doesn't matter. Having to buy a much more expensive monitor to get the best out of your video card isn't exactly a competitive advantage.
-
-
-
-
-
-
-
https://www.videocardbenchmark.net/30dayshare.html
K. NVidia has 3x market share over AMD. Gsync will be around many more years.
-
-
-
-
-
My fact-free guess is HDMI 2.1 is going to include similar as part of the spec and so NVidia took steps that would ensure they would be part of the bidding process in next gen consoles. By keeping gsync proprietary for 5 years they probably funded the R&D.
I'm happy to see any news that the graphics swamp is draining, not filling, though.