FreeSync Features

In many ways FreeSync and G-SYNC are comparable. Both refresh the display as soon as a new frame is available, at least within their normal range of refresh rates. There are differences in how this is accomplished, however.

G-SYNC uses a proprietary module that replaces the normal scaler hardware in a display. Besides cost factors, this means that any company looking to make a G-SYNC display has to buy that module from NVIDIA. Of course the reason NVIDIA went with a proprietary module was because adaptive sync didn’t exist when they started working on G-SYNC, so they had to create their own protocol. Basically, the G-SYNC module controls all the regular core features of the display like the OSD, but it’s not as full featured as a “normal” scaler.

In contrast, as part of the DisplayPort 1.2a standard, Adaptive Sync (which is what AMD uses to enable FreeSync) will likely become part of many future displays. The major scaler companies (Realtek, Novatek, and MStar) have all announced support for Adaptive Sync, and it appears most of the changes required to support the standard could be accomplished via firmware updates. That means even if a display vendor doesn’t have a vested interest in making a FreeSync branded display, we could see future displays that still work with FreeSync.

Having FreeSync integrated into most scalers has other benefits as well. All the normal OSD controls are available, and the displays can support multiple inputs – though FreeSync of course requires the use of DisplayPort as Adaptive Sync doesn’t work with DVI, HDMI, or VGA (DSUB). AMD mentions in one of their slides that G-SYNC also lacks support for audio input over DisplayPort, and there’s mention of color processing as well, though this is somewhat misleading. NVIDIA's G-SYNC module supports color LUTs (Look Up Tables), but they don't support multiple color options like the "Warm, Cool, Movie, User, etc." modes that many displays have; NVIDIA states that the focus is on properly producing sRGB content, and so far the G-SYNC displays we've looked at have done quite well in this regard. We’ll look at the “Performance Penalty” aspect as well on the next page.

One other feature that differentiates FreeSync from G-SYNC is how things are handled when the frame rate is outside of the dynamic refresh range. With G-SYNC enabled, the system will behave as though VSYNC is enabled when frame rates are either above or below the dynamic range; NVIDIA's goal was to have no tearing, ever. That means if you drop below 30FPS, you can get the stutter associated with VSYNC while going above 60Hz/144Hz (depending on the display) is not possible – the frame rate is capped. Admittedly, neither situation is a huge problem, but AMD provides an alternative with FreeSync.

Instead of always behaving as though VSYNC is on, FreeSync can revert to either VSYNC off or VSYNC on behavior if your frame rates are too high/low. With VSYNC off, you could still get image tearing but at higher frame rates there would be a reduction in input latency. Again, this isn't necessarily a big flaw with G-SYNC – and I’d assume NVIDIA could probably rework the drivers to change the behavior if needed – but having choice is never a bad thing.

There’s another aspect to consider with FreeSync that might be interesting: as an open standard, it could potentially find its way into notebooks sooner than G-SYNC. We have yet to see any shipping G-SYNC enabled laptops, and it’s unlikely most notebooks manufacturers would be willing to pay $200 or even $100 extra to get a G-SYNC module into a notebook, and there's the question of power requirements. Then again, earlier this year there was an inadvertent leak of some alpha drivers that allowed G-SYNC to function on the ASUS G751j notebook without a G-SYNC module, so it’s clear NVIDIA is investigating other options.

While NVIDIA may do G-SYNC without a module for notebooks, there are still other questions. With many notebooks using a form of dynamic switchable graphics (Optimus and Enduro), support for Adaptive Sync by the Intel processor graphics could certainly help. NVIDIA might work with Intel to make G-SYNC work (though it’s worth pointing out that the ASUS G751 doesn’t support Optimus so it’s not a problem with that notebook), and AMD might be able to convince Intel to adopt DP Adaptive Sync, but to date neither has happened. There’s no clear direction yet but there’s definitely a market for adaptive refresh in laptops, as many are unable to reach 60+ FPS at high quality settings.

FreeSync Displays and Pricing FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • Oxford Guy - Friday, March 20, 2015 - link

    "Now you want existing displays that are already assembled to be pulled apart and upgraded. That would likely cost more money than just selling the displays at a discount, as they weren't designed to be easily disassembled and upgraded."

    If that's the case... I wonder why that is? Could it be the blithe acceptance of ridiculous cases of planned obsolescence like this?

    Manufacturers piddle out increments of tech constantly to try to keep a carrot on a stick in front of consumers. Just like with games and their DLC nonsense, the new mindset is replace, replace, replace... design the product so it can't be upgraded. Fill up the landfills.

    Sorry, but my $800 panel isn't going to just wear out or be obsolete in short order. People who spent even more are likely to say the same thing. And, again, many of these products are still available for purchase right now. The industry is doing consumers a disservice enough by not having standards (incompatible competing G-Sync and FreeSync) but it's far worse to tell people they need to replace otherwise perfectly satisfactory equipment for a minor feature improvement.

    You say it's not feasible to make monitors that can be upgraded in a relatively minor way like this. I say it's not. It's not like we're talking about installing DisplayPort into a panel that didn't have it or something along those lines. It's time for the monitor industry to stop spewing out tiny incremental changes and expecting wholesale replacement.

    This sort of product and the mindset that accompanies it is optional, not mandatory. Once upon a time things were designed to be upgradable. I suppose the next thing you'll fully endorse are motherboards with the CPUs, RAM, and everything else soldered on (which Apple likes to do) to replace DIY computing... Why not? Think of how much less trouble it will be for everyone.
  • Oxford Guy - Friday, March 20, 2015 - link

    "it's probable that G1 *couldn't* be properly upgraded to support TRIM" "since you were working at Intel's Client SSD department...oh, wait, you weren't." So, I assume I should use the same retort on you with your "probable", eh?
  • Oxford Guy - Friday, March 20, 2015 - link

    The other thing you're missing is that Intel never told consumers that TRIM could not be added with a firmware patch. It never provided anyone with an actual concrete justification. It just did what is typical for these companies and for publications like yours = told people to buy the latest shiny to "upgrade".
  • Gunbuster - Thursday, March 19, 2015 - link

    So G-Sync has been available to purchase for what a year now? And AMD comes to the table with something exactly the same. How impressive.

    Oh and Crossfire driver the traditional trust us Coming soon™
  • chizow - Thursday, March 19, 2015 - link

    18 months later, and not exactly the same, still worst. But yes we must give it to AMD, at least they brought something to the table this time.
  • Gigaplex - Friday, March 20, 2015 - link

    The troll is strong in this one. You keep repeating how this is technically worse than G-SYNC and have absolutely nothing to back it up. You claim forced V-SYNC is an issue with FreeSync, but it's the other way around - you can't turn V-SYNC off with G-SYNC but you can with FreeSync. You don't address the fact that G-SYNC monitors need the proprietary scaler that doesn't have all the features of FreeSync capable scalers (eg more input ports, OSD functionality). You accuse everyone who refutes your argument with AMD fanboy sentimentality, when you yourself are the obvious NVIDIA fanboy. No doubt you'll accuse me of being an AMD fanboy too. How wrong you are.
  • JarredWalton - Friday, March 20, 2015 - link

    Technically the G-SYNC scaler supports an OSD... the options are just more limited as there aren't multiple inputs to support, and I believe NVIDIA doesn't bother with supporting multiple *inaccurate* color modes -- just sRGB and hopefully close to the correct values.
  • chizow - Friday, March 20, 2015 - link

    Actually you're wrong again, Vsync is always off, there is a frame cap turned on via driver but that is not Vsync as the GPU is still controlling frame rate.

    Meanwhile, FreeSync is still clearly tied to Vsync, which is somewhat surprising in its own right since AMD has historically had issues with driver-level Vsync.

    I've never once glossed over the fact G-Sync requires proprietary module, because I've clearly stated the price and tech is justified if it is a better solution and as we saw yesterday, it clearly is.

    I've also acknowledged that multiple inputs an OSD are amenities that are a bonus, but certainly not over these panels excelling at what they are purchased for. I have 2xU2410 companion panels with TONS of inputs for anything I need beyond gaming.
  • darkfalz - Thursday, March 19, 2015 - link

    I have to give it to AMD here - I was skeptical this could be accomplished without dedicated hardware to buffer the video frames on the display, but they've done it. I still wouldn't buy one of their power hungry video cards but it's good for AMD fans. This is good news for G-Sync owners too as it should drive down the artificially inflated price (partly due to lack of competition, partly due to early adoption premium). After fiddling around with triple buffering and tripe buffering overrides for years (granted, less of a problem on DX10/11 as it seems many modern engines have some form of "free" triple buffering) it's good to go to perfect refresh rates. As a big emulation fan, with many arcade games using various refresh rates from 50 to 65 Hz, these displays are also great. Was input lag tested? AMD don't claim to have Vsync-off like input lag reduction. This would be superb in a laptop where displaying every last frame is important (Optimus provides a sort of "free" triple buffering of its own, but it's not the smoothest and often requires you to set a 60 FPS frame cap).
  • darkfalz - Thursday, March 19, 2015 - link

    By G-Sync owners, I Guess I mean NVIDIA fans / prospective G-Sync buyers. G-sync owners (like me) have already paid the premium.

Log in

Don't have an account? Sign up now