Future Intel iGPU’s will support Adaptive Sync technology

Intel supports adaptive sync

Good news for everyone! Nvidia and AMD has been competing for a while in the variable refresh rate game with their G-SYNC and FreeSync technologies, respectively, and now Intel is joining the game too.
Intel’s Chief Graphics Software Architect, David Blythe, has revealed that they will support the VESA Adaptive Sync standard in their integrated graphics processing units (iGPU) in the future. This will mean that you can enjoy the benefits of variable refresh rate without a discrete graphics card as you will only need a compatible Intel processor.

Blythe has not said anything about a timeframe or anything like that for this exciting feature on their processors, but we expect it will take quite some time yet, possibly several Intel CPU generations away.

So when the time finally comes, Intel users will then be able to use the VRR (variable refresh rate) technology on all monitors that support FreeSync, as it is just AMD’s marketing term for the VESA Adaptive Sync standard.

Intel will gain a lot by supporting the VESA Adaptive Sync technology, as their iGPU’s are good enough for games that are not as demanding and the feature will make gaming a much better experience on the lower refresh rates. It will eliminate any screen tearing and stuttering that you would otherwise get.

It will be exciting to see how it turns out for NVIDIA and their G-SYNC technology, as it has nothing to do with the VESA Adaptive Sync standard and is basically its own thing. G-SYNC requires proprietary hardware (a chip) and a G-SYNC monitor is usually about $150 more expensive than the FreeSync version of the same model. So if Adaptive Sync becomes more and more common and more widespread than G-SYNC monitors, it’s not going to be a good thing for NVIDIA for sure.

  • Tyrann

    Glad this happened. Hopefully Nvidia drops Gsync and jumps on the Freesync bandwagon 🙂 Free is better and the $150 mark up on monitors plus locking them to only Nvidia cards ain’t cool.

  • anonymous72663

    Good riddance… Nvidia !