It looks like you're new here. If you want to get involved, click one of these buttons!
http://www.tomshardware.com/news/nvidia-amd-adaptive-sync-freesync,27736.html
A recent news story says "no". That, I think, would be suicidal for Nvidia, so it's likely that something was misinterpreted or lost in translation or some such.
For those not familiar with it, adaptive sync is the big new thing in monitors, as it presents monitors with the ability to refresh when it makes sense, rather than at fixed time intervals. Refreshing a monitor when it has a new frame to display will lead to much smoother animations and considerably lower monitor latency at a given frame rate, if below the monitor's refresh rate. It's the biggest deal in new monitor technologies in many years, arguably going back to the transition from CRT to LCD monitors.
Nvidia already has G-sync, which is a proprietary version of the same thing. Adaptive sync is a VESA-blessed industry standard that basically everyone else in the industry--notably including the monitor and laptop vendors--supports. Importantly, implementing adaptive sync doesn't require new hardware, so next year's generation of monitors can be whatever they would have been without adaptive sync, and then add adaptive sync for free if they support DisplayPort. G-sync costs $100 or so in proprietary hardware built into the monitor.
Also importantly, AMD's GCN cards--basically their entire modern lineup--already partially support adaptive sync, and several of their GPU chips fully support it. It's just waiting on monitors that support it to come to market. If Nvidia doesn't support adaptive sync, then that means that to get the same benefits from G-sync, you face a very restricted selection of monitors that cost $100+ more than comparable competition without G-sync to get the same benefits.
That would, by itself, be a pretty good reason to dismiss Nvidia out of hand unless you know that you're keeping an old monitor and not willing to upgrade anytime soon. That's why I say it would be suicidal for Nvidia not to support adaptive sync. And ultimately, it's why I think they will support it, contrary to today's story. Nvidia's pushing of stupid proprietary junk hasn't stopped them from also supporting the industry standards in the past. For example, Nvidia still pushes CUDA, but also supports OpenCL.
Comments
Sic semper tyrannis "Democracy broke down, not when the Union
ceased to be agreeable to all its constituent States, but when it was upheld, like any other Empire, by force of arms."
I think this will be resolve by engineering over time. if the software processing time gets fast enough to make it reasonable then it will happen, if not then it won't happen...
the unfortunate side of this is for us consumers...
it may ( like a couple other computer issues) vacillate between two or more alternates which means people who pick wrong get stuck..
Not that I matters in 2014, but I still liked the picture quality on my Betamax better than the VHS machines of the day...
in the end it will settle out...
I have a life, its just different from yours.....
First FreeSync has to come to market. Until then, nVidia's support has not a lot to do with it.
Sure, it may be an open standard, but right now there are 0 products that are available retail that support it.
AMD expects the first "FreeSync-Compatible" monitors to market 1Q15.
G-Sync monitors you can buy today. If you should buy one or not, that's a different question.
Being "free" and open doesn't necessarily mean FreeSync will be successful. Linux is free and open. OpenGL is free and open. Neither are very dominating when it comes to gaming technologies though. And to be fair, being first to market doesn't mean G-Sync will succeed either - gamers seem more intrigued by 4K and VR than by Adaptive Sync.
Personally, I think Adaptive Sync could be a big deal - It sounds really good on paper. But I've never seen a side-by-side comparison with my own eyes. I'm not buying, or holding off on buying, anything based on FreeSync or G-Sync right now.
I suppose that it's reasonable to ask if the products will show up on time and work properly. But I'm not buying your comparisons. Linux is an operating system, which makes it incredibly complicated. An enormous number of things could go wrong, and some of them do. OpenGL is an API for controlling chips that are among the most complicated ASICs to exist--and has to compete with DirectX, which is also free.
All that adaptive sync has to do, in contrast, is to let the video card and monitor decide when it makes sense for the monitor to refresh. The logical rules of how to do this sensibly are very simple. Has there ever been a monitor of any sort that couldn't refresh when it promised, apart from being a defective unit or perhaps dead entirely? It's possible that adaptive sync monitors could get delayed and delayed and delayed for strange reasons.
-----
The original question of the thread is not whether Nvidia will abandon G-sync. The question is whether it will support adaptive sync in addition to G-sync.