AMD FreeSync is likely a technology or brand or term that is going to be used a lot between now and the end of 2014. When NVIDIA introduced variable refresh rate monitor technology to the world in October of last year, one of the immediate topics of conversation was the response that AMD was going to have. NVIDIA's G-Sync technology is limited to NVIDIA graphics cards and only a few (actually just one still as I write this) monitors actually have the specialized hardware to support it. In practice though, variable refresh rate monitors fundamentally change the gaming experience for the better.
At CES, AMD went on the offensive and started showing press a hacked up demo of what they called "FreeSync", a similar version of the variable refresh technology working on a laptop. At the time, the notebook was a requirement of the demo because of the way AMD's implementation worked. Mobile displays have previously included variable refresh technologies in order to save power and battery life. AMD found that it could repurpose that technology to emulate the effects that NVIDIA G-Sync creates – a significantly smoother gaming experience without the side effects of Vsync.
Our video preview of NVIDIA G-Sync Technology
Since that January preview, things have progressed for the "FreeSync" technology. Taking the idea to the VESA board responsible for the DisplayPort standard, in April we found out that VESA had adopted the technology and officially and called it Adaptive Sync.
So now what? AMD is at Computex and of course is taking the opportunity to demonstrate a "FreeSync" monitor with the DisplayPort 1.2a Adaptive Sync feature at work. Though they aren't talking about what monitor it is or who the manufacturer is, the demo is up and running and functions with frame rates wavering between 40 FPS and 60 FPS – the most crucial range of frame rates that can adversely affect gaming experiences. AMD has a windmill demo running on the system, perfectly suited to showing Vsync enabled (stuttering) and Vsync disabled (tearing) issues with a constantly rotating object. It is very similar to the NVIDIA clock demo used to show off G-Sync.
The demo system is powered by an AMD FX-8350 processor and Radeon R9 290X graphics card. The monitor is running at 2560×1440 and is the very first working prototype of the new standard. Even more interesting, this is a pre-existing display that has had its firmware updated to support Adaptive Sync. That's potentially exciting news! Monitors COULD BE UPGRADED to support this feature, but AMD warns us: "…this does not guarantee that firmware alone can enable the feature, it does reveal that some scalar/LCD combinations are already sufficiently advanced that they can support some degree of DRR (dynamic refresh rate) and the full DPAS (DisplayPort Adaptive Sync) specification through software changes."
The time frame for retail available monitors using DP 1.2a is up in the air but AMD has told us that the end of 2014 is entirely reasonable. Based on the painfully slow release of G-Sync monitors into the market, AMD has less of a time hole to dig out of than we originally thought, which is good. What is not good news though is that this feature isn't going to be supported on the full range of AMD Radeon graphics cards. Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the "FreeSync" technology. Compare that to NVIDIA's G-Sync: it is supported by NVIDIA's entire GTX 700 and GTX 600 series of cards.
All that aside, seeing the first official prototype of "FreeSync" is awesome and is getting me pretty damn excited about the variable refresh rate technologies once again! Hopefully we'll get some more hands on time (eyes on, whatever) with a panel in the near future to really see how it compares to the experience that NVIDIA G-Sync provides. There is still the chance that the technologies are not directly comparable and some in-depth testing will be required to validate.
“Monitors COULD BE UPGRADED
“Monitors COULD BE UPGRADED to support this feature, but AMD warns us: “…this does not guarantee that firmware alone can enable the feature, it does reveal that some scalar/LCD combinations are already sufficiently advanced that they can support some degree of DRR (dynamic refresh rate) and the full DPAS (DisplayPort Adaptive Sync) specification through software changes.”
So just like G-Sync, except for the lucky few who just happen to have the right hardware, everyone has to go out and buy new displays that can take advantage of this. Good to see AMD support this, but in the end all this means is I have to buy a new display with either (or preferably both) technologies implemented.
I bring this up because when G-Sync was announced, a fanboy war instantly started for no good reason, trying to vilify one company or the other for something that requires a new purchase for 99% of the population anyway.
“Only the Radeon R9 290/290X
“Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the “FreeSync” technology.”
You forgot about that little tid bit to. only 5 AMD cards support it which means you might not only have to buy a new monitor but a certain gpu model. Yea fanboyz and AMD was in on the vilifying nvidia over the need for a new monitor but AMD’s idea is in the same boat well really worse since lack of gpu support nvidia already has on their side.
Your reactionary approach
Your reactionary approach here is a little wrong-headed. How are AMD supposed to add DP 1.2a support to existing products? 🙂
Well Nvidia’s gsync wasn’t
Well Nvidia’s gsync wasn’t released to well after 700 series, yet 600 series cards support it so?
Yes, because they add
Yes, because they add proprietary logic to the monitor. AMD is limited with freesync because they cannot add logic to the monitor.
You’ve gleefully missed
You’ve gleefully missed several points there:
1) Some people may have monitors already capable of it. Not true with G-SYNC.
2) It doesn’t require any additional hardware above and beyond what would already be in the monitor. G-SYNC requires some fairly hefty additional hardware to be added to the display.
3) Any DisplayPort 1.2a device going forward will support this. AMD, Intel, nVidia – it’s there for all.
The fanboy war started because it’s a great idea and, like PhysX, nVidia decided to try to turn it into a proprietary competitive advantage rather than moving the whole industry forwards. Whether or not you think that makes nVidia evil or whatever shit people come out with is irrelevant, that sort of thing does tend to aggravate people.
You obviously ignore the fact
You obviously ignore the fact that AMD’s solution is based on the DisplayPort standard. It’s easier to implement for monitor vendors because the Adaptive Sync specification exists for a long time. It’s also less cost intensive and license-free. Nvidia just do their typical proprietary crap to milk their fangirls.
And btw, FreeSync works more flawlessly. G-Sync has some drawbacks. For instance it costs some performance when running games in 3D.
He’s not ignoring that. He’s
He’s not ignoring that. He’s correctly quoting AMD that firmware isn’t enough. They need specific hardware in that display. He could be over estimating that 99% of monitor owners need to buy a new monitor but you nor I know any better what percentage of people already have the right hardware.
Did you mean stereoscopic 3D?
Did you mean stereoscopic 3D? Anyway any links proving freesync implementation is better than g-sync?
Yea don’t think anyone uses
Yea don’t think anyone uses that 3D stuff cept a few people.
The bottom line will be price
The bottom line will be price of the hardware..From both AMD and Nvidia.
“Mobile displays have
“Mobile displays have previously included variable refresh technologies in order to save power and battery life. AMD found that it could repurpose that technology to emulate the effects that NVIDIA G-Sync creates”
That’s backward.
nVidia found that they could use existing VESA stuff to make fanboy milking modules with a new name.
So in ~6 months AMD has shown
So in ~6 months AMD has shown something similar to what nVidia took 2 years to get at? Awesome! Any video of this running?
Here’s a link to the AMD
Here’s a link to the AMD Free-Sync demo. Maybe you should have included it instead of an nVidia vid in an AMD article?
https://www.youtube.com/watch?v=cK-aV4ryKdE
When I posted this, no video
When I posted this, no video was available. But I'll add it now.
Can you verify this 40-60Hz
Can you verify this 40-60Hz range because Computerbase say otherwise, they’re talking 47-48Hz
Thanks. Sorry for the tone in
Thanks. Sorry for the tone in my post. It wasn’t necessary.
40-60fps is a weak range.
40-60fps is a weak range. Why can’t DPAS or G-Sync dynamically cover the entire range of playable frame rates? 30fps is generally considered playable for most FPS titles even though a minimum of 60 is best, but 20fps is often considered the minimum for RTS/MMO games. Given that even beast rigs with Quadfire/QuadSLI still dip into the 20s when gaming at 4K, it seems like a missed window.
Why not support syncing for ALL frame rates? Why not a complete solution? If a monitor supports 144Hz natively, then do 0-144Hz. There is noticeable screen tearing and stuttering at all frame rates.
Glad to see DPAS can be a firmware update to existing DP1.2 monitors. But it’s a shame to see AMD support it in only 5 GPUs. Variable VBLANK has been supported by every GPU since the 90s. LAME!
Relax, it is just an early
Relax, it is just an early hacked together demo. It is not a finished product.
The supported refresh rates have been named already: 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.
In addition to the listed GPUs, FreeSync is supported in Kabini/Kaveri APUs too.
I’m completely calm, just
I’m completely calm, just disappointed. Like you said, they claimed to support ranges of 6-240Hz, 21-144Hz, 17-120Hz and 9-60Hz, but that’s not what they are showing in their demos.
I wonder if those ranges are resolution dependent? For example:
6-240Hz (720p/768p)
21-144Hz (1080p)
17-120Hz (1440p/1600p)
9-60Hz (4K)
The GPU reads the monitor to
The GPU reads the monitor to find out what range it can support. You can only slow the refresh rate down so much and it’s dependent on the monitor’s capabilities. This monitor was hacked to support Adaptive-Sync. There will be monitors that will be better and/or worse. That’s on the monitor though. The refresh rates AMD shows are what is/will be supported by their graphics cards.
The very point of g-sync (or
The very point of g-sync (or freesync) was suppose to improve the smoothness at low frame rates. For example to make 40 fps feels the same as 60 fps or even 120fps. (Though nvidia g-sync only works for 30fps and above). When nvidia first show g-sync game Dev were talking about no longer to worry about targeting 60 or 30 fps. With low fps can feel as smooth as high fps Dev can take the extra performance that always reserved for high fps to get much better graphic quality.
Maybe on a console 30fps is
Maybe on a console 30fps is playable but on a desktop computer you start to notice stuttering easy only 50fps, even at 60fps when you turn you see it.
From what I heard about how
From what I heard about how Freesync and G-sync work I suspect nvidia solution will perform better under circumstances of highly variable frame rates. Most games have tendency to drastically change their frame rates with time, and given the way VESA adaptive sync adjusts screen frame rates I think freesync might have trouble keeping up with those changes.
If I’m right about freesync rapidly variable frame rates problems I suspect G-sync might survive on the market. It will become sort of overpriced premium solution for enthusiasts, while freesync will be value orientated product for average consumer. Overall I expect freesync will be better then current v-sync , but not as good as G-sync.
Frankly neither of those technologies makes me want to buy new monitor just for them. From what I have seen on g-sync and freesync demos, while there is visible improvements in smoothness, to me effect is not as immediately obvious or overwhelming to justify the cost of buying new screen.
FWIU the time it takes to
FWIU the time it takes to read the timing signal is measured in nanoseconds. Maybe someone with more technical understanding can verify that?
1. Competition is good! It
1. Competition is good! It makes everything better and cheaper!
2. I dont champion “brands”, im not retarded.
3. The first 21:9 monitor supporting either Freesync or Gsync is MINE! (I cannot possibly describe how much i want this.)
4. There are APUs that support it with their integrated graphics?!
Allah Fcuking Christ!! Budget gaming will be as smooth and artifact free as all the most expensive gaming, just with lower eye-candy? [bold]OMG[/bold]
That monitor looks awfully
That monitor looks awfully familiar,……
Nixeus VUE27D Monitor maybe,…..
http://www.anandtech.com/show/7585/nixeus-vue27d-monitor-review
~$430 MSRP,…..
2560×1440 IPS monitor
Take my
2560×1440 IPS monitor
Take my money NOW!!!
I’m guessing that the Nixeus
I’m guessing that the Nixeus Vue 27″ is about the same although the case is a little different and the port placement as well. Probably very Similar to the Auria, overlord and other such clones. Chances are a lot of people already have monitors with similar internal hardware,…..
Unless AMD did something different to their demo monitor,…..
As a Gsync user.I have to
As a Gsync user.I have to say. once you play with Gsync you never want to go backward:
Right now I’ve been using the modified ASUS VG248QE with NVIDIA G-SYNC for about 5 months now. Purchased the monitor with the Gsync installed by Digital storm.
I cannot play game without it now. Totally spoiled. Even down in the low 40FPS is shockingly smooth.
The best part of Gsync though is the ZERO tearing, and minimal input lag.
VR is about gaming. IF your not a gamer and don’t care about gaming then don’t bother posting about VR or Gsync. It’s not your business.
Yeah, all the people who buy
Yeah, all the people who buy $150-$3000 GPUs don’t care about paying that much money to have washed out colors on a TN panel.
Gamers don’t care if their expensive GPUs aren’t pushing proper colors thru a low end panel during gaming.
Silly non-gamers who think good color on a panel matters when you spend so much on graphic setups.
And thus post the first
And thus post the first dueschbag non gamer =)
Yeah what fool. Doesn’t he
Yeah what fool. Doesn’t he know us gamers love spending money on 8-bit video cards only to play them on 6bit panels. DUH!!!
Enjoy playing your games on
Enjoy playing your games on an IPS panel with low Hz and is specifically made for graphics artists with HIGH latency etc. Enjoy your HIGH input lag and blur when moving around in games LOL. Non-gamer douchbag!
Yeah doesn’t he know that
Yeah doesn’t he know that gaming panels advertising a 1ms response time are actually 5-9ms with a average of 7ms at 144mhz. At 120hmz the average goes up to 9ms. Which is similar to a IPS direct drive panel at 60hz with 8ms response time.
We gamerz like paying extra for 75% color gamut along with that Hz and ms marketing. Marketing cost extra and we gamerz are willing to pay for the placebo affect it gives us.
You non-gamers don’t appreciate the monitor features that distort color accuracy and do little to improve response time like we true gamerz do.
…in all fairness, half the
…in all fairness, half the time a ‘gamer’ is complaining about tearing or blur during fast action sequences it’s psychosomatic.
Numerous instances where I spent hours (more like days actually, especially after building a new system) listening to various ‘gurus’ only to find that their various sativa-induced rantings about this or that amounted to absolutely no real world outcome.
VR is also for precise frame
VR is also for precise frame rate playback for the multitude of media formats.
That looks like a Laptop
That looks like a Laptop screen. Is it the same screen and demo they did in Montreal?
Nixeus Vue 27″ IPS LED
Nixeus Vue 27″ IPS LED 2560×1440 DisplayPort Monitor
2560×1440 IPS screen with 100% sRGB
http://www.nixeus.com/?product=nixeus-vue-27d
No more saturated washed up colors of the TN panels.
Wonderful!
Now give me a
Wonderful!
Now give me a quality 3440×1440 34″ 10bit IPS with freesync and you can have all my moneys.
Hopefully dell delays their 34″ to the end of the year to implement this. But i wont hold my breath.
I more interested in actual
I more interested in actual products and their release dates than prototypes.
well… the asus 1440p 144hz
well… the asus 1440p 144hz monitor has been “RELEASED” for 6 months now 🙂
Free the SYNC! Free Willy !
Free the SYNC! Free Willy !
What is wrong with you?
DO
What is wrong with you?
DO NOT CHECK YOUR CELL PHONE IN THE MIDDLE OF AN INTERVIEW!
You are not that important. You are not that busy. If you are, then perhaps you should conduct your interview another time. I really do not understand this behavior. Isn’t it self-evident that such a behavior is not only rude and annoying, but completely unprofessional?
…and NOBODY gives a ****
…and NOBODY gives a **** about 3D-vision!
“3D” is dead, Finally! Put a cork in it until next decade when they dig this trash back up for the next round of suckers.