At SIGGRAPH, Richard Huddy of AMD announced the release windows of FreeSync, their adaptive refresh rate technology, to The Tech Report. Compatible monitors will begin sampling "as early as" September. Actual products are expected to ship to consumers in early 2015. Apparently, more than one display vendor is working on support, although names and vendor-specific release windows are unannounced.
As for cost of implementation, Richard Huddy believes that the added cost should be no more than $10-20 USD (to the manufacturer). Of course, the final price to end-users cannot be derived from this – that depends on how quickly the display vendor expects to sell product, profit margins, their willingness to push new technology, competition, and so forth.
If you want to take full advantage of FreeSync, you will need a compatible GPU (look for "gaming" support in AMD's official FreeSync compatibility list). All future AMD GPUs are expected to support the technology.
I still think “Free Synch” is
I still think “Free Synch” is in the sink,so to speak.Huddy says it is merely a BIOS change,now he is backing off to say it is monitors that ‘can’ support it.Would that not hint at the need for hardware/G-Synch utilization and is that charge hdwre or license?GJ Scott anyway it seems the duel between Green and Red will continue,last to speak will be winner with biggest lie,LOL!!!
If manufacturers are smart,
If manufacturers are smart, they’ll just make monitors that do both. For the enthusiast crowd this is targeting, monitors usually outlive the GPUs they connect to for several generations.
If I were a monitor
If I were a monitor manufacturer, I would have an aversion to implementing technology in my monitors that was restricted to one particular IHV’s products. It might yet prove a masterstroke by AMD to go through a standards body with their tech
Yea if AMD’s implementation
Yea if AMD’s implementation was a standard which its still proprietary code.
I feel dumb for asking, but
I feel dumb for asking, but if it’s called “Free”Sync does this mean that Nvidia GPUs can use it too?
Nope. NVIDIA can use adaptive
Nope. NVIDIA can use adaptive refresh, found in DisplayPort 1.2a, but not "FreeSync". Surprised us a bit that they were not the same technology (and maybe AMD originally intended it to be). Unfortunately, we don't know that.
FreeSync part is AMD
FreeSync part is AMD proprietary drivers needed to use adaptive sync in a way to reduce tearing and stuttering, so talking about freesync being used on nvida card is as stupid as talking about running nvidia card with catalyst drivers.
What nvida can do is write their own drivers for adaptive sync that would do same thing as g-sinc, but only without all the additional hardware that comes with g-sync. Of course they are unlikely to do that as g-sync is more profitable because monitor manufacturers have to pay them license for every monitor they make with g-sync. This is why g-sync adds some $200 to cost of monitor, hardware + nvida license.
AMD solution is to have monitor manufacturers integrate adaptive sync DisplayPort 1.2a standard free of license, and AMD will provide freesync drivers in their catalyst drives suit free of additional cost to either final user or monitor manufacturers.
I expect eventually Intel will follow AMD and produce drivers for their integrated graphic cards to use adaptive sync DisplayPort 1.2a standard in a same way as freesync, but Intel will name it something else.
So in the long run I expect adaptive sync based reduction of stuttering and tearing will work with both in intel and AMD cards, but not on nvidia cards as they will be pushing their more expensive and licensed solution.
“This is why g-sync adds some
“This is why g-sync adds some $200 to cost of monitor, hardware + nvida license.”
No, that’s because the current implementation uses an FPGA and a big bank of RAM (for bandwidth, rather than capacity) rather than an ASIC with an optimised interface. Fast FPGAs are expensive, dedicated ASICs are cheap.
No surprises here, in
No surprises here, in adopting amd’s proposed additions to the dp spec they renamed freesync to adaptive sync. Naturally, anyone who wants to use it will need drivers/software and hardware compatible with adaptive sync to offer freesync like variable refresh.
https://www.youtube.com/watch
https://www.youtube.com/watch?v=ANPX0DQxalA
FAIL AMD
As with most things AMD hype of their version of VR will fall short of Gsync’s superior gaming experience.
I guess your brain just
I guess your brain just FAILED!
Bring up some real arguments or shut up.
Freakin Fanboys…
PC Perspective:
“Oh, and of
PC Perspective:
“Oh, and of course, you have this as the first true G-Sync capable monitor on the market, implementing NVIDIA’s custom variable refresh rate technology to offer up the best gaming experience you are going to find anywhere; it’s really not even close. Once you have seen and used G-Sync for some gaming, it’s going to be basically impossible to go back.“
Overclockers Club:
“By spending several hours in front of the ASUS ROG Swift PG278Q, what I did discover was that I found myself playing through the games much longer than I had before just because of how smooth the gameplay was. In addition, I found that, while playable with v-sync off, G-Sync fixed the screen tearing and lag issues you get when v-sync is enabled. After moving back to my standard gear, which is a 30-inch 60Hz IPS monitor, I was instantly yearning to go back to the G-Sync enabled ASUS ROG Swift PG278Q. The difference is truly mind bending and made the gaming experience all that more enjoyable.“
ASUS
“It can’t undo the
ASUS
“It can’t undo the fundamental effect of very low frame rates and it doesn’t do anything below 30FPS, but it does make it far more tolerable and the transition from high to low smooth. IMO it’s not so suitable for very fast action games where you’re better off using the ULMB option with normal or extreme pixel response setting instead.”
Why would you want to game
Why would you want to game under 30fps? Change your settings.
I actually played through
I actually played through Halo PC at about 14-19 FPS with everything on low before I updated my GPU (GeForce 2 MX -> GeForce FX 5700 Ultra). It was surprisingly… okay.
HotHardware:
“Disabling
HotHardware:
“Disabling V-Sync may eliminate lag, but tearing is evident. And enabling V-Sync may eliminate the tearing, but the lag can be annoying. With, G-Sync though, the on-screen images don’t suffer from visual artifacts and the tearing is gone too.
We wish there was an easy way to visually convey how G-Sync affects on-screen animation, but there isn’t. We don’t have a means to capture DisplayPort feeds and shooting video of the screen and hosting it on-line doesn’t capture the full effect either. In lieu of an easy visual method to show how effective G-SYNC is, you’ll just have to take our word for it. G-Sync is great.“
Good. Good.
Good. Good. WONDERFUL.
Comptetition is king!
Work out all the bullsh1t and and make this a standard and work on improvements while we still sit on LCDs.
So when OLEDs come we can have 0-250hz smooth transitions with the monitor holding an image if the fps drops below 1.
I don’t know how this shakes
I don’t know how this shakes out, but I do know who’s going to be making profits from new monitor technology and it’s not AMD. Nvidia’s making the profits from gsync and monitor manufacturers are making the profits from freesync. My guess is that both flavors on monitor are similarly priced. If you think that monitor manufacturers are going to give away new functionality for free, you’re crazy.
Good work AMD, introducing new tech and not making any money from it. Hell of a way to run a business.
That is a big unfortunate AMD
That is a big unfortunate AMD will not see any penny from this. They have to make money. In the other hand what worry’s me : All AMD Radeon™ graphics cards in the AMD Radeon™ HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes. The AMD Radeon™ R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.
Basically what really counts: 260(X) and 290(X) if you would like to use dynamic refresh rates during gaming. Rest of the cards are out of the game. APU`s are fine with both features. Sound a bit of lack of support.
Gimme a 1440p 21:9 freesync
Gimme a 1440p 21:9 freesync ips screen.
And ill power it with JUST a kaveri APU which ill vesa mount to the monitors back…
and ill SMOOTHLY play AND ACTUALLY ENJOY all the source games and pretty much everything expect some top10 most demanding games.
Thats some pretty cheap and HIGHLY enjoyable gaming on the horizon… if you dont NEED eye candy in the latest AAA single players.
The point of most AAA games
The point of most AAA games is how good they look, playing on a gpu that could probably only push 30-40fps at low settings @ 1440p is only accept to a point. I know a lot of people agree with me on that.
If the point of most AAA
If the point of most AAA games is how they look. Why play them on a TN panel in the first place?
cause IPS is piss slow ?
cause IPS is piss slow ?
I have a TN panel that is 2
I have a TN panel that is 2 ms response time. A good ips panel is 5ms, which is average for a TN, not slow at all. Maybe you’ve reviewed the wrong ips panels?
A whatever-SYNC 21:9 ips is
A whatever-SYNC 21:9 ips is going to beat the piss out of your face and all your TN eye-burning trash.
Hmm is anyone else having
Hmm is anyone else having problems with the images
on this blog loading? I’m trying to determine if its a problem on my end
or if it’s the blog. Any feed-back would be greatly appreciated.
Yea people that have say
Yea people that have say 280(x) or 7970 kinda got screwed in that deal. As they talked on the podcast about tonga which would be a 285(x) kinda puts people in a spot to have to buy that card which should support it in gaming, to buy a new gpu with about same as they had before. Does sound like AMD did this based on reaction to nvidia’s idea. Nvidia looks like since most their cards support it have been working on it for a while, AMD just kinda ported it from laptops power saving idea to gaming.