Tomorrow at 12pm EDT / 9am PDT, NVIDIA's Tom Petersen will be stopping by the PC Perspective office to discuss some topics of interest. There has been no lack of topics floating around the world of graphics card, displays, refresh rates and tablets recently and I expect the show tomorrow to be incredibly interesting and educational.
On hand we'll be doing demonstrations of G-Sync Surround (3 panels!) with the ASUS ROG Swift PG278Q display (our review here) and also show off the SHIELD Tablet (we have a review of that too) with some multiplayer action. If you thought the experience with a single G-Sync monitor was impressive, you will want to hear what a set of three of them can be like.
NVIDIA Live Stream with Tom Petersen
9am PT / 12pm ET – August 22nd
PC Perspective Live! Page
The topic list is going to include (but not limited to):
- ASUS PG278Q G-Sync monitor
- G-Sync availability and pricing
- G-Sync Surround setup, use and requirements
- Technical issues surrounding G-Sync: latency, buffers, etc.
- Comparisons of G-Sync to Adaptive Sync
- SHIELD Tablet game play
- Altoids?
But we want your questions! Do you have burning issues that you think need to be addressed by Tom and the NVIDIA team about G-Sync, FreeSync, GameWorks, Tegra, tablets, GPUs and more? Nothing is off limits here, though obviously Tom may be cagey on future announcements. Please use the comments section on this news post below (registration not required) to ask your questions and we can organize them before the event tomorrow. We MIGHT even be able to come up with a couple of prizes to giveaway for live viewers as well…
See you tomorrow!!
How is NVidia working with
How is NVidia working with companies like Blizzard, whose games, like Diablo III, have issues with G-Sync?
With G-Sync, will desktops, like Windows’, be static, alleviating the need for a refresh until a call to the GPU is made, hopefully alleviating eyestrain and also allowing GPUs to use less power?
Why was the decision made to avoid high-quality panels, meaning using TN panels? Certainly 5 to 6 ms on a 120 Hz panel is arguably better for a first showing.
What is compatibility issue with Ultra Low Motion Blur that it will not work with G-Sync? It seems odd that one will not work with the other.
Has NVidia taken the necessary steps to ensure our thousand-dollar cards will work with FreeSync?
Tom Peterson, hearing from and seeing you again will be a pleasure, thank you.
Thank you PC Perspective crew, Ryan Strout, your work is excellent. Do not let Mr. Peterson answering a question that only creates another question.
Blessings to you all and to all those whom love.
Sincerely, Joseph C. Carbone III; 21 August 2014
As a person that plays diablo
As a person that plays diablo III, that game has its own issues of chugging to a crawl in fights as it is.
avoid HQ panels? Do you mean IPS? if so how many 120hz ones are there even at this time? IPS even those better color then TN are slower in hz side, 60hz/fps in most games is easy so no real reason to use that.
Nvidia cards i think they said works with VESA adaptive sync standard, BUT Free sync is proprietary AMD code so can’t expect them to support it outta the box if they ever do.
You really need to get out
You really need to get out more.
The ASUS ROG Swift is a 12ms panel at 60hz. IPS panels are faster then that natively at 4-8ms. At 144mhz the ASUS ROG Swift is 7ms panel. Not until its overdriven and color is degraded can it get below 4ms.
Take any of the current Korean brands IPS panels that are being overclocked to 120hz. They are going to get you a faster response time then the Asus Rog Swift 144hz @ 7ms without the need of any color degradation.
You can buy 2 of these and overclock them for the price of one ASUS Rog Swift.
http://www.amazon.com/dp/B00BUI44US/?tag=ufoghost-20
$300 premium on a non IPS panel just seams silly when the advantages comes to those using lower end systems who don’t spend much let alone $750 USD on something or high end systems with multi monitors and even then G-Sync isn’t functioning properly for them needing 1 panel per GPU card.
…
…
…
…
…
…
Like cherry picking CPUs and
Like cherry picking CPUs and GPUs, there are IPS/PLS panels that would do the job at 120 Hz and cleanly. There are several examples that have been on the market for at least eight months.
Understanding AMD’s direction, the company intends on using FreeSync by working with a standard already implemented by LCD controllers and connectors, which is the Embedded Display Port standard and a compatible controller used by notebooks, other portable devices, and some all-in-one PCs, the connector being available with Display Port 1.2a for the PC.
What is needed from companies like AMD and Nvidia is drivers to work with the existing technology, and some monitors are believed to already have controllers that recognize VESA’s Adaptive-Sync in the same way the Toshiba notebook AMD used to demonstrate FreeSync had the right controller. The effort needed from Nvidia to make an LCD recognize this technology AMD demonstrated is minor compared to G-Sync: drivers! Which, they have likely already written for their mobile devices.
I do not trust Nvidia’s adoration of being reasonable, to do anything that does not require first you and I put cash in their big pockets. However, if they avoided what has been available at little extra cost to use, what they are selling us is required to be superior, but if this proves to be contrary, which Nvidia has a reputation for less than admirable marketing strategies, then we will want our cards to work without Nvidia’s premium monitors and will need Adaptive-Sync drivers from the company.
—
Thank you PC Perspective crew, Ryan Shrout, your work is excellent. Do not let Mr. Peterson answer a question that only creates another question go unnoticed.
Sincerely, Joseph C. Carbone III; 21 August 2014
AMD’s freesync isn’t what
AMD’s freesync isn’t what VESA ratified as DP 1.2a standard, AMD said their freesync was gonna be a standard which was nothing more then a half truth, its based off the standard but is proprietary code in the end.
About Ultra Low Motion Blur.
About Ultra Low Motion Blur. Since that technique inserts a black frame (switches off the led backlight)in between the visible frames that the GPU produces, it would lead to a screen that very visibly starts to flicker when framerate varies. Have you for example seen how Sony LCD TV’s work using Impulse mode? That mode also inserts a black frame between it’s native refresh interval. But because the normal refresh rate is only 60Hz the screen flickers quite a lot and also gets somewhat darker when this mode is activated. I suspect this is one of the reasons for it not working together with G-sync. Plus the fact that it would most likely add some extra latency when synchronizing the panels led backlight to the variable refresh rate of G-sync.
Hola,
I just build my pc I
Hola,
I just build my pc I used as reference a pc that you recommend for play Titanfall on 1080p, but I change the graphic card for a asus gtx 780. I really want a g-sync monitor, but I want a 1080p g-sync monitor. So my question is when can I buy it?
And if Nvidia have a official online retailer in Mexico?
The asus monitor is too expensive, I think a 24 g-sync 1080p monitor would be nice.
Gracias
you can buy ASUS VG248QE
you can buy ASUS VG248QE 144hz monitor and g-sync DIY kit, that is 1080p monitor or there is a few sites that sell it pre-installed.
Okay, one of the most
Okay, one of the most important questions, did I mention importance, what the hell is it with the PG278Q being another 16:9 at 2560 x 1440? Did Nvidia have any say? Please, do not say that was ASUS.
For Childern: For Adults:
1920 x 1080 | 1920 x 1200
2560 x 1440 | 2560 x 1600
3840 x 2160 | 3840 x 2400
The Shield Tablet got it right with 1,900 x 1,200 display.
Thats kinda an opinion of
Thats kinda an opinion of taste. At work I prefer 16:10, but when gaming 16:9 or wider.
Why doesn’t nvidia implement
Why doesn’t nvidia implement a built in auto overclocker in their drivers, like with some other graphic card brands?,
While manual overclocking is simple enough, there are some users who are unable to get a stable overclock.
With VR supposedly about to
With VR supposedly about to have a big impact on gaming, are we going to see GPU designs optimized for driving 2 1920×2160 displays?
Or perhaps efforts to improove drivers to make 2-way SLI setups better at rendering pairs of slightly spatially offset frames?
I have a few questions for
I have a few questions for Tom:
1) What is the technical reason for ASUS to have Lightboost/ULMB disabled while running in GSync mode on the PG278Q? Is there an electrical issue with doing this, or is the GSync scaler not capable of coordinating both at the same time? If electrical, what’s required to make ULMB work at the full 144hz refresh rate?
2) ULMB can be supported on Radeon GPUs, but it’s intentionally blocked for PG278Q owners. Why go through the trouble of blocking this out? Why not leave ULMB open for Radeon users, which would allow them to consider buying this monitor and later upgrading to a Geforce GPU that would support GSync? Most people change GPUs to keep up with the times, but they don’t swap out their monitors as much. By blocking ULMB, you’re locking out a fairly big part of the market that’s currently on the fence on whether they want variable refresh rates now or in 2015. Giving them ULMB now would probably make them your customers later when upgrade time comes around.
3) What tricks can the GSync module do that aren’t related to performance in games? Can a faux panel self-refresh ability be patched in?
4) Once Displayport 1.3 monitors come to market, will Nvidia continue to invest in GSync and custom scaler hardware, or will it eventually be deprecated with the rise of DP1.3?
5) Monitors with GSync don’t have variable refresh rates below 30Hz for now. What is Nvidia doing to make the fabled 23.976Hz mode work for watching videos?
5) Monitors with GSync don’t
5) Monitors with GSync don’t have variable refresh rates below 30Hz for now. What is Nvidia doing to make the fabled 23.976Hz mode work for watching videos?
I want to know this aswell. It’s kind of a biggie for us that uses the computer to watch blu-rays. Having to manually change refresh every time or deactivating G-sync to watch a movie is not that nice.
What is easier for a GTX
What is easier for a GTX Titan to render:
100 duck sized horses with their hair modelled with tools from Nvidia GameWorks or
1 horse sized duck with its feathers modelled with TressFX?
Just what kind of game are
Just what kind of game are you making there?
I have a few questions
I have a few questions regarding G-Sync that I haven’t seen answered anywhere yet.
1. Gsync so far has been marketed at gaming but another thing I think we all know is that it can fix the video judder from TV and Movies. What i mean is that TV and Movies don’t run at 60fps they run at different fps(movies 24fps, PAL 25fps, 48fps, etc.). Is Nvidia working on a solution for this or can some sort of developer tap into Gsync to make a solution?
2. So far all Gsync monitor come with ULMB is that going to be stanard going forward with all Gsync monitors?
3. PhysX 2 vs PhysX 3 is there anyway to benchmark between the 2 version. Also why do some developers still use PhysX 2 (gearbox, rocksteady)?
4. Gsync IPS when? I check the Overlord Computer forums. They said they were on the list but that was 6months ago.
In order to get 3 g-sync
In order to get 3 g-sync monitors in surround, does it require 3 video cards? Or are there any video cards coming out with multiple DisplayPort ports?
Can I have a G-sync monitor in the middle, and then have 2 non-G-Sync monitors on the side? I assume this will create issues with tearing on the side monitors.
In the Geforce forums, NVidia
In the Geforce forums, NVidia reps have previously said it takes 3 cards (because of display port needs), but this would be good to confirm, same with your ‘cards with multiple display port’ question.
The way that I understand it, running Surround the way you describe isn’t possible with GSync *active*. Surround treats all the monitors as one single monitor, so Surround is supported to whatever degree the monitors are exactly the same. So best case scenario – if all three don’t support GSync, then that feature simply isn’t available. This is similar to my setup, where I run two 60hz and one 144hz monitor. Surround is only available to me at 60hz (which is fine!).
But my earlier question still stands, given the fickle nature of Surround. Is surround in this manner possible AT ALL? Meaning, if I have a GSync monitor in the middle of two 60hz monitors, would 60hz normal Surround be available? While I think it should be, no one I’ve talked to has any idea – and I don’t want to spend hundreds without knowing for sure.
Question.
Is there ever a
Question.
Is there ever a chance that G-Sync will work with AMD cards? Could this be implemented with a driver, or is there some proprietary hardware on Nvidia cards that would prevent this?
AMD could maybe add it if
AMD could maybe add it if they license the tech from nvidia but its AMD and they are too cheap to ever do that.
Not really a question, more
Not really a question, more of a message to pass along.
Hey Tom, please ask the Shield team to consider building a 64GB SKU for future products.
If it were possible to get a Shield Tablet that was wifi only and 64GB with a cost around $350-400, I’d have 2 on pre-order already.
When will nvidia come up a
When will nvidia come up a fix for shield tablet wifi problems and case cracking at the edges?
Will they replace my tablet?
1) With the advancements you
1) With the advancements you are making with gsync and project Denver have you been laying the ground work to enter the laptop/tablet market to offer a competitive (hopefully lower cost) alternative to AMD/Intel?
2) So GPUs can’t share memory because of latency over pcie but with the improvements you made with Keplar such as GPUDirect can’t you offer a dual gpu system where all the memory on a single card is available in the future?
Just to expand on question 2.
Just to expand on question 2. Can’t you in the future use the technology in GK110 so a dual gpu system can be seen as a single GPU through a virtualized instance? This way you increase the number of physical GPUs that can be SLIed from 4 to 8.
AT what point will DX12 be an
AT what point will DX12 be an issue with cards that are farther back in the line up? Cards like the 560, 660 and the such. With some elements not working so well with DX 11.2, what does that mean for the cards with DX12?
How hard was the decision to
How hard was the decision to market the Shield Tablet as a gaming device while? The Note 7 is my favorite tablet that I have ever owned and I don’t use it for gaming at all. It seems that the Shield Tablet would be a great all day tablet for everything not gaming but even as a gamer I don’t find any of the gaming features compelling enough as a selling point in that form factor.
I would actually be much more interested in buying a Tegra K1 powered shield in the form factor of the original shield rather than a tablet that just has a secondary controller. Or at least some sort of controller that can dock to a controller, having two separate pieces of hardware that don’t connect just doesn’t make any sense to me. If I have to set up on a desk/table to play a game what is the point?
When is Nvidia shield tablet
When is Nvidia shield tablet LTE coming? as I ordered it
When will Nvidia EVER start
When will Nvidia EVER start offering true 10 bit color on it’s HDMI and Displayport output of GTX and Titan cards for occasional content creation and gaming?
I do not want to buy a Quadro, not a design company…
Some quick questions, thanks
Some quick questions, thanks again to PCPer and NVIDIA for this;
1) When can we expect NVIDIA SHIELD PC Streaming to support multiple controllers, say when using a SHIELD Tablet in “console mode” playing PC Games?
2) Any plans to put the Denver version of Tegra K1 into a NVIDIA SHIELD Tablet?
3) Does NVIDIA have more OEM’s planning to produce G-Sync monitors than is currently announced?, Maybe Dell?
I for one would be very excited for a 24Inch+ Dell IPS Ultrasharp with G-Sync built in.
4) Any plans for NVIDIA to license its Tegra/Gefoce line to say other ARM SoC developers?, ex: an Apple ARM SoC with Geforce GPU.
5) What is NVIDIA’s current opinion on the state of multi GPU systems with regard to the problem of micro-stutter.
Why are we so behind on eGPU
Why are we so behind on eGPU technology??!!!!??
I’d like to run powerful software out of portable laptops that have great battery life.
Hello Tom,
Thanks for all of
Hello Tom,
Thanks for all of your excellent work on behalf of gamers.
1. Are you aware of any OLED based monitors or TVs that are in development which will also include Gsync functionality?
2. Are you aware of any 4K IPS monitors that are in development that will also include Gsync Functionality?
3. Will the next gen cards (880) have DP 1.3 or HDMI 2.0 connectivity?
4. Is Nvidia working directly with Oculus to enable Gsync support in the Consumer version of the Rift?
5. Rumors state that the 20nm process at TSMC has run into issues. When can we expect 20nm Maxwell parts?
6. What is the approximate performance delta between a 28nm and a 20nm Maxwell part?