UPDATE: If you missed the live stream yesterday, I have good news: the interview and all the information/demos provided are available to you on demand right here. Enjoy!
Last week NVIDIA launched GM204, otherwise known as Maxwell and now branded as the GeForce GTX 980 and GTX 970 graphics cards. You should, of course, have already read the PC Perspective review of these two GPUs, but undoubtedly there are going to be questions and thoughts circulating through the industry.
To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Thursday afternoon where he will run through some demonstrations and take questions from the live streaming audience.
Be sure to stop back at PC Perspective on Thursday, September 25th at 4pm ET / 1pm PT to discuss the new Maxwell GPU, the GTX 980 and GTX 970, new features like Dynamic Super Resolution, MFAA, VXGI and more! You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA Maxwell Live Stream
1pm PT / 4pm ET – September 25th
PC Perspective Live! Page
We also want your questions!! The easiest way to get them answered is to leave them for us here in the comments of this post. That will give us time to filter through the questions and get the answers you need from Tom. We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with.
So be sure to join us on Thursday afternoon!
UPDATE: We have confirmed at least a handful of prizes for those of you that tune into the live stream today. We'll giveaway an NVIDIA SHIELD as well as several of the brand new SLI LED bridges that were announced for sale this week!
Seems like the 25th will be
Seems like the 25th will be an interesting day. Hopefully AMD doesn’t do something unexpected instead of PR BS otherwise it would easily overshadow most of this interview.
I would really like a proper
I would really like a proper 28nm AMD cpu that i could use instead of the usual i5intel for gaming without losing fps… and that could fit inside a mini ITX.
THAT BE REALLY SWELL!
… but stories like this are for children, i know that.
I am waiting for the next AMD
I am waiting for the next AMD flashship late this year/early next year which will demolish the green team. I don’t care if it’s noisier than 2 vacuum cleaners at full load and requires more power than all of Ryans’ vibe’s put together, I am going to buy team red and put team G in the shame.
Can you please ask him what
Can you please ask him what display port version is on the 980 and 970? And can you ask him when higher spec cards containing the GK210 are going to be released? And lastly, can you ask him about free sync working on nvidia cards in terms of required hardware and software? And remind him that us poor enthusiasts are the reason nvidia is making money.
It’s already been confirmed
It’s already been confirmed that the 980 and 970 have DP 1.2 and HDMI 2.0. NOT DP 1.3. You’ll have to wait for Big Maxwell (980Ti/Titan2) or probably AMD 3xx-series.
There is a report that it
There is a report that it could support DP 1.2a via software update, but that was info leaked from a close door meeting so can’t say how true it is or if it was taken outta context.
Hi guys, I got a few
Hi guys, I got a few questions for you:
1.) I’ve heard that Nvidia’s development of the Tegra K1 revealed information that was ultimately used to make the Maxwell architecture what it is today. Could you please go into some detail about this?
2.) I’d like to know how much of an improvement to the efficiency of the GPU there was when moving to a power-of-2 system for the SMM’s.
3.) Is Nvidia looking into other forms of rendering as an alternative to AFR in SLI, or is it still the best method at this time for a multi-GPU configuration.
4.) Are there any particular improvements made to SLI with Maxwell?
5.) How difficult is it to make a 392mm2 chip?
6.) Are there other cost effective improvements Nvidia could make to the architecture if a smaller process is delayed again?
7.) Are there any specific improvements that this architecture has over previous ones for your other technologies (PhysX, OptiX, the Works family under VisualFX)
I know you probably cannot answer them all (for various reasons) but thanks for trying.
Lastly, I’d like to say thank you to Tom, Ryan with PCper, and Nvidia for taking the time to organize this event. Happy gaming guys 😀
I’ll just on that.
8.) When
I’ll just on that.
8.) When will Nvidia join the modern age and use the PCIe bus for SLI instead of the expensive bridge connector?
*comment deleted by user*
I
*comment deleted by user*
I realized I shouldn’t try to answer peoples questions when I’m not the expert. =P
I think bridge is better
I think bridge is better other wise you get possible of saturateing the PCI-e bus and crippling speed of the cards.
In generally bus is fast enough to run cards at full speed but cards get faster that might end up not being case with this extra data.
Hey Ryan can you ask Tom if
Hey Ryan can you ask Tom if he thinks G-Sync + ULMB will work together in the near future.
2nd Question is what does he think of the rumor of Nvidia supporting A-Sync/FreeSync.
Last thing can Nvidia bring back Nview for Geforce Graphics cards. They used to be available on WinXP but now it is Quadro only for Win7.
Sorry 1 last question. Can
Sorry 1 last question. Can you ask Tom what he thinks of AMD’s marketing tactics. Things like Richard Huddy, AMD_ROY, AMD twitter teams etc. and stuff like FREE is better then G or Invading Game24 etc.
The rumor didn’t say Free
The rumor didn’t say Free sync, they said DP 1.2a. The `1.2a spec and free sync are not the same. Freesync uses the 1.2a protocol call to work but is As people love to give crap to nvidia for using a ton of, but free sync code is Proprietary software by AMD.
Could you please ask Tom if
Could you please ask Tom if they could work with vizio to get 1080p/120 and 4k60 (4:4:4?) up to snuff on the P series over HDMI 2.0 (970/980)? 4:2:0 4k60 over 1.4 (older cards) would be nice as well (like they did with Sony etc).
Seems people (everyone I’ve seen) are seemingly having some problems, which is unfortunate given it’s a major selling point of both (particularly nvidia’s) products.
Totally understand this is a new card series and also a new tv series, but if they could work together to get 1080p/120 (perhaps 2560×1440/120?) and 4k60 working in tip-top shape, I think a boat load of people would be very happy.
Also, ask him about fixing hdmi RGB (full/limited) 0-255 support within the control panel…thanks.
Also, where are the
Also, where are the G-Sync-enabled projectors we were promised?
don’t think any projectors
don’t think any projectors were ever promissed
Full range RGB is supported
Full range RGB is supported in the control panel. All you have to do is use a resolution that’s under “PC” and not one that’s under “Ultra HD, HD, SD”…
As always , looking forward
As always , looking forward to the Nvidia and Evga live stream.
My first question got answered by Ryan as to the 980 being the Full GM204 chip.
My 2nd is why is the HDMI port in where it is now, whats the logic.
3rd. DSR is a Great feature. But will it remain Exclusive to GM204/GM200? Will it Be available on Kepler? I ask because the 780/ti/Titans are perfectly capable of this feature and “Downsampling” have been accomplish for a while now with 3rd party software although not always stable.
regarding 3rd question, Tom
regarding 3rd question, Tom Petersen already said DSR will be later available for Kepler. not sure about Fermi though.
btw, you can downsample by adding custom resolutions to the Nvidia control panel, no need for 3rd party tools. the downside is the cheap bilinear type scaling, but it works in all games that support whatever resolution you add. DSR should provide better scaling algorithm and ease the process of adding the resolution. which brings me to my own question:
will I be able to add more than a single DSR setup at a time, or at least per application profile ? because I hear DSR is only available in global profile, which means we must constantly go to control panel and edit DSR multiplier/smoothness depending on game.
this isn’t a problem with custom resolutions as I can add a whole bunch of them, and select desired res in game.
Fermi is what 3+ years old
Fermi is what 3+ years old now, so adding features to it is pretty much not likely to happen.
“btw, you can downsample by
“btw, you can downsample by adding custom resolutions to the Nvidia control panel, no need for 3rd party tools. the downside is the cheap bilinear type scaling, but it works in all games that support whatever resolution you add. DSR should provide better scaling algorithm and ease the process of adding the resolution.”
@applejack the problem that I’ve seen is, for example, a 1080p 144hz monitor would revert back to 60Hz during Downsampling , thus losing the benefits of a faster refresh rate.
that depends. I’m using ASUS
that depends. I’m using ASUS VG248QE w/ G-Sync I can achieve the followings:
2400×1350 @ 120Hz
2560×1440 @ 100Hz
2880×1620 @ 85Hz
3840×2160 @ 60Hz
but yea it could be another advantage for DSR if it keeps the 144Hz in all cases… not that I’ll really need the higher refresh at demanding resolutions anyway, especially with G-Sync enabled 🙂
just an update, I read we can
just an update, I read we can tick all multipliers, so all added resolutions are available in games. thank you nvidia 🙂
When will VR features (like
When will VR features (like VR SLI) be available and will they be limited to Maxwell GPUs?
What about future of
What about future of 3DVision?
Support for newer games basically stopped, which makes me a sad panda because I just bought PG278Q and 3D in 1440p is awesome when it actually works.
I also would like to know
I also would like to know when Nvidia will add an ability to select color range (Limited/Full Range RGB) in games and 3D applications like AMD did 6 years ago (!!).
Currently all NV GPUs send out ONLY Limited color range at 1080p when using HDMI and DisplayPort. This is rediculous – all modern TVs and monitors support FullRange RGB! You can’t select Full Range RGB at all!
Please, note there is a setting in Nvidia control panel that allows to select color range but it works only for video files!
Less color to output more
Less color to output more available bandwidth more performance. They recently dumb down HDMI color to 4k panels to be able to say they can do 4k@60hz.
If you have a 6-bit panel or less its less likely limited range to bother you because your more then likely to blame the panel then the GPU. Recent users of G-Sync have experience this issue with the out of range bug in ASUS ROG Swift monitor and Nvidia have yet to confirm it exist at all with various complaints in forums from buyers.
Nvidia default
Limited Range RGB 16-235
Gamma = 2.0
White point = 6318k
White luminance = 132
Black luminance = 0.32
Contrast ratio = 413
AMD default
YCbCr 4:4:4
Gamma = 2.3
White point = 6468k
White luminance = 154
Black luminance = 0.13
Contrast ratio = 1184
AMD default is much better and from what I understand they give you the option to improve it with a drop down menu. So if your running a game or an app its more then likely reverts back to default and thus with Nvidia giving you the wash up color look on a decent monitor panel.
If you use a HDMI/DP monitor
If you use a HDMI/DP monitor that identifies itself as a HDTV for the sake of compatibility then the driver will apply HDTV settings to the signal because it thinks it’s connected to a TV.
If you use a HDMI/DP monitor that properly identifies itself as a PC monitor then you’ll get the full range RGB.
Make sure the resolution profile in the driver is listed under “PC” and not “HDTV” or something else (like “Ultra HD, HD, SD”). If it’s under anything other the “PC” try to select one under “PC” and if they don’t work, then create a custom resolution.
Simply adding a setting to use “full range RGB” in the driver, will not work, because you’ll need to setup the monitor/HDTV. Because the monitor or HDTV will simply ignore the full range RGB and compress the signal in the most terrible way. Most monitors and some HDTV have an option to turn this compression OFF and then you’ll have to see what resolution the driver chooses (maybe by unplugging and plugging it back again). And this is true for AMD also.
I already inquired about DSR
I already inquired about DSR and whether the downsample filter is gamma-aware, but I need to clarify my question based on something I remembered. RGB is a linear format, so any resampling done before color correction (driver or game gamma/brightness settings) shouldn’t need to take gamma into account AFAIK. If the downsampling happens after color correction – which it very well might – then it does become a pertinent question.
I guess I’m also asking does the downsampling for DSR happen before or after color correction?
Is there anything that would
Is there anything that would make it impossible or very expensive for a display to support both gsync and freesync?
For me the most interesting
For me the most interesting thing about Maxwell is its use in laptops and mini PCs. I realize asking about future releases is futile, but maybe if your phrase questions in a way that allows him to reply with a ‘it’s definitely something we’re looking into *wink wink*’ type of answer…
Has there been much interest from makers of steamboxes in Maxwell? The Alienware one is the only one I’ve seen any coverage of.
If the energy efficiency of Maxwell is the result of lessons learned from the development of Tegra K1, why is the graphics portion of it classified as Kepler?
Does that mean we can expect a jump in energy efficiency from the next Tegra similar to the jump from desktop Kepler to Maxwell?
Seeing as the 880M is very close to the desktop 770, is it reasonable to expect GM204 derived mobile GPUs?
Or with the GM204 being an overkill for FullHD, would laptop makers be more interested in smaller, cheaper chips?
Wow, that’s gona be alot of
Wow, that’s gona be alot of camera time for ryan.
Question for Tom : Is nvidia at all concerned about the upcoming monitor market segmentation G-sync/Freesync or is the expectation that one will win and the other company will adopt the others tek. Would nvidia be willing to licence out G-sync to other GPU companies (intel and matrox still count)
I would like to know what
I would like to know what features EVGA is pushing on these motherboards to warrant their price tag. With similar boards from other companies with seemingly more features selling for a smaller premium. Is it the fabulous CMOS Battery placement?
aw crap, looked at the prozes
aw crap, looked at the prozes for the other post and thought evga was gonna be there too. Let me go find my dunce hat
aw crap, looked at the prozes
aw crap, looked at the prozes for the other post and thought evga was gonna be there too. Let me go find my dunce hat
oh wait, they will be talking
oh wait, they will be talking about tht stuff. Thenm my question still stands 🙂
Wrong place to ask that, try
Wrong place to ask that, try asking under evga live stream.
Who can afford them?
Who can afford them?
Please ask Mr. Petersen:
Does
Please ask Mr. Petersen:
Does the color compression happen in the driver as a CPU task?
If yes, has anyone measured the CPU usage in games with GM204?
Does color compression also reduce video RAM usage? If yes, why did v-RAM increase while bandwidth was decreased this generation?
Well Since bus did go down to
Well Since bus did go down to 256bit from 368bit, nvidia had a choice cut gpu to 2gb or go to 4gb, given state of games which direction to go was kinda no brainer.
I have another question, this
I have another question, this one regarding G-Sync.
There has been interest in G-Sync for matching frame rates of video content, such as film (23.97) and NTSC (29.97) content as well as for video editing purposes. Since these refresh rates are below the minimum specs for current G-Sync monitors (and are rather low for doing any sort of work), are there any plans to implement a ‘double-rate’ g-sync, which for example in the case of 23.97 content would sync at a rate of ~47.95fps?
I watch a ton of video on my
I watch a ton of video on my 60hz locked monitor and never see anything like tearing. SO not sure there would be a point for it. What g-sync does is mostly needed for games.
Are there any plans to extend
Are there any plans to extend G-sync to IPS or other LCD panel technologies other than TN panels?
Announcements were made for desktop maxwell GTX980 and GTX970s were announced, When can we expect laptop versions of these products to be released?
There is nothing limited
There is nothing limited g-sync to TN, Its the Monitor makers that have done so. Reason for this is IPS has yet to get fast enough to really benefit from g-sync. There is only 1 IPS monitor I seen being sold that can do 120hz from factory and that is listed as Overclocked.
Is the GTX 980 a fully
Is the GTX 980 a fully unlocked GM204, or is it like the 780, where it gets replaced with a fully unlocked GPU?
NVIDIA has already confirmed
NVIDIA has already confirmed that the GTX 980 is the fully-unlocked GM204. You can read it on page 1 of Ryan’s review.
You can also count the green squares (or just the clusters thereof) in the die shot – they match the number of those elements in the GTX 980 specs.