Although their Keynote presentation tonight at CES is all about automotive technology, that hasn't stopped NVIDIA from providing us with a few gaming-related announcements this week. The most interesting of which is what NVIDIA is calling "Big Format Gaming Displays" or BFGDs (get it?!).
Along with partners ASUS, Acer, and HP, NVIDIA has developed what seems to be the ultimate living room display solution for gamers.
Based on an HDR-enabled 65" 4K 120Hz panel, these displays integrate both NVIDIA G-SYNC variable refresh rate technology for smooth gameplay, as well as a built-in NVIDIA SHIELD TV set-top box.
In addition to G-SYNC technology, these displays will also feature a full direct-array backlight capable of a peak luminance of 1000-nits and conform to the DCI-P3 color gamut, both necessary features for a quality HDR experience. These specifications put the BFGDs in line with the current 4K HDR TVs on the market.
Unlike traditional televisions, these BFGDs are expected to have very low input latencies, a significant advantage for both PC and console gamers.
Integration of the SHIELD TV means that these displays will be more than just an extremely large PC monitor, but rather capable of replacing the TV in your living room. The Android TV operating system means you will get access to a lot of the most popular streaming video applications, as well as features like Google Assistant and NVIDIA GameStream.
Personally, I am excited at the idea of what is essentially a 65" TV, but optimized for things like low input latency. The current crop of high-end TVs on the market cater very little to gamers, with game modes that don't turn off all of the image processing effects and still have significant latency.
It's also interesting to see companies like ASUS, Acer, and HP who are well known in the PC display market essentially entering the TV market with these BFGD products.
Stay tuned as for eyes-on impression of the BFGD displays as part of our CES 2018 coverage!
Update: ASUS has officially announced their BFGD offering, the aptly named PG65 (pictured below). We have a meeting with ASUS this week, and we hope to get a look at this upcoming product!
Should ask them if they will
Should ask them if they will take advantage of HDMI 2.1 for these displays. Would make sense for 120 Hz 4k HDR content, but Nvidia will probably try and avoid cross-platform VRR in favor of Gsync.
Gsync needs to just go away.
Gsync needs to just go away. It should have been part of the standards from the start, not some proprietary, vendor lock in BS.
It’s unders why you might
It’s unders why you might think that, but given that Nvidia was R&Ding g-sync years before the standard, and Freesync only came to exist off the back of g-sync, piggybacking the research and consumer damand that g-sync had already brought about, Nvidia are not going to retire and concede to their copycat competition at least until they have made ROI.
Gsync is superior anyways, I
Gsync is superior anyways, I hope it doesn’t go away.
I’m sure it’s displayport,
I’m sure it’s displayport, these are oversized monitors, after all
I’m sure it’s displayport,
I’m sure it’s displayport, these are oversized monitors, after all
… also HDR10+ and HDCP2.2.
… also HDR10+ and HDCP2.2.
Ask ASUS for price and
Ask ASUS for price and availability in CANADA, please!
It will most likely G-Sink
It will most likely G-Sink your bank account!
I wouldn’t expect much info,
I wouldn’t expect much info, ASUS loves showing off vaperware at CES.
I believe that AMD is also
I believe that AMD is also coming out with some standard for FreeSync on TVs that could be included in to the DisplayPort standard at no additional cost to manufacturer or enduser.
Anyway, this doesn’t make any sense unless you are connecting your PC to this TV. All consoles at this point in time are using AMD technology and wouldn’t be able to support GSync.
If anybody wanted to use these TVs as monitors, they’d probably need to get a 1080TI.
I’m guessing that Nvidia is playing a preemptive game by announcing this news first before AMD. Nvidia seems like a jealous girlfriend who’s always trying to take away any boy AMD likes. 🙂
HDMI 2.1 specifications
HDMI 2.1 specifications include Variable Refresh Rate which is “adaptive sync”. Nvidia most likely will ignore the stanrdard and push for the g-sync crap.
VESA DisplayPort Adaptive
VESA DisplayPort Adaptive Sync(TM) is Freesync and its included since the VESA DisplayPort 1.2a standard or do most Gamers Appear to be unable to tell the difference between marketing branding and Standards Naming.
Gamers appear to be unable to Know that Zen is the mame of a micro-arch and Ryzen is a consumer branding for a line of consumer CPU/APU products based on the Zen CPU Micro-Arch.
Gamers appear to be unable to tell that Intel’s HyperThreading(TM) marketing brand is just Intel’s Version of SMT(Simultaneous MultiThreading) and AMD does not have a brand name for its versin of SMT an AMD just uses the generic SMT term. Hell Intel did not even invent SMT that was done at:
“The first major commercial microprocessor developed with SMT was the Alpha 21464 (EV8). This microprocessor was developed by DEC in coordination with Dean Tullsen of the University of California, San Diego, and Susan Eggers and Henry Levy of the University of Washington. The microprocessor was never released, since the Alpha line of microprocessors was discontinued shortly before HP acquired Compaq which had in turn acquired DEC. Dean Tullsen’s work was also used to develop the Hyper-threading (Hyper-threading technology or HTT) versions of the Intel Pentium 4 microprocessors, such as the “Northwood” and “Prescott”. ” (1)
So Whatever Tweaks that AMD does with its “FreeSync” will be adopted by VESA at some point into the VESA DisplayPort Adaptive Sync(TM) Standard just like it was done for the Vesa DP 1.2a standard.
(1)
“Simultaneous multithreading”
https://en.wikipedia.org/wiki/Simultaneous_multithreading
I’m warming up to the idea of
I’m warming up to the idea of large format gaming monitors. Though, it’s still hard to imagine sitting 3 feet away from the display on a desk. Thinking about the ergonomics of looking at the top of the screen makes my neck hurt already.
The press image doesn’t seem accurate scale. I have a 70″ TV and it’s far larger than what this 65″ monitor appears.
On a related note, I bet
On a related note, I bet Nvidia has a new video card in the next 3 months that can actually drive the display at 120fps.
On a related note, Nvidia
On a related note, Nvidia will not be bringing Volta variants to market when it still has plenty of GP102 ROPs available to spin up new Pascal GPU variants. GP102 tops out at 96 ROPs so there is plenty of Pascal ROPs for gaming for Nvidia to milk until AMD can produce an ROP heavy Vega base die variant. It’s ROPs that fling the frames because it ROPs that produce the final Pixel Fill Rates that fill the final Frame Buffer before It’s flung out to the monitor!
The FPS race is all because of ROPs and shaders do not matter as much compared to the allmighty ROPs that make up the render back end.
Now Bubba Gamer likes them FPS racing metrics becuses its just like NASCAR. So Bubba Gamer has his bragging rights from those ROP’s from Nvidia that separeates Bubba from his dish washing dollars. Nvidia should continue to increase its GPU prices because we can not have Vega selling to those miners for more than the price of a GTX 1080Ti.
So, by your logic, Apple
So, by your logic, Apple should stop releasing new iPhones because they have plenty of existing product to sell.
What you don’t realize is that Nvidia makes its money on people who upgrade their video cards. If everyone has already bought their desired upgrade, and enough people have committed to waiting for the next product cycle, then Nvidia misses out on revenue they could be getting right now on new Volta cards.
Nvidia has already sunken the r and d costs into the newest architecture, waiting around is pointless. Plus, Nvidia can charge a larger amount for early adopters anyways.
It’s not like the majority of video card purchases are made in the last 3 months of a product cycle.
AMD vs Nvidia doesn’t play into this at all.
This is not like Intel vs AMD, since the performance gains with successive gpu architectures are so great that there actually is a strong incentive to upgrade every cycle or every other cycle, unlike the CPU side with Intel.
The performance gains for
The performance gains for Nvidia is not based on new GPU Micro-Archs its based on Cramming in more ROPs than AMD does in its GPUs so Nvidia can get higher FPS metrics. Bubba is not going to notice any Frame quality differences at 60+ FPS and Nvidia can design its drivers to cut back on the frame quality based on game movement by the player so any frame guality decreases is not noticed by any gamer.
The Human Mind is geared to noticing edges and edge detection so as long as there is no jaggeged edges or no periods of above normal Frame Rate Variance then Bubba Gamers Mind(The parts of the brain not dediceted to logical/rational thought processes i.e. the visual cortex) is not going to notice. The Human Mind evolved with nearly instatinous frame rates as the real would can map out millions+ of FPS using Trillions and Trillions of Photons with actual rays themslevs hitting actual atoms. So the Human Mind will notice Frame Variance if it happens owing to the fact that in that natural world FPS is damn near perfect at millions+ of FPS and the photons and atoms are doing the heavy quantum calcualtions that come naturally with the universe and the laws of physics.
Any GPU microarchitectural tweaking done by Nvidia on the GPU “generations” since Maxwell has been to mostly improve on only slight areas such as Cache subsystems and small shader core improvments for GPUs up to Pascal for more power efficency. Most of Nvidia’s gaming advantage comes with the many base die designs from GP100, GP102, GP104/GP106/GP108 that give Nvidia many base die designs with various levels of Maximum ROP counts like GP102 having 96 ROPs avaible compared to GP104’s lesser amounts of total ROPs baked into the design.
ROPs are what have kept Nvidia in the lead with gaming and GP102 was dusted off to make the GTX 1080Ti with its 88 ROPs to keep Nvidia ahead of RX Vega 64 with its maximum complement of 64 ROPs. GP104 has the GTX 1080 limited to 64 ROPs so Nvidia use its GP102 to fix that problem in the Ti.
AMD’s GPUs are shader heavy but that’s popular with miners and the mining demand for Vega’s compute has AMD’s Vega selling at a higher demand market price than even the GTX 1080Ti. AMD lacks the ROPs to soothe Bubba Gamers fragile ego with ego/ePeen soothing FPS metrics so Nvidia has that Bubba market in the bag. Nvidia may have its Pascal development costs full amortized but JHH has his share holders to worry about and AMD is not serious enough about ROPs to force Nvidia’s hands to pull onto the market any newer GPU designs until Nvidia absoultely has to.
AMD has its Epyc CPUs that will save the company and AMD is more interested in those Vega 10 dies that make the grade to be included in the Radeon instinct MI25 AI SKUs and Radeon Pro WX 9100’s for the compute market where the professional market Vega revenues/markups will more than pay for large amounts of HBM2.
AMD will get more of the consumer integrated graphics market at first and that’s good enough if you look at that Intel GPU market share with Integrated graphis being a larger market than AMD’s and Nvidia’s discrete GPU markets combined. So AMD can work its way back into the consumer GPU market faster with its Radeon/Vega Raven Ridge APUs before Navi is available for AMD to have some multi-GPU die fully scalable products to create is entire line of Navi SKUs low end to flagship and professional markets like AMD has done with its modular scalable Zen/Zeppelin dies in the CPU arena!
Nvidia can sell a Pascal/GP102 Based GTX 1080TX with 96 ROPs and that’s some ways above the GTX 1080Ti’s 88 ROPs and Bubba gamer will eat that up for bragging rights alone. Bubba Gamer thinks of Nvidia more as a piece of jewelry and the Nvidia GPUs FPS metrics as a status symbol but if it Sells Nvidia’s GPUs then JHH/Stock holders are happy.
I’m using a 55 inch LG B7
I’m using a 55 inch LG B7 OLED with HDR as my primary computer monitor. It’s 4 feet from my eyes and the best display I’ve ever used. The computer monitor industry is a joke, good luck finding OLED or local dimming LCDs which are required for HDR in computer monitors in 2018. These Asus/Nvidia “monitors” are just rebadged panels with a Gsync module strapped on. I’d still rather get OLED.
You know that OLED has burn
You know that OLED has burn in problems similar to what Plasma had, right? While their color accuracy and black levels are exceptional, they do not make good monitors because of static elements on screen (backgrounds, task bar, icons, etc).
Are these panel OLED as well?
Are these panel OLED as well? Hopefully a next gen OLED without burn in issues?
Price and availability in Canada too please 🙂
I thought the lowest panels
I thought the lowest panels could go so far with G-Sync natively was about 37 Hz.
So how come they speak of G-Sync working at 25, 24 and even 23.976 fps, or is it actually doubling the refresh rate at those rates so the panel runs at for example 50 Hz?
Yes, it doubles (or more)
Yes, it doubles (or more) between frames when below the minimum panel refresh limit.
How it works.