We've been talking about the benefits 4K for a while, most recently with the Samsung U28D590D, which added single-stream 60Hz support to the mix, but there have certainly been some drawbacks with 4K monitors to date. Between usually low refresh rates and the general problem of getting smooth images on the screen (not to mention the high price of entry into 4K) there have been some legitimate questions about when to upgrade. Well, an interesting new product announcement from a surprising source might change things.
With a logo like that, who needs product photos?
Today, Acer is announcing an interesting alternative: the world’s first 4K monitor with integrated NVIDIA G-SYNC technology.
The XB280HK will be a 28" display, and (provided you have an NVIDIA graphics card and were looking to make the move to 4K) the benefits of G-SYNC – which include minimizing stutter and eliminating tearing – seem ideal for extremely high-res gaming.
We’ll be eagerly awaiting a look at the performance of this new monitor. (Or even a look at it, since Acer did not release a product photo!)
The details are scarce, but Acer says this will be a part of their “XB0” series of gaming monitors. Here are some specs for this 28” 3840×2160 display, which features three proprietary technologies from Acer:
- “Flicker-less” which Acer says is implemented at the power supply level to reduce screen flicker
- “Low-dimming” which sounds like an ambient light sensor to dim the monitor in low light
- “ComfyView” non-glare screen
Of interest, the Acer XB280HK is likely using a TN panel given the claimed "170/170 degree" viewing angle.
The hardware needed for good 4K frame rates are definitely up there, and with G-SYNC onboard the XB280HK will probably not be in the low-end of the 4K price range, but we shall see!
It likely is TN at 170/170.
It likely is TN at 170/170. If it was VA or IPS it be 178/178
Agreed, many TN monitors
Agreed, many TN monitors handle 170/170 (though 170/160 is more common). Not sure where PCP got that idea from.
Corrected, thanks. True
Corrected, thanks. True 170/170 wouldn't likely be TN, but this is not a claim of distortion free performance, only a marketing claim.
same 4K panel as the Samsung
same 4K panel as the Samsung UD590 (4K TN/60Hz).
Most likely. If the Samsung
Most likely. If the Samsung is selling for $599-749, and the G-Sync module is a $100 (OEM, not MSRP) upgrade… minus the Acer branding… carry the three…
I’m guessing a $699-799 MSRP.
I’d still like to hold out for 120Hz 4K displays with DP 1.3.
The module was $100 before.
The module was $100 before. But it won’t be the same built in. You are replacing tech that was in the OLD style monitors (scaler) so the cost should drop because that is no longer being replaced. You don’t pay for that part now right? Considering their reluctance to make a better scaler (even today, AMD has to wait on these just like NV got TIRED of waiting), they are not that cheap I guess.
So a little more math says subtract whatever the OLD tech was that is no longer needed (removed, when you DIY) with the NV solution. I doubt it will be expensive because of NV; it will be expensive because 4K+gsync can make them some extra premiums, and NV can’t really stop them from jacking it up to whatever the market will pay.
Even at $100 it baffles me people say it was expensive. I’d pay $100 for smooth gaming for 7yrs or so I own my monitor (current on is in yr 7, dell 24in, my 22in is also). I’d give $14yr for much better gaming no matter what I’m playing. As this stuff all shrinks etc, I doubt it’s anywhere near $100 now.
It was announced at $100 but was sold at $150 and now its at $200.
$200 for 1080p. Knowing Nvidia we are looking at a $400 mark-up at 4k for G-Sync.
Hopefully we see Freesync and
Hopefully we see Freesync and the same time as Gsync.
4k is too expensive to drive.
4k is too expensive to drive. I would be more interested if this monitor ofered 1080p@120Hz with strobing, but I doubt that it does.
I’m more likely to get that
I’m more likely to get that Asus monitor that’s coming out soon rather than this one, I’m that guy who likes to do a bit of gaming in 3D from time to time and a 27″ 1440p 3D Vision certified display that also happens to do G-Sync in 2D mode instantly piqued my interest (I’m currently on a 22″ 1680×1050 display and could never justify the high cost of the marginal upgrade to a 3D capable 1080p display, but 27″ 1440p is finally a more substantial upgrade option I could see myself shelling out for that just hasn’t existed until now). 4K would certainly be nice, but if this is IPS then there’s no way it’d have the response time necessary to work in 3D.
Regardless of whichever monitor I wind up grabbing at some point, I’m gonna need a GPU upgrade first as my old 560Ti is still kicking but certainly won’t cut at 1440p let along 4K. I’ve started putting some money aside to eventually pick up an 870 whenever that becomes a thing, AND a new monitor at the same time.
im guessing those monitor
im guessing those monitor were already manufactured, and now they are just gonna empty the stock, because if Nvidia choses to stick to a private eco system, while there is an open tech out there, it would be greedy and stupid of them.
If I cared about such things
If I cared about such things I would prefer a monitor that supported both G-Sync and FreeSync,……not just one.
Also it’s not as if the place is lousy with FreeSync support one can take advantage of now.
The “open” alternative will
The “open” alternative will not appear in monitors for at least 6-12 months, and even then there is fundamental differences between the two techs that would make one better than the other for many people.
Predictive like FreeSync, meaning the graphics card have to predict how long before next frame is to be displayed… Which could cause problems with mispredicts…
Or Reactive like GSync, meaning the graphics card holds the current frame until next is ready… Zero prediction necessary.
May be more expensive.
If you disregard open vs proprietary, which do you think people would prefer? And why?
More uninformed nvidiots who
More uninformed nvidiots who don’t know how vblanking works.
And you do? Please
And you do? Please in-lighting us then