What RTG has planned for 2016
The Radeon Technologies Group had a select few media member in Sonoma last week to lay out plans for 2016.
Last week the Radeon Technology Group invited a handful of press and analysts to a secluded location in Sonoma, CA to discuss the future of graphics, GPUs and of course Radeon. For those of you that seem a bit confused, the RTG (Radeon Technologies Group) was spun up inside AMD to encompass all of the graphics products and IP inside the company. Though today’s story is not going to focus on the fundamental changes that RTG brings to the future of AMD, I will note, without commentary, that we saw not a single AMD logo in our presentations or in the signage present throughout the week.
Much of what I learned during the RTG Summit in Sonoma is under NDA and will likely be so for some time. We learned about the future architectures, direction and product theories that will find their way into a range of solutions available in 2016 and 2017.
What I can discuss today is a pair of features that are being updated and improved for current generation graphics cards and for Radeon GPUs coming in 2016: FreeSync and HDR displays. The former is one that readers of PC Perspective should be very familiar with while the latter will offer a new window into content coming in late 2016.
High Dynamic Range Displays: Better Pixels
In just the last couple of years we have seen a spike in resolution for mobile, desktop and notebook displays. We now regularly have 4K monitors on sale for around $500 and very good quality 4K panels going for something in the $1000 range. Couple that with the increase in market share of 21:9 panels with 3440×1440 resolutions and clearly there is a demand from consumers for a better visual experience on their PCs.
But what if the answer isn’t just more pixels, but better pixels? We already have this discussed weekly when comparing render resolutions in games of 4K at lower image quality solutions versus 2560×1440 at maximum IQ settings (for example) but the truth is that panel technology has the ability to make a dramatic change to how we view all content – games, movies, productivity – with the introduction of HDR, high dynamic range.
As the slide above demonstrates there is a wide range of luminance in the real world that our eyes can see. Sunlight crosses the 1.6 billion nits mark while basic fluorescent lighting in our homes and offices exceeds 10,000 nits. Compare to the most modern PC displays that range from 0.1 nits to 250 nits and you can already tell where the discussion is heading. Even the best LCD TVs on the market today have a range of 0.1 to 400 nits.
Coming in time for the holidays of 2016, AMD believes we’ll see displays capable of 2,000 peak nits on LED LCD technology and as high as 1,000 nits (with perfect blacks) on HDR OLEDs. A new industry encoding standard, 10-bit ST 1084, will provide room for future displays as high as a 10,000 nits! (Though a time frame is unknown.)
Chances are most of you remember a lot of commotion many years ago from game developers claiming to have included HDR support in their engines. Though that’s true, the images had to be tone mapped back down to the SDR color space to be displayed on current monitors, washing away much of the advantage the rendering feature provided. With the ranges of upcoming HDR TVs and monitors in the second half of 2016, that will no longer be the case – as long as game developers and GPU drivers are up to the task.
The good news is that HDR support doesn’t necessarily require a new GPU. AMD claims that all Radeon R9 300-series cards will be able to support HDR up to 4K 30FPS at 10-bits thought HDMI 1.4b and DisplayPort 1.2. To fully support 4K at 60 FPS with 10-bit color, the next-generation of Radeon GPUs will be required.
Based on what I saw and how RTG is talking, most believe that 1080p + HDR will be a better user experience than 4K + SDR. If so, that means that GPU horsepower requirements would be lower than expected in the coming year – a good thing for gamers that don’t have an unlimited budget.
FreeSync: HDMI Support, Mobile and Future Resolutions
The Radeon Technologies Group was quick to talk about the successes for FreeSync for the Radeon brand as well. Though the table above is not a total picture of the current ecosystem, it is a solid direction that AMD is on – more partners, more displays, and more flexible displays with multiple inputs. The table does not mention unit sell through and how many people are actually using the monitors with VRR technology, something that NVIDIA will likely follow up with in the coming weeks.
The Radeon group went over the progress of FreeSync as well, starting with initial release in March of this year, support for Overdrive in some models in June, a 30-144Hz panel in August and more recently, added support for low framerate compensation added in the Radeon Software Crimson Edition release last month.
LFC is solid, works mostly as advertised and is a needed addition to the FreeSync software suite that honestly, AMD should have thought about prior to release. Regardless, it’s here now, as a free software upgrade for users of compatible monitors, and gaming is better because of it.
AMD also wanted to point out that FreeSync is also available this holiday in a notebook! The Lenovo Y700 that includes the AMD FX-8800P Carrizo APU and the discrete Radeon R9 M380 GPU has VRR support on its 15.6-in 1080p IPS panel. I asked what ranges the monitor supported FreeSync at but no one seemed to have a solid answer for me yet. I’ll try to get hands on with the hardware at our local Best Buy to find out. Either way, it’s great to see AMD make a move in this market though with very minimal market share in discrete notebook graphics, it may be hard to find.
A new feature coming in 2016 is support for FreeSync over HDMI. Using vendor-specific extensions of the HDMI specification, AMD has worked with scalar vendors to develop such an extension that will add support for variable refresh to the protocol, something that does not exist today. This method allows HDMI FreeSync to be forwards and backwards compatible with shipping FreeSync-ready graphics cards though you will need a new display controller (and monitor) to support it.
As an interesting aside, when AMD was asked if they would open up this HDMI extension to the ecosystem for others to integrate, often claimed to be the company’s primary direction, they said they hadn’t yet made that determination as they had done the development work. The irony in that statement is ripe if you have followed the NVIDIA G-Sync versus AMD FreeSync debate.
AMD already has a list of 9 upcoming displays that will support FreeSync over HDMI, adding to the lead over design wins for the technology when compared to G-Sync.
An obvious question to FreeSync over HDMI is….why? With DisplayPort a free connection option (HDMI has a licensing fee), what is the benefit to having FreeSync on HDMI and then even on NEW upcoming monitors? AMD claims that monitors vendors aiming for lost cost solutions still lean towards a single HDMI connection and tend to leave DisplayPort off the feature list. Though this doesn’t make sense to me in this modern age, and makes even less sense in 2016, we have definitely seen the sub-$200 monitor market with a distinct lack of DisplayPort connections. If that continues, and FreeSync improves on its advantage in pricing against G-Sync, it could be a solid win for Radeon users.
Finally, to round things out, AMD touted the pending advantages of DisplayPort 1.3 for new resolutions and new FreeSync configurations. The HBR iteration of DisplayPort offers an 80% bandwidth improvement over HDMI 2.0, up to 32.4 Gbps. It uses existing cables and connections and offers new display options that will get some of our reader’s attention.
How do 4K 120Hz 4:4:4 screens with VRR in Q4 of next year sound to you? Good. Or maybe single-cable 5K 60Hz panels with 4:4:4 color? No problem.
And even if you don’t think you need all that bandwidth for your usage scenario, the upgrade to HDR panels will eat up a lot of that bandwidth. For example, DisplayPort 1.3 will support 3440×1440 resolution 21:9 screens at up to 190Hz in SDR mode or 144Hz in HDR.
This is just the beginning for Radeon as we approach 2016. The Radeon Technologies Group has plans laid out for the next couple of years, though implementing them correctly to build back market share and consumer mind share is a different matter altogether. What I can tell you from personal experience and my interactions with the team at RTG last week, everyone believes they can do amazing things. And for me, with Raja Koduri at the helm, I am tempted to believe them.
#1 “For those of you that
#1 “For those of you that seem a bit confused,..”
get outta my head, and ty for being kind rofl
#2 ” I will note, without commentary, that we saw not a single AMD logo in our presentations or in the signage present throughout the week.”
my comment… “duly noted, and ty very much.”
#3 “…though implementing them correctly to build back market share and consumer mind share is a different matter altogether. What I can tell you from personal experience and my interactions with the team at RTG last week, everyone believes they can do amazing things. And for me, with Raja Koduri at the helm, I am tempted to believe them.”
i think a lot of folks still saw radeons as a separate thing, competing in their on own sub-industry, after the ati-amd merger. and i also think a lot of folks are inclined to have similar sentiments as yourself, for the future.
#4 obligatory grammatical correction:
“Thought today’s story is not going to focus on…”
its ‘though’, not ‘thought’. good read!
Thanks for the edit point! 🙂
Thanks for the edit point! 🙂
One more small mistake
One more small mistake “vendors aiming for lost cost solutions”
scalar or scaler?
scalar is correct
scalar is correct
very interesting new tech
very interesting new tech coming up.
I’m on the verge of getting a new 1440@144hz VRR monitor and gfx card, and while this new tech is interesting, I just don’t see it as that compelling to want to wait another year or two.
I’m running an Acer XG270HU
I’m running an Acer XG270HU freesync monitor right now with an R9 290 and it is amazing. I run Dirt 3 maxed at around 100fps, Battlefield 4 & Hardline (minus high AA) at around 80-90, and Dirt Rally on high around 65fps.
Even though the monitor is TN, it’s not as bad as other TN monitors I’ve bought. I also keep a 1080p60Hz IPS monitor as a secondary for video, which helps.
Looking back at what I wrote,
Looking back at what I wrote, I can see it wasn’t clear enough.
What I meant is that the new tech covered in this article is just not compelling enough to wait another year or two before I upgrade to a 1440p@144hz VRR monitor and compatible GPU.
The jump from 75hz to 144hz is awesome, but will the jump from 144 to 270 feel like that much of an improvement again? I suspect the laws of diminishing returns says, while you will experience an improvement, it won’t be much.
270hz means older games won’t have issue with vsync at the upper range since hardware is so powerful these days. But how often do you think new AAA games are going to hit that 270 limit?
It just seems to me that if you already have a high refreshrate monitor of 144 or even 120(that also does VRR) that you won’t get much, if any improvement by going to one of those newer monitors next year(or whenever they finally come out)
The high refresh numbers they
The high refresh numbers they are giving on the last slide are demonstrating the maximum numbers that the interface is capable of. I doubt we will see LCD based panels capable of real 270 Hz. The extra bandwidth is mostly to support higher resolution and color depths rather than significantly higher refresh rates. For 5k it looks like they will not be able to run HDR over a single cable. It seems like they could handle that with some lossless compression though. It may be possible for some other display technology, like OLED, to reach much higher refresh rates than LCD, but it is unclear whether these are necessary. I think Ryan indicated that he noticed a difference between 120 and 144 or 160 Hz. It might be interesting to do some blind test to see how high it needs to be before it is unnoticeable. It would be difficult to isolate though. To run high refresh rates, you would need a TN panel. The colors and brightness would not remain the same at different refresh rates. Also, different people have different sensitivities to refresh rate. Peripheral vision is also more sensitive to frame rate also.
2016 is looking
2016 is looking bright!
(Note: Pun absolutely intended) 🙂
No AMD listed means that AMD
No AMD listed means that AMD is trying to push the Radeon Technology Group name recognition, AMD is TOO wedded to the GPU technology with its APUs for Radeon Technology Group to ever become totally independent from AMD. That FX8800P in the Y700 better be able to be RUN at 35 Watts or this laptop is way overpriced, even with a descrete AMD GPU. Every laptop is using eDP for variable refresh anyways, so what’s with the FreeSync branding, does it mean that any external monitors plugged into the Y700 via DP that supports VESA’s Display Port Adaptive Sync will have variable refresh also?
Well I’m off to check out the Y700 at BB’s website, there better be a listing of the FX8800P’s Wattage parameters, or it’s no sale! I’m serious about that ability to run the FX8800P at 35 Watts, and it’s not for any crappy CPU cores that I want the 35 watts, it’s for the GPU ACE units to get more wattage/thermal envelope at the 35 watts for rendering workloads!
I don’t see a TDP spec for
I don’t see a TDP spec for the FX-8800P processor on the Lenovo Y700 listing at Best Buy, but a quick search led me to this page at NotebookCheck.net which states the TDP is 12 – 35 Watts. I hope this helps.
Most of the OEMs have the
Most of the OEMs have the FX8800P stuffed into a Thin and Light/Ultrabook style laptop form factors and the APU is OEM limited to 15 Watts. There needs to be some more information listed for the Lenovo or potential customers will not be buying the Y700 because the Carrizo FX8800P is sometimes severely down clocked in order to make that 15 watt thermal envelop that OEMs are limiting the FX8800P to in those thin and light laptop abominations!
Lenovo better provide a link to the proper technical specifications that explicitly lists the FX8800P APUs maximum usable wattage envelope in the Lenovo Y700 SKU that ships with the Carrizo FX8800P! I’m seeing that Best Buy is the only one with the FX8800P/Lenovo Y700 laptop, with some opportunists offering the Same SKU on e-Bay for a premium price gouge. It does ship with an AMD discrete GCN GPU to pair with the integrated GCN graphics on the FX8800P so hopefully PcPer/others will get one to benchmark and test for the APU’s top Wattage ability if the FX8800P can run at 35 watts.
This Y700 being a Gaming SKU gives me hope for a laptop with the proper cooling solution to run the APU part at at least 35 watts, and along with the discrete GPU in this laptop it may be good for gaming and rendering workloads, Blender 3d now has Cycles Rendering support for AMD’s GCN based GPUs, Thanks to AMD’s new commitment to driver support and the Blender Foundation.
I really want to see there be a line of AMD based laptops that are made for running the Full Linux OSs, including Steam Machine branded Laptops that can utilize the Carrizo FX8800P with Steam OS/Other Full Linux Distros and the Next Carrizo Laptop/Desktop refresh coming from AMD before the Zen SKUs make it to market. The FX8800P at 35 watts has about 30% better performance over an FX8800P OEM limited to a 15 watt thermal target.
If it is a laptop with a
If it is a laptop with a discrete GPU, then running the APU at 15 W (mostly CPU only) is probably fine.
No the F it’s not! I want the
No the F it’s not! I want the extra Watts for the integrated APU graphics to have that 35 watts, I don’t give a F about the CPU! I want the Integrated graphics to get a full 35 watts, and the discrete GPU to get its full wattage also.
CPU F-ing suck for rendering! What person doing Blender Cycles rendering on the GPU(both integrated on the APU and discrete on that GPU) cares about the CPU at all. I want the FX8800P for its GPU ACE units, and whatever discrete AMD GPU for its ACE units, with both getting that maximum amount of watts the parts can handle. Folks doing rendering workloads on the GPU only care about to GPU, the CPU is just there to manage the OS, and run the few parts of the graphics application that does not concern graphics.
I sure as hell can not afford a Pro Graphics card, so the Y700 has potential for some low cost rendering hardware, but I want that FX8800P to have that 35 watts ability, no 35 watts for the APU(mostly for the APU’s graphics cores NOT THE CPU’s cores) or it’s no sale.
So you want a mobile 3D
So you want a mobile 3D rendering workstation? That isn’t a common use case. When games can take advantage of both the dedicated GPU and the integrated GPU simultaneously, then I would want more power for the APU. I don’t think we are there yet though, so if I have a dedicated GPU, I probably wouldn’t care if the APU is 15 watt limited. Also, I would rather have a powerful APU with HBM than an DDR3 or DDR4 based APU with a dedicated GDDR5 GPU. That is assuming the GPU is comparable. I don’t know if HBM APUs will be good for 3D rendering workstation type workloads since the amount of HBM memory will probably be relatively low.
Do you render using 32-bit floating point on the GPU or 64-bit? The 64-bit capabilities have been reduced significantly in recent GPUs. AMD went from 1/8 down to 1/16 of 32-bit speed going from Hawaii to Fiji. That will be reduced in the APUs eventually. Nvidia went down to 1/32 from1/24 for their consumer cards; it was 1/3 for some of their Titan products in the 700 series, but I believe they are 1/32 across the board for consumer level 900 series cards. If you need 64-bit FLOPs, you are going have to pay more fore it. They cut a lot of 64-bit processing power out of consumer products since 64-bit doesn’t get used that much for games, it takes a lot of die area, and it takes a lot of power.
I can not afford a Pro setup,
I can not afford a Pro setup, and Blender 3d is open source and free, and the Blender 3d Cycles renderer now has support for AMD GCN GPU’s. And lots of people render on laptops, and I want the integrated GPU on the FX8800P laptop APU(the integrated GPU on the APU’s die) to damn well be able to have the 35 watts for more rendering power, That 35 watt usage would give the FX8800P about 30% more rendering performance. And Add to that a GCN discrete GPU to give its GPU wattage and more ACE units for Blender Cycles rendering workloads.
That Lenovo Y700, if it lets the FX8800P run at 35 watts UN-throttled will make a great poor man’s Blender Cycles Rendering laptop. There may be another(1) Excavator(Bristol Ridge) based mobile/laptop APU with a 25-45 watt design metric, and maybe at 14nm(?). So Hell yes to laptops running the FX8800P/successor part at top wattage! Hell yes to Blender Cycles rendering done on the GPU(integrated and discrete), even my current laptop’s quad core i7 takes hours to render what can be done on the GPU in minutes, because all my Cycles rendering is done on the CPU for lack of Cycles rendering support on the laptop’s non GCN AMD discrete GPU(Thames pro).
Having a non Nvidia GPU option for Blender Cycles rendering means that now if I get any laptop with an AMD GCN based GPU and I can have a much lower cost laptop based rendering solution.
(1) “Rumor: AMD will bring back Excavator core for one last dance”
someone go to best buy a load
someone go to best buy a load CPUZ onto it give us screen shots.
Not going to happen at Best
Not going to happen at Best Buy, you’ll have to have the ability/permissions to download and install or even save for stand alone use of CPUZ! Hopefully the Geek Squad will offer some Windows 8.1 Pro custom Installs for those that do not want windows 10.
“AMD will bring FreeSync to
“AMD will bring FreeSync to HDMI early next year
Support for UHD content and DisplayPort 1.3 is coming, too”
There is NO, NONE, ZERO
There is NO, NONE, ZERO articles from Ashraf Eassa, where he is not attacking AMD, talking about how it will go bankrupt next week. Still he is on one of AMD’s slides. That’s funny 😀
And there is not one single
And there is not one single comment where you don’t suck AMD’s d*ck,
pics or it is you that is the
pics or it is you that is the …
He’s posted plenty actually.
He’s posted plenty actually. If you were to get your nose out of Nvidia’s short’n’curlies for a few minutes here and there you might see some.
DP 1.3. It’s about GD time.
DP 1.3. It’s about GD time.
Really nice to hear that HDR
Really nice to hear that HDR and DP 1.3 are in the works for future monitor + GPU upgrades. Those seem like the most easily implemented display improvements at this time. Now we just need the perfect blacks of OLED and expanded color space of quantum dots. Unfortunately, extremely high luminance OLEDs with expanded color space at a large panel size at a reasonable cost is an extremely difficult challenge that likely wont happen in the next 10 years.
OLEDs and quantum dots are
OLEDs and quantum dots are not currently used together. Quantum dots are just being used to make backlights with very narrow range RGB light. This allows the color filters to be tuned more specifically for higher efficiency. OLEDs don’t use a back light. LG is doing some strange stuff, but they claim they will be able to produce high luminance screens. They will not be as bright as the quantum dot based LCD tech, but the LCD tech cannot produce OLED black levels. LG makes their OLED televisions by layering the red, green, and blue OLEDs to make a “white” OLED. They then put color filters over the top of it. This is a bit wasteful since all sub-pixels produce red, green, and blue light, but the color filter only allows one color through. Each sub-pixel can be dimmed or turned off independently though. It seems to be necessary to manufacture large screens at reasonable cost. With the OLED I still worry about screen burn in and longevity. Neither one is clearly better since they still involve trade offs. I think the OLED will probably give the best image quality when you purchase it since black levels and high contrast really make images pop. I believe the current OLED TVs do not support HDR connectivity though.
Correct, my post doesn’t say
Correct, my post doesn’t say that OLEDs and quantum dots are used in the same display. OLEDs should eventually match quantum dots in the brightness and accuracy of the light emitted (red, green, and blue respectively). Combined with the inability of individually turning off pixels on with quantum dots makes OLED the clear end goal for display tech.
Samsung is developing “true” RGB OLED displays while LG uses a White OLED backlight with RGB color filters. Do you have any links to more detailed descriptions of LG’s OLED implementation? The ones I’ve seen haven’t been as clear as desired.
I don’t want to get into a
I don’t want to get into a semantic discussion on the meaning of “and”, but your post implied to me that you may have been talking about them in the same display so I wanted to clarify to make sure.
Anyway, I don’t think I have seen anything more detailed about LGs process than what I have already stated. It does make sense that it would be easier to build up 3 layers with different chemical composition, with all cells being identical, than it is to make adjacent cells with different chemical composition.
OMG OLED HDR VRR monitor
OMG OLED HDR VRR monitor goodness can’t come soon enough.
HDR for 300 Series but last
HDR for 300 Series but last time I checked they are same GPUs are 200 Series? Is AMD trolling us?
The 300, and 200 series may
The 300, and 200 series may support relatively the same GCN micro-architecture, but its the other functional blocks on the newer GPUs that allow for the newer video standards to be supported. The GCN on the 300 series part may be the same as the GCN supported on the 200 series, but the other on GPU die functional blocks like hardware based video encoding/decoding standards and other newer support may be absent on the 200 series parts. There is a lot of functionality on die with the GPU that is not directly tied into the GPU’s GCN version or dependent on the GPU’s GCN/other current/past micro-architectures! A lot of other video/graphics functionality on the GPU is through dedicated specialized hardware that may not be available for even the same version of GCN labeled GPU micro-architecture. That’s why you go and read the GPUs/APUs data sheet from the manufacturer, the re-brands are confusing enough, and then there are the OEM only SKUs and the GPU makers updates to sift through for feature sets implemented.
That GCN 1.0/1.1/1.2 naming does not come from AMD, so some good disambiguation tables are in order to know the difference between the 300/200 and other marketing naming schemes and the actual engineering names, and code names that are used to describe the actual feature sets of the parts.
Here are some tables to help under the sub-heading “Decoder ring for engineering vs marketing names”, It’s not complete but it has a lot of info!
Got inside info that they are
Got inside info that they are same silicon. If you believe that they are different then they sure fooled you.
I’m not arguing about if it
I’m not arguing about if it is/is not the same version of GCN on the 200/300 series that part may be true, take that up with AMD’s marketing department! I’m saying that there may be some other non related to the GPU’s core micro-architecture version in some on DIE functional blocks that may contain improvements in the 300 series parts, and that includes firmware and other enhancements to the on PCI card chip-sets as well as faster base clocks, more/better memory etc. GPU SKUs are not only rated/labeled by what’s on the core, but also what’s on the uncore and PCI card, and even if the core micro-architecture is the same, those other improvements may mean that the part receives a new name by the marketing department.
I personally never trust any companies’ marketing departments when it come to labeling new parts for the next years GPU/CPU/other new product lines, so I do my research before buying! I do not expect that a GPU maker should be radically updating it’s GPU micro-architecture/ISA every year unless the old micro-architecture is now really under performing relative to the competition’s products. So what is called AMD’s GCN “1.2” has a lot of software improvements to come, and even AMD does not recognize more than 2 generations of GCN currently. If the GCN GPU micro-architecture is not broke, then don’t fix it! I’d rather see a stable GPU micro-architecture that has lots of software/driver improvements to make for a more useful lifespan before obsolescence!
Arctic Islands is supposed to have and entirely new micro-architecture/ISA, so I’d expect it will be getting a different GCN number/name, AMD has GCN Generation 1, and GCN Generation 2, as its unofficial designation for its GCN improvements. I’ll bet even Nvidia’s naming schemes amount to about the same amounts of GPU micro-architecture improvements as AMD’s, year on year, with some years offering more radical improvements than others. Nvidia got all that power savings in its GPU’s by stripping out more DP/other functionality while AMD did not sacrifice as much on the hardware side to get the power savings. It took a while for AMD to reach relatively close power usage metrics to Nvidia, but AMD did not sacrifice its hardware ability as much to get the power savings. Now AMD’s commitment to HSA and having more asynchoronous compute/FP/other functionality fully implemented on the GPU’s hardware is paying off for the next generation of graphics APIs like DX12 and Vulkan that can make use of the hardware that AMD has on its GCN based GPUs!
@Ryan Shrout or anyone who
@Ryan Shrout or anyone who has experienced 34″ widescreeen displays first hand on TOP of 40-43″ 4k displays for pc use and gaming…
Which do you prefer?
It seems like the 3440×1440 will be able to get ALL the goodies of higher refresh and resolution AND HDR, but 4k @120Hz uses too much bandwidth to ALSO allow HDR… so what would you choose?
I know Ryan literally has dozens of companies RAIN high end displays upon his head, he is one of the few people in the world that has looked into the windows of eternity with top end displays, so to him, and others like him that have experienced BOTH the widescreen 34″ monitors and the 40-43″ displays, what do you prefer for general use? For gaming? Which is more immersive? Do you enjoy watching tv shows on one more than another?
cant speak to this whole
cant speak to this whole bandwith issue, in regards to HDR etc etc. But for plain ol’ immersion while in-game??? the 4k display, hands down.
unless you really really really like cock-pit esque games (ie racing/flying sims) you would prefer the xtra resolution over the 21:9 aspect ration of the 34″ display. moreover, aspect ratios, like 21:9, can be acheived on any panel, regardless of size/resolution, via software. there’s only one way to get more pixels….
ps im not ryan shrout
lol when Ryan Shrout posts in
lol when Ryan Shrout posts in the comments, it’s pretty obvious that it’s him.
I’m old. I still remember
I’m old. I still remember walking into a circuit city and seeing 720p hi def displays for the first time. It was like looking through a window compared to the 15 yr old tube I had.
I’m pretty happy with what I’ve got. A 24″ 144hz gaming monitor next to a 29″ultra widescreen IPS for webbing, and a chromecast on my big screen for videos (and a oculus rift DK2 for giggles). I would like to swap the 29″ to a 34″ though…
10.000 cd/m² holy shit. I
10.000 cd/m² holy shit. I just bought a new monitor which has 350 cd/m² and i can hardly look at it at 50 percent brightness and contrast. What will 10.000 cd/m² do to your eyes then?
21:9 but larger than
21:9 but larger than the 34″ currently, maybe 36-38, doesn’t have to be 40
10 bit full color, high color usefulness is only set to go up, LED full adobergb gamut backlights are not so expensive as several years ago
DisplayPort 1.3 do drive the color levels and higher resolutions
I know this will all come, but you are building a future list here so there you go.
I just bought a Samsung CF591
I just bought a Samsung CF591 monitor, I have to buy the cable
DP? To take advantage much more The freesync mode? , Or is enough with the cable HDMI that came with the monitor .
Sorry my bad english