So NVIDIA has announced their next generation of graphics processors, based on the Pascal architecture. They introduced it as “a new king,” because they claim that it is faster than the Titan X, even at a lower power. It will be available “around the world” on May 27th for $599 USD (MSRP). The GTX 1070 was also announced, with slightly reduced specifications, and it will be available on June 10th for $379 USD (MSRP).
Pascal is created on the 16nm process at TSMC, which gives them a lot of headroom. They have fewer shaders than the Titan X, but with a significantly higher clock rate. It also uses GDDR5X, which is an incremental improvement over GDDR5. We knew it wasn't going to use HBM2.0, like Big Pascal does, but it's interesting that they did not stick with old, reliable GDDR5.
The full specifications of the GTX 1080 are as follows:
- 2560 CUDA Cores
- 1607 MHz Base Clock (8.2 TFLOPs)
- 1733 MHz Boost Clock (8.9 TFLOPs)
- 8GB GDDR5X Memory at 320 GB/s (256-bit)
- 180W Listed Power (Update: uses 1x 8-pin power)
We do not currently have the specifications of the GTX 1070, apart from it being 6.5 TFLOPs.
It also looks like it has five display outputs: 3x DisplayPort 1.2, which are “ready” for 1.3 and 1.4, 1x HDMI 2.0b, and 1x DL-DVI. They do not explicitly state that all three DisplayPorts will run on the same standard, even though that seems likely. They also do not state whether all five outputs can be used simultaneously, but I hope that they can be.
They also have a new SLI bridge, called SLI HB Bridge, that is supposed to have double the bandwidth of Maxwell. I'm not sure what that will mean for multi-gpu systems, but it will probably be something we'll find out about soon.
As usual prices have been
As usual prices have been jacked up.
Yeah, $379 for a card faster
Yeah, $379 for a card faster than a Titan X is a huge wallet killer. …wait, what?
I believe he’s referencing
I believe he’s referencing the price relative to the previous generation with the same item in the lineup. Eg, the 970 was $299 MSRP, whereas the 1070 is $379. The 980 was $549, the 1080 is $599.
Correction the Gtx 970 was a
Correction the Gtx 970 was a msrp of $329.00.
But those weren’t their
But those weren’t their prices at launch.
It did not launch at $299.00.
It did not launch at $299.00. Most of the prices were a bit higher, but $329.00 was the launch msrp price.
You neglect to mention that
You neglect to mention that the 770 launched at $399. An architecture change will always be higher than a refinement.
We’ll have to see actual
We’ll have to see actual benchmarks.
If benchmarks are your
If benchmarks are your favorite games, these cards are for you…
Nvidia really struck
Nvidia really struck marketing gold with the titan idea: if you create a ludicrously priced joke card, people will praise your other prices no matter how high they are. $600-700 single GPU cards didn’t exist not too long ago (aside from professional cards), but now, not only are they accepted, some people even call them a good value.
https://en.wikipedia.org/wiki/Door-in-the-face_technique
Maybe AMD should try something similar. That should help their margins. Or, better yet, maybe GPU customers could stop being such absolute pushovers.
“Maybe AMD should try
“Maybe AMD should try something similar.” – Pro Duo says hello!
That’s dual GPU. It also runs
That’s dual GPU. It also runs workstation drivers, and those cards have always been expensive.
But somehow they advertise it
But somehow they advertise it as “For Gamers Who Create and Creators Who Game” this doesn’t sound as for workstations only slogan to me.
With the option for using the
With the option for using the pro drivers and not having to pay $4000+ for the pro SKUs, it’s a great deal for a creator to use and save some serious dosh! Those pro drivers cost a lot to develop and certify for the pro graphics packages, so that in itself if reason enough. The gaming drivers are gimped down for speed at the cost of accuracy, and without the pro drivers targeting the pro applications and Pro VR markets are impossible!
So if you want to game on that AMD Pro Duo SKU install the gaming drivers, for pro development work install the pro drivers. There will be plenty of pro VR applications for engineering work in 3D virtual environments! Look at the Boeing 777, lots of 3d VR simulations even for maintenance routines where maintenance workers simulated tasks in 3D virtual environments and designs where changed to make maintenance work easier and faster!
So that AMD dual pro is only $1500, and can be use to develop software for cards that cost $4000+, the only difference is that the costly FirePro versions come with error correction/other error correction features that are in the FirePro SKU’s hardware, so you can not use the Radeon Pro Duo for production work, but it can be used for developing software that can be used by the more costly FirePro branded SKUs, that can/have to be used for production work(Engineering/other work where errors can cost lives and that error correction is needed/required).
Yea Titan was always a card
Yea Titan was always a card that was for pro’s that still want to game but it still got slammed for its price so well Over priced fury pro can take the same lump’s.
It’s still lower than what
It’s still lower than what that Titan Z($3000) SKU cost when it was introduced! So where your lumps of cash now, in JHH’s bank account! The AMD dual pro is a good deal more affordable than the Titan Z(Even now)!
Well if you put it this way I
Well if you put it this way I would happily buy TWO fire Xes instead
Yep that’s precisely the way
Yep that’s precisely the way you structure pricing. You make the top end way more expensive for little more performance, you make the low end way too expensive for no performance and you guide everyone towards the mid-size chips with the highest yield which you make the most profit on. The high and low end could be way cheaper for the performance they give, but that would mean your mid-size chips didn’t look as good value.
Sure, the top end was always
Sure, the top end was always relatively poor value, but Nvidia really took that to another level with the titan cards. Also, in the past, people complained about the $600 GPUs instead of praising them as a good value. Some of the comments on this article demonstrate just how effective the titan marketing was.
Don’t forget about that other
Don’t forget about that other part of Nvidia’s product line segmentation! With Nvidia reducing the compute on its consumer SKUs and marketing the power savings. That one will not work for the VR gaming segment where the gaming engines will be accelerating on the GPU more of the non gaming compute traditionally done on the CPU! So the VR gaming engines in order to get that CPU to GPU communication induced latency issues, inherent over PCI/Other protocols, down to as little as possible will be moving as much of the VR games’ Graphics/Gaming Compute onto the GPU and using that to keep the latency(CPU to GPU over PCI) down to a minimum. Those VR frame-rates have to be as high as possible to make those cookie tossing incidents as nonexistent as they can realistically be!
Nvidia has already taken some steps towards improving the fine grained GPU processor thread scheduling and dispatching on its P100 line of HPC GPU accelerators and just how much of that improvement is going to make it into the consumer SKUs is still unanswered. AMD has not been skimping on the asynchronous compute features of its GCN based SKUs and has in fact been improving those GCN ACE units with every new update/generation of GCN.
With DX11 and before, Nvidia has not been at much of a disadvantage for not having the asynchronous compute features fully implemented in its consumer GPU’s hardware, but with DX12/Vulkan and the VR gaming market just starting up Nvidia will have to bring those hardware features down into its consumer SKUs to compete in the VR gaming market.
There is only so much latency hiding that can be done in software/VR middleware outside of having the asynchronous compute features Fully implemented in the GPU’s hardware. There is much less room for any latency in the high VR frame-rates that are necessary for VR gaming and for keeping folks off of Dramamine!
P.S. You “Technology
P.S. You “Technology reporters” don’t use the acronym SMP (Symmetric multiprocessing)(1) for Nvidia’s Simultaneous Multi-Projection it’s going to lead to lots of confusion with Google searches that lead to articles on SMP(Symmetric multiprocessing), even some Nvidia white papars on SMP(Symmetric multiprocessing)!
Call Nvidia’s Simultaneous Multi-Projection(S-MP, or somthing else for short) at least until wikipedia gets a proper SMT (disambiguation entry for SMT, the projection kind of Nvidia’s new nomenclature)!
Even on Anandtech they are calling it SMT for short, and the damn overloading of computing acronyms is going to cause confusion even for folks with good Google-Fu skills!
(1)
https://en.wikipedia.org/wiki/Symmetric_multiprocessing
That market segmentation is
That market segmentation is somewhat of a separate thing from releasing something like the TitanX. There was almost no reason for any gamer to buy a Titan X. It mostly just got them a massive amount of publicity that makes AMD cards look bad in comparison, even though the Nvidia part wasn’t a comparable priced product, even if it can be considered a real product. How much of the market is even 980 TIs, much less Titan Xs? It is also designed to reinforce the idea of Nvidia as the high end maker and AMD as the low end. That is nothing but marketing though. For almost all price points, AMD represents the better value most of the time. This is like comparing an auto company that makes a sports car that cost hundreds of thousands of dollars against one that does not. The existence of that sports car might be good marketing, but it is foolish to let it sway your opinion when shopping for a reasonably priced sedan.
Tell that to my wallet when I
Tell that to my wallet when I spent that sum on an 8800 GTX. Memory is short…
Um the 8800 ultra launched at
Um the 8800 ultra launched at 830+ USD. And if you go back even further the rade,on x850 xt platinum was 750 usd and that was in like 2005.
Fair enough, they did exist.
Fair enough, they did exist. But people didn’t praise them as a good value the way some do today. Nvidia’s marketing is still undeniably successful.
No one is exactly praising
No one is exactly praising these as a good value by themselves, they are a “good value” compared to current products because they offer a significant performance increase at a minimal cost increase. Its normal for the next generation of any product to be a better value then the previous.
You pasted the same comment
You pasted the same comment in many other places, are you working for NV marketing ?
if not then you should never compare the old gen gpus with the new ones, ofc it MUST be faster and the GTX 970 was faster with much lower power consumption that the original Titan card.
380$ for the market that its targeting is way too much!
*than the original titan
*than the original titan
WTF no they haven’t. The 80
WTF no they haven’t. The 80 series cards have been that price for years, and if you look at REAL inflation numbers and the purchasing power of the dollar, they’re cheaper.
It’s almost as if they have
It’s almost as if they have to recoup the cost of new architecture R&D or something. OUTRAGEOUS!!!
No word on DX12 game
No word on DX12 game performance ?
They only mention/show tomb
They only mention/show tomb raider for dx12 games. Which is one of dx12 title where geforece can be faster than radeon right now. Dx12 things probably doesn’t change much. Nvidia might using brute force to negate the percormance difference in dx12. Also depending on which side sponsoring the games we will see the tide goes between nvidia and amd fron time to time; means nvidia will be faster dx12 title than sponsored by them.
Any mention of asynchronous
Any mention of asynchronous compute specs?
The 1080 is so amazing. I
The 1080 is so amazing. I want it now!
http://www.amazon.com/1080-Sn
http://www.amazon.com/1080-Snowboarding-nintendo-64/dp/B00000DMAO
Soooo will a low-profile
Soooo will a low-profile 950Ti only be slightly faster than a TITAN X?
Nvidia just took a dump in
Nvidia just took a dump in AMD’s mouth
AMD Polaris price $13.99 and
AMD Polaris price $13.99 and a hug good bye
Any news on support for
Any news on support for Adaptive Sync or do you still have to buy the other half of the card with a monitor?
If you read the Nvidia 1080
If you read the Nvidia 1080 page.
Don’t think they’ll be in a rush to support it since its not 1.2a or above certified.
Ready usually means compatible which DisplayPort is backwards compatible anyways.
This confuses me, is it DP
This confuses me, is it DP 1.4 or not?
Um adaptive sync isn’t
Um adaptive sync isn’t required. Its Optional part of DP so it can be certified for DP 1.4 without it.
but its not certified for 1.3
but its not certified for 1.3 nor 1.4
1080 2xTitanX performance and
1080 2xTitanX performance and 3x efficiency, thats a bald claim, i hope this is not another PR coup, and end up with underwhelming results.
the show was terrific thou, AMD in full Panic Mode i bet, it’s a pretty hard sell for them now, their efficiency and prices need to reflect the vast shift in performance, and they need to undercut the 1070 price even if they somehow manage to out perform it with polaris 10.
AMD need to play it very smart and sweep the low/mid range, size up their vega.
omg this is so misleading, i
omg this is so misleading, i was playing LoL while listning to the stream, peaking from time to time, but now that i saw the slides, it actualy says twice the performance on VR games, so this is taking into account 60% boost from their projection thingy, which means 1080 is not twice the performance of a titanX but around 40% faster, and even efficiency wont be 3 times on regular work without that projection thingy, feel so cheated lol
That is actually major,
That is actually major, though. I can foresee lots of use cases for that tech that can make games better.
First, Their own examples of VR and Multiple monitors of course.
Next, How about playing Paragon in a two monitor setup, one top down and the other First Person?
Or creating large virtual rooms for parties.
How about a minimap that is rally just a zoomed out view of the map?
Now, it shouldn’t affect most previous titles, unless there is a way to play them in a VR headset, but this is some very cool stuff.
i need to watch the stream
i need to watch the stream again, but i think that is VR specific, not surround, and it doesnt boost the performance of VR by 60% it just doesnt use the extra performance used to display on 2 screens, thats why it says VR games… the projection works differently on surround, VR you get one image that you shift with projection to get for 2nd eye, surround you still need to display the added landscape.
this is how i understood it, but again i was just listening, need to rewatch, there is alot of misleading stuff, anyone just listning on audio of that event, he would have a complete different take on it once he sees the slides.
besides VR games wont need that much power, because the leading platform isnt PC, it’s the PS4, and 90% of the games will be port from the PS4, and even those made on PC will be tailored to be later on ported on the PS4, you just cannot compare 1mil vive/oculus users to 50mil PSVR users, once again VR games will be stuck to whatever level the dominant market perf is (ps4)
The VR specific
The VR specific implementation uses 8, 4 per eye, projections that are angled to allow them to compute less of the world for the same or similar experience.
It is a simple use case for the 16 projections available in the card. Similar tricks could be done in other use cases, such as single split screen multi-player, where you know the size of the projection has changed, so fit the projection for the appropriate amount of performance.
Games developers are notorious for adapting new tech in ways it was never intended for. These projections are much more versatile than just using them in this one trick for VR. It remains to be seen how it will ultimately affect games, however.
Just read a report of one
Just read a report of one game using multiple projections for 4K. They segment the outer portion of the screen to a lower resolution, while keeping the detail in the center of the screen, where your eyes are focused most of the time.
i saw a video sony VR
i saw a video sony VR conference for developers, explaining to them how VR games should be done and what to avoid, they talked about something similar to this, i think valve also, i dont think this is it, it might be more complicated than that.
True.
“all games maxed out”
True.
“all games maxed out” ,but at 1080p I guess.
The slide show like 15%-20% more than titanx, relative performance. Lets wait for the benchmarks.
Price still high imo.
Also, We all should ask 600.00usd for 4k@60 on all games on the planet.
AMD certainly is not in panic
AMD certainly is not in panic mode. nVidia released >300mm2 parts. AMD is working on reportedly 232mm2 and some ~120mm2 parts.
Therefore nVidia and AMD are effectively going to split the market. And AMD now knows nVidia priced new cards pretty high, which gives them opportunity to price higher (=profits).
Also note the press release is written so that only the $700 1080 is going to be available on 27 May.
Um price them pretty high?
Um price them pretty high? they are only 10-15% higher then pervious gen cards that these replaced. That price bump is due to chips being new and more expensive to produce. Its not an older mature node like 970/980 cards using widely avaible gddr5 ram, 1080 using GDDR5x which is not as produced in mass yet.
He was referring to 2x
He was referring to 2x performance of titan x in vr using the multi blah rendering blah blah blah. The 1080 is absolutely not 2x the performance of titan x, it is around 25% on nvidias website when benchmarking dx11/12 games.
What about the fine grained
What about the fine grained GPU procesor thread scheduling on these consumer SKUs? The P100 has improved thread scheduling what about these consumer SKUs. More about the actual hardware please, it’s not only just the benchmarks, more actual hardware information please!
What is Founders Version that
What is Founders Version that is $100 more? Is that instead of a higher specked TI version?
I think its just clocked
I think its just clocked higher or some kind of over clocking card its most likely the same core. It could also come with a box of swag or something.
Reference design price.
Reference design price.
From what I could gather,
From what I could gather, it’s not just a “reference design price”, it’s the price for the new reference cooler which will not be available from anyplace other than Nvidia. So the AIB partners will be able to take the reference design and slap custom coolers on them and charge less, or more, or whatever they want to do.
Looks like I was right, as
Looks like I was right, as confirmed today by Gamers Nexus.
founders one has more
founders one has more unlocked overclocking possibilities and likely better binned chip to get higher clocks
shiiiiit im getting a 1080
shiiiiit im getting a 1080
I have a 650 watt psu. Do you
I have a 650 watt psu. Do you think with the lower TDP of these cards that SLI is now feasible with that PSU?
Why don’t you just get a 800
Why don’t you just get a 800 – 1000 watt power supply? You are talking about spending $1200 on video cards. Whats an extra 120 for a more than powerful enough PSU.
Seriously, what a stupid
Seriously, what a stupid thought process. And no, you sure as fuck SHOULDN’T run SLI 1070’s, or 1080’s on a single 650w PSU Dipshit, duh.
I think you could have made
I think you could have made your point without all the personal insults, which there is no call for.
1080 has an 8pin power
1080 has an 8pin power connector = 2(150)+25(2) = 350 W
I7 6700k = about 91W
Your other components are going to use less than 100W
So a single 650W would handle 2 1080s or 2 1070’s dipshit.
Also go fuck yourself for being a champion asshole.
Given that the GTX 1080 is
Given that the GTX 1080 is rated considerably lower TDP than a 980Ti I would say you will be fine running SLI at 650W PSU given the few benchmarks that I found online.
http://www.eteknix.com/nvidia-geforce-gtx-980ti-6gb-sli-review/14/
http://us.hardware.info/reviews/6292/16/nvidia-geforce-gtx-980-ti-sli–3-way-sli–4-way-sli-review-ultra-hd-in-ultra-quality-test-resultsnpower-consumption
http://www.overclock3d.net/reviews/gpu_displays/asus_gtx980_ti_matrix_sli_review/4
Several of the results show the entire system draw of a rig with 2x 980Tis in SLI is in the 500-550W range depending on the benchmark/game running. The last source shows the ASUS MATRIX cards actually requiring 721W but the Strix cards in SLI result in 644W. The TDP of the 1080 is considerably lower than the 980Ti, 180W vs 250W.
Of course, I would wait for reviews. I am sure somebody will have benchmarks up soon, and there should hopefully be one or two sources that will show an entire system draw in SLI.
You seem to be in the
You seem to be in the know…if you have a minute..
I have a new 2016/Alienware X51 R3 with the GTX 970. My question is will the 1070 have the same or lower power draw than the 970? The 970 is 140 W
The efficiency of the PSU
The efficiency of the PSU should be considered too, though it’s not like it’s going to be pulling the full 650 watts at all times just because the cards are running.
Considering I’m running a GTX
Considering I’m running a GTX 690 at this point, I think this will be a great upgrade to take …. honestly it would be great if they did a 1090, but I doubt that will happen as the Titan Black I believe was the last dual chip Nvidia GPU. Also since I’m not yet running full 4K, it should be a very nice bump to games running in 3440×1440.
one question about the DP. If something is ready and not certified, does that mean they can drop features. Really the exciting thing is the ability to actually see HDR, even if the games are using it we still can’t see it as the monitors and Windows(?) doesn’t support HDR at this point.
To be certified you have to
To be certified you have to be compliant with all the latest features. DP is backwards compatible. You cant skip features and thus be certified with the latest and not the ones below.
Gsync works on DP 1.2 and the way they are using it might be causing them issues to be certified with anything above that.
Yeah cause SLI works in what,
Yeah cause SLI works in what, 3 games now? LOL I mean, I have 3/4-way Titans, and 2-3-way 980Ti’s. Unless I’m playing BF4 on my 144Hz BenQ XL2420G, I’m not enabling SLI.
Works in more and better then
Works in more and better then CF does.
you have 3/4 card and 2-3
you have 3/4 card and 2-3 cards. i dont think you have either
One 1080 is like 2 690s in
One 1080 is like 2 690s in quad sli if not better.
I was low-key kinda hoping
I was low-key kinda hoping for benchmarks today. What’s good pcper?
Since Ryan is on vacation,
Since Ryan is on vacation, even if they had the cards, and I wouldn’t expect that before a week before release, they haven’t had time to do the benchmarks yet. Ryan ran out of time to review all the games he wanted to for the Radeon Pro Duo, let alone a card that has just been announced.
Wasn’t Ryan at this event?
Wasn’t Ryan at this event? Vacation for Ryan means going to Nvidia events.
Nope, Ryan went to a beach
Nope, Ryan went to a beach somewhere, at least according to last week’s podcast.
Ryan could be at some BENCH,
Ryan could be at some BENCH, doing some sort of BENCH like things under some sort of you are not authorized to say YET sort of agreement! He may be playing volleyball at this BENCH, but it’s mostly involving volleys of various sized and shapes of objects that may or may not be actual volleyballs, but could in fact be volleyball shaped things, or other more streamlined objects with various nougat like fillings wrapped in full metal jackets that when set off expand at beyond the supersonic/hypersonic speed regiments associated with NOT staying in one piece!
In fact Ryan could in fact be at the beach, at some BENCH, and doing various things one does at the beach, at some BENCH! And while at that BENCH at the Beach his activities may involve some or all of the things stated in the first paragraph of this post!
That cooling shroud is
That cooling shroud is Transformers af but the rear plate is damn sexy.
nah, it looks like something
nah, it looks like something batman would drive.
Lucius Fox would never make
Lucius Fox would never make such an atrocity.
The question for me is
The question for me is whether or not the performance warrants an upgrade from my existing 980ti or not.
Probably not, unless this
Probably not, unless this card overclocks extremely well, which is a slight possibility as we don’t yet know much about this new process. Upgrading to consecutive generations is never really “worth it”.
Unless you’re going VR right
Unless you’re going VR right away, it would be better to wait for GP100/HBM2 based cards, if you’re going to upgrade.
I have a 980Ti as well… And plan to wait… though that new 1080 is really tempting for lower heat generation…
The VR 2x claim has an
The VR 2x claim has an asterisk – when using Pascal special features. This means a game must support those features.. So just like their earky VR SLI claim this is also not going to be seen anytime soon. 980ti to 1080 will be no different in VR than on the flat screen for a while IMO.. Wait for 1080ti ..
a 980ti, no it doesn’t you
a 980ti, no it doesn’t you would have to wait for 1080 ti before could say its worth it.
I believe this card targeted
I believe this card targeted more towards 980 and below. But there are some people that willing to upgrade even if the performance increase was only a mere 10%. Personally own 970 (gaming on 1080p monitor) but i wonder what do i want to play with that much performanve lol. Some of the game that i played simply have performance issue that regardless of how powerful your hardware are (Xcom2) and right now i’m replaying Dishonored. maybe i won’t upgrade until 2018 time frame.
Exactly my thoughts. As long
Exactly my thoughts. As long as it allows me to play smoothly at (near-)max settings, no reason to upgrade.
I expect to upgrade GPU when I upgrade to 4K panel.
It seems based on cores and
It seems based on cores and frequency the 1080 is about 25% better than a 980ti and 15% than non reference 980ti like my amp extreme. But im pretty sure the non refernce 1080s will be beastly.
Disappointing to see only
Disappointing to see only DVI-D on this card. Some people still use VGA-only monitors as secondary displays.
It would be nice to have some kind of modular/customizable/DIY display outputs, since people’s needs can vary so widely and cards usually converge on one jack-of-all-trades setup.
Those some people need to
Those some people need to dump their VGA-only monitors. You can’t even get a sharp picture above a certain resolution on VGA, and digital monitors are cheap.
The DVI / VGA adapters work well enough though.
That’s not really true. As
That’s not really true. As with all analog electronics, the quality depends on the hardware and implementation. Cheap monitors might not run well at high resolution (for a variety of reasons), but some equipment (high-end CRT projectors) could run nearly 4k (3200×2560) over VGA. I’m typing this right now at 1440p over VGA.
VGA-only monitors may not be worth buying today, and almost anyone buying a new mid or high-end GPU will have something better to use as a primary display, but that’s no reason to discard a monitor that still works fine and can be used as a secondary or tertiary display. Cheap monitors exist, but they have their own problems and nothing is cheaper than just continuing to use what you already have. Monitors become obsolete very slowly, especially because you can use several at a time, so you can just relegate older ones to less demanding roles. Also, modern monitors look terrible when displaying anything but their native resolution, so if you play any older games, you would want to have another monitor than can run at their resolution.
Converters can be expensive, and only support limited resolutions (generally 2048×1536 at 60hz).
The great thing about PC building is that there are generally countless options so you can set up exactly what you want, and nothing more. This is just one area that needs some improvement in terms of options.
Is it possible to use a
Is it possible to use a second GPU that has VGA and output to a second monitor using it?
On AMD yes, probably on Nvida
On AMD yes, probably on Nvida as well, if you have the space, power, and money for a second GPU, and if both cards use the same driver set. That’s probably the best solution, if it’s possible.
I used to be able to do it on
I used to be able to do it on my Intel iGPU, but this motherboard (for Devil's Canyon) has digital-only outputs. Also, Skylake retired VGA support, so it won't be an option going forward.
They can buy active or
They can buy active or passive adapters to fill that gap. I find it a little odd to use such a card with an older spec, but if you’ve got legacy hardware laying around…
Here I go imagining how I’ll
Here I go imagining how I’ll play games to justify getting one again.
I like PC gaming, but i can
I like PC gaming, but i can honestly say that i loathe most of you ignorant know it all’s, who dominate internet comment boards all over the net.
Reading these comments is equally as useful as jumping into a dumpster that’s on fire.