As we have talked about on several different occasions, Altcoin mining (anything that is NOT Bitcoin specifically) is a force on the current GPU market whether we like it or not. Traditionally, Miners have only bought AMD-based GPUs, due to the performance advantage when compared to their NVIDIA competition. However, with continued development of the cudaMiner application over the past few months, NVIDIA cards have been gaining performance in Scrypt mining.
The biggest performance change we've seen yet has come with a new version of cudaMiner released yesterday. This new version (2014-02-18) brings initial support for the Maxwell architecture, which was just released yesterday in the GTX 750 and 750 Ti. With support for Maxwell, mining starts to become a more compelling option with this new NVIDIA GPU.
With the new version of cudaMiner on the reference version of the GTX 750 Ti, we were able to achieve a hashrate of 263 KH/s, impressive when you compare it to the performance of the previous generation, Kepler-based GTX 650 Ti, which tops out at about 150KH/s or so.
As you may know from our full GTX 750 Ti Review, the GM107 overclocks very well. We were able to push our sample to the highest offset configurable of +135 MHz, with an additional 500 MHz added to the memory frequency, and 31 mV bump to the voltage offset. All of this combined to a ~1200 MHz clockspeed while mining, and an additional 40 KH/s or so of performance, bringing us to just under 300KH/s with the 750 Ti.
As we compare the performance of the 750 Ti to AMD GPUs and previous generation NVIDIA GPUs, we start to see how impressive the performance of this card stacks up considering the $150 MSRP. For less than half the price of the GTX 770, and roughly the same price as a R7 260X, you can achieve the same performance.
When we look at power consumption based on the TDP of each card, this comparison only becomes more impressive. At 60W, there is no card that comes close to the performance of the 750 Ti when mining. This means you will spend less to run a 750 Ti than a R7 260X or GTX 770 for roughly the same hash rate.
Taking a look at the performance per dollar ratings of these graphics cards, we see the two top performers are the AMD R7 260X and our overclocked GTX 750 Ti.
However, when looking at the performance per watt differences of the field, the GTX 750 Ti looks more impressive. While most miners may think they don't care about power draw, it can help your bottom line. By being able to buy a smaller, less efficient power supply the payoff date for the hardware is moved up. This also bodes well for future Maxwell based graphics cards that we will likely see released later in 2014.
Continue reading our look at Coin Mining performance with the GTX 750 Ti and Maxwell!!
To illustrate this example, we put together two builds of mining computers that should be capable of similar hashrates:
R9 270X Mining Rig | GTX 750 Ti Mining Rig | |
---|---|---|
Processor | AMD Sempron 145 - $55 | AMD Sempron 145 - $55 |
Motherboard | GIGABYTE GA-990FXA-UD3 - $135 | GIGABYTE GA-990FXA-UD3 - $135 |
System Memory | Kingston Value RAM 4GB 1333MHz - $40 | Kingston Value RAM 4GB 1333MHz - $40 |
PCIE Riser Cards | 1 x 16X to 1X Converter - $10 | 3 x 16X to 1X Converter - $30 |
Power Supply | Corsair CX750 Builder Series - $80 | Corsair CX500 Builder Series - $50 |
Graphics Cards | 4 x Radeon R9 270X - $1200 | 6 x MSI Graphics Cards N750Ti - $990 |
Price | $1520 - Full Cart on Amazon.com | $1300 - Full Cart on Amazon.com |
In these two builds, the core platform stays the same, with the AMD Sempron 145 Single Core processor. While this processor would be essentially useless for a lot of other tasks, Coin mining on a GPU is not a CPU intensive task, so we can get away with one of the cheapest CPUs on the market.
We chose the Gigabyte 990FXA-UD3, as it was the cheapest motherboard we could find for this platform with 6 PCI-Express ports.
As you have probably noticed not all of the PCI-E ports on this motherboard allow for a x16 card to be plugged in, and even the ones that are capable don't have proper spacing for 2 slot cards. To remedy this we have included the appropriate adapters.
Due to the fact that we are using PCIE risers, there is no case included. You would most likely be best served by building an open-air test bed for the system out of milk crates, shelving systems, wood, or some other building material. Just remember, it doesn't have to look pretty to be effective!
There is also no storage option included. For something like this you could either use any spare hard drive you have laying around, or even install a Linux distribution to a thumb drive. Due to this, we found storage to be a negligible option.
First, we have a more traditional build, using 4 x R9 270Xs, which we found available on Amazon right now for just above $300 each. With 4 of these cards running at about 450KH/s each, we should have a 1.8MH/s machine. With a power draw of 150W from each card, we get a total of 600W just for the GPUs alone. Throwing in another 75W for the 45W TDP processor and any additional overhead, we come to approximately 675W power draw for our entire mining rig.
At a total cost of around $1520, this machine would have a payoff period of about 113 days at the current Dogecoin rates, at 1.8MH/s
Our second build is based on the GTX 750 Ti. This time we instead opted for 6 x 750 Ti cards for a total of $900, which is still significantly lower than the $1200 for 4 270Xs. With 6 x 750 Ti cards, the estimated GPU power draw would only be 360W, just above half of the power draw of the 270X machine. Adding in the same 75W for additional system components the total estimated power draw works out to 435W, which allows us to purchase a cheaper power supply.
At a total cost of around $1300, this machine would have a payoff period of about 97 days at the current Dogecoin rates, at 1.8MH/s
As you can see, by cramming more of the lower end but impressive GTX 750 Ti's into a single machine you can create a similar performing machine for less money than the AMD alternative, which is contrary to all the advice given about coin mining up to this initial release of Maxwell. In addition, performance of the Maxwell-based machine should only improve as the Maxwell kernel for cudaMiner is developed further, whereas OpenCL performance for AMD mining has likely been as optimized as we will ever see it.
An additional factor you have to keep in mind is the fluctuating cryptocurrency market. Just because the payoff estimates today say you could be making a profit in 80 days, doesn't mean that will remain the same in the future. While the estimate could get better, it also could get a lot worse, leaving you with a lot of hardware to sell off in the future.
While no one is sure where the mining market will be as far as profitability is concerned when the high end Maxwell GPUs hit the market, NVIDIA could have a similar stock issues and an inability to deliver GPUs to gamers as we see AMD having today.
Oh yeah sure make all the GPU
Oh yeah sure make all the GPU prices go up!!!
While I’m impressed, I don’t like it. I’d like to upgrade my GtX 670 when the higher end maxwell chips hit, but the way things could and probably go….I won’t be able to unless I plan on spending more than the MSRP….just like anyone who wants an AMD card at the moment. Its gonna be really bad if both nvidia and AMDs cards are way above MSRP.
Great review and I recommend the card to a friend instead of a 7790 (he’s has an OEM system like the ones you tested with the 750 earlier).
Better start reserving your
Better start reserving your 800 Series cards early. I’m talking pre-order and pad the delivery mans wallet early.
Yay,
Now coin mining can ruin
Yay,
Now coin mining can ruin the price of all brands of graphic cards. Lol
It’s the greedy resellers and
It’s the greedy resellers and dumbasses that are willing to pay the extra that raise prices. not minig 😉 If noone bought for a higher price, noone would sell at a higher price.
Then nobody would get a mid
Then nobody would get a mid range graphics card.
@pt3322
Overclocking the
@pt3322
Overclocking the 750Ti is handled via the GPU Boost 2.0 feature:
– nVidia GPU Boost 2.0 only overclocks within the limits of the TDP design of the card.
– most 750 Ti card designs draw their power exclusively from the PCI express bus (no 6 pin power connector provided), so this overclocking limit cannot currently be raised above 100% TDP (maybe possible later with a modded VBIOS).
Other nVidia GPU models will allow you to raise the TDP limit for overclocking significantly above 100%, but this one refuses to do so.
I’m obviously glad that
I’m obviously glad that Nvidia finally offers good mining cards.
But if I were you, I’d look how much power those rigs pull from the wall with a Kill-A-Watt or similar, before I present a bogus graphic with “Mining performance per Watt” based on TDP.
It’s obviously nonsense to claim that stock or OC’ed 750Ti use the same power and give more Kh / W. But somehow you missed that.
Miners also tend to undervolt their cards to reduce noise, heat and power cost.
Edit: you are suggesting that the current prices are $150 for a 750Ti and $300 for a R9-270x? In Europe, a 750Ti and a Gigabyte or Sapphire R9-270 cost the same.
Edit2: Oh, I guess you picked the cheapest 750Ti and the most expensive, worst-rated R9-270x at Amazon by accident too.
We aren’t using European
We aren't using European prices here. And in reality, the focus of the story is much more on the power efficiency side of things than actual cost. As we have seen in recent months with GPU prices they will scale WILDLY from day to day.
Also just to point it out,
Also just to point it out, our Performacne per Dollar graph is using a base price of $280 for hte R9 270X. Only the link to Amazon uses a higher priced unit.
And to be honest, you failed
And to be honest, you failed miserably on that. TDP is not real power consumption. AMD and Nvidia use TDP in a different way. AMD TDP is more like a theoretical maximum value you can almost never measure in real applications. Nvidia TDP is more like a typical value you actually can measure in real applications. Sorry to say that but your charts are quite worthless.
Your concern about power
Your concern about power measurement versus power rating is valid, but our power testing gear is only to work "at the wall" and as such takes the entire system into account. That can be interesting for some testing and some analysis, but for this story we decided to use TDP.
Mining is among the GPU intense workloads for a GPU and in my experience the TDP if a very close level of expected power draw for ecah card.
I have to agree you should
I have to agree you should have tested the power draw yourself, who knows what Nvidia and AMD classify their TDP as, and if scrypt mining is considered part of that evaluation.
I certainly don’t mind the at the wall readings, as long as the PSU stays the same over the whole test, idle vs. load will still tell a close enough story about the real life power usage. I hope you’ll add it for the next overview!
AMD has been known to under
AMD has been known to under rate their draw, least on cpu side where their cpu with claimed 125watt TDP ends up being more like 150.
Good article, only gripe is
Good article, only gripe is the comparison rigs. Should have used regular 270’s for ~250 each and rated at 470+ kh/s since that’s the standard most miners are building around. The x versions aren’t any faster when overclocked.
Very happy to see more competition in this space. The 750 ti and r7 265 should help reduce the premiums on all the other r9 cards (I hope)
I think your hashrates for
I think your hashrates for the older Nvidia cards are off. They seem to be pulled from the litecoin wiki, which doesn’t represent the newest version of cudaminer from 12/19 onward.
Here are some better numbers:
http://www.reddit.com/r/dogemining/comments/1wuvny/new_cudaminer_released_20140202_1020_increase_in/
We actually re-ran all the
We actually re-ran all the GeForce results just yesterday to make sure they were accurate with the same version of the CUDA miner.
You are still quite off from
You are still quite off from what others are reporting. I am getting around 350kh/s stock with a 770 and over 390kh/s overclocked. You can’t leave cudaminer settings on auto as it rarely finds the correct setting.
Could been the number at the
Could been the number at the time test’s were done, like that cuda miner has been more optimized to get better numbers outta it since 770 was tested.
For you to achieve the same
For you to achieve the same desired hashing rate, it will cost more to build with GTX 750 TI OC; while consuming the same amount of electricity. (Comparing to undervolt r9 280x @750Kh/s @ 420 before tax and 475 after tax @ 13%)
I like the fact that nvidia
I like the fact that nvidia has made some effort to make their cards mine better. This new release looks good in a sense that it doesn’t require big power supply cos’ it uses less power. 6x 750 ti seems solid option now, I guess they are pretty quiet too, only problem is the room.
If you can achieve 294 mH/s with 640 cuda cores, we will probably way over 1000 mH/s with high end cards when they arrive. That is if hashrate scales well.
I have to point out that in this article you have unoptimized settings at least for 290/290x. My 290s(non-x) do 830(elpida) and 841(hynix) @ stock (using W7). I use settings from here http://tinyurl.com/pyu7elz except overclocks, these things squeal enough @ stock.
Great write up!
Great write up!
That you would even validate
That you would even validate that this is an issue by basing your article on the *coin world is ridiculous and destroys any respectability you might have had. Get back to real articles and leaves theses scams to others.
Wow,
How could one writing
Wow,
How could one writing about GPU hardware not acknowledge the influence of cryptocoin mining? Putting blinders on doesn’t help you and it definitely wouldn’t help PCPER. They’re articles about mining performance just shows how progressive and open PCPER writers are about what’s happening in computer hardware. Would you really want PCPER to suffer the consequences of not keeping up with changes in the market?
PCPER, do you think prices of AMD cards will drop now that nVidia offers something competitive for mining? Or will it really be inflated cards across all GPU hardware as people are predicting and commenting on here?
Yaaay! Now No one will offer
Yaaay! Now No one will offer good cards at a good price!
Fuck you all
lol Nvidia’s already been
lol Nvidia’s already been milking the market since the day the 600 Series dropped. Only cards that were somewhat decent were the 660 and 680. Everything else got milked and has continued ever since.
I have that feeling we are in for some more rude pricing schemes with R-300 and 800 Series cards to follow.
YAY I don’t need a new top of
YAY I don’t need a new top of the line gaming gpu every 3 months fucktard!
I am waiting for your article
I am waiting for your article about
“How selling more cards to miners it is bad for Nvidia”.
It made Ryan curse on the
It made Ryan curse on the last podcast. It was a pleasant surprise to see the anger and frustration finally boil over. 😀
So an OC’ed 750Ti has the
So an OC’ed 750Ti has the same rated TDP as a stock one?
You might want to revisit those hash/watt numbers and maybe use actual values instead of stated ones
The value used was based on
The value used was based on wall watts and if they say it was same tdp probably assumes that wall draw was same.
.
.
I really enjoy having a
I really enjoy having a mining option from Nvidia.
But for this article, did you built the 2 rigs (like really built them in reality) or you just made them up on paper?
You got a screenshot with 750TI hashing away but it is only one card.
Would have been a nice article if it was based on 2 real rigs with power consumption measured with a kill-a-watt.
Cold? turn yer machine into a
Cold? turn yer machine into a mining rig, problem solved.
die crypto coin just die
die crypto coin just die
“By being able to buy a
“By being able to buy a smaller, less efficient power supply the payoff date for the hardware is moved up.”
Are you insane? If you are any type of miner, you know that you will be using multiple cards, multiple rigs. In this case you will have to build even MORE rigs because you are running such a gimped card, so that multiples your costs of motherboard, PSU, RAM. That moves your payoff date even FURTHER back.
It’s hilarious seeing all these articles pop up about this crap card and the writers clearly don’t have a clue what they are talking about.
There is one obvious flaw
There is one obvious flaw with your evaluation. The board you are using in both rigs can not run 6 Nvidia 750ti cards, but only 3. The power for the card (60w) has to be provided through the pcie slots and you can only physically fit 3 cards in there because of double slot cooler.
Most of these cards do not have external 6 pin power connector, so powered PCIe risers are out of question too as they can’t provide enough power (unlike 270 or 270x cards which can be used with risers just fine).
Casual users that want to run one, maybe two cards would be better off with 750ti indeed, as the price/performance/power ratio is great.
You’re not reading article
You’re not reading article closely enough. Look at list of parts, see risers:
http://www.amazon.com/gp/product/B00I6NC5OC/?tag=pcper0a4-20&tag=pcper0a4-20
Lets you use all 6 slots, and the molex connectors on the risers power the cards no need for 6pin.
they include risers, but
they include risers, but since the cards do not have PCI-E power recepticles, they have to draw power from the motherboard. please someone correct me if i am wrong, but 6 cards pulling 60 watts seems way out of spec from PCI-E motherboard power supply which is 75 watts.
replying to self. i see now
replying to self. i see now they are powered risers. but that still sounds dangerous…i have read reports of powered risers feeding power back into the pci-e power from atx and causing all sorts of issues. again please correct me if i’m wrong!
Its not 75 watts for ALL the
Its not 75 watts for ALL the slots, its each slot can provide 75 watts. If it was 75 watts over all slots you would seen a problem with SLI/CF rigs.
How would the other 3 cards
How would the other 3 cards hold in place? What kind of computer case would you need to house all 6 cards? I have a problem seeing how this is even possible.
Just saw this. Here’s one
Just saw this. Here’s one possible solution:
Riggit Revision 9 Custom Mining Frame & Testbench 6 Triple Slot Cards XL-ATX E-ATX Dual PSU Mount
http://www.ncix.ca/products/?sku=94366
Its facinating to see this
Its facinating to see this 750TI perform at same level as my 680.
Wonder how it would perform with Blender or other Cuda compute benchmarks
What part of Blender is using
What part of Blender is using Cuda, Blender is using mostly OpenGL(in the 3d editor), and the CPU for CPU rendering, but it does have cycles render which uses CUDA for some builds AFAK. I would love it if Blender would use OpenCL, and try to get more 3D editing functions using more CPU cores, and OpenCL acceleration of Mesh editing! Rendering is not so bad, as far as work flow is concerned in Blender, just start the render and come back when it’s done, but editing high-poly count meshes, man waiting for a simple mesh relax, and other mesh operations can really slow down the creation process. I can not wait for Blender to start taking advantage of more HSA type systems/drivers, and to utilize OpenCL, etc. drivers to speed up the mesh editing process. Photoshop gets more benchmarking than any of the open source alternatives, and I wish that Blender benchmarking would become standard for any new graphics cards reviews, including the OpenCL, openGL, and other benchmarks, even something as simple as loading a system with a high a resolution mesh/scene with lots of meshes, in the editor to see the maxium mesh polygon count that a graphics card can comfortably handle in Blender’s 3d editor(openGL). Graphics wise, as a whole, most of the benchmarks are about rendering, but few if any at all, cover complex high-poly mesh editing performence, and its complete times. Image filtering get its share of benchmarking also, but mesh editing tasks can be just as processor intensive.