Having started your journey with Ryan's quick overview of the performance of the 1800X and anxiously awaiting our further coverage now that we have both the parts and the time to test them you might want to take a peek at some other coverage. [H]ard|OCP tested the processor which many may be looking at due to the more affordable pricing, the Ryzen 1700X. Their test system is based on a Gigabyte A370-Gaming 5 with 16GB of Corsair Vengeance DDR4-3600 which ran at 2933MHz during testing; Kyle reached out to vendors who assured him an update will make 3GHz reachable will arrive soon. Part of their testing focused on VR performance, so make sure to check out the full article.
"Saying that we have waited for a long time for a "real" CPU out of AMD would be a gross misunderstatement, but today AMD looks to remedy that. We are now offered up a new CPU that carries the branding name of Ryzen. Has AMD risen from the CPU graveyard? You be the judge after looking at the data."
Here are some more Processor articles from around the web:
- AMD's Ryzen 7 1800X, Ryzen 7 1700X, and Ryzen 7 1700 CPUs @ The Tech Report
- AMD’s moment of Zen: Finally, an architecture that can compete @ Ars Technica
- AMD Ryzen 7 1800X CPU Review: The Wait is Over @ Modders-Inc
- The AMD Ryzen 7 1800X Performance Review @ Hardware Canucks
- The AMD Ryzen 7 Performance In 3D Rendering & Video Transcoding @ TechARP
- AMD Ryzen 7 1800X @ Kitguru
- AMD Ryzen 7 1800X @ Guru of 3D
- AMD Ryzen 7 1800X, 1700X, and 1700 Processor Review @ OCC
- AMD Ryzen 7 1800X Linux Benchmarks @ Phoronix
Too bad that, unfortunately,
Too bad that, unfortunately, some self-proclaimed “tech experts and reviewers” out there gave in to them Intel’s e-mails in the end…including…
Can you please elaborate?
Can you please elaborate?
This is the kind of ugly truth such sites as “Tom’s Hardware” and such people like Jay wouldn’t don’t like you to find out about…by any means “necessary”. Including…
Neither of these articles
Neither of these articles provide any evidence of the tactics you are describing, only “alleging” that such messages were sent. Could you link any verified leaks?
“Ignorance/delusional denial of the harsh truth is bliss”, huh? Like I’d expect anything else from you people.
You’re pretty gullible
You’re pretty gullible believing anything without and proof.
Also get off the circle jerk with Jay try contributing something other than ignorance.
>> You’re pretty gullible
>> You’re pretty gullible believing anything without and proof.
Except Intel was fined years ago for doing the same thing. Maybe you are just too young to remember.
The 1700 result from OCC look
The 1700 result from OCC look pretty good.
Its wild that they got better OC on the $329 model then the $499 1800x. Not by much, but it make you wonder what kind of binning AMD is doing…
Not many game tested, but it show that with dx12, engine might plateau quickly.
The diff between a FX-8370 and a i7-7770k is like less then 3fps at 1050p
I wonder if this will become the norm ?
vulkan also seem to show ryzen doing better.. in contrast is horrible with opengl.
This seem to indicate Intel got the better memory subsystem.
And then I wonder, faster ram on ryzen might make a huge difference in some games ?
Alright put on your tin hats.
Alright put on your tin hats.
Doesn’t it seem weird that microsoft delayed their monthly update this month? and update that could have optimizations for ryzen? Maybe that could be why some results seem odd.
From the Reddit AMA we know
From the Reddit AMA we know that there are a lot of optimizations needed to be made, especially in regards with SMT. Right now game engines and OSes have optimizations for intel’s HT where they don’t for SMT. That leads to the difference with SMT on and OFF, as well as the Ryzen performing worse in games while it hovers at 30-50% utilization.
Optimizations that AMD is
Optimizations that AMD is responsible for, either directly or through collaboration with for example Microsoft.
These are things that should have been ready months before a release.
Not very easy when AMD’s
Not very easy when AMD’s engineering samples had some features that were not working properly to be able to be developed for months in advance. There is a lot of very new CPU thread priority IP in the Zen CPU core(Including SMT for the very first time) and Infinity connection/control fabric IP in AMD’s Zeppelin based on die system platform that is used for both the Zen/server and Ryzen/consumer desktop variants that is concerned with power(CPU/on die) management and the fabric that connects up the CCX units and allows for cross CCX unit CPU core coherency.
Now before Zen+ is released to market there will have been at least the first generation of the Zen platform/micro-architectural tweaking experience from AMD and its Motherboard, gaming/firmware/other software, and OS partners to make any complaints about release issues more valid. This first Zen/Ryzen micro-architectural and motherboard platforms are brand new for the desktop system builder market, the server SKU market(Zen/Naples and other Opteron SKUs), as well as the lower branded consumer(6 and 4 core) SKUs, that are still under active tweaking and development.
AMD is Damned if they do and Damned if they don’t by only the simple minds that do not understand the real problems with any maker’s brand new CPU/motherboard/software ecosystem technology issues that all processor makers experience with a new product’s RTM. Then there are always the folks with an agenda who do know but are tasked with spinning negative on the competition.
Really conflicted about
Really conflicted about Ryzen, my plan was to buy the 1700X today (replace my i7 4740K), but the gaming benchmarks are just not where i had hoped.
And i kind of feel like buying a 7700K is a dumb thing to do right now.
Not 4740K, ment 4770K, stupid
Not 4740K, ment 4770K, stupid numbers.
The gaming numbers will
The gaming numbers will improve if you are gaming at low res from better software development; and I doubt you will notice much, if any, in the real world. Otherwise, fucking eight killer cores and 16 threads to run so many things at once or use for programs that take advantage of multiple cores and threads. I think it will be great for years and only get better, so I think it is well worth the premium over the 7700: I guess $99 if you are thinking about the 1700X. Enjoy whichever way you decide to go.
Too early to tell, as to much
Too early to tell, as to much games/other software are still not optimized for Ryzen, but if all you do is game the maybe the 7700k is what you want. Kaby lake does have a wider decoder and a little fatter FP back end in it’s core design and higher clocks.
But if you are planning on doing any tasks that like lots of cores then the 1700X may be what you need and there will be improvements to all the Ryzen SKUs just released after more optimizations are done for the games/other software that are currently not optimized for Ryzen.
There are some features that where planed for Zen/Ryzen that could not be certified in time for this Ryzen/version’s release but they will be updating Ryzen with Tweaks/steppings as usual for any CPU product. Also there will be Ryzen2 and Ryzen3 releases pretty much on a fixed schedule as there where for the previous AMD micro-architectures from AMD.
I think it will be very easy for AMD to tweak Zen/Ryzen designs and add things like fatter FP units or maybe fatten the front end decoder resources and other tweaks with Ryzen2 and Ryzen3 so at lest AMD is closer to maybe being able to jump fully past Intel if AMD keeps its nose to the grindstone!
Keep your eyes out for some Ryzen performance improvements with the necessary tweaking that comes with any new release of any new CPU/Motherboard IP from any maker and also watch for any new Ryzen steppings and improvements in GF’s 14nm fab process that will help with Ryzen’s clock speeds.
I’d go read Anandtech’s review of the major Ryzen features as they compare to Intel’s competing SKUs at there are plenty of detailed tables that list on the CPU core feature for feature differences between Ryzen and Kaby lake/older Intel and AMD micro-architectures! Some of those charts/tables are great for future refrence so bookmark the article or download the charts!
I have a Q6600 right now, and
I have a Q6600 right now, and I’m picking the 1700 over the i7-7700k to replace this PC.
This choice was easy for me because this PC is not just for gaming.
And if a game sustain 60fps, I’m happy.
I also recall getting the Q6600 and it was not the fastest gaming CPU at he time… but it got better with age.
I can even run games the $1000 X6800 cant even start.
I dont think the i7-7700K wont be able to run games a few years from now, but I expect the R7 1700 to age better with Vulkan/Dx12 titles then the i7
From the Reddit AMA, AMD
From the Reddit AMA, AMD officially said that Ryzen chips and the AM4 socket support ECC, but (a) they haven’t done server-grade validation and (b) it needs to be enabled in the BIOS.
Any chance of you guys trying this on the various motherboards you have available?
Anandtech correct that to, so
Anandtech correct that to, so ECC support seem to there officially, but no one tested it ? (The 1800x is NOT a gaming CPU)
Ryzen R7 is a workstation CPU
Ryzen R7 is a workstation CPU for people who also play games. Which is me, so I’m happy.
Also, I’m still running an FX 8150, so I really deserve an upgrade.
You are very correct! All of
You are very correct! All of that that low resolution gaming nonsense over a few gaming titles that have issues is just some draw-call at the O.K. Corral drama. AMD and its gaming/OS partners can fix most of that if given the time.
Enjoy your all around Workstation/Gaming with that R7. I’ll be looking at maybe the 1700X and dual RX 480s(Blender Rendering) after the 500 series Polaris refresh and Vega arrives. The RX 480 is becoming even lower priced with each passing day.
Wait until after AMD has had
Wait until after AMD has had a chance to tweak its Zen/Ryzen platform before you damn the 1800x for gaming usage and that ECC support is there on Ryzen but it’s not certified for ECC usage at this time. And AMD’s consumer SKUs have come with ECC support while Intel wants you to pay to play under ECC support.
There are plenty of disabled on Ryzen server features that AMD has the option of enableing and there is some more time needed for the usual new product tweaking that is necessary to fix any performance regressions for low resolution gaming.
P.S. the server market is what is really going to save AMD’s Bacon and the Zen micro-arch will get AMD plenty of Zen/Naples server business and Zen/Ryzen consumer business. Consumer Ryzen is selling out quickly after restock at a lot of retail channels. It looks like AMD is back for real this time.
If they haven’t done
If they haven’t done validation, there’s no point in enabling it.
Except for the fact that it
Except for the fact that it works, sure.
So from all those reviews,
So from all those reviews, the i7-7700k is still the gaming CPU to get.
(unless you game at 4K… or even 1440p, where the CPU is not the bottleneck)
And, the 6900k lost all its luster to the 1800x.
$500 is still a lot… but considering the r7 1700 is fully unlocked and seem to overclock better then the 1800x, AMD is delivering something for 2d and 3d artists on a budget.
No more 6900k envy, thats for sure. But i7-7700k owner can still brag 🙂
I sound like a AMD fanboy, but frankly, I have been super happy with my Q6600 (absolute best CPU I ever got) and have been on the sideline for years for an upgrade… (I was also gifted a AM3 server, and put in FX-8320 in it)
Anyways. I dont know what voodoo Intel is doing with their memory system, but since sandy bridge, its beyond fantastic.
(I never understood why streaming sequential data was less then 50% rated. got lots of excuses that it was not possible.. and then SB made it possible…)
So I have a r7 1700 + msi tomahawk on order (replacing my P35+Q6600) . Exact same price as a i7-7700 upgrade would have been. Hopefully 5 years from know, I wont regret.
I have noticed that some
I have noticed that some websites are doing Ryzen and Blender Cycles rendering which can use the GPU, if the GPU used supports Blender Cycles(OpenCL or CUDA accelerated on the GPU) for a rendering test of Ryzen. So the system usually defaults to the GPU for cycles and I wish there were more information provided by the reviewers on how they went about tuning off any use the GPU settings in Blender 3D’s Cycles rendering mode on their test platforms to use only the CPU cores for rendering in Blender cycles rendering mode.
No, Cycles defaults to CPU
No, Cycles defaults to CPU
That’s what I thought on
That’s what I thought on another post but someone informed me that Cycles rendering has a back-end that will run on the CPU if no supported GPU was available. There are plenty of systems that have pre-GCN AMD GPUs that do not have any support for cycles rendering because of large monolithic GPU-kernel issues and Blender’s Cycles rendering. There has been a longer support for Blender Cycles rendering support from Nvidia via its proprietary CUDA API.
Now Blender’s regular “Blender Render” rendering option is 100% done on the CPU and is a damn good way to stress test a CPUs ability to perform at 100% usage all processors and threads pegged out at 100% usage for the rendering run however long that takes.
For GPU accelerated Rendering Blender uses OpenCL for any compatible GCN GPUs or CUDA for Nvidia’s GPUs. But Cycles rendering can be done on the CPU and GPU! So I would rather not trust any Cycles rendering usage to test out a CPU’s ability if the system has a GPU to maybe help with things and skew any proper results.
There are some CPU only Rendering applications and plug-ins that do very well using the current Ryzen SKUs.
jesus why is everyone focused
jesus why is everyone focused on the 1800x, 1700 man. of all of them thats the one people are leikyl to buy and hardly anyone is covering it. everyone is missing a trick there.
Oh yeah, 1700 is where it’s
Oh yeah, 1700 is where it’s at.
All the information about OC so far indicates that 1700/1700x/1800x all clock to the same 4.0/4.1GHz and it does not depend on the model.
As long as you have basic know-how of OC, 1700 will be at the same level with the rest at almost 200 less moneys.
I have a feeling the R7 1700
I have a feeling the R7 1700 will be my “new q6600”
I dont think I will overclock past 1.3v as I plan to use the stock cooler. So I might have to stay at 3.8ghz
It seem the biggest gain is 2400+ DDR4
Some review show massive gain from ddr4 clocking.
The R5 1500x might be less in need of high clocked ram, since the exact same bus will serve half the core..
1600x is what matters, cheap
1600x is what matters, cheap and allround exceptional for gamers both now and long term and will probably clock 200-300 mhz more than 1700 and 1800 series. 8 cores always overclocks worse than 4-6 cores. look at intel hedt.
if i really need 8c/16t i would take 1700 non-x and oc it, they all reach same clocks pretty much. 1800x is a halo product, waste of money like 6950x
Not sure any Ryzen will clock
Not sure any Ryzen will clock higher. But the 1600x will need nimble cooling.
I personally find the R7 1700 gaming result more then needed.
That I get 140FPS vs 170FPS at 1600×900 with a GTX 1080 is irrelevant to me.
I care that I can get 60FPS at 1440p on a 1070 class GPU.
And the 1700 seem to deliver the perfect balance of gaming and raw horsepower.
I think most people knew that
I think most people knew that ryzen would not match current gen intel when it comes to IPC but what suprised people are the low overclocks. Most people (myself included) were hoping to overclock ryzen a bit to negate the IPC deficit in comparison to Intel
I was wondering how long we could expect a new stepping/tweak to Globalfoundries 14nm process?
If it’s a top end/top binned
If it’s a top end/top binned part and it still has overclocking headroom then something is wrong or intentional with the binning process from Intel/others. Overclocking has traditionally been for users who will save money by getting the lower binned part at a great bargain and then get that part overclocked to perform like the highest/higher binned part!
If Intel’s top end SKUs have so much extra “Overclocking” headroom then some flags need to be raised by the consumer as to why Intel is not shipping the parts with higher base and higher boost clocks to begin with.
If a top binned flagship “gaming” SKU has any overclocking headroom then I am very suspicious of that makers intentions! But if a lower binned part has a high relative amount of overclocking headroom and I can get that lower binned/lower cost part to perform like the higher binned part, well that is the golden goal of all real overclockers.
All that LN2 overclocking is just for show and has no real Overclocking value other than the WOW factor!
The goal is to find that Golden lower binned processor part that may have only failed one of the many tests to be binned as a top binned part. And then take that lower binned part and make it perform like a top binned part while still having some money remaining to get the best motherboard for overclocking purposes.
Yes, I have been saying for a
Yes, I have been saying for a long time that Intel has been selling underclocked processors.
I’m starting with my 4700k at
I’m starting with my 4700k at 4400ghz on water,that and my 980to plays all I want at 1440,eventually we have reached peak cpus,sure you get more pci bus but in reality how many need it? Perhaps streamers and vid editor’s,average guy is ok with 3 to 4 year old cpu
Not to mention a lot of games
Not to mention a lot of games take years of development and are optimized for processors that are around near beginning of development. Have fun waiting 2 years or so for the first Ryzen optimized games. It will be a better value then, just not so much now. $500 is way too much for mainstream.
I’m a gamer who when I last upgraded splurged on a i7 4770k at $330. I could last at least 2 more years even at 4k.
Video card is most important anyways. I’m looking at splurging again when 1080ti launches at $699. Last time I spent $320 on 4 gig 760 EVGA FTW. I got a 4k monitor last year and I’m looking to get my use out of it now.
But hey I do get the same frame rates at 1440 and 4k that I get at 1080 (on graphics intensive newer games). Obviously I’m GPU bound but AMD fans will still use it as an excuse to ignore deficiencies of Ryzen. Comparison is only valid for single card at 1440/4k. LOL