Just recently, we posted a story that claimed NVIDIA was preparing to launch high-end Maxwell in the October/November time frame. Apparently, that was generous. The graphics company is said to announce their GeForce GTX 880 in mid-September, with availability coming later in the month. It is expected to be based on the GM204 architecture (which previous rumors claim is 28nm).
It is expected that the GeForce GTX 880 will be available with 4GB of video memory, with an 8GB version possible at some point. As someone who runs multiple (five) monitors, I can tell you that 2GB is not enough for someone of my use case. Windows 7 says the same. It kicks me out of applications to tell me that it does not have enough video memory. This would be enough reason for me to get more GPU memory.
We still do not know how many CUDA cores will be present in the GM204 chip, or if the GeForce GTX 880 will have all of them enabled (but I would be surprised if it didn't). Without any way to derive its theoretical performance, we cannot compare it against the GTX 780 or 780Ti. It could be significantly faster, it could be marginally faster, or it could be somewhere between.
But we will probably find out within two months.
AMD users mourn while Gsync
AMD users mourn while Gsync users rejoice.
Thats trying way too hard.
Thats trying way too hard.
Why would AMD users, or
Why would AMD users, or anyone for that matter, mourn a GPU release? It doesn’t affect them in the slightest unless they choose to buy it.
Oh look. The fanboy just came
Oh look. The fanboy just came out of it’s cave.
Newer and better products are good news for everybody. It’s called competition and leads to lower prices and higher performance.
As for GSync, what a joke. Months after it’s release there are people who bought a gsync monitor only to notice extra lag when playing fast ffs games.
your the fanboy that needs to
your the fanboy that needs to go back in to his cave. IF there is any its only a few ms at best. 1-2ms maybe 3ms. frame time for 144fps is like 6ms so. Wanna talk about BS, AMD claims its tech is VESA standard but just to turn around in their FAQ and say its not proprietary, AKA straight up lie to end users yet again.
I am the fanboy because I
I am the fanboy because I posted something that you admit it is there, that it is true? Good one.
I said IF, you are one those
I said IF, you are one those all over AMD every word like they can say and do no wrong. I never said there was and delay as there is no proof of it besides AMD claiming more crap which more offen then not is AMD being full of it not not exactly saying everything. As far as we know the work could be sent with the video frame delay IF and i said IF is sub 1ms. SO its so small it won’t impact anything. Under 5-6ms won’t be noticeable by anyone even if AMD makes such a large deal about it. I am no expert designer that worked on g-sync so I can’t say there is a delay, 60fps is 16.67ms, 120 drops it to about 8ms, bumping at 144 which most g-sync monitors set to release are, only puts frame time to around 6-7ms.
So now we just try to create
So now we just try to create a false image of the other person, to have an excuse to make him look bad. You are doing great.
I was reading about the lag from people who are using GSync and they are full of Nvidia hardware.
This is my third post and I haven’t written until now the word “AMD” once. On the other hand in two posts you mentioned AMD 5 times. They must have done some pretty bad things to you when you where very young.
Your avatar isn’t helping
Your avatar isn’t helping
Yes I know, but people should
Yes I know, but people should focus on the text not on the avatar. Unfortunately most people see the avatar and then they just approve or reject whatever it is written based, on that avatar.
http://www.vesa.org/news/vesa
http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
Arbiter’s point is: that
Arbiter's point is: that standard is NOT FreeSync. It is used by FreeSync, wrapped in proprietary bits.
That Just means VESA Didn’t
That Just means VESA Didn’t adopt everything AMD gave them…
Dockport standard from AMD also went VESA…
Meh.
“How are DisplayPort
“How are DisplayPort Adaptive-Sync and Project FreeSync different?
DisplayPort Adaptive-Sync is an ingredient DisplayPort feature that enables real-time adjustment of monitor refresh rates required by technologies like Project FreeSync. Project FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video. Users are encouraged to read this interview to learn more.”
http://support.amd.com/en-us/kb-articles/Pages/freesync-faq.aspx
AMD claimed months ago that their solution would be a VESA spec but like most every other AMD claim they end up not being 100% truthful and had to back pedal cause they couldn’t live on to their claim.
Haha, actually you threw the
Haha, actually you threw the first stone “Oh look. The fanboy just came out of IT’S cave.” And judging by your other reply’s I see you go with the tactic “if you cant attack the argument, attack the person”.
Stepping that aside and going back to the original post and even to your original off topic rant about GSync.
The GTX 800 series is coming soon and we should all be excited even AMD fans, its progress in the industry, which on the GPU side has been slowing down as of late. Really looking forward to some new GPU’s.
Also on a side note AMD has also been doing the “reuse/re-brand” method, and are just as guilty as NVIDIA doing it. I personally don’t have that big of an issue with it, they basically make last years flag ship cheaper and move it down the product stack.
As for GSync, I think it is the right direction to go in, do I wish it was more open source, sure. But I can also understand NVIDIA’s $$$ investment so its understandable. And at its core it is a technology that improves the life of EVERY gamer, no more Tear Lines and no more Frame Stutter.
🙂 Feel free to object and reply.
First paragraph.
Nope and
First paragraph.
Nope and nope.
Second paragraph.
Rand about GSync? Nope. Facts.
Third paragraph.
You are copying me.
Fourth paragraph.
Who mentioned rebrand? I think you are the first.
Last paragraph. GSync was good to wake up the industry. The same is the case with Mantle. But both should go away the day free techs replace them. And NO. Proprietary standards are NOT for EVERY gamer. But for some. Those who pay a specific price to a specific manufacturer.
What do you prefer?
DirectX 12 (OpenGL), OpenCL, Adaptive Sync, some open Physics engine
or
Mantle, CUDA, GSync, PhysX
This guy is delusional. Even
This guy is delusional. Even the moderator over at ASUS ROG Forums is realistic while responding to SWIFT G-Sync issues.
It can’t undo the fundamental effect of very low frame rates and it doesn’t do anything below 30FPS, but it does make it far more tolerable and the transition from high to low smooth. IMO it’s not so suitable for very fast action games where you’re better off using the ULMB option with normal or extreme pixel response setting instead.
Because dat 6GB GTX 780 Ti in
Because dat 6GB GTX 780 Ti in the wild >=)
8GB lol
Not impressed at all because 28nm is going to have a severe impact on Maxwell which has already lost everything that made it interesting over the past 4 years.
We can keep dreaming though.
making the efficiency
making the efficiency improvement over kepler using the same 28nm node is already impressive. but i know what you mean. but then again nvidia can’t keep delaying their next product just because the tech they need does not arrive in time (TSMC die shrink). they still need to show investor that they have some sort of progress despite things are not going as they have hope/expected in the past. me? maybe i will wait for their third gen maxwell before pulling the trigger.
I am sure there is some loss
I am sure there is some loss but Nvidia has proven what can be done with 28nm in the 750ti.
Soo Far.. You really want to
Soo Far.. You really want to defend Nvidia.. I smell fan boyism ahahaha!!
Anyway.. GSYNC or FREE SYNC, what ever you call it, I’ll just go for something that does the same with no extra hardware cost
so you’re waiting for a
so you’re waiting for a solution that will give you the capability without the need to upgrade your current monitor and gpu?
g-sync is nvidia, no-so
g-sync is nvidia, no-so “freesync” is AMD.
Least with g-sync if you have mid range or higher gpu based on kepler you just need the monitor.
AMD on other hand you will even though they claim some monitors just need firmware update which i doubt, you need a monitor and only 3 models of them radeon dedicated series supports it so more like need new GPU as well. AMD solution in the end depending on what you got isn’t gonna be cheaper.
i don’t know how reliable is
i don’t know how reliable is this but it seems Gigabyte has confirmed that they will be releasing GTX880 in late september:
http://videocardz.com/51133/gigabyte-launch-geforce-gtx-880-g1-gaming-september
Booyah!!!
The 880 Maxwell
Booyah!!!
The 880 Maxwell specs is finally here !!!
2560 SP/ 1050MHz/ 7GHz /256bit/ 4GB/ 8GB/ 230WTDP
The new rip off on its
The new rip off on its way.
Hopefully the driver “quality” will be the same as the 600-700 series, meaning one realse out of two will crash and the other do artifacts.
nvidia FTW !
Take Nvidia outta the battle,
Take Nvidia outta the battle, you think AMD wouldn’t pull same crap if they didn’t have to compete with company beating them down in the market?
lol do you blow the nvidia
lol do you blow the nvidia logo every morning when you wake up.
judging by his rational to
judging by his rational to that last one I believe he does.
judging by your 2 moronic
judging by your 2 moronic comments you blow AMD logo everyone morning and confirm the whole “AMD can do no wrong” viewpoint.
Might want to wipe that JHH
Might want to wipe that JHH love juice off your chin there buddy. You think people haven’t noticed.
Anyone that regular reads the comments here on PCPer can tell you do. I don’t know why your surprised when someone says it.
Whoa people. This has gone
Whoa people. This has gone way too far.
judging by his rational to
judging by his rational to that last one I believe he does.
Yes, AMD would pull the same
Yes, AMD would pull the same crap.
And I would be defending nVidia.
Question is, would current nVidia defender switch to defend AMD ? After all if they defend the philosophy of expensive slow crap, they should.
That would be an interesting test.
yea you know what you’re
yea you know what you’re taking about …
I really hope they dont milk
I really hope they dont milk Maxwell like kepler.
Im afraid to even get the first cards 6 months later will better
I promise they will. After
I promise they will. After everyone suckled the teet of the Titan because of the shock and awe factor, nobody started questioning what Nvidia was doing until the Titan Z came out and people realized the bullshit delicate game Nvidia has played throughout the Kepler Generation.
You will see another Titan for most likely the $1,000 price mark on 20nm. You will see a 20nm refresh with slight improvements and definitely a 880 Ti at least at the price of $649 if the 880 isn’t already priced there screwing those owners just like the 700 Series for $150 net profit.
The massively over hype whale
The massively over hype whale known as Maxwell 880 is over.
Now it’s official. GTX 880 confirm weak.
No chance weak 880 is faster than 708 Ti.
from what i heard a titan z
from what i heard a titan z part 2 will be coming next year. i’m still running with a gtx 560ti sc and have no intention to move from that card until i think its really necessary. really glad i didn’t run out and buy any of the 6 or 7 series cards as yet, same goes for the titans.
What you are saying there
What you are saying there makes some sense, but not necessarily in the way you present it. The 560TI SC is a good card, and can still play any game you want to, so unless you have a bunch of extra money burning a hole in your pocket why spend it?
What you are saying there
What you are saying there makes some sense, but not necessarily in the way you present it. The 560TI SC is a good card, and can still play any game you want to, so unless you have a bunch of extra money burning a hole in your pocket why spend it?
Here’s to hoping that these
Here’s to hoping that these cards keep the compute gains that were seen in the 750 and 750TI. I know that NVIDIA isn’t as fast a compute if you are running a simple task, but unfortunately in the rendering industry it is usually the only option, as CUDA actually works for complex path tracing algorithms unlike OpenCL.
Mostly Compute hasn’t been
Mostly Compute hasn’t been needed by end normal users hence why they don’t focus on making their cards do to that work. Usually it was all limited to pro user end.
this is the PCper comments
this is the PCper comments section now?
I think this crazy hot summer
I think this crazy hot summer is getting to everyone, We all seem to be bitching at each other, ALOT.
There are too many off topic
There are too many off topic people here… most are those fanboys ahahaha!
I dont care if you love AMD or Nvidia.. please if you are a fanboy just go out and camp at Nvidia or AMD HQ to support them and stop ranting here. You guys are detroying a good article here
the massive AMD lovers can
the massive AMD lovers can only attack anything nvidia does even when AMD does same thing. When you point out the truth or the facts they start throwing verbal and slanders at people.
I used to get excited about
I used to get excited about those graphics releases back in the old days, but what is the point now? Most upcoming games are total failures. No real progress has been made in game development. The so called physics engines are so bad, you don’t even feel like you’re doing anything when playing a game… Game audio still sucks, and the visuals ain’t that good either for the amount of money you pay for those graphics cards. I mean tessellation isn’t used to a noticeable extent as of yet in games and I haven’t seen any decent texture design in those modern games, including the upcoming ones.
BTW, I haven’t played a game in over two years, I think I’ve already quit this boring activity that I used to enjoy years ago.
Back in the day they had nice
Back in the day they had nice releases now they try to nickel dime loyal customers by fragment releases on top of it you get lazy ass devs porting over broken and half ass games from consoles my 780s are it for me im done after this unless something revolutionary comes out but still you deal with incompetent game devs
ah another 28nm refresh for
ah another 28nm refresh for the big 2 gpu vendors gg tsmc
Its not a refresh, gpu is
Its not a refresh, gpu is based on nvidia new maxwell arch which has power and performance improvements over kepler. If you want to get idea what maxwell brings, look at gtx750ti. It competes with 260x and 265 AMD parts while using like half the power. 260x has 115watt TDP, 265 is 150watt TDP, And the Nvidia 750ti is listed at 60watt TDP.
Nice to see some new GPUs,
Nice to see some new GPUs, even if they are still on the 28nm node. The Maxwell architecture has shown impressive power efficiency with the 750 Ti though, so we should see a reasonable performance increase with the 870 and 880 over the 780 and 780 Ti.
I’m rather content with my GTX 780 for the while, it handles all my games at 2560×1440 perfectly. Might be looking at one of those ASUS ROG Swift monitors though. 😀
The childish arguments are lame too, just saying.
@Mandrake Stay away from the
@Mandrake Stay away from the ASUS ROG Swift monitors, why because of G-Sync sorry Free-Sync will kill that as soon as it hits the market latter this year.
Yes I’m a Nvidia fan boy but you would have to listen to Maximum PC podcast to know what I’m saying.
http://www.maximumpc.com/no_bs_podcast_226_-depth_interview_amd_graphics_guru_richard_huddy
PhoneyVirus
https://twitter.com/PhoneyVirus
https://phoneyvirus.wordpress.com/
How is gsync dead? Because
How is gsync dead? Because AMD, a company that only sells to 13rd the market is coming up with a knock off? “Freesync” will only work with AMD GPUs, AMD only has a fraction of the dgpu market.
Or do u think that gsync is dead because a function of the eDP standard has been added to the desktop DP standard????? News flash, the ability to change the refresh rate has existed for years as part of the eDP spec. And no one to this day, not even AMD themselves was using it to give a gsync like experience. No one.
So how is moving this eDP feature to the desktop supposed to have any effect on gsync? Ppl need to get their facts straight. There is a heck of a lot more to freesync than the spec change. There is no one using the spec to create a gsync experience now, and that part of the spec has been a part of eDP for yrs. The only one trying now is AMD. With their small fraction of the market, there is no way gsync is automatically dead.