User not logged in - login - register
Home Calendar Books School Tool Photo Gallery Message Boards Users Statistics Advertise Site Info
go to bottom | |
 Message Boards » » Finally Some Real Kepler (Geforce GTX 680) Info Page [1]  
BlackDog
All American
15654 Posts
user info
edit post




First off why the HELL did you not include BF3 in the benchmarks?!?!?!?!?






This card should be out March 22nd and it will be a hard launch, not a paper.




http://wccftech.com/nvidia-kepler-geforce-gtx-680-benchmarked-blows-hd-7970/


Here are some random benches found in the link:













Good to finally see what the first model will be like, can't wait for March 22nd to see the reviews.

3/16/2012 3:55:15 PM

Shaggy
All American
17820 Posts
user info
edit post

when will they be available and what will the price be?

3/16/2012 4:15:32 PM

BlackDog
All American
15654 Posts
user info
edit post

from the OP

Quote :
"March 22nd"


and I believe $549, depending on what AMD does with their prices.








[Edited on March 16, 2012 at 4:48 PM. Reason : _]

3/16/2012 4:43:27 PM

Prospero
All American
11662 Posts
user info
edit post

no 670 launch for us poor folk?

3/16/2012 5:28:49 PM

BlackDog
All American
15654 Posts
user info
edit post

not at launch

3/16/2012 5:34:11 PM

JBaz
All American
16764 Posts
user info
edit post

You could have just fucking linked the article...

Also, that's piss poor numbers right there. Disappointing to say the least. I thought nvidia was saying 2x the performance over current Fermi's and how it would just smoke the 7970's.

3/17/2012 1:29:24 AM

JBaz
All American
16764 Posts
user info
edit post

Also, 680's are suppose to be $649, GTX 670 and 660's should be released the same time frame at $499 and $349 price points, but all of it is just speculation; haven't seen anything from nvidia about this.

3/17/2012 1:47:57 AM

BlackDog
All American
15654 Posts
user info
edit post

Jbaz you are behind the news, I didn't think I had to site the price ref:

http://videocardz.com/31020/zotac-geforce-gtx-680-2gb-available-for-pre-order-for-e507

why just post a link when you can see all the info in here that matters?


Also I'd say for beta test drivers (remember the 460 went up almost 40% with one driver update) that is from a random website showing benchmarks with no indication of test method that they are favorable.





[Edited on March 17, 2012 at 12:47 PM. Reason : _]

3/17/2012 12:45:30 PM

BlackDog
All American
15654 Posts
user info
edit post

http://videocardz.com/31046/geforce-gtx-680-sli-performance-results

http://vr-zone.com/articles/nvidia-geforce-gtx-680-sli-performance-preview/15273.html




Quote :
"VR-ZONE posted a benchmark results of GeForce GTX 680 running on SLI configuration. Cards were overclocked for tests.

VR-ZONE was the first to grab two GTX 680s. They posted benchmarks of GTX 680 running in SLI configuration. They promised to post 3-Way SLI performance, as soon as they get third GTX 680 in hands, which is after release of GTX 680.

Benchmarks results are quite solid and prove everything that NVIDIA said about their upcoming SLI performance of GTX 680. As they said, SLI configuration is raising 3DMark11 results almost 100% – it’s twice as fast as single card.

GPU-Z 0.5.9 informs that default clock of GT 680 is 706 MHz (it is actually 1006 base clock). Cards were overclocked to 1150 MHz for core, and 1803 MHz (7.2 GHz effective) for memory. Default memory clock is set to 6GHz effctive. GPU-Z also informs about memory bandwidth of 230.8Gb/s (earlier rumors said it was 192GB/s).

Tests were performed with Core i7 3930K processor, which was also overclocked to 5GHz. So it is card to compare any benchmarks to reference Radeon HD 7970."

3/19/2012 3:56:12 AM

MiGZ
All American
2314 Posts
user info
edit post

Serious (not a troll) question.

What about this video card was thread worthy? I mean if we made threads for every new vid card, that's all this section would be.

I really am interested to know why this one stands apart from the rest.

3/19/2012 10:06:37 AM

neodata686
All American
11577 Posts
user info
edit post

Because it's the first major Kepler card that's been benchmarked. Big releases like this only come out every couple years.

aka:



So it's not just another card.

3/19/2012 10:36:30 AM

JBaz
All American
16764 Posts
user info
edit post

Yeah, its a whole new chip design.

Quote :
"Jbaz you are behind the news, I didn't think I had to site the price ref:"

read your stupid link again idiot... That's still speculation for the $549 price point. Even that news link had the euro price at €507 which is above the expected $649 USD price point... It would be foolish on nvidia's part to position them right next to the 7970's if they are indeed much faster, specially considering this has been in long development with some setbacks and low yields, so their cost is enormous with this line; specially compared to Fermi.

Quote :
"why just post a link when you can see all the info in here that matters?"

Because those charts are stupid to look at the 2x res that you linked in this thread vs. the sourced page for not a whole lot of information. Irks me to have that much wasted space for like 3 numbers. Besides, I'm just here to troll you since you used to post all of the DT news caps from their blog in here... lol

Quote :
"Also I'd say for beta test drivers (remember the 460 went up almost 40% with one driver update) that is from a random website showing benchmarks with no indication of test method that they are favorable. "

This is true, but the person in question who had the 680 to bench and we've seen demo's of where the 680 is indeed powerful with whatever drivers they were using, one could use logical fallacy to conclude that the person should have access to those drivers. But, I'll wait till I see a proper benchmark testing from a verified source then some Chinese forum. The lower power usage is nice though.

3/19/2012 12:30:13 PM

BlackDog
All American
15654 Posts
user info
edit post



http://videocardz.com/31052/gigabyte-geforce-gtx-680-pictured-and-tested




Hmmm why is this thread worthy? Maybe because new GPU Architectures (not die shrunk, brand new) only come around every 2-3 years.

3/19/2012 1:37:00 PM

smoothcrim
Universal Magnetic!
18927 Posts
user info
edit post

Does this architecture allow for more than 2 displays without a second card?

3/19/2012 3:04:04 PM

JBaz
All American
16764 Posts
user info
edit post

yes, it should. They noted this awhile back that they were trying to do 3 multi-monitor gaming on a single card to go against AMD cards, but I haven't heard too much about it in the specs or listings lately.

3/19/2012 3:55:31 PM

neodata686
All American
11577 Posts
user info
edit post

I got an iPad so no more need for a desktop computer.

3/19/2012 4:11:03 PM

JBaz
All American
16764 Posts
user info
edit post

I got an iPad a pen so no more need for a desktop computer gun.

3/19/2012 4:31:30 PM

neodata686
All American
11577 Posts
user info
edit post

Come on pick up on my sarcasm.

Truthfully I may pick up a second 580 down the road but I'm happy with my single 580 at the moment. I packed up Skyrim with a shit ton of HD texture mods and I'm hitting 2GB+ of VRAM constantly. I doubt the first few 680s will have 3GB of VRAM.

-Like that Gigabyte only has 2GB of VRAM.

[Edited on March 19, 2012 at 4:40 PM. Reason : s]

3/19/2012 4:36:26 PM

Prospero
All American
11662 Posts
user info
edit post

I'm still trying to figure out why it's at 256-bit instead of 384-bit?!?!! or at least 320-bit!?!?!

3/19/2012 4:52:50 PM

BlackDog
All American
15654 Posts
user info
edit post

when your VRAM runs 6ghz you can save a lot of money by going with a 256bit, the question is how much does this hurt performance? We can simulate it by OCing the memory further and seeing how much gain we get or wait for the GTX 685 with its rumored 384bit + 6ghz+ Memory

3/19/2012 5:56:46 PM

JBaz
All American
16764 Posts
user info
edit post

Quote :
"Come on pick up on my sarcasm. "

Come on... pick up my trollling...

And 680's will never have 3GB, its 2 or 4.

And I agree with Blackdog (oh god), they don't need to run wider bit memory channels if they can up the MT speeds to insanely high effective clock speeds. But don't be fooled, they aren't saving $$$ that BD is suggesting. It reduced complexity for an already complex gpu. I'd wait till we get official reviews/specs then rely on Chinese leaked stuff. Then's plenty of speculation that its suppose to be higher bit, from 256 to 512 is what I've seen and nothing is confirmed.

3/20/2012 1:59:34 PM

Prospero
All American
11662 Posts
user info
edit post

I understand they increased the memory clock, what I'm asking is, why not in addition increase the bandwidth, it would seem that could make this architecture really scream and see those 200% increases like NVIDIA suggested it would have (which it does not).

If anything it seems like NVIDIA is just sitting on the bandwidth so they can squeak out even more money from people as they know people would buy it even if it was 20% faster... so they'll slowly increase memory bandwidth over the next 2 years until they release Maxwell.

[Edited on March 20, 2012 at 3:16 PM. Reason : .]

3/20/2012 3:15:16 PM

BlackDog
All American
15654 Posts
user info
edit post

the 685 is rumored to have 512bit, we know for sure that the 680 has 256bit without a doubt. The GK110 is what you are talking about and is detailed here:

http://videocardz.com/31126/geforce-kepler-gk110-specification

3/22/2012 12:19:49 AM

Stimwalt
All American
15292 Posts
user info
edit post

Interesting, but I'm passing on this. Maybe in a year or two.

3/22/2012 9:28:02 AM

neodata686
All American
11577 Posts
user info
edit post

Well yeah you have 2 3GB 580's. You'll still have the advantage over a single 680 or 685. Which reminds me I need to pick up a second one before they're all gone.

3/22/2012 9:36:30 AM

JBaz
All American
16764 Posts
user info
edit post

I'm just hoping that once the 600's come out, the water blocks for the 500's go uber cheap like the 400's did; grab three at MC while I'm in DC or something next month. Damn three cards stacked on top of each other is too hot, specially when my case doesn't have a side vent fan.

3/22/2012 4:17:07 PM

JBaz
All American
16764 Posts
user info
edit post

Official previews are out! It's confirmed, only sports a 256 bit wide memory bus. Performs a little bit better on average over the 7970's, but not by a huge stance of what most people were thinking; good news is that it's going to hit the $499 price point! GTX 685 is what the 680 is suppose to be, but since nvidia wanted to counter the 7970 asap, they just pulled this out of their ass; not to shabby though. And as everyone has mentioned from the leaked previews, it does in fact sip much less power than the 580's and even the 7970; 195w TDP!?!?!

http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review

3/22/2012 5:49:59 PM

BlackDog
All American
15654 Posts
user info
edit post


http://www.xbitlabs.com/articles/graphics/display/nvidia-geforce-gtx-680_2.html

http://www.guru3d.com/article/geforce-gtx-680-review/1



[Edited on March 22, 2012 at 8:01 PM. Reason : _]

3/22/2012 7:58:40 PM

BlackDog
All American
15654 Posts
user info
edit post





BIG PICKTUR: http://images.anandtech.com/doci/5699/GeForce_GTX_680_F_No_Thermal.jpg



[Edited on March 22, 2012 at 8:36 PM. Reason : _]

3/22/2012 8:34:22 PM

JBaz
All American
16764 Posts
user info
edit post

GPGPU computations are said to be double the performance over the 580's, so this got everyone in the folding community going ape shit. Should provide a good home for the Jaguar super computer in TN.

3/22/2012 8:41:57 PM

JBaz
All American
16764 Posts
user info
edit post


3/23/2012 3:38:48 PM

BlackDog
All American
15654 Posts
user info
edit post

I thought Compute was less of a concern on Kepler and that AMD is holding the lead right now in GPGPU performance benchmarks.

http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17


Quote :
"As always our final set of benchmarks is a look at compute performance. As we mentioned in our discussion on the Kepler architecture, GK104’s improvements seem to be compute neutral at best, and harmful to compute performance at worst. NVIDIA has made it clear that they are focusing first and foremost on gaming performance with GTX 680, and in the process are deemphasizing compute performance. Why? Let’s take a look."







[Edited on March 24, 2012 at 6:16 PM. Reason : _]

3/24/2012 6:14:37 PM

JBaz
All American
16764 Posts
user info
edit post

first of all, geforce uses the limited 16 bit FPU's so their GPGPU performance won't be as high for double precision FPO's. This is why you pay the $$$ for Quadro and Tesla cards; even then, the older fermi quadro's are much faster than the 7970, but their price/performance is at a steep cost although you can string multiple quadro/telsa's for a better performance per box than AMD's cards when speed is absolute.

Second, GPGPU performance can be misleading in performance since it depends on what you are rendering and how you are rendering it. Some scenes could be limiting the gpgpu performance because of a slow cpu and vice versa. It depends on what software you are running, what you are rendering and how efficient the coding is.

Third, AMD has been known to be very poor performance with certain OpenCL applications, even though the theoretical throughput is technically higher than nvidia's. They had these issues with the 5k series because of how the FPU's were designed and it would starve the steam processors; they were terribly complicated to code for since the micro management is software based, not hardware like nvidia's CUDA or the their later 6,7k series.

Lastly, CUDA is a much more used platform for GPGPU apps right now because the standard has been around a bit longer and many pro level software took advantage of CUDA straight off the bat. There's still some hesitation to switch over to OpenGL even though both nvidia and AMD supports it; its just a relatively new standard that finally got a full release in 2009. So it took a bit to finally see some apps that use it today. Still being adopted at a slow pace.



But all of this doesn't mean that the Geforce lines, like the 680 is terrible for GPGPU applications, just for double floating point calculations that require a higher level of math, requiring 4 or 8 times of cycles for one operation; like what the smallluxGPU plugin for 3ds max requires for ray tracing. In some applications like physics and modeling (like folding), CUDA is just much more advance and faster than openCL right now, but that could change once openCL gets better, more advanced and streamlined all while the hardware tech is specifically built for its code.


tl;dr

3/24/2012 7:20:28 PM

BlackDog
All American
15654 Posts
user info
edit post

you aren't responding to any of the data from the review or their opinions:

Quote :
"Remember when NVIDIA used to sweep AMD in Civ V Compute? Times have certainly changed. AMD’s shift to GCN has rocketed them to the top of our Civ V Compute benchmark, meanwhile the reality is that in what’s probably the most realistic DirectCompute benchmark we have has the GTX 680 losing to the GTX 580, never mind the 7970. It’s not by much, mind you, but in this case the GTX 680 for all of its functional units and its core clock advantage doesn’t have the compute performance to stand toe-to-toe with the GTX 580.

At first glance our initial assumptions would appear to be right: Kepler’s scheduler changes have weakened its compute performance relative to Fermi."









Quote :
"Redemption at last? In our final compute benchmark the GTX 680 finally shows that it can still succeed in some compute scenarios, taking a rather impressive lead over both the 7970 and the GTX 580. At this point it’s not particularly clear why the GTX 680 does so well here and only here, but the fact that this is a compute shader program as opposed to an OpenCL program may have something to do with it. NVIDIA needs solid compute shader performance for the games that use it; OpenCL performance however can take a backseat"



Sounds to me like AnAndTech believes Fermi is a better GPGPU card than Kepler is (since they flat out say it) and that NVIDIA focused on game scenario Compute performance. This is why I question why people would be going "ape shit" over anything having to do with GPGPU performance.

3/25/2012 9:50:14 PM

JBaz
All American
16764 Posts
user info
edit post

Quote :
"you aren't responding to any of the data from the review or their opinions"

What... Are you really that stupid to not see that I did in fact replied to you? The wealth of a few select benchmarks? One of which I explained because of the 16bit FPU's?

And you do realize that they are going to use kepler in other fucking products... you dip shit... It's called Quadro and Tesla...

3/26/2012 8:28:42 AM

ThatGoodLock
All American
5697 Posts
user info
edit post

get a room already you two

3/26/2012 9:29:39 AM

HockeyRoman
All American
11811 Posts
user info
edit post

It looks pretty baller thus far according to guru3d and tom's hardware. I may wait until closer to the launch of Diablo 3 before doing this next build that way they can shake out all of the bugs but more importantly get some in stock.

3/29/2012 8:32:57 PM

JBaz
All American
16764 Posts
user info
edit post

Diablo 3 will not stress the 680. Shit, a cheapo 550 ti will work

4/12/2012 6:47:34 AM

ComputerGuy
(IN)Sensitive
5052 Posts
user info
edit post

I wonder if the original star craft can run on high with these?

4/12/2012 7:30:46 AM

ComputerGuy
(IN)Sensitive
5052 Posts
user info
edit post

I wonder if the original star craft can run on high with these?

//found an egg..double post..

[Edited on April 12, 2012 at 7:31 AM. Reason : egg]

4/12/2012 7:30:46 AM

 Message Boards » Tech Talk » Finally Some Real Kepler (Geforce GTX 680) Info Page [1]  
go to top | |
Admin Options : move topic | lock topic

© 2024 by The Wolf Web - All Rights Reserved.
The material located at this site is not endorsed, sponsored or provided by or on behalf of North Carolina State University.
Powered by CrazyWeb v2.38 - our disclaimer.