|
The Official Graphics Card and PC gaming Thread
|
|
AfterDawn Addict
7 product reviews
|
26. April 2010 @ 02:18 |
Link to this message
|
I read a couple of reviews on newegg, that the 480 doesnt run as hot as people are saying. Or at least with proper case cooling, they do just fine ;) I'll never know though. By the time I can afford another card, whether its nvidia or ati, there will likely be something else out...
To delete, or not to delete. THAT is the question!
|
Advertisement
|
|
|
AfterDawn Addict
4 product reviews
|
26. April 2010 @ 04:55 |
Link to this message
|
Quote: it seems that when anyone says anythign against ATI you get very defensive and try and pin something bad on nvidia, why is that?
Just quickly, you've been doing the exact opposite for nvidia. Me and Jeff can see the GTX400 series for the pile of wasted silicon it is. If you disagree you have no basis to criticise the 4GB HD5970 for its extra power consumption or price.
More importantly though, there is no logic to using that argument on the 4GB HD5970. No pretentious fanboyism here, just look at the details.
HD4870 original retail £160, HD4870X2 original retail £380 -> X2 card is 18% more expensive than two cards ->benefits of double memory per GPU, benefits of single card, same cooler used, noisier
HD5850 current retail £230, HD5970 current retail £520 -> X2 card is 13% more expensive than two cards ->no double memory benefit, benefits of single card, same cooler used, noisier
HD5870 current retail £320, 4GB HD5970 expected retail £750 -> X2 card is 17% more expensive than two cards -> benefits of double memory per GPU, benefits of single card, benefits of pre-overclock, better cooler used, quieter.
On paper the 4GB HD5970 is better value by comparison than the HD5970 and HD4870X2 that have gone before it. Therefore by your logic the HD4870X2 and HD5970 are duds too. They certainly weren't.
The 4GB HD5970 is no every man's card. Aside from the huge price tag placing it outside the reach of most gamers, the 4GB of total memory only has benefits for 30" monitor users and Eyefinity owners. As a 30" monitor user, I can clearly see scenarios where 1GB of video memory is limiting, there are at least 3 or 4 occasions where this is currently true. This situation will only get worse.
Crysis Warhead on max detail uses about 1.4GB or so. That doesn't leave much left for the GTX480. Plenty left in the 4GB HD5970 though. Crossfire scaling is no problem, it achieves the same 77-78% in Bad Company 2 as normal cards. The substantial lead it has over the pair of 480s in the Bad Company 2 benchmark (13%) will apply to other games, and either place it on top performance-wise, or at least greatly reduce the bias against it.
On top of this, what about eyefinity users? Sure, there may not be a big pack of titles yet that need more than 1GB per GPU at 4.096 Megapixels. But what about 5.292? 6.912? Heaven forbid, 12.288 or 13.824! There are games out there I'm sure that two slightly overclocked 5870 GPUs could take on at such gargantuan resolutions, without the frame rate dropping anywhere near the levels you get when you run out of video memory.
Two GTX480s for £900, using four PCIe connectors and having a maximum power output of 710/532W, running at 95ºC at 95% of their maximum fan speed.
Or, an HD5970 4GB for £750 or so, using two PCIe connectors, a maximum power output of 670/488W, running at 65ºC at 84% of its maximum fan speed, which in turn is half that of a normal graphics card. This is Bad Company 2, and the thing is running quiet. It's pulling the same amount of power above the GTX480 as the GTX480 is above the GTX470. It runs 30ºC cooler than its rivals without blowing your ears off.
The fact that this card is £750 ish is almost immaterial, it's the best gaming graphics card there has ever been, and that's an end of it. They could charge what they like. But instead, they've priced it competitively. No matter how ridiculous £750 for one graphics card sounds, considering all of the above, and the competition, frankly, it's almost good value for money. Sad but true.
A dud? I lol'ed. If it ran as hot and as loud as the GTX480, I'd still disagree, and blame it on the card's sheer ability. The thing's a monster. I fully intend to get my mits on one. As for it being hard to find, I'm doing my best to get in on the pre-orders. Have spoken to Scan who say they will let me know as soon as it's available to pre-order, and will shortly be doing the same for other stores as well.
|
AfterDawn Addict
15 product reviews
|
26. April 2010 @ 05:08 |
Link to this message
|
Quote: All of the above
Agree 100% Excellent thrashing! lol
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 26. April 2010 @ 05:09
|
AfterDawn Addict
4 product reviews
|
26. April 2010 @ 11:44 |
Link to this message
|
As for the comment about these being hard to find, people seem to keep citing Sapphire as saying the cards are a limited edition. Through various research, I can't find one quote that states the Sapphire card will be a limited edition, only the Asus and XFX. The Asus makes sense, as the last ROG card they made was also a limited edition, and the XFX card states it's a limited edition in the title.
Remember that the last ROG card Asus made was a limited edition of 1000, not just because it was exclusive and expensive, but also because it was a technological experiment. Dual GTX285s in the same card was never an official product from nvidia, just another of Asus' many experiments, such as the HD3850X3. The HD5970 4GB is an official ATI product, ATI themselves have set the MSRP, and given it its own codename. Got to love ATI's sense of humour, their midrange product is given the name of one of the tallest trees in the world, the high end product gets the name of a smaller tree, their flagship dual card the name of a poisonous plant, and their ultra-high end dual GPU offering, a small garden flower!
Although they may not make that many more than 1000 units, I expect the Sapphire card to be the most numerous, because the 4GB HD5970 is worthy of a full product. The MARS295 at the time as laughable as it was a whopping 60% more expensive than the two cards it was (albeit with extra memory), and this on top of the fact that the GTX285 was already absurdly overpriced.
The 4GB HD5970 carries a much more modest 20-30% premium over the two cards it replaces (also with the additional memory), and the single card it clones is a well-priced offering already. Couple this with the disappointment that was the GTX400 series, and it makes pretty good sense for this card to be popular. I imagine the number of people who, if there was availability, buy two GTX480s in SLI is very high, well over 1000, because, apart from nvidia fanboys being nvidia fanboys, that amount of graphical horsepower, and just the dual graphics prestige alone, convinces numerous people to buy stuff like that. So, as a direct competitor, coming in at a lower price, running at half the heat, a fraction of the noise, having more memory and still leaving the other 16x slot free for expansion, the 4GB 5970 being a limited edition all over would be a missed opportunity.
Make no mistake, I'd still much rather single graphics cards be sufficiently powerful to be able to do away with multi graphics altogether, but look what happened to the last card that tried to do that...
With the ever-increasing pile of games that require hugely powerful GPUs to run, I don't think CF/SLI based products are going anywhere any time soon. Look at how much more popular SLI and Crossfire have become in the last 3 years. I suspect Crysis has had a lot to do with that.
This message has been edited since posting. Last time this message was edited on 26. April 2010 @ 11:50
|
AfterDawn Addict
4 product reviews
|
26. April 2010 @ 12:05 |
Link to this message
|
Grand Theft Auto 4: Episodes from Liberty City (AA excluded)
Medium Textures, 60 View Distance, Otherwise Maximum, Multi-GPU assumes 1.80x,2.52x,3.24x scaling
Minimal: Radeon HD2900XT/HD3800 series/HD4670/HD5570 or above, Geforce 8800 series/9600GSO G92/GT220 or above
Reduced: Radeon HD3850X2/HD4750CF/HD4770/HD4830CF/HD4850/HD5670CF/HD5750 or above, Geforce 8800GT/9800GT/GTS250 or above
Moderate: Radeon HD4850X2/HD5770CF/HD5850 or above, Geforce 8800GTS G92 SLI/GTX280/GTX275/GTX470 or above
Good: Radeon HD5830CF/HD5970 or above, Geforce GTX260-216 SLI/GTX295/GTX470 SLI/GTX480 or above
Optimal: Radeon HD5850 Tri-CF/HD5970QCF or above, Geforce GTX280 Tri-SLI/GTX295 QSLI/GTX470 SLI or above
Extreme: Geforce GTX470 OC QSLI / GTX480 QSLI
Very High Textures, 20 View Distance, Otherwise Maximum
Minimal: Radeon HD5570 or equivalent, Geforce GT240 or equivalent
Reduced: Radeon HD5670CF/HD5750 or equivalent, Geforce GTS250 or equivalent
Moderate: Radeon HD4860CF/HD5770CF/HD5850 or equivalent, Geforce GTX260 SLI/GTX295/GTX470
Good: Radeon HD5830CF/HD5970 or equivalent, Geforce GTX275 SLI (not 280 or 295)/GTX470 SLI
Optimal: Radeon HD5850Tri-CF or equivalent, Geforce GTX280 QSLI/GTX295 QSLI/GTX470 Tri-SLI/GTX480 SLI
Extreme: Geforce GTX480 OC QSLI
CPU Requirement
Maximal Settings for 1GB applied (Very High Textures, 1920x1080, 63 view distance)
Limited to M27, A41
Clock speeds based on Yorkfield Architecture
M10: Single core 3.25Ghz, Dual core 1.6Ghz, Tri-core 1.35Ghz, Quad core 1.3Ghz
M15: Single core 4.25Ghz, Dual core 2.35Ghz, Tri-core 2Ghz, Quad core 1.95Ghz
M20: Dual core 3.25Ghz, Tri-core 2.7Ghz, Quad core 2.6Ghz
M25: Dual core 4.25Ghz, Tri-core 3.95Ghz, Quad core 3.7Ghz
This message has been edited since posting. Last time this message was edited on 26. April 2010 @ 12:05
|
Senior Member
|
26. April 2010 @ 18:14 |
Link to this message
|
Originally posted by Estuansis: That's extremely weird that Realtek onboard would have an issue. It's one of the most used sound chips in gaming PCs... and worldwide for that matter. If everything else you try fails, a soundcard isn't too bad an idea besides. An Audigy SE can be had for under $20 and will kick the living crap out of integrated sound. The quality is absolutely no match at all.
Though if you decide to get a soundcard, let me know which model you decide on. Many of the cheaper cards have most of their features "locked out" under Vista and 7 as part of Creative's plan to force everyone onto X-Fi cards. There are hacked drivers available which re-enable everything and have worked problem free for any game I've seen them used on. Specifically one of my best friends is using a Creative Audigy SE with hacked drivers on Win 7 x64 with a single 5850 and he plays BC2 with me several times a week. With great performance to boot.
It sounds like an Audigy SE is the way to go. I remember I had issues with it before so I returned it. It was exactly because some of the features were locked out.
|
AfterDawn Addict
4 product reviews
|
26. April 2010 @ 18:15 |
Link to this message
|
The daniel_k drivers don't work for all systems. They are completely unusable for me in Windows 7, sound becomes corrupted and garbled as soon as they are installed.
|
AfterDawn Addict
15 product reviews
|
26. April 2010 @ 18:42 |
Link to this message
|
That's actually pretty rare AFAIK. It seems it depends on your card. The very old Audigy SE's might need an older driver package. But I'm not entirely sure. I've installed the Daniel K drivers for 5 different cards and I've never seen an issue...
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
4 product reviews
|
26. April 2010 @ 18:43 |
Link to this message
|
Tested on two different Audigy SEs, one from 2006 and the other from 2010.
|
AfterDawn Addict
7 product reviews
|
27. April 2010 @ 13:09 |
Link to this message
|
From what I've been reading, the nvidia 480 doesn't necessarily run hot. If one has good cooling, it does not. Or perhaps the hot ones, simply need their blocks reseated. I've heard that watercooling is a sweet spot for those. There are overclockers running barely over 80C. I imagine if I had 4 120mm fans in the side of my HAF932, 2 480's would do quite nicely. I'll probably never know. Those cards are wayy too power hungry for my liking LOL! The idea of needing a 1KW PSU does not sound appealing. Now their next revision could be a good thing. Depends on how ati Vs Nvidia are doing in a few months, when I can afford to upgrade ;)
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 27. April 2010 @ 13:10
|
AfterDawn Addict
4 product reviews
|
27. April 2010 @ 13:14 |
Link to this message
|
In bad cases they run hot, in good cases they run warm, same as any high-end graphics card. The difference is, hot for a typical card like an HD4870 is 87C, and warm is 70C or so. Hot for a GTX480 is 96C, and warm is 85C or thereabouts. If nvidia cards were built to withstand such temperatures, it would be no issue. All of the ones in the past, however, have not. In the case of an ATI card, if you need emergency extra cooling, you can turn the fan speed right up. Typically an HD5870 may run at 70C with a fan speed of 2200rpm. Turn it up to its maximum of 5000rpm and it will run in the 40s, definitely no problem there. The GTX480 in many scenarios already runs at nearly 90C with the fan speed at 3000rpm or higher. The max fan speed of the card is 4000rpm which is likely to get you down to the high 70s, low 80s. In a cramped environment, or in SLI, maximum fan speed is still going to be pushing 85-88C. That leaves very little breathing room.
One HD5970, which is two high-end DX11 GPUs, uses less power at load than one GTX480. That should sum it up really.
|
AfterDawn Addict
7 product reviews
|
27. April 2010 @ 13:20 |
Link to this message
|
Power requirements are definitely the major turn off LOL!
I have no doubt I could cool the beast, but the power requirement is absurd...
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 27. April 2010 @ 13:21
|
AfterDawn Addict
4 product reviews
|
27. April 2010 @ 14:22 |
Link to this message
|
There's not much reason to use 270W to run a GTX480 when 170W to run an HD5870 does an almost identical job!
|
Senior Member
|
27. April 2010 @ 17:19 |
Link to this message
|
|
AfterDawn Addict
4 product reviews
|
27. April 2010 @ 17:58 |
Link to this message
|
Not unless in a quad-threaded game. There aren't very many of those yet.
|
AfterDawn Addict
7 product reviews
|
27. April 2010 @ 18:01 |
Link to this message
|
Originally posted by sammorris: Not unless in a quad-threaded game. There aren't very many of those yet.
LOL! I love it. Quads aren't even fully supported, and they're releasing yet another monster. 6 Core behemoths LOL! And a 12 core in the not so distant future ;)
To delete, or not to delete. THAT is the question!
|
Senior Member
|
27. April 2010 @ 18:10 |
Link to this message
|
It was going to either be getting a new processor or buying a Jtag'd xbox. I guess I will go with the xbox.
I am still having issues with bad company 2 as well :(
I can play as long as I want it seems if the game doesn't have punkbuster.
(I asked about the processor because BC2 supports it)
This message has been edited since posting. Last time this message was edited on 27. April 2010 @ 18:13
|
AfterDawn Addict
7 product reviews
|
27. April 2010 @ 18:25 |
Link to this message
|
Hey sam. Remember once upon a time, when I said "I don't believe the Samsung 2433Bw would support 2 simultaneous video signals"? Well...now I'm thinking it can. I just looked once again at its capabilities, and it does have an ability to switch from D-sub to Dvi. Tonight, I'll find out if I'm in hog heaven :D I just lost my other display. It was actually my mothers. She was ready for it. I'm bordering on tears welling up LOL! Now I REALLY want a larger display :p
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
27. April 2010 @ 18:28 |
Link to this message
|
Eh? All monitors that have two inputs can do this...
|
AfterDawn Addict
7 product reviews
|
27. April 2010 @ 18:30 |
Link to this message
|
But you more or less just shrugged last time. See...the asus monitor actually listed this on neweggs site. The samsung did not. However, now they list it. Convenient eh LOL!
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
27. April 2010 @ 18:40 |
Link to this message
|
Perhaps I misunderstood :S, when you have multiple inputs, there is always a switch, you don't have to unplug the cable for it to recognise the other input.
|
harvrdguy
Senior Member
|
27. April 2010 @ 22:07 |
Link to this message
|
Originally posted by sam: The performance discrepency in the HardwareCanucks article between the HD5970 and two HD5850s is that the HD5970 shares a 16x slot between the GPUs so they get 8x each, the HD5850 Crossfire gets 16x each (assuming they're using a 32x board which they will be)
Sam, that answer is interesting as hell - I didn't think of that which makes perfect sense. And at the same time that answer brings me completely back to the discussion about the impact on the 5970 (not your 4870x2 cards) from being put on an 8x slot versus a 16x slot.
I realize your current motherboard does not provide dual 16x slots for crossfire (correct me if I am wrong.) So presumably, the two 5850s with 2gig memory each that you could buy, versus a 4gig petunia, would end up at 8x each, just as the two gpus on a 5970 will end up with 8x each.
So now, what you're saying makes sense - for you.
For me, on the other hand, or for anybody who might have a crossfire board with two 16x slots - then I hear you agreeing with the idea of two 5850s instead of a 5970.
At the same time, doesn't this bolster the argument that hardware canucks was making that the 5970, on a PCI-E 2.0 motherboard, would take a bad hit if it had to be put on an 8x slot, thereby reducing the bandwidth to each gpu to only 4x?
Anyway, yes, I picked the worst game. Here's how all the games on that page stacked up - Drop in fps, average and minimum, at 2560x1600, by moving 5970 to x8 instead of x16 slot on PCI-E 2.0 motherboard:
Fallout 3: average -5% minimum -10.5%
Far Cry 2: average -16% minimum -36%
Hawx (DX10): average -2.9% minimum -13.6%
Hmmmm. Hahaha. Well, clearly Far Cry 2 was the very worst. But Sam, you are quite clear that you pay strict attention to minimums - so drops of 10% and 13% are not exactly trivial and could make the difference between something that's playable or not.
Anyway, I concede the point - it's not as bad as the Far Cry 2 data looked. LOL
Originally posted by shaff: mmm thats true lol. well i shall wait then, or till im am forced to upgrade again
You and me both Shaff!
If Miles gives me that 9450 4 gig with 8800 card, does that count?
Wow, hyperthreading can kill some applications! I just was playing around with an m4v to avi converter for DivX movies - my mac brother downloads legit podcasts from itunes and we picked up a little thing that plays Divx on the tv off a thumb drive or usb disk drive. Anyway, I tried the converter on a variety of computers - the 4ghz p4 was getting 140 frames per second (the laptop at 1.6 ghz pentium M had scored 155 - both of them with 2mb L2 cache) and I noticed it wasn't pulling 100% cpu load like the other computers - only about 50% on each hyperthreaded logical core. I turned off HT, and cpu usage jumped to 100% and frames jumped to 230 frames per second, up from 140 - that's 65% faster with HT off! I left it off!
Well I have to run. That was quite a nice thrashing you administered to Shaff as Jeff noted, Sam - lol - that's what he gets for complaining about me never upgrading. Hahaha.
(So look guys, if Miles gives me that 4 gig 9450 and I throw a 2 gig 5850 on it, what kind of 3dmark6 score will I get on stock clocks? And then, how high should I try to overclock it? I don't know what motherboard he's got until I get my hands on it.)
Rich
|
AfterDawn Addict
15 product reviews
|
28. April 2010 @ 02:35 |
Link to this message
|
So I have 2 5850s now. Watch for benchies in the next few days :D
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
7 product reviews
|
28. April 2010 @ 02:43 |
Link to this message
|
Originally posted by sammorris: Perhaps I misunderstood :S, when you have multiple inputs, there is always a switch, you don't have to unplug the cable for it to recognise the other input.
Confirmed. My samsung is even more awesome than I originally realized :p
Jeff, I'm jealous. I can't upgrade for probably quite some time :(
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 28. April 2010 @ 02:44
|
Advertisement
|
|
|
AfterDawn Addict
15 product reviews
|
28. April 2010 @ 02:56 |
Link to this message
|
Quote: Jeff, I'm jealous. I can't upgrade for probably quite some time :(
I'm at an even worse spot! The 6 core Phenom II is out and it's awesomely priced. I have the board, I have the high performance memory, I have the video horsepower, but I'm not sure if I can justify the money. As mentioned elsewhere it's a great value for money but many games don't fully take advantage of the two extra cores. If it shows good benefits in the games play, (namely BFBC2, Crysis, L4D2, Metro 2033) then I might be seriously tempted to get it. But there still remains the question of whether I should or not. And if so will I part with my faithful old 940 or put it in a budget box?
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 28. April 2010 @ 02:56
|
|