User User name Password  
   
Sunday 24.11.2024 / 21:51
Search AfterDawn Forums:        In English   Suomeksi   På svenska
afterdawn.com > forums > pc hardware > building a new pc > the official graphics card and pc gaming thread
Show topics
 
Forums
Forums
The Official Graphics Card and PC gaming Thread
  Jump to:
 
Posted Message
AfterDawn Addict

4 product reviews
_
18. November 2010 @ 22:51 _ Link to this message    Send private message to this user   
The brief, generalised graphics card buyer's guide - November 2010

What's on offer, how it performs versus the 4870, and how much it costs (UK prices are converted directly, do not consider any UK market markup)

GT 220: 025%, 058W, $70/£50
GT 430: 045%, 049W, $80/£60
HD5670: 055%, 064W, $80/£60
HD5750: 075%, 080W, $110/£80
GTS450: 095%, 106W, $130/£95
HD5770: 095%, 108W, $140/£105
GTX460: 125%, 150W, $160/£120
HD5830: 130%, 165W, $165/£125
GTX461: 135%, 165W, $190/£135
HD6850: 150%, 127W, $200/£150
HD5850: 155%, 151W, $240/£175
GTX470: 160%, 225W, $260/£190
HD6870: 170%, 151W, $260/£190
HD5870: 180%, 188W, $330/£240
GTX480: 190%, 275W, $450/£330
GTX580: 230%, 270W, $530/£390
HD5970: 270%, 294W, $500/EOL


As it stands then the GT220 is completely obsoleted by the GT430, which is nearly twice as powerful, uses less power and costs almost the same amount.
The HD5670 vs GT430 is a relatively close battle, the HD5670 is faster but uses a bit more powerful. 15W is splitting hairs this low-end though, and at the same cost, the HD5670 is the better card.
The HD5750 is in a relative class of its own as there is no direct rival for it from nvidia. It is a fair bit slower than the GTS450 and only $20 cheaper, but $20 makes a difference in low midrange, so it still proves itself useful.
The HD5770 vs GTS450 is another close battle, even power consumption and even performance. Right now the GTS450 is actually $10 cheaper, so is ultimately better value. A good deal on an HD5770 may nullify this, but there'll be equally as many of those from the other side. 2-1 to AMD.
The HD5830 vs the GTX460 pair is an interesting one. The HD5830 sits between the two for performance, but sits closer to the bigger 1GB 460 for power requirements. Ultimately though the HD5830 is as cheap as the 768MB 460, is marginally faster and has more memory. For the sake of the relatively minor power increase (after all, all three cards are dual 6-pin), the 5830 still seems attractive, while it lasts. 3-1.
The HD6850 vs the GTX460 1GB is no contest, sadly. The HD6850 uses so much less power it actually needs one less connector from the PSU. It's almost as cheap, despite the price increase, and considerably faster. 4-1.
The HD5850 and HD6870 vs the GTX470 is also no contest. The HD6870 is faster than the 470, the same price, and 75W more efficient. The HD5850 is similar to the 470 and also 75W more efficient, but $20 cheaper. 5-1.
The HD5870 vs GTX480 is a comparison I don't really like as I don't consider the cards true rivals. Yes the GTX480 is faster, substantially so in the more biased arena, but in real terms it's 5-10% for an extra 90 watts and $120. That hurts. 6-1
The HD5970 vs GTX580 is another comparison I'm not fond of as one GPU versus two is rather variable. Taking common scaling of 75% and ignoring the plethora of issues with dual graphics, the 5970 sits a good 15% above the 580, though it does use 25W less power, ish. For $30 less though, with the performance boosts you often see, I'd say for now at least, another win for AMD, assuming you can indeed get hold of a 5970 for the $500 figure (they are becoming EOL in the UK and cost a lot more. But why not just buy two HD6850s?) 7-1.

When you turn these cards from single to SLI/Crossfire pairs, it used to be such that nvidia gained more of an advantage with SLI scaling typically being a bit better than crossfire scaling. Now however, certainly with the HD6 series at least, the reverse is true, so on that note I see no reason to change any of these figures.
I won't refer to the older cards in crossfire as they are being phased out and there's no reason not to use the HD6s in CF instead.
I also won't refer to the smaller cards in crossfire/SLI as it's frankly pointless. One big GPU is still always a better buy than two small ones.

For now at least then, even if you ignore the scandals, the reliability issues and the general corporate behaviour of nvidia, I'd still be recommending AMD GPUs.
Who knows, this may change with the GTX560/570's release, we shall see.

Therefore, these are my recommendations:
$80: HD5670. PSU: 250W Stock or 380W Antec EarthWatts
$120: GTS450 or HD5770. PSU: 380W Antec EarthWatts
$160 (already own a PSU): GTX460 768MB or HD5830. PSU: 550W Corsair VX
$200 (do not own a PSU): HD6850. PSU: 380W Antec EarthWatts - the savings here mean these two options are about the same, hence the 6850 being the better option, assuming you don't already have a bigger unit.
$250: HD6870. PSU: 550W Corsair VX
$400: 2x HD6850. PSU: 550W Corsair VX
$500: 2x HD6870: PSU: 650W Corsair HX
$600+: Wait for the HD6900 series. If PC is needed now, see $500.

Note: The Corsair CX 430W has been stricken from the recommendations list upon closer inspection, as it uses a low-tier OEM, neither the high-end CWT or midrange Seasonic as Corsair have used prior and thus has low reliability ratings and it also has only a 2 year warranty, indicative of potential issues.
The low-end unit of choice is now the 380W Antec EarthWatts Green.
The 400W version of the CX is a Seasonic unit and is no longer produced, though still on sale in europe for now.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13

This message has been edited since posting. Last time this message was edited on 18. November 2010 @ 23:05

Advertisement
_
__
harvrdguy
Senior Member
_
21. November 2010 @ 00:15 _ Link to this message    Send private message to this user   
Quote:
ATI have the exact same software that nvidia have -individual adjustment of red/green/blue and hue/saturation
Wow! Well, if that is the case, then I happily am wrong - but I'll have to have you help me find it when I get another ati card installed. The last time I looked through catalyst, on my p4 which is now set up in the sunroom, looking for an equivalent to the nvidia wizard, I could not find anything like that! Is that something that they have always had, or did you see that come into being recently in the last year or so?

WELL THAT'S THE BEST NEWS I HAVE HEARD ALL DAY! THANKS!!

Haha - DXR quoted my whole post. That's funny somehow. The PBY Catalina - the A10 of WWII. Wait - the A10 is the Warthog, right? That Catalina actually reminded me more of the one that we got to shoot from in COD4 - the one that leans over a bit and flies in a giant circle. Google says it's the AC-130 gunship.

But I'll take your word for it, if you say it's closer to the Warthog. I'm surprised no other WWII game has featured that plane up to now, to my knowledge.

Originally posted by ddp:
DXR88, the catalinas were flying boats. the closes equivalent to an a10 for the allies would have been the RAF typhoons & tempests. the german equivalent would have been the tankbuster stuka mounting 2-37mm cannons.


Originally posted by dxr:
they where flying boats with enough fire power to sink a Civilian Class Ship. i'm pretty sure they wouldn't have a problem with flattening enemy Armour.

Originally posted by ddp:
i think you are talking about british coastal command's Short Sunderland flying boats refered to the german uboat crews as flying porcupines because of all the guns she carried.
Hey hey, I think ddp and dxr know their WWII. I am just amazed I never heard of these planes until the last couple of weeks playing World at War. I thought I was a WWII buff, but I guess not, lol.

Well, I'll happily fly one of those Short Sunderlands next game!

Speaking of WWII, did anybody else notice that Hula.com has a free HD series, WWII in Color? I watched the first 4 or 5 I think, of about a dozen. I'll go back later and finish all of them. Some amazing footage, including footage from some of the WaW battles that I just finished - like the battle of Peleliu.

Originally posted by estuansis on BC/2:
Oh yeah it's a whole new ballgame when all 4 squadmates are in a group chat working out the strategy. We were dominating :)
You bast*rds - well I'll get dxr on my team, Capt Biggs, and then you and your 3 buddies, can "run run run far away" lol.

Just kidding Jeff. Congrats on all that gold - and for coming in number #1! Don't let it go to your head! You'll be out there on the firing range with your 100 clip AK, blowing all the targets to smithereens!!

Group chat - on a squad level? Is that true private chat that the other guys can't listen to?

It used to bug me on Counter-strike, that they always had open chat, like how are you supposed to plot strategy - "they're going long A" - when the terrorists are hearing every word? Same thing on Day of Defeat. (Maybe I'm wrong - maybe the Day of Defeat chat is team based.)

Anyway, if it is true PRIVATE group chat, on a squad basis - that is awesome!

Nice article, Shaff, about Black Ops. I see the 5970 performed very well. And I noticed this: "In other words, if your graphics card was capable of playing World at War you will have no difficulty with Black Ops." Hey hey hey!!!!


Originally posted by Sam:
assuming you can indeed get hold of a 5970 for the $500 figure (they are becoming EOL in the UK and cost a lot more. But why not just buy two HD6850s?

Great idea! Maybe that's what I'll do, total wattage around 250, total cost around $400, total performance vs 4870, around 300%!

Originally posted by sam:
$600+: Wait for the HD6900 series. If PC is needed now, see $500.
Ok, on second thought I'll wait. Regarding your power supply reqts. I'll have to settle for something that works with this toughpower 750 - I hope the 6900s aren't power hogs.

Rich
AfterDawn Addict

4 product reviews
_
21. November 2010 @ 06:59 _ Link to this message    Send private message to this user   
I don't know if they've always had it, but it's been there for at least the last 3 years or so.


Note the color section above, which is where the individual color adjustment is.
Now this is the old style catalyst control center which hasn't existed for about a year already, and given the age of the card in this picture, this is probably taken in 2007.

In counter-strike rich, alltalk is a server option that can be on or off. If you know the admin, you can request it be changed from one state to another :P

Quote:
In other words, if your graphics card was capable of playing World at War you will have no difficulty with Black Ops.

I will hasten to add, this is because BlackOps looks no better than World at War.

The HD6900s will be big GPUs, well over 200W each. However, a decent 750W unit should handle two of them ok, as long as you have enough power connectors. It's entirely possible that 6970s will use a 6+8-pin connector, so you may need two of each.
My system draws around 700-720W with two 4870X2s rated at 286W each. The HD6970s are likely to be something like 220-230W each I suspect, so you shouldn't exceed arond 650W full load with two 6970s.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
Senior Member

4 product reviews
_
21. November 2010 @ 16:11 _ Link to this message    Send private message to this user   
Originally posted by sammorris:
I don't know if they've always had it, but it's been there for at least the last 3 years or so.


Note the color section above, which is where the individual color adjustment is.
Now this is the old style catalyst control center which hasn't existed for about a year already, and given the age of the card in this picture, this is probably taken in 2007.

In counter-strike rich, alltalk is a server option that can be on or off. If you know the admin, you can request it be changed from one state to another :P

Quote:
In other words, if your graphics card was capable of playing World at War you will have no difficulty with Black Ops.

I will hasten to add, this is because BlackOps looks no better than World at War.

The HD6900s will be big GPUs, well over 200W each. However, a decent 750W unit should handle two of them ok, as long as you have enough power connectors. It's entirely possible that 6970s will use a 6+8-pin connector, so you may need two of each.
My system draws around 700-720W with two 4870X2s rated at 286W each. The HD6970s are likely to be something like 220-230W each I suspect, so you shouldn't exceed arond 650W full load with two 6970s.
Black Ops uses the WaW engine, the reason the texture's look worse is because of the size increase in maps and would eat up more V-memory simply because of the size of the maps.

Powered By

This message has been edited since posting. Last time this message was edited on 21. November 2010 @ 16:13

Red_Maw
Senior Member
_
21. November 2010 @ 16:28 _ Link to this message    Send private message to this user   
Rich, for private chat in BC2 I usually use skype. It's especially nice with friends since the connection isn't server dependent.


harvrdguy
Senior Member
_
22. November 2010 @ 00:20 _ Link to this message    Send private message to this user   
Originally posted by sam:
Note the color section above, which is where the individual color adjustment is.


Ok, so you're pointing up to the color area. Let me create a graphic so I make sure I understand.

Is this the place you're talking about?





And about WaW:

Originally posted by sam:


Quote:
--------------------------------------------------------------------------------
In other words, if your graphics card was capable of playing World at War you will have no difficulty with Black Ops.
--------------------------------------------------------------------------------


I will hasten to add, this is because BlackOps looks no better than World at War.

Yeah, but "no better than" to me means, no better than the taste of cherry pie!

I loved the way WaW looked - best graphics I have ever experienced!!!

Blame it on my noobness coming off the 6000 points p4 to the 13000 points 9450 w 8800GTX, running 2560x1600 with 4XAA - IT LOOKED GORGEOUS! I've been saying it was the nvidia gamma wizard, but maybe it was THAT, yes, plus full pixel matching and 4xAA. For example, I tried running Far Cry 2, and I played it for a while, but when I went back to it after finishing WaW, I saw that I couldn't turn on AA at all, since it dropped me down to 24 or 25fps. Without AA, the game has major aliasing. So I just made a decision that I won't try to play that game until I get a better graphics card.

So anyway, I did fine with WaW at about 33 fps. Occasionally when I threw smoke, then the fps dropped down to the teens - but that was rare. I only had 4 smoke grenades, lol.

So I am sure I'll be happy with BO - I'm watching it right now on ebay to see what it sells at - I don't feel like forking out $60 if $30 will do.

What about Modern Warfare 2 - how demanding is that compared to WaW and Black Ops?

Will I be able to play it right now? I'll be playing through Modern Warfare 1 pretty soon again, with this faster rig, but I can probably pick up MW2 for maybe $25 or so since it's a year old already, if you think I can play it full 2560x1600 with 4xAA, and get 33 fps - in other words about the same stress level as WaW. Or is it a more demanding game?

Originally posted by sam:
The HD6900s will be big GPUs, well over 200W each. However, a decent 750W unit should handle two of them ok, as long as you have enough power connectors. It's entirely possible that 6970s will use a 6+8-pin connector, so you may need two of each.
My system draws around 700-720W with two 4870X2s rated at 286W each. The HD6970s are likely to be something like 220-230W each I suspect, so you shouldn't exceed around 650W full load with two 6970s.

EXCELLENT!
You quoted the 6850 at 150% of the 4870, and the 6870 at 170%. What are you guessing for the 6970, at the least, over 200% right? So each 6970 should match a 4870x2, is that correct? So two of them would match your current graphics power, with 2 x 4870x2.

Where do you think the dual gpu 6990 would come in, performance-wise, and power-consumption wise - just a guess is all I'm asking.

I looked at the current newegg reviews of the overclocked 4gb XFX 850mhz 5970 for $1200, and some of the guys said they were running 2560x1600 on crysis, highest settings, with good frame rates. Were they exaggerating, or maybe running high, but not ultra high?



Originally posted by dxr:
Black Ops uses the WaW engine, the reason the texture's look worse is because of the size increase in maps and would eat up more V-memory simply because of the size of the maps.

Oh, so worse textures. Hmmmm. Well, that makes sense - yes it's another Treyarch game. Ok, thanks for the heads up - I won't expect it to be quite as pretty as WaW was.

Originally posted by red:
Rich, for private chat in BC2 I usually use skype. It's especially nice with friends since the connection isn't server dependent
Skype! Now that makes sense!

I have never used skype, so maybe a little bit of research is needed, not to mention downloading the application and client. Sounds like a good idea.

Rich
AfterDawn Addict

4 product reviews
_
22. November 2010 @ 06:17 _ Link to this message    Send private message to this user   
Yes, that's the area I mean for Colr.
As for WAW, I think there are plenty of games that have better graphics, but in terms of looks vs. performance the COD engine is still comparatively efficient, so on your current hardware, WAW is perhaps as good looking as games will get, as better looking games won't run smoothly.
MW2 and BO are both largely the same for performance.
The HD6970 is estimated at around 250-260%, so well above the performance level of a 4870X2, even with 100% scaling.
Given that 2 GPUs scale better than 4, there should be at least a 40-50% performance improvement going from two 4870X2s to two 6970s, even in games that worked well with Quad CF.
I did include the 6990 in one of my charts, perhaps you didn't see it. The HD6970 I estimate at 330% (based on 95% scaling on two 6870s, as that's what the 6990 will be based on) at 295W.
The problem with Crysis as a benchmark is people are clueless at how to test it. An HD5970 probably could get a reasonable 25-30fps in the beginning section of Crysis on maximum detail, but as soon as you progress through the game that would drop to about 10.
HD5970s are still very powerful cards, but they aren't enough to take on Crysis at 30" resolution.
In one of the more demanding areas of the game where the bit-tech benchmark is run, a HD5970 manages a minimum fps of 18 and an average of 31 with AA disabled. With 4xAA enabled [though AA is pretty useless in Crysis, it only really works in Warhead] that drops to a minimum of 9fps and an average of 24. The minimum fps tanks because the cards run out of video memory at this setting, it requires around 1200-1300MB of video memory to run Crysis at this detail level.
Fortunately that's something the HD6900 series will be taking care of from launch, at long last.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
harvrdguy
Senior Member
_
22. November 2010 @ 21:24 _ Link to this message    Send private message to this user   
Quote:
As for WAW, I think there are plenty of games that have better graphics, but in terms of looks vs. performance the COD engine is still comparatively efficient, so on your current hardware, WAW is perhaps as good looking as games will get, as better looking games won't run smoothly.
MW2 and BO are both largely the same for performance.


Okay, that makes sense. Then I'll go try to find a deal on both of those COD games - great!!

Quote:
I did include the 6990 in one of my charts, perhaps you didn't see it. The HD6970 I estimate at 330% (based on 95% scaling on two 6870s, as that's what the 6990 will be based on) at 295W.


In the above quote you put HD6970 but I assume by context that was a typo and you meant 6990. It will be interesting to see where it is priced. Two of them would push past my toughpower 750, sounds like.

If I threw this mobo in another case with a bigger PSU, and adequate ventilation, given that my P5E x38 provides a full 16 lanes for each crossfire board, with three slots between boards, do you think the mobo and q9450 (overclocked) with 4 gigs DDR2 would work, or would it really make more sense to budget another $1000 for X58 i5 or i7 and DDR3.

Shaff's recent link to that techspot article on BO performance, showing how dependent the frame rates were not just on the gpu, but also on the cpu, was very interesting.

By the way, if one has two 6990s, and Crysis only supports 3 gpus, are you able to turn off crossfire on one of the cards?

So let me ask you Sam, a few pointed questions about the 6990. [i](The next part reminds me of that 50-Cent song, 21 questions.)[/i]

Are you considering upgrading to two 6990s, depending on where they are priced?

How much memory do you think they will come with to support 30" gameplay?

Would that put you at 50% more performance than what you have now, plus the graphics memory to support 30" play?

Considering that only 3 of the 6870 gpus would be enabled, with the extra memory would you think that a dual 6990 setup would allow you finally to run Crysis maxed out - or are you going to have to wait for the 7000 family?

And thanks for that explanation of why some of the guys thought Crysis was working well on that overclocked 5970 at 30." They probably wrote those comments in the early part of the game.

Rich
AfterDawn Addict

4 product reviews
_
23. November 2010 @ 09:27 _ Link to this message    Send private message to this user   
Yeah sorry it was a typo.
I wouldn't use two 6990s on a 750W unit, that is pushing it. Two 6970s should be OK though.
To be honest, a Q9450 would be a fair bottleneck for two 6970s, let alone two 6990s. If you go the quad graphics route, a new CPU is fairly mandatory, and it needs to be overclocked as well.
You can't select which GPUs you want to use in crossfire. All you can do is disable catalyst A.I. which shuts down crossfire completely [but does not necessarily fix all the problems with having it] or disable the physical crossfire connection between the dual cards, leaving you with just two GPUs.

I have absolutely no interest in using two 6990s, as I forsee it being slower than two 6970s in most games, due to how vastly slower the 6990s are per-GPU. There is absolutely no evidence so far to show that the increased crossfire scaling with the HD6800 series affects systems with more than 2 GPUs.
Did you spot that the HD6850 and HD6870 only have one crossfire connector on them, so you can only use two cards maximum?

From what I can gather at the moment, the HD6970 and HD6990 should both be 2GB per GPU cards, i.e. 2GB and 4GB respectively, but that's all speculation at this point.
The benefits of using two HD6970s would depend on scaling.
Let's take some fairly typical scenarios
HD4870X2 at 0%: 100%
HD6970 at 0%: 260%
HD6990 at 0%: 170%
I gain 160% here, but only 70% with a 6990.

HD4870X2 at 70% (Dual): 170%
HD4870X2 at 120% (Quad): 220%
HD6970CF at 95%: 510%
HD6990 at 95%: 330%
HD6990CF at 120% (Quad): 375%
I gain 130% here using two 6970s, but only 70% with two 6990s.

HD4870X2 at 100% (Dual): 200%
HD4870X2 at 400% (Quad): 400%
HD6970CF at 100%: 520%
HD6990CF at 100%: 660%
In the absolute best case scenario, I gain 30% with two 6970s, but 65% with two 6990s.
This is relatively rare though.

I think Crysis will be playable for a reasonable part of its length on two HD6970s at max, but I can see AA having to be disabled once you reach the alien caves, and it's still going to be largely unplayable without cutting a few detail levels once you reach the 'Reckoning' chapter.
For a fluid experience throughout, I think we'll be waiting until at least the HD7 series, if not further. You have to remember that the frame rate in that final section quarters - i.e. you'd need four times the graphics power to achieve the same frame rate in it as previous parts of the game.

Crysis Warhead may end up being easier to max first with it supporting 4 GPUs. I don't know if it has any ludicrous sections like Reckoning as I never finished the game.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict
_
23. November 2010 @ 11:04 _ Link to this message    Send private message to this user   
but doesnt crysis use motion blur pretty effectively, that lower framerates dont seem that bad?

PS your chart confuses me haha.



MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
AfterDawn Addict

4 product reviews
_
23. November 2010 @ 11:26 _ Link to this message    Send private message to this user   
All the percentage figures are relative performance to a single 4870, which is an easy baseline to use.

Crysis does use motion blur to the extent that lower frame rates are acceptable. However, the game is so demanding and thus, frame rates so low, that the realm of playability is still extremely difficult to reach. My statement above does consider this, and indeed, whereas I would normally advocate a minimum of at least 40-50fps, preferably 60 for a first person shooter, in Crysis a minimum of 25-30 is acceptable. This, however, is no mean feat. On a 30" monitor [let alone eyefinity] it takes two 2GB HD5870s to reach a minimum of just 15. Two 6970s therefore are going to be looking at around 21-22fps minimum, which is borderline enough for the area of the game bit-tech are using, which credibly is one of the more demanding levels, however, it is certainly not equivalent to the last hour or so of gameplay, which is much more demanding still.
For that reason, Crysis will still not be playable at max settings on two 6970s.
However, with AA disabled, since AA is really poor in the original Crysis, 1GB per GPU will suffice, at least at 2560x1600 anyway, and the minimum fps achievable with two 6970s will be about 29-30 for that section, so it should be relatively smooth up until said final chapter, which will be difficult to judge, but I expect it will either mean suffering 30 mins or so of poor frame rates, or lowering of detail levels.

It has been too long since I've seen a competent website bench Warhead, so I couldn't offer any estimates for that unfortunately.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
Senior Member

4 product reviews
_
23. November 2010 @ 13:14 _ Link to this message    Send private message to this user   
Warheads last mission....forgot what its called isn't as demanding as the Ship mission in the first. basically your taking on the same type of boss that attacked the ship, only on an airstrip.

the open space. the explosions are spread out, and there is no water pouring into the airstrip. its a bit more demanding than the rest of the game do to the shadowing from all the machine's blotching out the sun and of course the big bad boss. other than that it plays as well as the rest of warhead.

Powered By

AfterDawn Addict

4 product reviews
_
23. November 2010 @ 13:32 _ Link to this message    Send private message to this user   
So it is somewhat more demanding then? Not a particularly good sign :P



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
Senior Member

4 product reviews
_
23. November 2010 @ 13:38 _ Link to this message    Send private message to this user   
it helps that all the action isnt cramped up in a boat, thats sparking, exploding, soldiers dieing, head shaking, past the tac cannon that would of made my life easier, oh did i mention its taking on water also.

Powered By

ddp
Moderator
_
23. November 2010 @ 13:59 _ Link to this message    Send private message to this user   
blub, blub, blub!!!
harvrdguy
Senior Member
_
23. November 2010 @ 17:48 _ Link to this message    Send private message to this user   
Very interesting analysis. So in most cases, while scaling with two gpus is pretty good these days, scaling with quad cf is not so good on most games. Additionally, two gpus on a single card are usually throttled back to keep power down, so that is one more factor that reduces the attractiveness of 4 gpus, versus cf using two single cards like the 6970.

So, depending on price, it sounds to me like you may well be stepping up to two 6970s I take it - for what you estimate to be a 40-50% gain over your 2 X 4870x2.

But I hear you that Crysis will certainly not be "raped" with that kind of a system. It's shocking that frame rates actually quarter in one part of the game versus another. Not only that, but as dxr and ddp point out - "blub blub blub" - you are simultaneously trying to manage the very low frame rates and not drown at the same time, lol.

Still, kind of like how you never finished Warhead, with two 6970s, it will be fun to run around in the jungle in the earlier parts of the game at full 30" and maxed out, and enjoy all the special effects that so many have raved about, even if one can't really finish the game in comfort until the 7000 family arrives. That's an amazing game - launched in the 3000 family, and unplayable until the 7000 family - hahahaha.

By the way, would I be missing a lot playing the earlier parts of the game on xp? The q9450 rig has 4gb, and maxes at 8gb. I have a beta copy of vista that I never installed, and similarly a beta of windows 7. Would I want to skip vista but install windows 7 on an available partition? One of your posts talked about 12gb on vista, and 8 gb on windows 7. Would I need 8 gb to enjoy the earlier parts of the game 2560x1600 fully maxed, on dual 6970s and windows 7, and Q9450?

And if you already quoted it somewhere, I missed it - what do you estimate will be the power reqts. of the 2 gb 6970? I think you mentioned that my toughpower 750 should be able to support two of them, but did you say it was something like around 230 watts each?

Rich
AfterDawn Addict

4 product reviews
_
23. November 2010 @ 18:49 _ Link to this message    Send private message to this user   
Well they aren't necessarily throttled back, the HD5970 was the first case of that, but you no longer see two of the very best GPUs on a single card as reference because frankly, if you do, the architecture's not efficient.
Using dual graphics or more, is always a 'last resort' so to speak for more performance as obviously, it's less efficient than having one GPU that can attain that level of performance as dual graphics scaling is never guaranteed.
Using 4 GPUs is more of a problem as in the vast majority of systems it requires using dual cards, so the above applies automatically. Secondly, Quad CF is still much more an infant technology than Dual card CF. Crossfire has been around since the Radeon X series in 2005/2006 whereas QuadCF has only been available since the HD3800 series at the beginning of 2008. The methods by which crossfire works also make the use of more than 2 GPUs inherently more problematic.

The ideal world solution is to have the fastest single GPU and have four of it, as you can arguably never lose. Turning crossfire off eliminates the interlink entirely so you never get any CF-related bugs, you have the highest possible performance in no-CF environments, and the highest possible performance when crossfire scales as well.
The issue here with the modern generation is that this requires a motherboard with 4 slots with at least single spacing, as well as, usually, a power supply with 8 6-pin connectors or better. In addition to this basically meaning two PSUs and a very large case for cooling, it costs an absolute fortune.

The reason why I have two 4870X2s was because back then, with the exception of not being able to totally isolate crossfire, there were no downsides to using two of the dual card as, GTX280 notwithstanding, the 4870X2 used the most powerful GPU out there (and the GTX200 series did not support more than 3 GPUs until the release of the GTX295, a card plagued with insufficient memory to run properly at 2560x1600). This means that with a relatively excusable TDP of 572W, and only 8+8+6+6 power configuration that would run on any good 850W unit, and only two PCIe slots required, it was relatively straightforward to set up, at least, with a motherboard that could handle the stress on the PCIe bus.
It also didn't cost very much, by comparison at least. The RRP for said cards was £700 at the time, but I got them for £540 posted.
Right now, four GTX580s costs an outlandish £1560, nearly three times what I paid for 2008's top combination.
One can argue that crossfire was inferior to SLI at the time I bought the cards, but that's for another debate :P

As it happens Crysis Warhead was released a month after the 4870X2, but you weren't far off :P

You can't use more than 4GB of combined memory in windows XP, so if your graphics card uses 1GB, that's 3GB maximum of system RAM. Considering 4GB of system RAM without any restrictions is woefully inadequate for Warhead, I strongly disrecommend playing it until you have got a 64-bit version of Windows 7 installed (and, at least if you're maxing it out, a card with 1.5GB of video memory or more)

The HD6970 should be somewhere between 215 and 250W TDP. I'm guessing around the 225W mark of the GTX470/570.
A 750W PSU will run two, but it will be stressing the unit quite hard.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict

15 product reviews
_
23. November 2010 @ 21:15 _ Link to this message    Send private message to this user   
For the record Crysis maxed is nothing for my rig anymore... like seriously.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
AfterDawn Addict

4 product reviews
_
23. November 2010 @ 21:53 _ Link to this message    Send private message to this user   
1920x1200 is a fair bit easier than 2560x1600. Two 6850s at 1920x1200 will run games marginally better than two 6970s at 2560x1600, so that gives some indication.
I'd like to see how you fare in Reckoning though. Play the chapter through without AA, then play it through at 4xAA. No edited .inis, the game's default very high profile.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict

4 product reviews
_
23. November 2010 @ 22:13 _ Link to this message    Send private message to this user   
Is this your first post?



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
Senior Member

4 product reviews
_
23. November 2010 @ 22:17 _ Link to this message    Send private message to this user   
Originally posted by hahajohn:
I see.
What do you see?

Powered By

Red_Maw
Senior Member
_
23. November 2010 @ 22:21 _ Link to this message    Send private message to this user   
Originally posted by DXR88:
Originally posted by hahajohn:
I see.
What do you see?
Incoming lightning most likely XD


This message has been edited since posting. Last time this message was edited on 23. November 2010 @ 22:22

ddp
Moderator
_
23. November 2010 @ 22:35 _ Link to this message    Send private message to this user   
i don't see him now.
AfterDawn Addict

4 product reviews
_
23. November 2010 @ 22:38 _ Link to this message    Send private message to this user   
IMO, Afterdawn should have a little 'show deleted posts' except for case of profanity, as reading back on instances like this, it can get very confusing! :P



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
Advertisement
_
__
 
_
AfterDawn Addict

7 product reviews
_
23. November 2010 @ 22:54 _ Link to this message    Send private message to this user   
hahajohn? What kind of name is that for a spammer? Well, lightning took care of it LOL! ;)



To delete, or not to delete. THAT is the question!
 
afterdawn.com > forums > pc hardware > building a new pc > the official graphics card and pc gaming thread
 

Digital video: AfterDawn.com | AfterDawn Forums
Music: MP3Lizard.com
Gaming: Blasteroids.com | Blasteroids Forums | Compare game prices
Software: Software downloads
Blogs: User profile pages
RSS feeds: AfterDawn.com News | Software updates | AfterDawn Forums
International: AfterDawn in Finnish | AfterDawn in Swedish | AfterDawn in Norwegian | download.fi
Navigate: Search | Site map
About us: About AfterDawn Ltd | Advertise on our sites | Rules, Restrictions, Legal disclaimer & Privacy policy
Contact us: Send feedback | Contact our media sales team
 
  © 1999-2024 by AfterDawn Ltd.

  IDG TechNetwork