|
The Official Graphics Card and PC gaming Thread
|
|
AfterDawn Addict
4 product reviews
|
2. April 2009 @ 08:43 |
Link to this message
|
15% is included in all our items (except food and kids clothes, and it's 5% for utilities like Gas and Electricity), but some sites do advertise ex-VAT, some don't. Either way, the After-VAT price ALWAYS has to be printed somewhere on the page. If you bought the 7800GTX when it first came out, I'm not surprised you paid that much, they used to cost that much in pounds here as well, back in the 2005 glory days... haha.
This message has been edited since posting. Last time this message was edited on 2. April 2009 @ 08:44
|
Advertisement
|
|
|
AfterDawn Addict
4 product reviews
|
2. April 2009 @ 08:58 |
Link to this message
|
A proper benchmark for the HD4890 has been upped. I'll post some detailed results in this post later this afternoon. One of the things I notice is that in some games the HD4890 vs HD4870 is utterly ineffectual (meaning the GTX275 wins) but in others I'm seeing bigger differences than preliminary reports indicate, for instance the minimum frame rate in HAWX when the detail is maxed is 27 for an HD4870, 35 for an HD4890.
Also, this will please hardware geeks no end:
http://techreport.com/discussions.x/16670
OK, Here's the full scoop on the GTX260 vs HD4870 VS GTX275 vs HD4890, courtesy of DriverHeaven.net!
Minimum Frame Rate Graph:
http://img14.imageshack.us/img14/7636/minimum4890.gif
Average Frame Rate Graph:
http://img27.imageshack.us/img27/681/average4890.gif
Comments to note:
The HD4890 architecture does not change the fact that ATI cards perform poorly in Call of Duty: World at Warcompared to their nvidia rivals. What's interesting though is they respond relatively well to clock speed increases as the XFX card shows. Since the newer architecture offers a 100mhz overclock over the 4870, less than the increase that XFX added on top, this must mean the newer Radeon architecture actually performs worse overall at the game. Perhaps the cards are being optimised in a way COD:WAW does not work well with. Crossfire performance for this game is, after all, mediocre at best with numerous graphical glitches as well as unsatisfactory performance.
In GTA4, on average frame rate ATI sit marginally ahead by around 2-3%. However, when the going gets tough, once again the Geforces win out, as we know already. On a more positive note, 19 to 22fps minimum is a relatively distinct improvement in minimum frame rate, even if it does mean you have to buy the overclocked card to reach it.
Left 4 Dead mirrors the same problems COD:WAW encounters, the performance gain from going 4870 to 4890 is negligible. The HD4890 competes with the GTX275 solely on the basis that the HD4870 solidly beats the GTX260 at this game. Again, some overclocking does as much good as the change in card.Intriguingly though, none of these results are what you would consider unplayable. There are far smoother experiences to be had with Dual or Quad graphics, but it shows you that if you want to play Left 4 Dead on high with a 30" monitor and only one GPU, you can.
Empire Total War carries an SSAO feature which at the time of the test does not function in DirectX9 with ATI cards, so won't be used for the comparison. However, compared to the other settings the GTX260 drops from 22-28 to 12-16, and the GTX275 from 29-33 to 16-20. A shocking decrease in performance it must be said. Perhaps this is one for the SLI configs, and with luck, in future drivers to come, Crossfire as well. Overall what's most important in this game is not how the ATI vs nVidia comparisons run at each end, but the huge gap between the GTX260 and GTX275 -nearly 20% at average, and nearly 25% at minimum. The HD4870 is definitely a better choice than the GTX260 here, but the GT275 can hold its own against the HD4890 rather well - higher minimum, lower average as we see so often from nVidia (Realistically this is what I prefer to see in games). It's worth noting that at this resolution performance is not exactly stellar on these cards. To run Empire on a 30" screen with AA, Dual graphics are mandatory.
World in Conflict: Soviet Assault is an interesting title. At 1920x1200 the HD4870 and GTX260 are on almost dead even ground, and the GTX275 and HD4870 aren't much apart either. Up the res, and the situation remains the same for the 4870 and GTX260, but the GTX275 pulls ahead of the HD4890, being a better fit for the XFX overclocked card rather than the stock product. Overall, advantage nVidia, but it's not by very much.
The trend continues with HAWX, ATI have a relatively significant lead here with average frame rates but the minimum frame rates are still very much in nvidia's favour especially with the older generation hardware. In addition to these test results, HAWX offers a 'Very High' SSAO mode utilising DirectX 10.1. Since nVidia do not produce DX10.1 cards, this is strictly ATI territory. Performance drops from 27-53 to 27-46 for the HD4870, 35-57 to 35-52 for the HD4890 and 35-61 to 37-54 for the overclocked HD4890. On the whole then, no impact on minimum performance, but a slight drop in average frame rates. Interestingly, the XFX HD4890 card is bundled with this game. A truly fitting demonstration of the card's performance.
Lastly, the review features a 'Crossfire' section. It is not made entirely clear whether the HD4890 has been combined with the HD4870, or whether it's two HD4890s, but I'm almost certain it i two HD4890s as three different brands were available on test for the review. It is assumed they are using the two stock-clocked cards for these tests. In lack of having two GTX275s on hand, there are no SLI results available as of yet until another site like Bit-Tech posts a review. However, since we all know at aD that SLI is only worth bothering with for i7, there's no huge benefit to them being here anyway yet, other than for interest purposes.
HD4890 CF results as follows:
Crysis Warhead 1680x1050 Enthusiast
HD4870: 14-21
HD4890: 15-25
HD4890CF: 26-41, but with Display Corruption. Expected fixed in Catalyst 9.4. 65% scaling is relatively impressive though, given the difficult circumstances.
Lost Planet Colonies 1920x1200 4xAA
HD4870: 16-28
HD4890: 16-31
HD4890CF: 32-58
About as close to 100% scaling as you get, but noticeably the performance improvement between the generations is not enough to up the minimum frame rate by the 6% required to see 17 on a single card.
As per usual, ATI are ahead in HD video playback performance, but by now the difference is never going to be enough to save you many watts or cure any lag, there shouldn't be any with either card.
The one difference is that CUDA saves a lot of processor activity when playing MKVs, so HD video users who have the CoreCodec will see benefits there, but only if they have a weak CPU, as even though the nVidia setup has less than half the CPU usage of the ATI in this instance, the maximum CPU usage ever reached was 10.5% in the latter case.
My final word on the speed of the cards
1st. XFX Radeon HD4890 XXX
2nd. GeForce GTX 275
3rd. Radeon HD4890
4th. Radeon HD4870
5th. GeForce GTX 260
This message has been edited since posting. Last time this message was edited on 2. April 2009 @ 11:11
|
AfterDawn Addict
4 product reviews
|
2. April 2009 @ 15:23 |
Link to this message
|
After all that work, the HardOCP benchmarks completely conflict with what I just wrote. How fortunate.... :S
Check these results out:
Main chart details maximum playable settings. Comparisons shown underneath main chart.
(Minimum / Average fps)
Crysis Warhead
HD4870: 1920x1200 Gamer 1/29
GTX275: 1920x1200 Gamer (Enthusiast Water & Textures) 7/29
HD4890: 1920x1200 Gamer (Enthusiast Water & Textures) 8/32
GTX280: 1920x1200 Gamer (Enthusiast Textures, Water, Volumetric Lighting, PostProcessing) 7/29
GTX280: 1920x1200 Gamer 2xAA 16/25
GTX275: 1920x1200 Gamer 2xAA 17/26
HD4890: 1920x1200 Gamer 2xAA 17/28
HD4890 OCMax: 1920x1200 Gamer 2xAA 20/32
Fallout 3 Max
HD4870: 2560x1600 4xAA 27/44
GTX275: 2560x1600 4xAA 34/49
GTX280: 2560x1600 8xAA 28/44
HD4890: 2560x1600 4xAA 33/50
HD4890: 2560x1600 8xAA 28/48
HD4890 OCMax: 2560x1600 8xAA 33/53
Far Cry 2 Max
HD4870: 2560x1600 30/40
HD4890: 2560x1600 2xAA 30/41
GTX275: 2560x1600 4xAA 34/41
GTX280: 2560x1600 8xAA 32/44
HD4870: 1920x1200 4xAA 32/46
HD4890: 1920x1200 4xAA 37/49
HD4890: 2560x1600 4xAA 13/35
GTX275: 2560x1600 4xAA 15/38 (Test Variance due to Game bug)
HD4890 OCMax: 2560x1600 4xAA 21/40
FEAR 2 Max
HD4870: 2560x1600 4xAA 29/50
GTX275: 2560x1600 8xAA 23/48
HD4890: 2560x1600 8xAA 29/52
GTX280: 2560x1600 8xAA 25/47
HD4870: 2560x1600 4xAA 29/50
GTX280: 2560x1600 4xAA 30/57
HD4890: 2560x1600 4xAA 33/55
HD4890: 2560x1600 8xAA 25/52
GTX280: 2560x1600 8xAA 12/37 (Test Variance to note difference in other game areas)
HD4890: 2560x1600 8xAA 15/37 (Test Variance to note most demanding section)
HD4890 OCMax: 2560x1600 8xAA 25/42
This message has been edited since posting. Last time this message was edited on 2. April 2009 @ 15:24
|
AfterDawn Addict
15 product reviews
|
2. April 2009 @ 16:33 |
Link to this message
|
I remember paying $550 USD for my 8800GTX back in the day ;P
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
7 product reviews
|
2. April 2009 @ 16:56 |
Link to this message
|
Wow guys! Is bleeding edge performance really worth that price? I guess to each his/her own. Im not a huge gamer. However certain games have tried pushing me in that direction LOL! Grand theft auto for instance. Quite possibly the best game ive ever played. I must admit though, The legend of Zelda, the Ocarina of Time was the big one in its day. I still like it a lot to this very day! However, ive never played Far Cry, Crysis, etc. Heck, ive only played Gears of War briefly. I was quite impressed with that one! Probably the sharpest thing ive seen on my screen.
What would a person like me need, to be wowed, and be CLOSE to the times for at least a year. What I mean is, play any released game for the next year, and have an agreeable picture, little to no stutters, without spending more than 150$. Is that possible? Keep in mind what im now running, as well as you guys being substantially more critical than myself :)
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
2. April 2009 @ 17:10 |
Link to this message
|
You can get an HD4870 for about that these days in the US (or do you mean in your own currency?) and that falls into that category perfectly.
|
AfterDawn Addict
7 product reviews
|
2. April 2009 @ 17:17 |
Link to this message
|
Ok, now how about a comparison. Ive been reading a little about PhysX. And from what ive read, its mainly Nvida based.
Lets just say I find PhysX intriguing! Now, what do you think between the two of these. I know, I know, im a cheap SOB right? LOL! As I said, im not TOO picky. There's totally nothing wrong with your being picky though. Im picky about other things :D
Will either of these cards get me by for a year or 2, and be substantially better than the 8600GT? And how much better?
http://www.newegg.com/Product/Productcom...N82E16814130445
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
2. April 2009 @ 17:22 |
Link to this message
|
You can ignore the PhysX really, it's just a fad that only really works in Mirrors Edge, and even then only minor benefits...
The HD4830 is miles better than the 9600GT.
|
AfterDawn Addict
7 product reviews
|
2. April 2009 @ 17:28 |
Link to this message
|
Originally posted by sammorris: You can ignore the PhysX really, it's just a fad that only really works in Mirrors Edge, and even then only minor benefits...
The HD4830 is miles better than the 9600GT.
What makes you come to that conclusion? FORGET the PhysX! I read a similar comment.
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
2. April 2009 @ 17:29 |
Link to this message
|
Because the HD4830 is the competitor to the 9800GT, not the 9600GT. The HD4830 is just a much faster card than the 9600GT.
|
AfterDawn Addict
7 product reviews
|
2. April 2009 @ 17:40 |
Link to this message
|
Something tells me im just gonna be better off spending a little more money! Here's an interesting question. How much better in your opinion is the 4870 compared to the 4830? Disregarding the X2 model obviously LOL!
To the naked eye KEEP IN MIND
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 2. April 2009 @ 17:41
|
AfterDawn Addict
4 product reviews
|
2. April 2009 @ 17:40 |
Link to this message
|
Maybe 50% faster...
|
AfterDawn Addict
7 product reviews
|
2. April 2009 @ 17:48 |
Link to this message
|
Hmmm...thats rather considerable! Im gonna grit my teeth on this one, and just do it here in a week or so. But one last question. LOL, I think ive asked this one once before, not certain. One more time, Is the 1gb version worth the few extra pennies? Is it really only beneficial to dual monitors?
http://www.newegg.com/Product/Productcom...N82E16814102810
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
2. April 2009 @ 17:50 |
Link to this message
|
Not really dual monitors, it's beneficial if you use Anti-Aliasing though (though only at 1920x1200 and above)
|
AfterDawn Addict
7 product reviews
|
2. April 2009 @ 17:54 |
Link to this message
|
My monitors max is 1920 X 1200(Samsung 2433BW). Do you think its worth it?
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
2. April 2009 @ 18:13 |
Link to this message
|
Depending on what games you play, probably yes.
|
Member
|
2. April 2009 @ 20:35 |
Link to this message
|
edit: nothing to see here, carry on.
Gigabyte 3d Aurora 570 | Coolermaster Real Power M1000 | Core i7 920 @ 4.0ghz w/ Coolermaster V10 | DFI LanParty DK-X58-T3EH6 | Evga GTX275 SLI @ FTW clocks | 3x2GB Patriot 1600mhz | Seagate 500GB x2, 1.5TB, FreeAgentPro 500GB | Dell 2408WFP | Windows 7 Professional x64
This message has been edited since posting. Last time this message was edited on 3. April 2009 @ 00:57
|
AfterDawn Addict
15 product reviews
|
2. April 2009 @ 22:48 |
Link to this message
|
LOL @ your perceived hate of the 9600GT. You don't like that card Sam XD
The 9600GT is fast as hell but not really for 1920 x 1200. You'll be a lot happier with say the 4870 or if you can stretch it, the 4890. Again the 4830 is lacking for high res gaming and will struggle a bit with newer games at 1920.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
7 product reviews
|
3. April 2009 @ 00:35 |
Link to this message
|
Ok, lets say I do buy the 4870. How long do you think it will last me? Ahh heck! They could release a game next week, that would only run on the 4890 for all we know LOL! Makes the world go round right! But nah, seriously. You think i'd be happy for AT LEAST 2 years!
To delete, or not to delete. THAT is the question!
|
Member
|
3. April 2009 @ 00:56 |
Link to this message
|
You'll be fine with a 4870 for a while, if you've lasted this long with an 8600gt then I'd say you can't go wrong. Depends on how much the 4890 is though - looks like about 10% more performance, so if it's not too much more then I'd suggest stretching just that little bit extra for one of those instead. Either way, both of those cards would be a biblical improvement on an 8600gt at high res.
On another note, I've been trawling the internet all day and I still can't pick between crossfire 4890 and gtx275 sli.... which do you think would be better?
Gigabyte 3d Aurora 570 | Coolermaster Real Power M1000 | Core i7 920 @ 4.0ghz w/ Coolermaster V10 | DFI LanParty DK-X58-T3EH6 | Evga GTX275 SLI @ FTW clocks | 3x2GB Patriot 1600mhz | Seagate 500GB x2, 1.5TB, FreeAgentPro 500GB | Dell 2408WFP | Windows 7 Professional x64
This message has been edited since posting. Last time this message was edited on 3. April 2009 @ 01:13
|
AfterDawn Addict
7 product reviews
|
3. April 2009 @ 01:14 |
Link to this message
|
Im not positive, because i've only recently got sucked into the gaming side of things LOL! But you may check if your board supports BOTH crossfire and Sli before you consider both. I know my last board didnt mesh well with ATI due to an Nvidia Northbridge! You might look into that first. But if you already know...Im not sure which would be better.
Thanks for your opinion by the way :D
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 3. April 2009 @ 01:16
|
Member
|
3. April 2009 @ 01:24 |
Link to this message
|
No worries champ.
Yeah it supports both, getting a new rig with a dfi x58 - as far as I know most x58 boards are sli and crossfire compatible. Might have to wait for sammorris to arrive lol. He is quite the helper
Gigabyte 3d Aurora 570 | Coolermaster Real Power M1000 | Core i7 920 @ 4.0ghz w/ Coolermaster V10 | DFI LanParty DK-X58-T3EH6 | Evga GTX275 SLI @ FTW clocks | 3x2GB Patriot 1600mhz | Seagate 500GB x2, 1.5TB, FreeAgentPro 500GB | Dell 2408WFP | Windows 7 Professional x64
This message has been edited since posting. Last time this message was edited on 3. April 2009 @ 01:25
|
AfterDawn Addict
7 product reviews
|
3. April 2009 @ 01:45 |
Link to this message
|
Originally posted by rubixcube: No worries champ.
Yeah it supports both, getting a new rig with a dfi x58 - as far as I know most x58 boards are sli and crossfire compatible. Might have to wait for sammorris to arrive lol. He is quite the helper
Ehhh...Hes ok! LOL *Sarcastically speaking of course* Totally jerking your chain sam! ROFL
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
3. April 2009 @ 07:41 |
Link to this message
|
I don't hate the 9600GT at all, but it's the same price as a far superior card, it isn't worth buying. If you were to go back and search for posts from me when it first came out, I was quite fond of it for being the first card to show near-100% SLI scaling, and it was better value back then too. nVidia just seem to be overpricing everything these days (except the GTX275).
For reference, the HD4870 is a full DOUBLE the performance of the 9600GT, and only about 40-50% more expensive from some of the cards I've seen.
Omegaman: They won't release a game that outcasts the HD4870 for a long time. The only 'you can or you can't situation that exists right now is DirectX10.1 such as used in H.A.W.X. and both the 4870 and 4890 can pull that off. No Geforce card can yet do it. You should be fine for at least a year at full 1920x1200 on high settings, and the other year you certainly won't have any trouble running games, but you may have to knock the detail or res back a bit.
Rubix: Until I see the GTX275 SLI benches in full I'm not sure, but the HD4890 CF combo is a force to be reckoned with, that much is certain.
As for SLI/CF cross-compatibility, you're essentially right, nForce chips for Socket 775 (and any other recent gen socket) are strictly SLI-only (and also crap). Intel and AMD chipsets are strictly Crossfire-only, but obviously any of them will work with a single card from either manufacturer.
i7s have the ability of using EITHER Crossfire or SLI, but not both simultaneously, and I believe only boards with the GT200 chip (which is where all the problems start) can use Triple-SLI, but don't count me on that. Triple SLI is so outrageously expensive I don't do much research on it...
|
Advertisement
|
|
|
Member
|
3. April 2009 @ 08:35 |
Link to this message
|
Well, I recently read an old review that showed crossfire 3870s, and the average increase in FPS was 70% or so... which I believe at the time was much better scaling than sli (the reference cards were 8800gt's). But, that being said, the gtx275s look a little faster to begin with, so I'm stil no closer to making a decision. Lol. Let me know when you find some good benchmarks!
Gigabyte 3d Aurora 570 | Coolermaster Real Power M1000 | Core i7 920 @ 4.0ghz w/ Coolermaster V10 | DFI LanParty DK-X58-T3EH6 | Evga GTX275 SLI @ FTW clocks | 3x2GB Patriot 1600mhz | Seagate 500GB x2, 1.5TB, FreeAgentPro 500GB | Dell 2408WFP | Windows 7 Professional x64
This message has been edited since posting. Last time this message was edited on 3. April 2009 @ 08:41
|
|