The Official Graphics Card and PC gaming Thread
|
|
AfterDawn Addict
15 product reviews
|
25. January 2011 @ 09:02 |
Link to this message
|
Read my edit as I tear Xfire a new one :P
I tend to keep my OS running lean and clean. A few background apps for essential functionality and that's it. If I'm not using it, it sure as hell don't need to be open :P
Currently have:
MSN
Steam
FRAPS
Rocketdock
CCC
That's my usual. Warhead itself takes about 1.45GB
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 25. January 2011 @ 09:07
|
Advertisement
|
|
|
AfterDawn Addict
4 product reviews
|
25. January 2011 @ 09:23 |
Link to this message
|
Bit-Tech Analysis
reference points
HD6850: 150 (AMD scale)
HD6870: 171 (AMD scale)
HD6950: 192 (AMD scale)
HD6970: 212 (AMD scale)
GTX470: 160 (nVidia scale)
GTX480: 190 (nVidia scale)
GTX570: 194 (nVidia scale)
DiRT2
1920x1200 NoAA: 235A/173N
1920x1200 4xAA: 215A/168N
2560x1600 NoAA: 206A/165N
2560x1600 4xAA: 201A/164N
CoD:Black Ops
1920x1200 NoAA: 164A/165N [Approaching CPU Limit]
1920x1200 4xAA: 173A/166N [Approaching CPU Limit]
2560x1600 NoAA: 172A/165N
2560x1600 4xAA: 177A/160N
Just Cause 2
1920x1200 NoAA: 185A/181N
1920x1200 4xAA: 188A/170N
2560x1600 NoAA: 157A/163N [Below playability threshold]
2560x1600 4xAA: 171A/167N [Below playability threshold]
Bad Company 2
1920x1200 NoAA: 200A/175N
1920x1200 4xAA: 205A/173N
2560x1600 NoAA: 172A/163N [Below fluidity threshold]
2560x1600 4xAA: 187A/165N [Below fluidity threshold]
Idle consumption
HD6870: 20W
HD6950: 29W
HD6970: 31W
GTX560Ti: 26W
Load consumption
HD6870: 151W
HD6950: 159W
HD6970: 203W
GTX560Ti: 185W
Idle temperatures (assume 25C room)
GTX560Ti: 33C
HD6870: 39C
HD6970: 39C
HD6950: 53C
Load temperatures (assume 25C room)
GTX560Ti: 62C
HD6870: 69C
HD6950: 79C
HD6970: 79C
GTX470/480: 97C
Conclusion
The GTX560Ti, in a mildly AMD-biased environment like CoD:BlackOps, is a direct rival to the HD6870. In relatively neutral titles like Just Cause 2, it essentially equals the HD6950, as far as 1920x1200.
At higher resolutions, the GTX560Ti falls flat and ends up comparing to weaker AMD GPUs. However, since these resolutions are beyond the ability of cards in this price point, this a little irrelevant.
In mildly nvidia-biased titles such as Bad company 2, the GTX560Ti is able to be on the heels of the HD6970, a considerably more expensive card ($370 vs $275).
In the further-biased DiRT2, the GTX560Ti is able to clearly surpasss the HD6970 at reduced resolutions. However, frame rates here are quite high so this may not have the impact that the numbers indicate.
With additional stress on the card, the 560 sits just behind the HD6970.
The cooler on the 560 is every bit as adept as that on the 460. Temperatures are well within spec, too within spec in fact, the card could run quieter safely.
Power-wise Fermi still hasn't caught the Radeons, but the efficiency of the GTX560 is admirable, A 185W card performing at HD6950 level on average, meaning Fermi efficiency has now in fact surpassed that of the Radeon HD5870.
Heat gremlins conquered, noise gremlins conquered, and now efficiency gremlins conquered. It may have taken 16 months since the HD5 series release, but there's now no real hardware disadvantage to owning a Fermi, ignoring nvidia's other ills as a company in general.
|
AfterDawn Addict
4 product reviews
|
25. January 2011 @ 09:39 |
Link to this message
|
HardOCP analysis
F1 2010 @ 2560x1600 4xAA
GTX560Ti: 142 [170 at 1920x1200 8xCS]
HD6870: 171
HD6950: 190 [176 with 8xAA]
Result: Landslide to AMD. The HD6870 far surpasses the GTX560, able to run at 30" res, versus the 24" of the geforce.
Civilization V @ 2560x1600 8xAA
HD6870: 171 [203 at 2xAA]
GTX560Ti: 195 [233 at 4xAA]
HD6950: 209 [217 at 4xAA]
Result: Minor win for nvidia. The GTX560 can surpass the marginally more expensive HD6950 at 4xAA and below. Only at 8xAA with a mediocre frame rate does the HD6950 take over.
Metro 2033 @ 2560x1600 Hi NoAA
HD6870: 171 [224 at 1920x1200 AAA VH]
GTX560Ti: 177 [240 at 1920x1200 AAA VH]
HD6950: 196
Result: Minor win for AMD. The GTX560Ti beats the HD6870 by 3.5% at 30", 7% at 24", but costs 15% more.
Bad Company 2 @ 2560x1600 4xAA
HD6870: 171 [251 at 1920x1200 8xADMS]
HD6950: 205 [238 at 2xAA]
GTX560Ti: 233 [252 at 1920x1200 8xTRMS]
Result: Very close, win for AMD. The GTX560Ti cleans up at 2560x1600, a resolution in which even it isn't very playable. At more sensible settings it's a dead heat between the GTX560Ti and the cheaper HD6870. The more expensive HD6950 can still pull off 2560x1600 if AA is reduced.
Mafia II @ 2560x1600 AA -NoAPEX
HD6870: 171 [265 at 2560x1600 NoAA Med-APEX]
GTX560Ti: 176
HD6950: 199
Result: Minor win for AMD. The GTX560Ti only surpasses the HD6870 by 3%, versus its 15% higher price. The ability to run Apex at 30" requires the removal of AA.
Overall, mostly wins on the AMD-side.
HardOCP do note that the stock cooler on the GTX560 is both quieter and cooler than that of the HD6 series cards. Given how quiet the HD6 card already are though, I don't really see this as much of an advantage.
|
Senior Member
4 product reviews
|
25. January 2011 @ 09:49 |
Link to this message
|
I really need to upgrade, I've got my phenom 2 x4 965 sitting in an AM2+. just afraid that when i do up grade to AM3 2 weeks later they'll have AM4 With DDR5. which is my typical experience.
been eyeballing the 890FX but the only thing different than my current 790FX it seem is USB3 And Sata3 Support. im just wondering if the 890 chipset offers better performance.
Powered By
|
AfterDawn Addict
|
25. January 2011 @ 10:49 |
Link to this message
|
i do hate the GTX 560!
i have the money to burn right now, about £350. was ready to get an SSD and a 6950 (which i wanted to bios mod to a 6970, but now damn this makes it hard to choose, esp as it OCs like a beast.
hopefully this starts a price war, and these cards drop to below 150 soon?
never has xfire auto update my games, though steam wants to do any time i open steam. but thats not an issue to me. would liek if xifre did that. no ram leakages, no problems with either one. never use wither to voip with. always use skype or vent or mumble or teamspeak.
MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
|
AfterDawn Addict
4 product reviews
|
25. January 2011 @ 11:10 |
Link to this message
|
The waiting forever for my PC to boot with xfire installed is what killed it off for me.
|
AfterDawn Addict
15 product reviews
|
25. January 2011 @ 11:12 |
Link to this message
|
Quote: been eyeballing the 890FX but the only thing different than my current 790FX it seem is USB3 And Sata3 Support. im just wondering if the 890 chipset offers better performance.
Better performance? Probably not except for some minor benchmark gains owing to DDR3. Typically though 890FX will have better power management, better OCing, likely run cooler etc. Heard nothing but good things about it, but AFAIK it's just 790FX with some new features.
Quote: never has xfire auto update my games, though steam wants to do any time i open steam. but thats not an issue to me. would liek if xifre did that. no ram leakages, no problems with either one. never use wither to voip with. always use skype or vent or mumble or teamspeak.
Ahh but Steam has sheer simplicity. Also I already use it for a great many games, so it means I don't have an extra program sitting there. For me the choice is obvious. Everybody get on Steam, join a group chat. Simple simple simple. Makes managing large groups of people easy when there's a central program we all have and needs no special configuration. Open the window, click chat, done.
Quote: The waiting forever for my PC to boot with xfire installed is what killed it off for me.
Same here. At first I thought I had hardware issues =/
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 25. January 2011 @ 11:14
|
AfterDawn Addict
4 product reviews
|
25. January 2011 @ 22:50 |
Link to this message
|
The monster GPU performance index chart
HD6970: 211
HD6950: 191
HD5870: 179
HD6870: 170
HD5850: 157
HD6850: 149
HD5830: 132
HD4890: 114
HD4870: 100
HD5770: 92
HD4860: 85
HD4850: 80
HD5750: 77
HD4770: 79
HD4830: 60
HD5670: 50
HD3870: 45
HD5570: 42
HD4670: 40
HD2900XT: 40
HD3850: 37
HD2900Pro: 35
HD3690: 25
HD4650: 25
HD2900GT: 25
X1950XT-X: 24
X1900XT: 20
HD3650: 18
X1800XT: 16
HD4550: 15
HD5450: 14
X1800GTO: 12
HD2600XT: 12
HD4470: 11
HD4450: 10
HD3470: 10
HD4250IGP: 9
HD2400XT: 7
HD2600 Pro: 6
HD2400 Pro: 3
HD3450: 3
GTX580: 230
GTX570: 194
GTX480: 190
GTX560Ti: 180
GTX470: 161
GTX4601GB: 134
GTX285: 128
GTX465: 128
GTX460: 122
GTX275: 121
GTX280: 112
GTX260+: 100
GTX260: 91
GTS450: 90
GTS250: 76
9800GTX+: 76
9800GTX: 70
8800GTX: 67
8800GTS512: 65
9800GT: 58
8800GT: 57
GT240: 50
9600GT: 43
8800GTS640: 38
9600GSOG92: 37
8800GS: 37
GT430: 32
GT220: 25
9600GSOG94: 23
9500GT: 19
8600GTS: 19
8600GT: 15
9400GT: 5
8500GT: 5
8400GS: 3
G210: 2
Yes, these charts are comparable as the HD4870 and GTX260-216 (denoted GTX260+) are direct performance equals, and both have been used as the reference points of 100.
GTX560Ti scores are derived from the new benchmark from HardOCP released today, and are not an approximation.
Index 191 HD6950 1GB for $273 shipped, or Index 180 GTX560Ti for $258 shipped? I think I know what I'd choose :P
For dual config setups, apply roughly the following figures:
X1 series crossfire: x1.4
HD2 series crossfire: x1.6
HD3 series crossfire: x1.6
HD4 series crossfire: x1.8
HD5 series crossfire: x1.75
HD6 series crossfire: x1.95
HD4/5/6 series TriCF: x2.5
HD4/5/6 series QuadCF: x3.1
8800 series SLI: x1.7
9000 series SLI: x1.7
GTX2 series SLI: x1.75
GTX470/480 SLI: x1.8
GTX460/GTX5 series SLI: x1.85
GTX2/3/4/5 series TriSLI: x2.6
This message has been edited since posting. Last time this message was edited on 25. January 2011 @ 22:54
|
Senior Member
4 product reviews
|
26. January 2011 @ 00:32 |
Link to this message
|
Originally posted by Estuansis: Better performance? Probably not except for some minor benchmark gains owing to DDR3. Typically though 890FX will have better power management, better OCing, likely run cooler etc. Heard nothing but good things about it, but AFAIK it's just 790FX with some new features.
if i'm not mistaken isnt DDR3 slower than DDR2 but capable of higher Bandwidth. still makes you wonder why its still called Double Data Rate.
Powered By
|
AfterDawn Addict
15 product reviews
|
26. January 2011 @ 00:50 |
Link to this message
|
Quote: if i'm not mistaken isnt DDR3 slower than DDR2 but capable of higher Bandwidth. still makes you wonder why its still called Double Data Rate.
Nope, DDR3 is faster but has higher latencies. Most benchmarks will show it to be marginally faster overall than DDR2. In practice they are mostly the same. Though AMD CPUs will show a slightly larger difference with lower latencies due to the nature of their memory controller.
I'ts called double data rate because the memory has a base clock. My 1066 is running at 533MHz DOUBLED, not 1066MHz
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
Senior Member
4 product reviews
|
26. January 2011 @ 01:33 |
Link to this message
|
Originally posted by Estuansis: Quote: if i'm not mistaken isnt DDR3 slower than DDR2 but capable of higher Bandwidth. still makes you wonder why its still called Double Data Rate.
Nope, DDR3 is faster but has higher latencies. Most benchmarks will show it to be marginally faster overall than DDR2. In practice they are mostly the same. Though AMD CPUs will show a slightly larger difference with lower latencies due to the nature of their memory controller.
I'ts called double data rate because the memory has a base clock. My 1066 is running at 533MHz DOUBLED, not 1066MHz
yeah but that whole doubling business is so inefficient.
Powered By
|
AfterDawn Addict
15 product reviews
|
26. January 2011 @ 02:00 |
Link to this message
|
Oh I agree. But because the speeds of computer components can vary so wildly, you need the multipliers. If there were a more efficient way to do it, I think they'd be doing it. Who knows, maybe some radical changes are in store for the next decade. In 10 years, we'll have truly photorealistic VR simulators with holographic haptic feedback interfaces. Don't tell me the internal hardware won't be different too.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
7 product reviews
|
26. January 2011 @ 16:41 |
Link to this message
|
Thanks for the performance chart Sam. Always appreciated. It's a very straightforward simply way to understand generally which card is in the lead.
Sorry about off topic, but does a PS3 (1st generation) take a typical computer PSU cord for it's power? I don't wanna fry the PS3 any more than it already is. A buddy gave it to me, and told me if I can fix it, I can have it.
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
26. January 2011 @ 16:47 |
Link to this message
|
Yes:
|
AfterDawn Addict
7 product reviews
|
26. January 2011 @ 16:50 |
Link to this message
|
Looks can be deceiving. I just wanted to be sure. Thanks :) Essentially, it may want a precise amperage. I'm afraid to hook it up to the wall, fearing full on juice LOL! Yes the console says 120V @ ~3.2amps...
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
26. January 2011 @ 16:56 |
Link to this message
|
3.2A is way over the maximum, that's just a limit. Typically it will use about 1.8-2.0A on a 110V supply.
|
AfterDawn Addict
7 product reviews
|
26. January 2011 @ 17:00 |
Link to this message
|
Thanks! I probably won't be plugging it in til this weekend. The guy says he had it repaired before. Something about the way the GPU is connected. It's like it's constantly pulling on its connection. Sounds kind of goofy to me. I've heard that it only costs 100 - 150 to have sony repair it. So if the problem isn't obvious to me, I'll just have them repair it, and save money on owning my PS3 :D
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
15 product reviews
|
26. January 2011 @ 17:02 |
Link to this message
|
The power is drawn, not pushed ;P
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
7 product reviews
|
26. January 2011 @ 17:05 |
Link to this message
|
Not all transformers are created equal. I bought a 3 amp 12v converter for 120V ac, and it fried an amplifer I connected it to. I could feel the juice it put out. It felt similar to 120 :S
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
26. January 2011 @ 17:09 |
Link to this message
|
Say what? It was obviously faulty then. Jeff is right, the laws of physics don't suddenly bend if you buy a powerful PSU :P
|
AfterDawn Addict
7 product reviews
|
26. January 2011 @ 17:12 |
Link to this message
|
Hmmm, I wonder if I somehow had it hooked up wrong then. I should not have felt 12V @ 3amps through skin, eh?
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
26. January 2011 @ 17:15 |
Link to this message
|
It wouldn't be 3 amps through your skin. 3 amps through your skin would probably kill you. 3 amps is only the maximum current produceable, remember the resistance of skin is very high so it takes a high voltage to get the few miliamps of current through your body that causes shocks. Typically it takes around 50V to produce a shock that is dangerous to a healthy adult.
|
AfterDawn Addict
7 product reviews
|
26. January 2011 @ 17:17 |
Link to this message
|
I've felt full on 120 before. It was slightly more potent than the aforementioned converter :p I was of course 12 yrs old, so I was resilient :p I've felt it since too. My dad was showing me a weird trick with an outlet...
To delete, or not to delete. THAT is the question!
|
Senior Member
4 product reviews
|
26. January 2011 @ 19:06 |
Link to this message
|
Originally posted by omegaman7: I've felt full on 120 before. It was slightly more potent than the aforementioned converter :p I was of course 12 yrs old, so I was resilient :p I've felt it since too. My dad was showing me a weird trick with an outlet...
don't worry Ive plugged a lamp in in the dark only to find out what happen when you don't hit the holes right....quite the shocking experience if i do say so myself.
|
Advertisement
|
|
|
AfterDawn Addict
7 product reviews
|
27. January 2011 @ 00:51 |
Link to this message
|
Originally posted by DXR88: Originally posted by omegaman7: I've felt full on 120 before. It was slightly more potent than the aforementioned converter :p I was of course 12 yrs old, so I was resilient :p I've felt it since too. My dad was showing me a weird trick with an outlet...
don't worry Ive plugged a lamp in in the dark only to find out what happen when you don't hit the holes right....quite the shocking experience if i do say so myself.
LOL! I can imagine!
To delete, or not to delete. THAT is the question!
|