User User name Password  
   
Wednesday 27.11.2024 / 16:11
Search AfterDawn Forums:        In English   Suomeksi   På svenska
afterdawn.com > forums > pc hardware > building a new pc > the official graphics card and pc gaming thread
Show topics
 
Forums
Forums
The Official Graphics Card and PC gaming Thread
  Jump to:
 
Posted Message
AfterDawn Addict

4 product reviews
_
29. February 2012 @ 02:23 _ Link to this message    Send private message to this user   
Fans on the nforce 6 were still rare. Mine didn't have one. Thermoprobe placed the heatsink of my 650i (less power hungry than the 680) at 72C. That puts the internal chip temperature at at least 80C. The highest the sensor for the board ever read was 42C.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
Advertisement
_
__
AfterDawn Addict

7 product reviews
_
3. March 2012 @ 14:33 _ Link to this message    Send private message to this user   
Ha ha! The new Evga precision has a frame limiter. The GPU can run even cooler now :p Limiting to 80Fps, drops the temps 10C in GRiD. I know, I'm overly cautious eh? Perhaps... but I like things to run cool. Having had Nvidia chips overheat in the past ;)



To delete, or not to delete. THAT is the question!
harvrdguy
Senior Member
_
4. March 2012 @ 22:22 _ Link to this message    Send private message to this user   
Originally posted by buzzkill:
LOL sorry to burst your bubble. Like other games though, play it anyway. For the record, the first game is actually quite decent, and I enjoyed it. It's the second game that was a let-down.
Originally posted by Sam:
I'd recommend Bioshock rich, It's a good laugh.
Well thanks Jeff and Sam. That sounds better than "sucks." Haha.

Originally posted by omega:

The GPU can run even cooler now :p Limiting to 80Fps, drops the temps 10C in GRiD. I know, I'm overly cautious eh? Perhaps... but I like things to run cool. Having had Nvidia chips overheat in the past ;)
I'm with Kevin. That Zalman cooler (was it the V1000?) you told me about, Jeff, dropped the 3850 by so many degrees - and the very quiet fan always ran 100% since it wasn't adjustable. I don't think I could ever bring it to hit 80 after that aftermarket cooler. Prior to that - many times in the 90's - bordering on 100, and I too was a nervous wreck. The only thing that helped was I kept telling myself I only paid $110 for the card on ebay, not like the $330 that Kevin just forked out!

And on the current rig - the 8800GTX - upwards above 90 in warm weather on furmark (I know that's an artificially heavy load) even using rivatuner to fix the fan at 100% which I still do when I game. The antec just doesn't have any decent airflow. I started to fix that somewhat with the kama bay in front - especially when I put a kaze 3000 rpm in there! You said, Sam, that I'd hear it through the headphones, but when I'm gaming I really don't hear it. (I play it pretty loud - people will come into the trailer and be standing right in front of me, and I don't know they're there, lol.)

And I think I posted about six months ago - Miles said something to me that after a year, it was unlikely anybody would ask for that rig back. I didn't tell him but that sounded like I could drill some holes.

So I mounted a side intake - another 3000 rpm kaze - blowing right on the graphics card, to complement the front intake. That right there dropped temps 10 degrees. Very rarely do I bust over 80 - but as you guys have just posted - when I do hit in the 80s, I may be in for some crashing, which happens from time to time. With those two kazes blowing, when I crank up the fan controller, a lot of wind comes out the back of that antec.

So I think that my comfort factor is more like Kevin's - mid to low 70's makes me happy - above that, no. :D

Rich
AfterDawn Addict

15 product reviews
_
5. March 2012 @ 02:17 _ Link to this message    Send private message to this user   
Quote:
Well thanks Jeff and Sam. That sounds better than "sucks." Haha.
Well certainly don't let my opinions sway your decision to buy the game. A lot of people had fun with it and it is a really interesting game. Some of my opinion is so people don't expect much from over-hyped games and some of it is because I don't agree with lazy game-making. At least for its time, Bioshock 1 was creative, technically competent, and good-looking. There was no excuse for that steaming pile of a sequel.

Sadly it's a problem stemming from the current generation of consoles lasting as long as they have. Any multi-platform release WILL be limited to match the consoles. It's up to the developers to decide if the game is limited FOR consoles or limited BY consoles. Regretfully, any company wishing to make a technological impact either develop for PC only or are heavily PC centric. Even console gaming's biggest franchise, Call of Duty, started quite firmly on the PC. More interesting is that games designed as console or even platform exclusives, end up being better quality in the end. God of War 3, Metal Gear Solid 4, and Halo Reach come to mind.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 5. March 2012 @ 03:01

AfterDawn Addict
_
22. March 2012 @ 11:51 _ Link to this message    Send private message to this user   
No discussion on any of other new generations?



MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
AfterDawn Addict

4 product reviews
_
22. March 2012 @ 17:27 _ Link to this message    Send private message to this user   
So the GTX680 is out. It's faster than the HD7970 (albeit it only makes a difference at 1920x1200), it's cheaper than the HD7970, and it even only uses the same amount of power as the HD7970. Long and short, it's better unless you run 2560x1600+, in which case it's about even.

HD7970 price cut incoming or AMD are heading for trouble...


Card / Performance Index @ 1920x1200 / Performance Index @ 2560x1600 / Idle power consumption / Load power consumption:

HD6970/195/210/20W/210W
GTX580/230/230/50W/250W
HD7970/240/310/10W/210W
GTX680/300/320/10W/210W




Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13

This message has been edited since posting. Last time this message was edited on 22. March 2012 @ 17:41

AfterDawn Addict
_
23. March 2012 @ 00:16 _ Link to this message    Send private message to this user   
Wow, nvidia did bloody well. Power consumption is shocking for the performance considering its from nvidia



MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
AfterDawn Addict

7 product reviews
_
23. March 2012 @ 00:19 _ Link to this message    Send private message to this user   
Wish my hand wasn't forced to buy a new card. My GTX 260 still had some life in it :( I guess it still could. But of course I won't know for a while. I'd hate to pop my PSU! I guess I could use an external power supply, and see what happens. I have a cheapy laying around ;)



To delete, or not to delete. THAT is the question!
AfterDawn Addict

15 product reviews
_
23. March 2012 @ 01:10 _ Link to this message    Send private message to this user   
Have recently discovered the Crysis 1 port for Xbox 360. Built on CryEngine 3 and iirc released shortly after Crysis 2. I almost cried thinking PC lost elite status, but LOL the differences had me on the floor rolling. It's a competent version of Crysis but it is not even in the same ballpark as the original. It does seem to run very well though. Have rented it for a few days just to try it and it's definitely Crysis, but the entire game is stripped down or otherwise graphically compromised. The lighting, textures, draw distance, poly-count are all akin to Low settings in the PC version with like medium shaders and shadows. Basically as bland as possible. It, for the most part, does resemble Crysis. They also adjusted the time of day in a few levels, so it's hard to make a fair comparison in some areas. The particle effects are still generally strong but the physics are stunted. Nothing is quite as breakable or interactive. The plants don't blow in the wind but they do move as you push through them. The framerate does dip a bit but is surprisingly a stable 30 for the most part. Also objects have a very noticeable pop-in akin to the console version of Oblivion, ie s***. So basically this proves that even extremely optimised, consoles can not run Crysis at all. As a stand-alone console game it would be bearable, but it lacks most of what made Crysis impressive, the graphics. It looks like a slightly more polished 360 game, which it is. It's not terrible, but it's no credit to consoles either because it's a 5 year old game and barely runs at settings we could manage years ago, and only at 720p mind you.

A lot of reviewers are saying it's a better use of CryEngine 3 than Crysis 2 was because it's a better game besides. I'd have to agree. It is still Crysis ie a damn good game. I'm enjoying it. I will say it does have better sound effects than Crysis 1 on the PC. It sounds nice. 100/100 for effort on the port. 7.5/10 as a stand-alone game compared to the 9 to 9.5 of Crysis PC. Maybe 7.5 to 8.5 for Crysis 2, for console and PC respectively. The PC version scoring higher only because of the high res texture pack and Dx11 which are dang pretty.

Also, woohoo Charter.

www.speakeasy.net/speedtest
Download Speed: 35453 kbps (4431.6 KB/sec transfer rate)
Upload Speed: 4250 kbps (531.3 KB/sec transfer rate)



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 23. March 2012 @ 22:19

AfterDawn Addict

4 product reviews
_
25. March 2012 @ 15:29 _ Link to this message    Send private message to this user   
From what I can gather, the trend comes out something like this:

Figures stated for 1920x1200 / 2560x1600
HD6950: 175/190
HD6970: 195/210
GTX570: 195/195
GTX580: 225/230
GTX590: 360/350
HD6990: 380/400

HD7950: 250/260
HD7970: 295/305
GTX680: 305/300

Not had a look at the lower end HD7 series yet, but as far as I can tell they're roughly analogous to their predecessors (i.e. 7870~6970, 7850~6950, 7770~6870, 7750~6850)



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict

7 product reviews
_
25. March 2012 @ 15:51 _ Link to this message    Send private message to this user   
Wow, my GPU's been punked big time.



To delete, or not to delete. THAT is the question!
AfterDawn Addict

15 product reviews
_
26. March 2012 @ 06:17 _ Link to this message    Send private message to this user   
Likewise am considering a serious GPU upgrade myself but hesitating due to lack of immediate need. I do hover at the edge in some games, and it's going to start going downhill. Not to mention AMD's lack of timely driver updates which is a ballbuster in a lot of situations. Particularly Mass Effect 3 which will dip into the high 40s, low 50s during intense cutscenes on a single card, but has graphical issues in Crossfire(like super-concentrated micro stutter). (I'll be damned, driver just came out to fix that).

Also likewise having erratic scaling in Battlefield 3. Enabling the in-game graphing function shows my CPU is not the limit but my video cards and it's by a longshot so Crossfire scaling is to blame. Wondering if I'm hitting a memory limit in parts as this game is intensive. Will do some tweaking. I will note that Crossfire scaling fluctuated in BFBC2 with every release. Some driver releases would be exceptional and others would suck. I've always had better luck with a single card in Battlefield.

Am considering a single, powerful card, possibly in the form of a 7970. Possibly dual 6970s, which a friend is trying to trap me into, but doesn't want to upgrade from. The cards would come at a good price but he expects me to make a promise I can't keep. Likewise he would be a good source for a Phenom II X6 1090T but also doesn't want to upgrade from it yet. Getting sick of waiting and about to make a purchase regardless. Either video cards or an X6 for now would be a good jump. Just can't bring myself to change platforms yet considering the relatively good gaming performance of an OC'd Phenom II X6(Not to mention better OCing by a longshot besides). My video cards and drivers have so far been more of an obstacle than my CPU. Wish I would have stayed 5850s as they had much more reliable scaling but not missing the heat, noise and power draw.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 26. March 2012 @ 06:29

AfterDawn Addict

4 product reviews
_
26. March 2012 @ 06:55 _ Link to this message    Send private message to this user   
It could be memory, I've never had a single issue with my 6970s in battlefield 3, which is a far cry from the CTDs, green flashes, black screens and general glitching I used to put up with for 4870x2 quad cf in bad company 2.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict

15 product reviews
_
27. March 2012 @ 05:20 _ Link to this message    Send private message to this user   
Seems I've discovered a real use for post processing FXAA. It doesn't seem to tip the memory limit on BF3 allowing me to use Ultra textures on all maps. Ofc the blurred effect of FXAA is a far way off from true MSAA but it's better than none in this particular case. Using the low setting, it cleans up BF3 without the horrid frame drops I'm seeing when using true AA, and because it is kept at low, I get to avoid the horrible blurring that usually accompanies FXAA.

Still experimenting, but basically I can get away with Ultra textures if I don't use AA, and tbh textures matter much more to me. I would be willing to play entirely without AA if it meant better textures and that's just what I do on some games. A memory limit can be damning even if the hardware has the necessary power.

Still might opt out of using FXAA, and run without AA entirely, but the FPS drop is minimal so it's worth experimenting with. Some games, particularly Skyrim, only have one level of FXAA which usually means "grease coated lens" blurry. BF3 has the unusual advantage of having it in different levels, so I can get some of the benefit without all the drawbacks.

Guess I'm lucky AA isn't essential for me to enjoy a game. My monitor helps with that a lot :P



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 27. March 2012 @ 05:25

AfterDawn Addict

4 product reviews
_
27. March 2012 @ 06:09 _ Link to this message    Send private message to this user   
All max at 2560 means fxaa for me as well I think, and no hbao.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict

15 product reviews
_
27. March 2012 @ 06:16 _ Link to this message    Send private message to this user   
That was a clincher for me. I tried getting away without ambient occlusion and then simply using SSAO. But HBAO is basically all of the graphical depth in one package so it's rather important for the looks, even if it has a large performance hit. The effect was less performance intensive and more subtle in Bad Company 2, but I found it important for the graphics there as well. I'd agree that SSAO is better than none though.

Luckily you, like myself, have good pixel density, so AA could be considered a non-essential in some games. Would personally choose other graphical options over AA in a lot of games. A few games I have sacrificed options in order to use AA, but BF3 is so damn pretty I'd rather fiddle with post processing FXAA than neuter the amazing graphics.

Worth noting is that shadow quality and use of ambient occlusion go hand in hand. They are important for creating depth and scale, and work in tandem with the large draw distances. For example, turning shadows from Ultra to High doesn't affect shadow detail much, but it greatly affects how far out detailed shadows will render. Ambient Occlusion applies to all shadows in the game no matter the distance... so you can see how the effect would be drastic if either are turned down. Basically, anything below Ultra Shadows is nerfed, because detail stays the same while distances grow shorter. Likewise it's better to have any kind of ambient occlusion than none, but HBAO is miles better quality than SSAO.


Current settings:

Texture Quality - Ultra
Shadow Quality - Ultra
Effects Quality - Ultra
Mesh Quality - Ultra
Terrain Quality - Ultra
Terrain Decoration - Ultra
Antialiasing Deferred(MSAA) - Off
Antialiasing Post(FXAA) - Low
Motion Blur - Off
Anisotropic - 16x
Ambient Occlusion - HBAO

Rarely goes below 40 as long as I leave AA off. Also it performs MUCH better on indoor or urban levels like Grand Bazaar and Operation Metro. To break it down:

General Gameplay(Flying/Driving): 50-70FPS

High Action(Firefight/Explosives and Dust): 40-60FPS

Indoors General Gameplay: 70-90FPS

Indoors Action : 50-70FPS

If I had to characterize it with a single average FPS range I would say 50-60 pretty steady. Really wish I had some extra horsepower but it's within my tolerances. Am more than willing to have a few moments of lag here and there for the overall graphical effect. And again, willing to drop FXAA but am experimenting to see how large the drop is. If it's negligible, it's worth using.

Motion Blur was rather a personal setting more than a performance tweak. Basically makes it hard to operate in fast action and in moving vehicles. One of the settings I've experimented with but will probably leave off. Bad Company 2 was fine without it, and all it really does is serve to obscure the screen and cause hitching when I'm flying overhead at 200MPH on a minigun. Motion blur should be reserved for games with guaranteed performance like Source Engine or single player games where it doesn't matter. The technology is not ready for fast-paced competitive gameplay yet as it doesn't interact well with fluctuating framerates. Particularly in BF3 it has been responsible for heavy frame drops in certain situations and hitching in general. Many have told me Vsync clears up a LOT of the motion blur issues but I am not currently in the performance bracket where Vsync is practical.

Koroush Ghazi wrote a wonderful tweak guide with explanations, performance differences and direct screenshot comparisons for every setting:

http://www.geforce.com/Optimize/Guides/...-tweak-guide/#7

Basically, to turn anything lower than Ultra is to neuter the game. Motion Blur and AA method aside, the core graphics settings don't scale well. Essentially when you turn down settings, instead of a slight drop in fidelity or quality, you lose entire features of the engine or objects are simply not drawn. So again, would rather lose AA and FPS to maintain Ultra settings. Battlefield 3 is one of those "whole package" games like Crysis, wherein no single graphical setting makes all the difference, but they all have to be maxed to get the full effect. Maybe even more so, because Crysis had proper scaling and a ton of room for tweaking where you didn't lose effects.

BF3 is certainly one of the prettiest and most challenging games to run I have found. Am hoping some more patches come out soon to clear up some of the more glaring issues.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 27. March 2012 @ 23:31

AfterDawn Addict

7 product reviews
_
28. March 2012 @ 18:24 _ Link to this message    Send private message to this user   
Regarding moving the Steam and Steamapps directories to another hard drive.

Does this seem correct? I have a feeling in the near future, I'll have more games than my Velociraptor can accommodate ;) This process seems a bit too simple(too good to be true LOL!). I recently installed Virtualbox, assuming it would need C:\ for VHD's. Not true. I was able to use one of my other storage drives. Game backups/restores seem pretty easy. But I hope moving steam is this easy :)
https://support.steampowered.com/kb_art...=7710-TDLC-0426



To delete, or not to delete. THAT is the question!

This message has been edited since posting. Last time this message was edited on 28. March 2012 @ 18:24

AfterDawn Addict

15 product reviews
_
28. March 2012 @ 18:34 _ Link to this message    Send private message to this user   
Steam installed to another hard drive just fine for me. It might need to re-update the client, but it's minor compared to re-installing 100GB+ in games. It basically is that simple.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 28. March 2012 @ 18:34

AfterDawn Addict

4 product reviews
_
28. March 2012 @ 18:37 _ Link to this message    Send private message to this user   
My Steam is on E:
If I ever reformat, I just open E:/Games/Steam/Steam.exe and let the program figure out the rest. I don't even bother deleting the client folder unless something goes wrong.

To save me needing to copy everything should I lose E:, almost all my games are also installed on my server, which also enables said machine to play most of my steam games if it's taken to a LAN :)



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13

This message has been edited since posting. Last time this message was edited on 28. March 2012 @ 18:38

AfterDawn Addict

7 product reviews
_
28. March 2012 @ 19:10 _ Link to this message    Send private message to this user   
Thanks guys. Good info :)



To delete, or not to delete. THAT is the question!
AfterDawn Addict

15 product reviews
_
29. March 2012 @ 17:08 _ Link to this message    Send private message to this user   
Just an update, I finally sat down with MSI Afterburner and did some overclocking. Custom fan profiles, lol I suddenly get it. Nothing too aggressive, all stock voltage and tested stable with repeated runs of Unigine Heaven Maxed at 1920 x 1200 w/ 4x AA and Normal Tesselation. Both cards are not equal one being a XXX Edition card but voltage and BIOS are identical, so they were capable of a decent OC.

775MHz Core/1000MHz Mem(Effectively 4000MHz) -> 940MHz Core, 21% OC/1175MHz Mem, 17% OC(Effectively 4700MHz)

I ran two complete runs of Unigine Heaven Benchmark v3.0 on both and took the second set of results.

Average: 56.9 -> 67.8(+19%)

Minimum: 17.3 -> 25.1(+45%)

Maximum: 119.2 -> 142.0(+19%)

Heaven Benchmark Score: 1434 -> 1707(+19%)

Also I can support the minimum with recorded results at other clock speeds. It was a linear and repeatable increase. Holy crap.

Fan speeds update on a curve so I don't notice it suddenly throttling up and down.

Minimum up to 50*C: 40%

50*C: 60%

70*C+: 80%

90*C+ 100%

Tops out at 81*C on the top card and 79*C on the bottom card after repeated Heaven Benchmark runs. I do not mind roaring fans with headphones on or the speakers blasting. I want to find a good balance though that allows low noise at idle while maintaining sane temps. 100% is a bit too much, but having it set for 90 ensures my cards don't start getting too hot. I think 80-85 is what others have been seeing even with some stock cards so I'm pretty confident I'm safe.

Top card idles around 45-ish degrees. Bottom card 38, 39. The fan doesn't seem to kick on much.

Idle clocks are sadly limited too high for my 2D profile so I'm currently stuck at 400/525 vs 300/300. Wish there was a way to get Afterburner to go lower. It would be easier to maintain 40% fan speed with good temps.

XFX's hardware is saving the day here. I'm basically getting a direct 20% increase to video processing power for a little patience. That was worth my time. From a couple glances at review sites, an OC'd card the same as mine are is reliably faster than a 6870. I think I just upgraded to 6870s for free. lol Interestingly the 6870 doesn't seem to OC very well.

Am also experimenting using my CoolIt fans as intake, and relying solely on the top fan for exhaust. I may be investing in some more 120mm Scythe fans, and replacing the top one entirely. Any particular models worth looking at as case fans? The chipset runs cool as ever, even after removing the additional AMD HSF fan I had added. Plus I went back and refreshed my OC, now at 2400MHz NB and 2200MHz SB, and still the chipset is staying nice and cool. Chipset temp staying squarely in the mid-40s after several hours of gaming. CPU is idling at 32-34*C according to the motherboard sensor, 36-38*C according to the cores. Essentially rarely touching 50*C in games. I think with some better exhaust fans I could take that down a degree or two.

Everything is running better than ever and now with a healthy video OC(with drastically increased minimums now proven in BF3) I am very happy. Have finally caught back up with my 5850s for minimums and Crossfire scaling. Seems these cards were starved for clockspeed! LOL I feel like a newb for not trying it sooner. Even if I have to drop a few MHz to stay stable, I am reliably beating 6870s.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 29. March 2012 @ 18:34

harvrdguy
Senior Member
_
2. April 2012 @ 01:51 _ Link to this message    Send private message to this user   
Originally posted by shaff on gtx680:
Wow, nvidia did bloody well. Power consumption is shocking for the performance considering its from nvidia

Shaff made an appearance! Hey dude! Is your brother still breaking your equipment? LOL Just kidding. Say hi to him - he was playing your steam id once upon a time and I think I teamed up with him on Left 4 Dead.

Originally posted by sam:
HD7970 price cut incoming or AMD are heading for trouble...

Ah hah!

Well, then, here's hoping that price slides a bit by summer.

Originally posted by estuansis:
Koroush Ghazi wrote a wonderful tweak guide with explanations, performance differences and direct screenshot comparisons for every setting:

http://www.geforce.com/Optimize/Guides/...-tweak-guide/#7

Basically, to turn anything lower than Ultra is to neuter the game. Motion Blur and AA method aside, the core graphics settings don't scale well. Essentially when you turn down settings, instead of a slight drop in fidelity or quality, you lose entire features of the engine or objects are simply not drawn. So again, would rather lose AA and FPS to maintain Ultra settings. Battlefield 3 is one of those "whole package" games like Crysis, wherein no single graphical setting makes all the difference, but they all have to be maxed to get the full effect. Maybe even more so, because Crysis had proper scaling and a ton of room for tweaking where you didn't lose effects.
Wow, Jeff, a brilliant writeup of BF3. I bought Koroush's tweak guide for XP, and it was a world of good tips (for only $4.)

I'll have to stay closer to some of Koroush's more recent guides. He's a very good writer, and he puts a lot of effort into what he is trying to do for the community.

Getting back to BF3 - your review is all the more reason I'll have to muscle up to get into the play with that game. It sounds like it looks good enough to maybe make up for some of the beating I undoubtedly will take in the early months until I ramp up, not being anywhere near Shaff's skill level.

From what I remember, I was thinking it would take crossfire 7970's to get good framerates for 30" gaming on that title - your strong recommendation to use only "ultra" settings is heard loud and clear.

With Sam saying that the 7970 will now have to take a price drop to be competitive, maybe we'll see something like $400 in a few months.

Edit - holy crap I missed Jeff's last post. Do I trust my eyesight - overclocking a graphics card? Great!!

What say you Sam about all this? Are you also in the overclocking a graphic card mood? I have to say it enabled me to squeak by on my 3850, and I think I've always gotten about 5 fps more out of my 8800gtx.

I like how you put it, Jeff: "I suddenly get it. Nothing too aggressive, all stock voltage and tested stable with repeated runs of Unigine Heaven."

Rich

This message has been edited since posting. Last time this message was edited on 2. April 2012 @ 02:20

AfterDawn Addict

4 product reviews
_
2. April 2012 @ 02:35 _ Link to this message    Send private message to this user   
The HD7900s overclock like nutcases. It doesn't get you a correspondent level of extra performance like it does with CPUs, but it can still sometimes provide a tangible difference in gameplay.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict

15 product reviews
_
2. April 2012 @ 03:32 _ Link to this message    Send private message to this user   
I'd say the difference for me is pretty extreme. Some +10-15% average FPS in most games is nothing to laugh at. It effectively gives me two more expensive cards for free. So AMD have been making some champion overclockers and unlockers lately. They have always done so in my experience.

Most prominently was my first performance video card ever, a Sapphire X800GTO2 256MB(R480 chip same as X850XT). Was able to unlock all 16 Pixel Pipes with a BIOS flash and overclock it to X850XTPE specs. Would have probably run those clocks forever but Sapphire, in all their infinite wisdom, decided to take the X850XT cooler off and use their own proprietary single slot design made only for stock clock X800GTOs. This led to an early heat death as I was a noob at the time.

The cards were literally locked/downclocked X850XTs with weaker coolers to clear stock after they went OOP. After my X800GTO2 died its early death I replaced it with a genuine X850XT for about $150. The PCBs and memory chips were identical. The cards were identical even down to being able to swap the coolers.

What's even more interesting is that, while a 6800GT and 6800 Ultra were about $300 and $500 respectively, I got my X800GTO for $140 and it outperformed the 6800GT at stock and unlocked and overclocked it outperformed the 6800 Ultra. The X800GTO was truly an excellent card!

Just some old timey trivia:







The X800GTO was the same PCB and clocks as the X800XL, but with 4 of the pipes locked. It could normally unlock all 16 pipes, and match one perfectly.

The X800GTO2 was identical in clocks to the X800GTO, but instead of an X800 PCB, it used an X850 PCB. Almost all of these could unlock and overclock back to an X850XT without much issue.

The big change made was for power consumption only. Even with different PCBs and chips, both models of X800GTO were identical performance-wise. This generation in particular used about 2 or 3 common PCBs for the entire product line.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 2. April 2012 @ 05:34

Advertisement
_
__
 
_
harvrdguy
Senior Member
_
16. April 2012 @ 00:21 _ Link to this message    Send private message to this user   
Wow, got a new real estate customer - several trips down to San Clemente (where Nixon had a house) in order to save him from a foreclosure. My negotiator says we have our work cut out for us with the worst junior lien holder money could buy, plus a federal tax lien. All a day's work in this weird world of short sales that I find myself in. I'll get back to regular real estate one day when ............ ??????? I don't know. I'm reading a very cool book called The Power of Now. Incredible. If Budha were around today, and wrote in English, then ........... You know, Budha - that fat guy. Yeah, him. Only I heard he wasn't really fat like that. But who am I to argue with 100,000 statues. LOL

Oh - gaming thread. I forgot.

With some good old Chrome print, to color paperport, I captured some great BF3 setup tips, courtesy of Jeff, not to mention some nice graphics card overclocking tools. All right there in the correct paperport game folder, and paperport graphics card folder (7000 family) for quick future reference. Nice forum.

And I love Jeff's walk down memory lane, in his post above, regarding my old x850xtpe that I forked over $500 for when it first came out! PE of course means, Platinum Edition baby!!! hahahaha


Originally posted by estuansis:
Most prominently was my first performance video card ever, a Sapphire X800GTO2 256MB(R480 chip same as X850XT). Was able to unlock all 16 Pixel Pipes with a BIOS flash and overclock it to X850XTPE specs. Would have probably run those clocks forever but Sapphire, in all their infinite wisdom, decided to take the X850XT cooler off and use their own proprietary single slot design made only for stock clock X800GTOs. This led to an early heat death as I was a noob at the time.

The cards were literally locked/downclocked X850XTs with weaker coolers to clear stock after they went OOP. After my X800GTO2 died its early death I replaced it with a genuine X850XT for about $150. The PCBs and memory chips were identical. The cards were identical even down to being able to swap the coolers.
Hmmmm - I paid $500, he paid $150. I guess those two letters, PE, cost me an extra $350. You can keep your Platinum Edition, baby!!

LOL

Regarding that x850xtpe, I had to pull it out of one of my business computers, since it was coming loose in the socket and giving me blank screens - (bad case on that one machine allows too much flex.)

My current main business desktop is the one that Miles used when he animated Left 4 Dead when it was under development over at TurtleRock. It's just a p4 3.0 ghz - but faster memory and better HT implementation than my 4 ghz p4, so just 15% shy of that computer's performance. (That one is doing duty out in the sunroom for whoever comes over and wants to turn it on and log on - about 6 user profiles.) Here in the trailer / office, on the TurtleRock machine, I have gained appreciation for well-designed cases, like this Dell case, that do not flex. Flexing is a problem since graphics card slots are part of the motherboard mounted on the side panel, but graphics cards are held down from the back panel.

I almost hated to move the machine off my test bench, since it was great for all kinds of hard drive testing, but I ended up putting a two-port Power over Esata card into the Dell, and I can now test any hard drive by plugging the raw drive into this one cable in front of me - without taking the sidecover off the case.




I now test hard drives 100% of the time. But that did not include laptop hard drives until last weekend.

I was up in LA recently to do everybody's taxes, and one of the two Dell D610 laptops I had gotten the relatives up there, for about $300 each refurbished, was said to be running super slow.

I was skeptical because a couple month's prior I had upgraded both to 2 gigs of memory, and the one that was being used for business every day was performing brilliantly. (This all started because I had bought one for myself, and liked it so much, I bought one for another LA relative. But a year ago, I ended up leaving mine and taking the old slow hp laptop in trade - I reasoned I could make it work for my laptop needs, which are not too strenuous. It mostly works okay, but it has an early super-slow usb 1 port, 40 times slower than usb 2, so I carry a tiny powered ethernet switch with me, and try to mostly move data laptop to laptop, without ever using the external usb drive. I also carry a crossover cable - but to use that I have to set manual ip addresses on both machines, so it's almost easier to plug in the switch.

I left the much faster dell, so if the business laptop went down, like it did before, due to a virus or whatever, that person, namely Vanessa, the sister of Miles the animator, could switch over to the other identical machine, only a mile away at another relative's house, who uses it only very lightly mostly for email. Vanessa's whole livelihood depends on being able to get work done on that Dell latitude d610 laptop, so I reasoned that there is nothing better than having an exact duplicate machine nearby in case of an emergency!! When Vanessa logs onto her user name, all her familiar stuff is in the same place.)


A week before coming up, my slow hp had already been upgraded to sp3 and .net framework 4 that the new turbo tax required. So I started one set of taxes on my hp, while I loaded the tax program into the supposedly slow one. Wow - they were right - it was crawling!

I watched as the fast usb 2 connection behaved exactly like the snail-paced usb 1 on my ancient hp, trying to copy over 300mb of sp3, another 300 mb of .net framework 4, and 110 mb of turbo tax 2011.

Less than 1 gig, is normally about one minute, took almost an hour. I reasoned - it had to be the disk subsystem - it couldn't be a virus - well maybe it could, but I didn't think that was likely.

I would have gladly abandoned that laptop, and worked on the remaining dell, but for some reason I could not get service pack 3 to install on that one. It failed, and then it failed again, forcing me to go into safe mode each time to uninstall sp3. The third failure left me unable to boot the machine at all! For everybody who hates sp3 - you have a new comrade!!

Anyway, I found that I fortunately had brought many of my gaming tools, so I installed my hard drive test programs, hdtune, hdtach, wd lifeguard, and seagate seatest, after first seeing what normal results I was getting from the other laptop drives, which I had never tested before.

I was shocked to see, not 25MB/sec, which was the normal on the other laptop drives, but only 2.5 MB/sec on the slow machine. That's right, only one-tenth normal transfer speeds!!!! Holy moly!!! I laughed, and the non-tech relatives thought I was nuts!!

Prior, the only experience I have ever had on a hard drive that slowed down to a crawl like that, which I think I posted about, maybe 3 or 4 years ago, was on an old 3.5" 160 gig seagate, my forming xp gaming drive, which had developed bad sectors on the outer edge of the drive. This was before I acquired hard drive test tools, and the only thing I ever used was speedfan with SMART data. That 160 drive still worked, and it sits today in a box, as a clone to some computer, the one I worked on last Spring I think, but the 160 is very slow.

But last weekend, this 2.5" laptop drive, at least, slow as it was, was successfully installing sp3, unlike the fast drive on the other identical dell latitude. So I loaded service pack 3, which took forever, and then I loaded turbo tax 2011, which took two forevers. I then ran all my testing software, but none of my testing programs, including wd lifeguard, and seagate seatest, could find anything wrong with the drive - it passed every test!

Finally, I stripped all the non-essentials from the drive, got the thing down to 32 gigs, and cloned back to the original 40 gig little Toshiba drive which had been sitting in an external enclosure in a kitchen drawer for 3 years. What should have been a 30 minute clone took 4 hours, but finally it finished successfully.

Then I put the "bad" hard drive into the external case, and the good one into the laptop. The laptop was back to 25MB/sec, and so was the external drive.

What!!!!!!!

After actually finishing everybody's taxes - two different sets - I was going to try to clone back to the "bad" disk, just to see what was up and if there was some problem in how the disk had been inserted into the laptop.

But as I started to clone, the "bad" disk sitting out in the usb external case, Acronis said "you can't clone - you only have one drive." I shortened the cable to the "bad" disk, thinking my cable must be too long.

Then a wonderful thing happened. Acronis saw both drives, but gave me a warning I have never seen before: "Error, cannot read sector 63 on disk drive two."

Wonderful! Good old Acronis found what none of the testing software could find - not the SMART data, nothing. I had even run CheckDisk on a full disk scan "Attempt to recover bad sectors" and it passed that Windows test with flying colors! But good old Acronis found at least one bad sector at the beginning of the disk, no doubt affecting the master boot record, etc. and responsible for slowing down the transfer rate of the disk to 1/10th normal speed.

So I am a fanatic about testing every single hard drive - laptop or desktop - running them through my testing suite. That all came from that big project last Spring, where I ruined two 1TB WD enterprise drives, which each developed pending sectors, and which each ultimately failed the WD Lifeguard advanced tests, and got rma'd back to newegg. The fault at that time was trying to run the drives in raid 1 off the Koutech power over esata card, which were getting 5 volt power from the psu floppy connector. I can't believe the psu floppy 5 volts could be so restricted amperage-wise, straight from the psu, but maybe that is the case. Otherwise, it is a Koutech card design failure. Somehow, the floppy power supported only enough amperage to power one disk's logic board, but not two - as soon as you powered the other disk up, the raid volume failed, and the affected disks started acting strange.




Nothing like power outages to cause problems with bad sectors on hard drives, and that is what I was doing to those cards with the weak 5 volts supply. But when I attached the included usb connector bringing in 5 volts that way, the problem went away, as I discovered too late.

I gave the Koutech 5 eggs on newegg, since I still love it, and as I mentioned, a few months ago, I bought another and installed it in my Dell business machine. The usb connector wouldn't work, the motherboard had no matching set of pins, so I used an adapter to pull the 5 volts from one of the molex hard drive connectors. But I will never try running two drives at the same time from two of the Power over esata cables - the second cable is safely put away. (I also notice that newegg no longer carries the card, just the cable. Maybe they got tired of rma'ing expensive hard drives. But the cable will work with any esata port into which you can plug a usb cable - meaning that the port is a "powered port".

I loved the Koutech card, but my several detailed reviews on the card and on the cable pictured above, mentioning in detail the care one has to take about the 5 volt power, probably scared away a lot of potential buyers, lol.

Rich

This message has been edited since posting. Last time this message was edited on 16. April 2012 @ 00:46

 
afterdawn.com > forums > pc hardware > building a new pc > the official graphics card and pc gaming thread
 

Digital video: AfterDawn.com | AfterDawn Forums
Music: MP3Lizard.com
Gaming: Blasteroids.com | Blasteroids Forums | Compare game prices
Software: Software downloads
Blogs: User profile pages
RSS feeds: AfterDawn.com News | Software updates | AfterDawn Forums
International: AfterDawn in Finnish | AfterDawn in Swedish | AfterDawn in Norwegian | download.fi
Navigate: Search | Site map
About us: About AfterDawn Ltd | Advertise on our sites | Rules, Restrictions, Legal disclaimer & Privacy policy
Contact us: Send feedback | Contact our media sales team
 
  © 1999-2024 by AfterDawn Ltd.

  IDG TechNetwork