|
The Official Graphics Card and PC gaming Thread
|
|
harvrdguy
Senior Member
|
26. April 2009 @ 23:35 |
Link to this message
|
For $65 I should have bought that instead of changing out my power supply the last time for the 3850.
I'm going to buy a little extra power this time, but it's nice to know that something like this is around if you later need just a bit more. They say that power supplies eventually fade out due to capacitor aging.
Anyway, Kevin, how did you ever find the thing?
Sam, I took a note of those new Kaze Jyuni fans you are thinking of picking up - you're saying more cfm per db even than the scythe s-flex? Very interesting!
Yeah Shaff, initially I was all over the idea of ceiling intake. I will definitely have to do some testing. If I tried the east/west and was able to fit push-pull that way, then if sam were right about the short circuiting, I could maybe prevent that by fashioning a duct and blowing the top intake directly down on the vrms.
As shown, on the Hardware Canucks open bench, when they added a fan blowing down on the northbridge pwm, it made a huge difference in temps - here's the chart, it was 20 degrees.
I will have two side door fans for the upper cpu section, and that's all. Standard is 4 exhausts if you include the exhaust behind the mobo. The front single intake will probably be directed toward the gpu compartment - they have a sliding fan rack and I could blow that toward the motherboard, but that leaves me a bit weak on inlet air for the gpu section (well, I do have two more side inlets for that part, plus a bottom - so maybe I'll be okay.)
Shaff - new board with 40nm process, 4770. Does that mean a new family of boards coming soon?? Isn't the current at 55nm - isn't 40nm a huge leap!!
Rich
This message has been edited since posting. Last time this message was edited on 26. April 2009 @ 23:38
|
Advertisement
|
|
|
AfterDawn Addict
7 product reviews
|
27. April 2009 @ 00:05 |
Link to this message
|
Originally posted by harvrdguy: Anyway, Kevin, how did you ever find the thing?
Its been a while, but I think I ran a search for "independent Power supply" on a http://www.google.com/products Search. Its kind of like Price grabber, only using google. QUITE handy! :)
To delete, or not to delete. THAT is the question!
|
harvrdguy
Senior Member
|
27. April 2009 @ 00:11 |
Link to this message
|
Hmmmm. Good job. Google is coming out with all kinds of stuff - their own browser. Somebody said - their own OS??
The battle of the giants, lol.
|
AfterDawn Addict
7 product reviews
|
27. April 2009 @ 00:23 |
Link to this message
|
Originally posted by harvrdguy: Hmmmm. Good job. Google is coming out with all kinds of stuff - their own browser. Somebody said - their own OS??
The battle of the giants, lol.
LOL. They also have another kind of search. You can find just about anything with the filter shut off too ;) http://images.google.com/
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
27. April 2009 @ 05:59 |
Link to this message
|
Yes they do, two very small, very loud fans. Also, if they're 450 watts I'll <insert something silly here>,
The fact that your PSU is undervolting as much as it is suggests the PSU is going to become a big problem later one. Don't bother spending all this money on getting around the problem, just buy a decent power supply to start with, with only one GTX260 in your system a 400W Corsair CX would actually power it, that should show you how badly the OCZ is performing.
Remember, 450W is only the requirement for a PSU to run the whole system, not just the graphics card. The GTX260 itself will draw no more than 150W or so.
Ultimately, one better PSU is ALWAYS a better bet than two. Fewer weak links for stability, guaranteed quality control, lower noise, and even lower cost.
Rich: I don't remember completely banhammering the concept of top intake. For a start when I only had an 80mm top fan in the Lexa that's what I used. For a standard configuration though with normal X2 heatsinks, that's the best way to do it as the side fan(s) need to be intake. If the GPUs were switched to be internal-exhaust coolers like the Arctic Accelero Extreme, the side fans would be turned to be exhaust, and the top fan converted to intake.
Realistically, I never saw the point in lapping CPUs or coolers. Lapping a cooler fair enough, as what have you got to lose, not only are heatsinks cheap, they also tend not to go wrong. Lapping a CPU though, I wouldn't want to do it, just in case. Ultimately, you're shaving a pretty small temperature off the CPU unless the CPU and cooler have been lapped together so the surfaces mate properly, but realistically, that's what thermal paste is for, to fill the gaps between the uneven surfaces...
Shaff: Asking nvidia for a price drop is like asking a lazy student to do their homework. Only at the last minute, when it's absolutely necessary, will you see any degree of co-operation.
Rich: The extra graphics PSUs have been around for years. They came out as a stopgap for when SLI was first being touted with the 6800 series (that's right, around the era the X850 had just been released), and before 500,600,700 Watt PSUs were commonplace in the market. The fact they've lasted in most people's builds is probably because they're so under-loaded. People think graphics cards draw more power than they do. Remember all the bay PSU can do is provide the power that runs to the PCI-E connectors. A lot of the power a graphics card uses runs via the PCI express slot, which still has to come from the main Power supply. Even with a GTX260, the extra PSU is perhaps taking 70 or 80W extra off. That's about it.
The extra fan on the PWMs is something I considered as they do get hot on the Maximus II, but then thought better of it as realistically, the temperature isn't actually an issue, it doesn't create any problems, it's just a number sitting there. Once again, PWM temps are something that have slammed down since moving to a better room.
|
AfterDawn Addict
|
27. April 2009 @ 07:10 |
Link to this message
|
Well the pcie slot gives 75w, and each 6 pin gives 75w aswell, and by puttng 2 pcie 6 pin connected I'd assume the draw from them is goign to be over 75w, possibly 100-130. Anywho it'd be nice to see some real world testing of them :). They are made by a very reputable company.
MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
|
AfterDawn Addict
4 product reviews
|
27. April 2009 @ 07:12 |
Link to this message
|
Well, 75W is the theoretical maximum. As far as I'm aware slot power takes preference over connector power until it is overburdened, at which point the connector takes over. Ultimately, short of plugging two 4870X2s into it, there's no way of getting more than 300W load out of that thing, and if you do that, you're drawing almost as much through the slot as a GTX260 uses in total in the first place...
|
AfterDawn Addict
15 product reviews
|
27. April 2009 @ 12:49 |
Link to this message
|
Yeah that's the general idea. The cards draw almost completely from the slot until they are being used. And the real power usage of most cards is not going to be anywhere near 450W. I'm running my entire system with an OC'd 125W quad core and 2 high end cards with 2 HDDs, a DVD-RW, a floppy, 2 sticks of RAM plus several odd peripherals and lights off a 620W PSU. Don't tell me you need a separate PSU just for your video cards.
Though back in the day of the 6800s SLI wasn't even worth it to begin with. Most times the gains were next to nothing even at high res. 2 X850s though is a different story. In the games it worked with Crossfire was already showing 80%+ scaling. Crossfire was just better from the get-go.
This coming from a fan of both ATi and Nvidia :P
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
4 product reviews
|
27. April 2009 @ 13:50 |
Link to this message
|
Yup, and I managed an overclocked 95W Q6600 with a 4870X2 and three HDDs with no issues off a 520W HX.
Agreed on the SLI scaling, CF had it better from the get go, but it was only with the HD series that they used internal bridge connectors than external DVI links.
|
AfterDawn Addict
15 product reviews
|
27. April 2009 @ 15:22 |
Link to this message
|
LOL @ the external links. What a bother. The number one reason I never tried Crossfire with my X1800XT. The performance was right up there though or so I've read. Agreed that it didn't really become useful until the HD cards though.
Nvidia also had it working a lot better and more reliably with the Geforce 7 series. Especially the lower end cards. My friend with the SLI 8800GTS 640s used to have SLI 7600GTs. The scaling was quite good on a lot of games. A few saw close to double. Crossfire was still more stable and scaled a bit better though.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 27. April 2009 @ 15:30
|
AfterDawn Addict
4 product reviews
|
27. April 2009 @ 15:22 |
Link to this message
|
Yeah, not as good as it is today, but not far off. Certainly better than SLI was at the time.
|
AfterDawn Addict
15 product reviews
|
27. April 2009 @ 15:29 |
Link to this message
|
read my edit...
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
7 product reviews
|
28. April 2009 @ 02:55 |
Link to this message
|
Well...something SOME people may find interesting. Im running 8GB of Ram, and GTA IV only uses 46% of it. This is after an hour of intensive gaming :D
Im gonna strongly recommend 4GB to the general public for Windows 7. While 2Gb is fine, 4Gb is wonderful. Although it ran OK, on a 500Mb system of mine. 2 and 3Gb is becoming a minimum anymore. Which is why, programs, programmers need to concentrate on FULLY embracing 64bit processing. Its time to move forward people LOL!
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
28. April 2009 @ 07:55 |
Link to this message
|
Omegaman: Have you applied the -norestrictions -nomemrestrict tag to the game so you can max every slider at 100%? Try that, then tell me what your TOTAL system RAM usage is. I wouldn't mind betting it's at least 6GB, closer to 7 if you have a lot of background apps open like I do.
|
AfterDawn Addict
4 product reviews
|
28. April 2009 @ 10:20 |
Link to this message
|
HD4770 just been released.
Price: £80 (80% of HD4850, 60% of HD4870 512MB, 50% of HD4870 1GB, 42% of HD4890)
Performance: -5 to 10% faster than the HD4850 (c. GTS250 performance)
Power consumption: idle c. that of HD4850. Load equidistant between that of HD4670 and HD4830 at around 60W.
Conclusion: Awesome product. New #1 card of choice for midrange PC systems. A 9800GTX+ and GTS250 beater for £80 with the power consumption of an 8600...
|
AfterDawn Addict
|
28. April 2009 @ 11:16 |
Link to this message
|
well there is no need to buy a 4850 any more, but in alot of games the GTS 250 thwamps it. some games it looses, btu most it wins and soem of the wins by a big margin.
this for £85 ish or the GTS250 for about £99 ish....
if you can afford £90, id say go up to a GTS250, but if you cant then 4770, but right now, the 4850 has been taken off the map.
(boy am i glad i picked up a 4870 for £125 instead of a 4850 for £100 :D)
MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
|
AfterDawn Addict
4 product reviews
|
28. April 2009 @ 11:52 |
Link to this message
|
Except Far Cry 2 where the HD4770 owns the GTS250 :P
I'm starting to veer off Bit-Tech as they're showing a bit of bias these days, but taking their results: for Crysis:
Crysis
HD4830: 284/327
9600GSO XXX: 195/253
HD4770: 337/391
9800GT: 356/425
HD4850: 375/439
GTS250: 449/531
Here's the lowdown on the Hexus test:
Call of Duty 4 4xAA:
HD4670: 135.7
9600GT: 162.7
HD4830: 176.2
9800GT: 205.2
HD4770: 223.7
HD4850: 230.3
Company of Heroes Opposing Fronts:
HD4670: 111.3
9600GT: 149.6
HD4830: 195.9
9800GT: 205.9
HD4850: 223.3
HD4770: 237.6
Enemy Territory: Quake Wars 4xAA:
HD4670: 174.9
9600GT: 204.3
9800GT: 236.7
HD4830: 250.4
HD4850: 254.9
HD4770: 278.5
Far Cry 2 DX10:
HD4670: 152.1
9600GT: 164.6
HD4830: 193.2
9800GT: 198.5
HD4850: 221.0
HD4770: 222.5
GRiD:
HD4670: 125.4
9600GT: 144.0
9800GT: 184.3
HD4830: 214.8
HD4770: 227.7
HD4850: 248.2
Idle (est figures)
HD4670: 12W, 44ºC
HD4830: 24W, 28ºC
9600GT: 29W, 33ºC
HD4770: 32W, 43ºC
HD4850: 39W, 70ºC
9800GT: 40W, 40ºC
Load (est figures)
HD4670: 40W, 82ºC
HD4770: 49W, 60ºC
9600GT: 59W, 44ºC
HD4830: 72W, 50ºC
9800GT: 74W, 55ºC
HD4850: 85W, 81ºC
This message has been edited since posting. Last time this message was edited on 28. April 2009 @ 12:27
|
AfterDawn Addict
|
28. April 2009 @ 12:46 |
Link to this message
|
why is it bias?
MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
|
AfterDawn Addict
7 product reviews
|
28. April 2009 @ 13:01 |
Link to this message
|
No. I dont have "no restristions" turned on in GTA IV. Was unaware of it actually LOL. And another thing, Is AA controlled by the Nvidia control panel for this game? Go ahead, you can laugh. LOL. Im still not much of a gamer :P
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 28. April 2009 @ 13:02
|
AfterDawn Addict
4 product reviews
|
28. April 2009 @ 13:25 |
Link to this message
|
Afaik, AA is permanently disabled for the game, not supported.
The Bit-Tech results are biased because in their recent tests, they have shown by far the best results in the nvidia vs AMD front. Most tests there is a trend where X competes with Y. With Bit-Tech the results are so unusual that such trends don't always work.
|
AfterDawn Addict
|
28. April 2009 @ 13:31 |
Link to this message
|
tbh it maybe the merge with Custom PC that has caused that, as i have always found Custom PC to be biased. (and this is comign form a subscriber :D)
MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
|
AfterDawn Addict
7 product reviews
|
28. April 2009 @ 13:31 |
Link to this message
|
So...attempting a AA setting in the Nvidia control panel would not have an effect? If not, do you think either Rockstar, or M$ will release a plugin/patch for this ability? Hearing you guys talk of AA sounds like SERIOUS eye candy LOL!
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
28. April 2009 @ 13:41 |
Link to this message
|
AA in GTA4 would be amazing, the game really badly needs it. However, due to the pathetic performance of the game, and the fact that the main thing AA whores is video memory (oh no!) you're unlikely to ever see it implemented.
|
AfterDawn Addict
4 product reviews
|
30. April 2009 @ 20:36 |
Link to this message
|
Some more HD4770 Facts:
Crossfire performance (total scores)
Quake Wars:
HD4770 Solo: 240
HD4890 1GB: 352
HD4770 Crossfire: 443
Left 4 Dead:
HD4770 Solo: 285
HD4890 1GB: 391 (CPU bound testing)
HD4770 Crossfire: 417 (CPU bound testing)
Call of Duty 4:
HD4770 Solo: 240
HD4890 1GB: 391
HD4770 Crossfire: 508 (Yes, I know)
HAWX:
HD4770 Solo: 356
HD4890 1GB: 504
HD4770 Crossfire: 612
Power consumption (Whole system AC)
HD4770: 153W idle, 164W video, 230W Gaming, 215W Furmark
HD4890: 185W idle, 197W video, 333W Gaming, 377W Furmark
4770CF: 190W idle, 205W video, 310W Gaming, 316W Furmark
|
Advertisement
|
|
|
harvrdguy
Senior Member
|
30. April 2009 @ 22:27 |
Link to this message
|
But what about the 4770 40nm die and GDDR5 that shaff mentioned? Is it really a smaller die? Does that mean the 5000 family is just around the corner?
Ah hah, Sam, you actually used a ceiling intake on the Lexa. Very interesting!
Here's my earlier concept of the spedo flow - the 4 side intakes is what you did to your HAF. I think this design needs some more work. For one, unlike the HAF, the spedo only has that one front intake. I could fix that with some kama bays (... see bottom picture.) Another thing that bothers me, is that I suspect that I don't really have positive pressure in both sections of the case, which is my over-riding dust-banning modding philosophy. Of course I could experiment with different fan speeds - these are all going to be 120s I think - probably several of them that new one you discussed, the 9 bladed Kaze Jyunis.
Originally posted by sam: I don't remember completely banhammering the concept of top intake. For a start when I only had an 80mm top fan in the Lexa that's what I used. For a standard configuration though with normal X2 heatsinks, that's the best way to do it as the side fan(s) need to be intake. If the GPUs were switched to be internal-exhaust coolers like the Arctic Accelero Extreme, the side fans would be turned to be exhaust, and the top fan converted to intake.
So if I understand you, if I were to pick up the Asus x2 cards on ebay, for example, which don't have the standard turbine cooler, (the cooler is 3 fans venting mostly inside the case) then I might want to consider turning the lower set of side fans into exhaust fans rather than intake fans, as I show in this picture:
In this picture I am also questioning whether the case ceiling fan should be intake or exhaust, and also I am thinking of adding some more front intake fans by way of kama bay. Also in this picture I am indicating the partitioning. I am very sold on the concept, initially advanced by you, Sam, that we have to think of the top of the case as being separate from the gpu section at the bottom.
After you introduced that idea, I later took particular note when I traded my antec 1200 for the spedo, that Thermaltake, in the "advanced" spedo design, actually puts a slim partition above the first graphics card, to keep gpu heat away from the motherboard. (I think I could easily mod that.) You pointed out, Sam, that even without such a barrier, the x2 card, due to its size, effectively partitions the case.
So I have to ask this question: If, pursuing the ideal design, one is actually able to create a true partitioned case, then WHY WOULD THE ORIENTATION OF THE CEILING FAN BE DEPENDENT ON THE TYPE OF COOLING OF THE GRAPHICS CARDS? Ideally, what is happening on the top of the case shouldn't have anything to do with the bottom of the case, right?
Regarding the lapping - I agree, what harm getting the TRUE lapped? For the cpu - it's that little edge again - those extra 3 or 4 degrees, for which one risks voiding the warranty. It appears that the d0 stepping which Rubbix alerted me to on the 920 is a winner - that may end up performing so well that no lapping will ever be needed. I hear what you're saying about thermal paste - obviously it works and it's designed to fill in the voids - but lapping is designed to eliminate the voids, right? LOLOriginally posted by sam back to kevin: Omegaman: Have you applied the -norestrictions -nomemrestrict tag to the game so you can max every slider at 100%? Try that, then tell me what your TOTAL system RAM usage is. I wouldn't mind betting it's at least 6GB, closer to 7 if you have a lot of background apps open like I do.
Sam, why would you have a lot of background apps open while trying to play GTA4? I know you have another computer that is your server. When I get ready for GTA4 I will undoubtedly be using "end-it" which the punkbuster people told me about, to kill off everything, so as to try to run the game 2560x1600. "A lot of background apps open" - not me!! LOL (When is the 8 gigs coming, Sam?)
Speaking of hard on the graphics, I loaded that RTS you and estuansis talked me into - World in Conflict. What a beauty!! But all hell is breaking loose!!
I kept crashing my gpu in boot camp until I finally backed off to 1440x900 and non-high settings. But just like crysis and the others, this game is too pretty to play like that. (Finally getting past the tutorial, I dishonored myself in advancing through the Berlin wall - I couldn't hold the key bridge. So I was told that besides being de-commissioned, I'll undoubtedly face a tribunal for my disgraceful performance on behalf of the motherland. So with that crushing humiliation, I think I'll put the game away until the new build, lol.)
And furthermore I don't think I'll play it on super hard until I get the hang of it - reinforcements, artillery, bombing runs, air defenses - there is a helluva lot more going on than just blasting somebody with my p90!! LOL! (I think I can see why RTS is popular - I'm not exactly sure it is for me, but I'll give it another try later on the new equipment. If I like it, there is another WWII RTS that I have been reading about on Steam - I think it's that one - Company of Heroes - from the bit-tech labs benches you just posted.)
Rich
|
|