|
The Official PC building thread - 4th Edition
|
|
|
Senior Member
|
26. July 2013 @ 12:30 |
Link to this message
|
Originally posted by sammorris: Heh, I wish. Going to be a long hard slog to save up that sort of money...
You never know Sammy, in 6 months time the way things have been going the price might be cut in half.
|
Advertisement
|
|
|
Senior Member
|
26. July 2013 @ 13:28 |
Link to this message
|
Someone say they have drugs for sale? What's the price per gram on your finest hashish?
|
Senior Member
|
26. July 2013 @ 13:38 |
Link to this message
|
Originally posted by Deadrum33: Someone say they have drugs for sale? What's the price per gram on your finest hashish?
LOL, I haven't tried that since the 60's, here in the Philly area I don't think it has been available for years, pot and hard drugs is what's avialable if that's anybody's thing, I haven't even messed with pot for years, the other stuff I'm to old, most likely would get a heart attack, if was fun years ago today I'm stuck being sober, my new high has been messing with computers, less expensive and a whole lot of less trouble.
|
AfterDawn Addict
4 product reviews
|
26. July 2013 @ 15:40 |
Link to this message
|
Originally posted by FredBun: Originally posted by sammorris: Heh, I wish. Going to be a long hard slog to save up that sort of money...
You never know Sammy, in 6 months time the way things have been going the price might be cut in half.
Which would still make it two months' salary :D
|
AfterDawn Addict
7 product reviews
|
26. July 2013 @ 16:46 |
Link to this message
|
Yeah, my current part time job would take a VERY long time to save funds. Hopefully I'll be full time in months to a year. 3 years after I become full time, I'll make 70 - 90K :D Some people make more. Can't wait. There's so many good things I can do with that money. And I'm not just talking about myself ;)
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
26. July 2013 @ 18:26 |
Link to this message
|
I'm getting there with the skill set for a decent salary, it's just the experience being built now. Current plan is to be on the equivalent of $35,000 by the end of the year. Am hoping to be on the equivalent of $50,000 by the end of my 20s (not so long away now! :S)
|
Senior Member
|
26. July 2013 @ 18:29 |
Link to this message
|
Originally posted by sammorris: I'm getting there with the skill set for a decent salary, it's just the experience being built now. Current plan is to be on the equivalent of $35,000 by the end of the year. Am hoping to be on the equivalent of $50,000 by the end of my 20s (not so long away now! :S)
A go getter, that's what I like to hear!!!!
|
AfterDawn Addict
15 product reviews
|
27. July 2013 @ 00:26 |
Link to this message
|
Got the heatsink swapped out to the heatpipe cooler from the 790X-UD4P. About a 2-3*C difference I would say. Can't tell if it's the enw TIM application or the heatpipe, but there is definitely a difference.
64*C on the northbridge is still a little warm for me, but most sources are saying it's within the realm of what I should expect under Prime 95. Good thing I don't run Prime 95 24/7, haha. Am going to be adding a 40mm fan to it very soon as well. Should bring it down below 60 at the least :D
1.5v I believe is also the highest they recommend trying on these chips without liquid cooling, but my core temps at 1.525 aren't touching the stability wall of 55*C so I think I'm okay.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 27. July 2013 @ 01:31
|
AfterDawn Addict
4 product reviews
|
27. July 2013 @ 08:08 |
Link to this message
|
45ºC on the cores? Which of those three sensors is the CPU temp, the 36 or the 54?
Normally you'd expect core temps to be 10-15ºC higher than TCase, if the reverse is true, I think the Tjunction must be set wrong, surely?
Nice little collection of hard disks in that PC too, one of each :D
As long as it's stable at that temp and not overvolted, the chipset will probably run at 64ºC for years. I personally don't like to have a chipset over 60ºC as I've started seeing issues there, but I've often had to overvolt chipsets to get the best overclocks, hence the high temps in the first place. If it is overvolted, I think you might be shortening its lifespan quite considerably at that high a temp.
|
Senior Member
|
27. July 2013 @ 10:44 |
Link to this message
|
Originally posted by FredBun: Originally posted by sammorris: Well, what I've been waiting for has finally been announced!
http://www.techspot.com/news/53343-dell...ving-in-q4.html
Time to start saving :/
I'd be surprised if this falls short of $3000 upon release.
Sammy I hope you have a great paying job, I to would love to have one of these bad boys, but I would have to sell drugs on a street corner to afford one.
This is definitely designed for a business monitor, however it may not fit into drug sales on your local corner? LOL Great call Fred....
Stevo
|
AfterDawn Addict
15 product reviews
|
27. July 2013 @ 22:43 |
Link to this message
|
Quote: 45ºC on the cores? Which of those three sensors is the CPU temp, the 36 or the 54?
Normally you'd expect core temps to be 10-15ºC higher than TCase, if the reverse is true, I think the Tjunction must be set wrong, surely?
Temp0 is Ambient, Temp1 is the on-board CPU temp, Temp2 is the Northbridge. AMD's on-core sensors are notoriously badly calibrated. I pay much more attention to the Temp1 sensor than to the cores. It is working properly. Remember it's the max temps we need to look at.
Quote: Nice little collection of hard disks in that PC too, one of each :D
Ha most of those should be in the Q6600 filebox but they always seem to float around depending on what I need to do. The FALS is my OS drive and there are 2 more Seagates below those. The FALS and two newer Seagates make up this system's normal drive array. The HAF is currently full up on drives :P
Quote: As long as it's stable at that temp and not overvolted, the chipset will probably run at 64ºC for years. I personally don't like to have a chipset over 60ºC as I've started seeing issues there, but I've often had to overvolt chipsets to get the best overclocks, hence the high temps in the first place. If it is overvolted, I think you might be shortening its lifespan quite considerably at that high a temp.
The NorthBridge is currently not overvolted. All at stock volts. I have the cooler swapped out from the older board and the 40mm fan for it is next on my list. I want to get it back below 60 or, as you say, it's not going to last long-term.
Take comfort in the fact that it(the Northbridge) never even goes to 55 under gaming and other loads. Only Prime 95 takes it that high. I played some 4 hours of Max Payne 3 today and it only hit 51*C.
Under gaming load, which is the hottest this PC is going to get daily, the temps are VERY comfortable. Not a single hint of being too hot. In fact, quite cool compared to similar AMD systems.
My current temps sitting idle with a few browser windows open, a movie playing, and some torrents downloading.
As you can see, there is no cause for alarm here.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 27. July 2013 @ 23:22
|
AfterDawn Addict
4 product reviews
|
28. July 2013 @ 05:02 |
Link to this message
|
To be honest, my temps get a bit silly when doing a proper stress test, primarily due to the heat the GPUs generate when being stressed (500W between them in an all-transistors test environment) heats the PCH up as it sits directly underneath them, and that reaches 65C odd unless I max the side fans out at 1900rpm - I've since lost two of them since last doing that (one failed fan, one failed channel on the scythe controller), but in normal usage, the most hardware-demanding (not performance-demanding) games like Left 4 Dead 2, it's mostly 50s on anything that isn't the GPUs. Annoyingly, since finally updating from an old 2012 set of drivers recently to a 2013 one (13.6 maybe?) maybe once in every 10 or so L4D2 games I'll get a hardware lockup mid-game. The timing correlates too well with the graphics driver update for me to think it's hardware, and it doesn't happen in any other titles that I've seen.
Given that crossfire on the HD7 series is still quite poor to actually use, I was at one point considering moving to SLI for my next upgrade (which'd still be a long way off at the moment), but now that a 4K monitor is definitely on the shopping list, that option's been removed, as nvidia still don't support 4K60 properly due to the displayport debacle. AMD it is.
|
AfterDawn Addict
15 product reviews
|
28. July 2013 @ 08:37 |
Link to this message
|
The final steps in my little cooling upgrade will be to make sure that the temps do not get silly at all. Obviously, practicality must take priority here. My PC is not going to be running Prime 24/7, so it's never going to be pinned at max temp like the screenshots show. But I would like to have that ability nonetheless, if for nothing but peace of mind...
I think every performance user and OC'er would like to be able to run a stress test 24/7 and never have an issue. Impractical, maybe, but it sure is a bullet-proof way to know your cooling will always be enough.
As for stress testing GPUs, I think it's absolutely stupid. No load, not GPU Compute, not Rendering, nothing, will ever push them to that type of power draw or heat output. It's completely unrealistic, much more so than using Prime or IBT on a CPU, and will never serve a purpose except to destroy good hardware and to test liquid loops.
As far as drivers, AMD have been dragging their feet for about 4-5 months now. I am currently considering an Nvidia card for my next upgrade as AMD have severely let me down with their driver support for the HD6800 series. Basically no Crossfire support for any new games until they're far beyond old news. Still waiting for a proper CAP for Metro last Light as my current scaling is only some 40%, when Metro 2033 has about 90%.
Not to mention that in the past my raw performance has been dictated by AMD's mood that month. Right after I got these cards, a driver update necessary for Crossfire scaling in some newer titles caused their single GPU performance to take a 10-15% dive and it was never remedied. Excuse my language but that's absolute BULLSHIT. My hardware got directly nerfed to sell newer cards. I believe you've recorded this change yourself.
I'll also mention that several companies have blatantly halved the performance of AMD cards vs the equivalent Nvidia. TWIMTBP isn't stagnating, it's growing. Don't even get me started on PhysX. Basically removing features from the game that have nothing to do with PhysX, and withholding them until you buy an Nvidia card.
http://www.youtube.com/watch?v=VafzR7JqO2I
I rest my case. Blatant BS. Not one single effect here requires PhysX to do practically. Havok is capable of all of it with the same general precision and performance, and CPUs are currently underutilized in games, so there's easily enough power in most gaming rigs to render it. Those are some pretty spectacular effects as well, and I feel that I am truly being cheated out of the full game. Try enabling PhysX on an AMD card, and you get the effects, but at 1/4 the framerate...
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 28. July 2013 @ 10:04
|
AfterDawn Addict
4 product reviews
|
28. July 2013 @ 12:12 |
Link to this message
|
Quote: Right after I got these cards, a driver update necessary for Crossfire scaling in some newer titles caused their single GPU performance to take a 10-15% dive and it was never remedied. Excuse my language but that's absolute BULLSHIT. My hardware got directly nerfed to sell newer cards. I believe you've recorded this change yourself.
This did happen but as I understand it, this was due to the fact that the HD6800 series performance was actually artificially raised by an image quality hack at launch, which was quickly removed when it was spotted by reviewers etc. In all honesty, I'd prefer it gone and have lower performance, as historically, image quality over performance was what owning an ATI was all about.
AMD's attitude re: TWIMTBP is 'If you can't beat them, join them' and there is now a small number of titles where roles are reversed, but not many. AMD have never been anything like as good at marketing as nvidia, and it shows when you look at how many fanboys there are of either side. I'd probably say it's at least a 4:1 or even 5:1 ratio of nvidia fanboys versus AMD fanboys, which is what makes querying (or even advising) AMD hardware all the more difficult in open forum.
Code was found in nvidia's PhysX implementation years back to artificially limit the frame rate by capping the PhysX process at c. 30% of one core of your CPU (and as far as I know it has never been removed). I'll also point out that this code, though crippling on any CPU, therefore means owners of AMD FX CPUs with lower performance per-core, will be even harder hit by this.
PhysX is perfectly capable of running on the CPU if you've got the spare cores to handle it, but extra performance and visual effects are not what PhysX is about - it's all about degrading the performance of your opposition through, effectively, bribing game developers.
It's a disgusting practice which I've long been of the mindset of trying to counter by avoiding the purchase/recommendation of nvidia hardware unless using the latter makes an obvious economic case, but frankly, people who actually do the research into this sort of stuff and properly understand what's going on are in a real minority - a small drop in an ocean of 15 year old Call of Duty players with blue LED-lined cases that think nvidia are god's gift to the earth - it's all marketing, nobody's born to think that way.
I could understand nvidia's reluctance to embrace displayport, as the implementation of adapting it to fit DVI displays (for eyefinity purposes) is absolutely disgusting, and by far the worst attribute of AMD's driver standards. That said, they have now at least added it to the recent crop of Geforce cards. It's still only a single full-size connector rather than multiple mini-displayports though, and further, nvidia's implementation of 4K is quite poor still.
Really, going for ultra high-res displays is still AMD's territory - nvidia aren't going to develop cards to work well with a platform that leaves their GPUs decidedly second best (Geforce performance drops off quite rapidly above 2560x1440).
Thing is though, the way things currently are, SLI delivers an enjoyable gaming experience, Crossfire doesn't. Unless that changes, one single HD7970 is as good as it gets on the AMD side, and for 3840x2160 gaming, even if you disable AA due to the higher DPI on the screen, that's looking a bit thin.
I'll re-assess the situation at such time I can actually afford a 4K display, so probably a little under a year from now. By then, if Crossfire still suffers the same issues it does today with the HD7 series, it'll be time to start looking at the Geforce lineup. Otherwise, it's still going to be an AMD solution next upgrade.
|
AfterDawn Addict
15 product reviews
|
28. July 2013 @ 12:57 |
Link to this message
|
I resent the Blue LED remark :P I like flashy stuff as long as it's tasteful.
That said, I agree with the entirety of that post, especially AMD's absolutely piss-poor management of Crossfire. More often than not, I'm usually stuck gaming on a single card lately because AMD haven't released a relevant CAP for about 6 months.
I will certainly be going single card this time around. Crossfire was dead awesome when AMD actually gave a crap and kept their CAPs updated. The ones they have released lately affect games that don't need it or games that nobody plays. Look at the latest 13.5. Still tweaking CoD4 performance? Really? You mean that game that runs at over 100FPS maxed on every card since the GeForce 8800s came out? Basically zero effort from AMD on their driver situation for quite a while now.
I also agree that SLI is currently much better than Crossfire. Being hardware based means it simply works in most things. Crossfire depends entirely on AMD to release new drivers. The HD7s have been out for quite a while now. I don't think they plan to put any more effort into supporting Crossfire as a serious technology.
It really sucks that you basically need to go Nvidia these days, or you get screwed. It doesn't seem like AMD gives much of a damn about their customers any more.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 28. July 2013 @ 12:58
|
AfterDawn Addict
4 product reviews
|
28. July 2013 @ 14:29 |
Link to this message
|
Oh I quite agree, but it's rarely tasteful. Unfortunately you very rarely see it with other colours. I was very fond of blue LEDs when they first came out and even went with a blue themed PC for a while, but it became too synonymous with that sort of cheap tacky system, so I scaled back the lighting and went looking for other colours. I've abandoned blue backlit keyboards, never had a blue mouse, but ultimately there are blue LEDs on all my monitors, my Z-5500 console, all my external hard disk docks bar one, all the hotswap trays on my server, and the HDD LEDs on my PCs. That's plenty :D
SLI is every bit as profile and driver bound as Crossfire, and nvidia don't always get it right - there were a fair few complaints of titles that went 5 months before a driver update. The difference is, with Geforces this happens on a fair few occasions, whereas with Crossfire it's the norm, and it's an unusual case to see otherwise.
There's still much I resent about nvidia and the PhysX situation is the tip of the iceberg, but I've been 'voting with my wallet' for almost 10 years and they're most definitely still the number 2 manufacturer.
I still genuinely feel I will break the long-term longevity of my system if I install a Geforce card, as they still on average don't seem to last as long, but if it means a decent running system for 2-3 years, that'll probably do.
The mentality of 'bad software is fixable, bad hardware isn't' is starting to wear a bit thin now...
The HD6900s have always been fairly good performers in crossfire, they're not going anywhere for now - if I went with HD7900s though, different story, and one HD7970GE is still not an upgrade from two HD6970s. Two GTX780s are currently way out of my league cost-wise anyway, and as said, until I push even higher up the resolution scale, I'm still not in any dire need of more graphics power until I get round to Far Cry 3, Crysis 3 and Metro, which is still a long way off, too much else to play that'll run well on my existing hardware first!
|
AfterDawn Addict
15 product reviews
|
28. July 2013 @ 15:08 |
Link to this message
|
Quote: Oh I quite agree, but it's rarely tasteful. Unfortunately you very rarely see it with other colours. I was very fond of blue LEDs when they first came out and even went with a blue themed PC for a while, but it became too synonymous with that sort of cheap tacky system, so I scaled back the lighting and went looking for other colours. I've abandoned blue backlit keyboards, never had a blue mouse, but ultimately there are blue LEDs on all my monitors, my Z-5500 console, all my external hard disk docks bar one, all the hotswap trays on my server, and the HDD LEDs on my PCs. That's plenty :D
Haha but no LED fans or CCLs in the case? For shame :P
No I agree that flashy lights and gizmos have become the norm for cheaply built junk PCs. Blue is EVERYWHERE. I know quite a few people who fancy themselves "PC Gamers" because they got a cheap case with some lights or a bargain bin PSU with an LED fan. It does begin to grate on one's senses when people choose to illuminate shoddy workmanship and crap components.
I do prefer that my PC have some dedicated lighting however. Call it a diehard habit from my noob days. I also don't mind LED fans if they're decent quality. I like it flashy so as to illuminate my attention to detail and build quality. I take a great amount of pride in my clean builds as I spent many years developing my wire management skills :)
Quote: SLI is every bit as profile and driver bound as Crossfire, and nvidia don't always get it right - there were a fair few complaints of titles that went 5 months before a driver update. The difference is, with Geforces this happens on a fair few occasions, whereas with Crossfire it's the norm, and it's an unusual case to see otherwise.
The HD6900s have always been fairly good performers in crossfire, they're not going anywhere for now - if I went with HD7900s though, different story, and one HD7970GE is still not an upgrade from two HD6970s.
I agree entirely on driver releases. You don't see near the complaints leveled at Nvidia's driver department. And when a game is lacking proper SLI support, Nvidia make it a priority as they know people will be eagerly awaiting the next driver. AMD have yet to release a non-beta driver since 13.4. It's going to be nearly 4 months since they've released a proper driver, and it still probably won't fix anything. In that time, Nvidia have released nearly a dozen driver updates, all bringing updates that address issues with the newest games. I haven't seen AMD put that kind of priority on a driver release since 2011... Not to mention the ENTIRE CROSSFIRE USERBASE IS STILL WAITING TO PLAY METRO LAST LIGHT.
Yes the HD6900s never got shafted as badly as the HD6800s and 7900s. AMD made a major screwup here. My HD5850s and HD4870s were much less problematic as well.
Quote: There's still much I resent about nvidia and the PhysX situation is the tip of the iceberg, but I've been 'voting with my wallet' for almost 10 years and they're most definitely still the number 2 manufacturer.
Oh I definitely agree. I'm sure if we both sat down and compiled a list of known biases, bribes, and outright cheating, it would be several forum pages long. Nvidia has a LOT of things not to like. PhysX is most certainly one of many, many complaints.
Quote: I still genuinely feel I will break the long-term longevity of my system if I install a Geforce card, as they still on average don't seem to last as long, but if it means a decent running system for 2-3 years, that'll probably do.
The mentality of 'bad software is fixable, bad hardware isn't' is starting to wear a bit thin now...
Nvidia's longevity issues are indeed still prevalent. I can't help but agree, though, that Nvidia hardware simply seems to work better on average and faces significantly fewer issues with performance, scaling, functionality, etc. Of course, the argument of bad hardware vs bad software still holds weight, but AMD still have yet to fix their bad software...
Quote: I'm still not in any dire need of more graphics power until I get round to Far Cry 3, Crysis 3 and Metro, which is still a long way off, too much else to play that'll run well on my existing hardware first!
Crysis 3 I'll certainly give you. Far Cry 3 is much more reasonable and quite well optimised but still very demanding. Depending on which Metro you're referring to I'm not sure what to say. Metro 2033 would actually be quite doable considering my own performance with the game. 60+ at almost all times, excellent Crossfire scaling. Metro Last Light on the other hand... refer to Crysis 3...
-----------------------------------------------------------------------
On another note, Crysis 2 was a step in the wrong direction but actually managed to be an excellent game otherwise. Crysis 3... they reintroduce Psycho with a whole different personality, a new face model, and a new voice actor. They completely ruined the best character in the game. Also, Nomad, the main character from Crysis 1, gets a retconned death in some obscure, poorly made comic. Don't even get me started on their treatment of the story. The first games set everything up for an amazing sequel, and they basically scrapped everything for generic enemies and the lamest FPS story in years.
How the hell do they manage to put out solid gold like Crysis and Warhead then make the sequels such a steaming pile? They take the story, quality, ambition, meticulous attention to detail, art direction, and even some of the technology completely out of the sequels and throw it down the drain. Crysis 2 and 3 are so much worse quality and differently made than Crysis and Warhead that I'm starting to wonder if they were even made by the same team.
Crysis 1 and Warhead managed near photorealism at times and had gigantic, sprawling environments packed with hand-placed details... and Crysis 2 and 3 are horrible CoD-clone corridor crawls with way too much gloss, uninspired level design and mediocre graphics. Crysis 1 still looks miles better than 3.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 28. July 2013 @ 15:32
|
AfterDawn Addict
4 product reviews
|
28. July 2013 @ 15:33 |
Link to this message
|
Nah, the only LED fan I have in use now is a red Antec Tricool as the CPU fan in my gaming PC.
The LED is useful as it means I can see if the motherboard fan controller has activated the fan (once every 100 or so boots it 'forgets', leading to an overheat prety rapidly - 200W+ CPU doesn't work with no fan!) but really it's because it's substantial enough to survive attached to the cooler. The slipstream fans I use won't work on hot surfaces as they're too fragile - I tried a Gentle Typhoon high-speed for the extra grunt, but the bearing noise was horrendous. The S-Flex fans are still the best I've had for balance of quietness airflow and longevity, but 1500rpm doesn't quite cut it when the room is hot.
The HD4800s were definitively the last 'just works' AMD cards, none of the GSODs the HD5s had, none of the crossfire woes the HD68s and HD7s have (different issues I know), consistent performance, solid reliability etc.
All that said, the HD4870X2 pair was high maintenance, after they get old (2 years+) the coolers that were perfectly adequate at 3000rpm when new, are at 6000rpm no longer really keeping up, even with a good clean. The 60dB+ noise level, and having to remember to crank all 4 1900rpm side case fans up to max as well is not something I miss, nor is the 800W power consumption/heat output.
The HD6970s have been inconspicuous - with the lack of light in my case you can't even see them through the window, they make minimal noise even in games, now having just last month surpassed the age my HD4870X2s were when they were retired. The performance is all there, slightly more of it in fact since you're not relying on 4 GPU scaling or a cap of 80% (something the HD6 series did provide a benefit with, upping to the current 95%), but with none of the heat/noise/fuss.
Apart from lack of new game profiles, they work great together. Less than can be said for the HD7s unfortunately.
I can't say the HD7s are bad as singular cards though - the MSI HD7770 I bought for my LAN PC (see http://www.newegg.com/Product/Product.aspx?Item=N82E16814127687 ) was cheap (<£100), is very quiet (<20dB) even when gaming, and has so far worked perfectly since I bought it, even if it isn't used often. The power usage and heat output are tiny for the level of performance you get, which is effectively an HD5850/HD6870's worth, in a proper slot-length card with a single power connector.
Just a shame that multi-GPU technology hasn't advanced at the same pace...if at all...if not regressed.
This message has been edited since posting. Last time this message was edited on 28. July 2013 @ 15:52
|
AfterDawn Addict
15 product reviews
|
29. July 2013 @ 00:38 |
Link to this message
|
For the record, Bioshock Infinite runs absolutely perfectly maxed with 4xAA. 60FPS locked with Vsync(with some minor tweaking of course). Damn near 100% scaling. I sure wish more attention would be paid to big name titles. There are a great many I could run beautifully if Crossfire simply worked properly. War Thunder, on the ACES engine, is another example. The engine has already been firmly established in several big name flight sim releases. Crossfire works wonderfully in them. War Thunder, however, needs a CAP. It's already capable of multi-GPU as SLI works for it, and enabling AFR makes Crossfire work with some 80% scaling, but with visual artifacts. It's all down to AMD to release a proper driver that supports it. It's a big name competitive simulation MMO with 3 million players. You'd think some attention would be paid to it. Nothing for months and months now. Been playing since March.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
4 product reviews
|
12. August 2013 @ 14:42 |
Link to this message
|
Rather gutting news, the next generation Nexus 10 will be manufactured by Asus. I'm pretty attached to my current-gen Nexus 10, it's pretty indispensible, so it looks like I now have to look into getting another current-gen 10 before they go out of production, in case anything happens to my current one. Since I'm still waiting for anyone but Apple or Google to produce a high-res tablet, it looks like in the worst case I'll have to keep two of the new ones on me at all times and ensure everything on them is cloud-based, so that when one fails I can take over on the next one. Such a nuisance, Asus seem to be making almost everything these days.
|
AfterDawn Addict
|
13. August 2013 @ 04:38 |
Link to this message
|
What's your beef with ASUS? Or are you just speaking from a market diversity point of view?
I think it was the obvious commercial move, and the N10 did a lot worse than it's smaller counterpart.
|
AfterDawn Addict
4 product reviews
|
13. August 2013 @ 04:40 |
Link to this message
|
I think a primary reason for that is the price though. If an Asus-manufactured N10 is cheaper (which it could well be) then that's as good a reason as any.
An Asus product is no good to me personally because I really don't want the time and effort sending stuff back every 6 months. The 10 has become such an integral device for me at work that I can't afford to be in the position where any minute I could be left without use of the device. Carrying two 10" tablets around for sake of contingency really isn't very practical.
This message has been edited since posting. Last time this message was edited on 13. August 2013 @ 04:40
|
Senior Member
|
13. August 2013 @ 22:59 |
Link to this message
|
Originally posted by sammorris: I think a primary reason for that is the price though. If an Asus-manufactured N10 is cheaper (which it could well be) then that's as good a reason as any.
An Asus product is no good to me personally because I really don't want the time and effort sending stuff back every 6 months. The 10 has become such an integral device for me at work that I can't afford to be in the position where any minute I could be left without use of the device. Carrying two 10" tablets around for sake of contingency really isn't very practical.
ASUS or not that could just happen, no one is covered 100% of the time unless you have great redundancy.
I don't think ASUS tablets are bad. I've sold plenty and haven't seen much come back unlike some other tablets.
|
ddp
Moderator
|
13. August 2013 @ 23:48 |
Link to this message
|
Mr-Movies, what "other tablets"?
|
Advertisement
|
|
|
Senior Member
|
14. August 2013 @ 00:10 |
Link to this message
|
Some time ago if anybody remembers I made friends with a computer repair shop owner in my neighborhood, I often visit my friend named Boris the owner, hanging around on weekends sometimes I see tons of laptops come in, I have seen so many different brands come in for repair or people ask Boris if he would be interested in buying their used laptops which he does often, he repairs and sells them, and only once did I see an Asus come in, I asked him why, he said they last longer, better made and most of all are easier to repair than all others, nice layout and roomier to work with.
I'm sure everybody else has their own opinion on which is best but watching with my own eyes the story speaks for itself, I also have learned when a Mac comes in is when I see Boris's face cringe, again I asked why, he answered hardly any room to work with than I saw for myself he was not bull crapping, taken them apart looked like a nightmare.
|
|