User User name Password  
   
Sunday 12.1.2025 / 02:58
Search AfterDawn Forums:        In English   Suomeksi   På svenska
afterdawn.com > forums > pc hardware > building a new pc > the official pc building thread - 4th edition
Show topics
 
Forums
Forums
The Official PC building thread - 4th Edition
  Jump to:
 
In case you want to ask something like "What components should I pick for my new PC?", start a new topic to our PC building forum.
Posted Message
AfterDawn Addict

4 product reviews
_
14. May 2012 @ 02:51 _ Link to this message    Send private message to this user   
Originally posted by theonejrs:
Originally posted by sammorris:
I have seen hardly any evidence of FX-8150s selling at all from the online community. They haven't been quite as much of an economical flop as a technological flop, but they still haven't really made much of an impact.

Llano on the other hand has been selling pretty well, and as a low-end all-in-one system for HTPCs and the likes, it's one of the best options out there.

Ivy Bridge has turned out to be a failure, there is literally no reason to buy one if you already have any i5/i7, let alone a sandy bridge CPU - the performance/mhz is roughly the same, around a 5% gain, the base clock speeds are the same too, so out of the box they're no faster. They use less power, but due to a badly designed heatspreader they run hotter than their sandy bridge counterparts, and are therefore less easily overclocked, so all the advantages of having lower power consumption from 22nm silicon are lost. In effect, Ivy Bridge is almost like Intel's Bulldozer.

Sam,

It's funny, the FX chips sell well in the US, yet don't seem to sell well in the UK? There are 6 available models of the Zambezi (not Bulldozer), and 4 of those sell very well in the US. there have been problems with the FX-4170 4.2GHz Quad core with heat issues and the FX-6200 3.8GHz Quad core, that generally doesn't overclock very well. As you know, I have a lot of faith in Newegg's reviews, because most times the number of reviews very closely match actual number of sales. Sorting those reviews between fact and fiction requires a lot of practice to weed out purchasers who don't know what they are doing, even though they claim to be highly experienced. Newegg also lists the actual number of verified purchases. Here are the Newegg sales numbers for the Zambezi, in sales order.

Zambezi
#1 FX-4100 3.6GHz Quad core 486
#2 FX-8120 3.1GHz Eight core 481
#3 FX-8150 3.6GHz Eight Core 337
#4 FX-6100 3.3GHz Quad core 314
#5 FX-4170 4.2GHz Quad core 44
#6 FX-6200 3.8GHz Quad core 32
That equals 1694 total sales

Sandy Bridge
#1 i5-2500K 3.3GHz Quad core 2016
#2 i7-2600K 3.4GHz Quad core 1358
#3 i5-2500 3.3GHz Quad core 232
#4 i5-2400 3.1GHz Quad core 204
#5 i7-2600 3.4GHz Quad core 200
#6 i7-2700K 3.5GHz Quad core 146
#7 i5-2300 2.8GHz Quad core 50
#8 i5-2550K 3.4GHz Quad core 34
#9 i5-2405S 2.5GHz Quad core 15
#10 i5-2320 3.0GHz Quad core 13
#11 i5-2400S 2.5GHz Quad core 11
#12 i5-2380P 3.1GHz Quad core 6
#13 i5-2310 2.9GHz Quad core 4
#14 i7-2600S 2.8GHz Quad core 3
#15 i5-2450P 3.2GHz Quad core 2
total 4294

As you can see, after #6, the sandy Bridge pretty much falls off the table in terms of sales numbers. I didn't bother with the Dual cores at all, but there are 13 of them, with 9 showing sales of 50, or less. Cheap Dual cores, most with graphics inferior to the Llano. That's 17 CPUs that total 242 sales, or an average of about 14.2 sales per chip.

The bottom line is that AMD with Zambezi sold right at 40% of what the Sandy bridge Quads sold, with far less overhead, and Sandy Bridge has been out much longer than Zambezi, so there has to be some impact felt, especially since the US is the largest market. I've run my figures by friends that work at Newegg, and while they won't give me the actual numbers, they tell me that I am right in the ballpark. Intel needs to drop a number of the 17 chips that aren't selling, and make some concessions to the dealers that have them in stock, and they need to do it right away because every day they don't just costs Intel more money. The stockholder are not going to like it, but with the biggest sellers, with the i5-2500K and the i7-2600K price dropped to almost no profit, they are going to have to do something, given the failure of Ivy Bridge. Someone at Intel made a bad decision, and the damage is done, and can't be easily undone. There will be a fire sale to lower inventory. There has to be otherwise they won't sell!

Best Regards,
Russ
I don't really see it a problem that Intel aren't selling that many of those CPUs in the wild, they undoubtedly won't have made as many. I'd imagine the majority of that sort of inventory is going into the likes of stock dell and HP systems rather than to end-users at newegg. You are right, nobody I know has bought anything other than the i3 2100, i5 2500K or i7 2600K.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
Advertisement
_
__
AfterDawn Addict

4 product reviews
_
19. May 2012 @ 11:26 _ Link to this message    Send private message to this user   
A fairly full, fragmented 5400rpm drive over USB3 is pulling just under 85MB/s on a transfer. Not bad :)



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict

15 product reviews
_
24. May 2012 @ 13:58 _ Link to this message    Send private message to this user   
Okay so I have an interesting one... reformatted a laptop for my dad's friend about a year and a half ago. Now, he's all over me telling me the copy of Windows isn't legitimate and is giving him the license key BS.

Well, this is a perfectly working corporate key of Windows 7 x64 Professional for small businesses. I have paid for the key in full and pay for as many copies as I need for my purposes. This includes reformatting customers' machines without making any profit from the OS at all. This technically falls under MS's acceptable use policy, ie using the key privately for any machine I choose, so long as I am not charging for it. Microsoft tech support has told me the same and that as long as it is used privately, and not being sold or distributed to a separate business, everything is well within my rights. Each key has a maximum of 10 installs and the key this machine is using is still valid.

So my dilemma comes in, how the hell did Windows Genuine Advantage get flagged to begin with? I am using the same distro of Windows on my main machine with another legit Microsoft corporate key and have never had an issue. I have also used these keys for customer machines previously, and this is the only machine ever to have this issue. So will it be as simple as reformatting the machine with a new key? Or are all machines with that key now borked because of it?

My main concern comes from the fact that until more recent times, I was using less-than-legitimate copies of Windows for my purposes, and all installs done with the less-than-legit distro are all still WGA passed ie genuine. One of the ones I've actually paid for, is now flagged for being illegitimate.

I also called my associate, to see if anything happened to my license deal or if any of the keys were leaked, thus causing MS to flag any distro with that corporate key. He says no, that everything on the business side is kosher. So I'm stuck wondering what it could be that caused WGA to be flagged. Both the company(Renaissance Learning) and Microsoft are well aware of what my keys are being used for. Private maintenance of any machine to come into my possession. These keys are cleared for my usage, but it's no good for me if they aren't going to be reliable. What does paying for keys actually do for me, if using a cracked distro with no need for any key at all works better and passes WGA more cleanly than a legit copy??



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 24. May 2012 @ 14:15

Senior Member
_
24. May 2012 @ 14:28 _ Link to this message    Send private message to this user   
All you need to do is call MS on the flagged machine and they will take care of it. You can also request how many machines are being used on a key which might help you figure out what is going on. There are too many things that can cause WGA to flag your machine and it isn't always legitimate, sometimes MS screws up and they know that but caution in their favor.

I've had to deal with this problem also but I have to say MS has always been good about straightening out the problem unless it is through someone like Dell, HP, Sony, and so on then you have to deal with the re-seller and that is never good. But that isn't your situation so you should have no problem just get the machine from the person and re-activate it through MS again.

You are always better off buying your OS over hacking them.

This message has been edited since posting. Last time this message was edited on 24. May 2012 @ 14:30

Red_Maw
Senior Member
_
24. May 2012 @ 14:55 _ Link to this message    Send private message to this user   
Originally posted by Estuansis:
What does paying for keys actually do for me, if using a cracked distro with no need for any key at all works better and passes WGA more cleanly than a legit copy??
You get the pleasure of having to call MS and straighten problems that should not have occurred out :P

I have a similar issue with my laptop where it occasionally complains that the copy of windows is not "genuine"; typically works itself out in under hour though.

Originally posted by Mr-Movies:
You are always better off buying your OS over hacking them.
To be honest the only difference I have ever noticed between the legal and illegal copies of win7 is that with the hacked version I do not need to spend an hour on the phone when ever MS decides my license is no longer valid.


AfterDawn Addict

15 product reviews
_
24. May 2012 @ 16:45 _ Link to this message    Send private message to this user   
I can't help agreeing with both sides. This is my dad's friend, so it's not a big deal, but other people have machines using that key, and I hope it doesn't come back to bite me. Those keys have no time limit either, they are simply corporate copies of the stock standard OS, with multiple keys.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
harvrdguy
Senior Member
_
30. May 2012 @ 21:49 _ Link to this message    Send private message to this user   
Haha - that's too funny.

Jeff tries to go legit and gets busted anyway!!!

The guy who wrote how I could get rid of XP telling me that it would boot up in 5 seconds, but "I must be the victim of software piracy" had the same problem with windows genuine advantage.

This was about 5 years ago when I turned off automatic updating on all my machines - I have my cracked xp (actually a beta copy when they were first testing xp) on at least 10 or 20 machines all over the place, but I also have the full OEM version of SP1, SP2, and also SP3, so I can start with a fresh install and not need to go online.

The long and the short of it is that it worked great, until one day 5 years ago my computer wouldn't boot up and did that 5,4,3,2,1 countdown. That unnerved me - but google quickly fixed the problem.

The guy with the solution had a totally legit copy of XP, on several machines, but MS screwed up - AND THE MS TECH SUPPORT COULDN'T FIX IT! (Completely true, Jeff, so don't be too sure they'll always be able to fix it for you.) So the guy got so mad, he went into the registry and found the 4 places that windows genuine advantage hides, and saved all the guys like me.

I have a uh hum ... questionable copy of Windows 7 on the gaming machine. But I don't run auto updates, and I don't browse on that machine, so unless the hackers can get to me while I'm on a multiplayer, I think I'm reasonably safe from new threats. (My thinking is that auto updates is mainly to solve new virus and hacking holes that emerge.)

Rich
AfterDawn Addict

15 product reviews
_
31. May 2012 @ 03:43 _ Link to this message    Send private message to this user   
Letting auto-update go is not the issue. There is a single update which checks for Windows Authentication circumvention. Simply disable and hide said update. It's right near the bottom of the list :P Windows should still be able to use all the other updates without complaint. Many of the updates are rather essential I'd imagine, security and all :S

BTW I disable this update on legit authenticated machines as well. It's unnecessary. Believe me though, I have paid for Windows many many times over. Personally Windows 7 for myself twice, both Ultimate and Pro and 3 copies of XP, both Home and Pro. I am not shy about using a less-than-legit copy. Especially since any major hardware changes make a basic reformat inevitable. I can't be tacking on a $200 OS every time I upgrade or re-build my PC($150 for the multi-install corporate keys). I've even had the OS trigger by just removing my motherboard for re-TIMing and plugging everything back in. It senses that the hardware has changed or shifted in some way and prompts a license renewal as it counts as a "new PC".

AFAIK Ultimate is largely useless fluff for most so I prefer Pro on my own machines. Actually saves a bit of HDD space :S



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 31. May 2012 @ 03:52

AfterDawn Addict

4 product reviews
_
31. May 2012 @ 04:22 _ Link to this message    Send private message to this user   
I use legit 7 pro but often find myself using removeWAT to stop WGA shernanigans.
It's a fairly solid tool for this job.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict

15 product reviews
_
31. May 2012 @ 04:39 _ Link to this message    Send private message to this user   
Sam, give a look at Wargame European Escalation. If you liked Supreme Commander you might like this. Very impressed with the depth of the gameplay myself. Only mentioned because I'm currently playing. Fairly demanding as well, though not too bad.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 31. May 2012 @ 04:54

harvrdguy
Senior Member
_
31. May 2012 @ 05:26 _ Link to this message    Send private message to this user   
Hmmm, that removeWAT tool sounds handy. Well, I don't know - maybe those other updates, security things, are good. Maybe some of them help game performance - but I doubt it. That certainly sounds like a pain to have it detect a mobo change and want a new license - and then the inevitable phone call to MS to get it all cleared up. LOL

Hey guys, I am super close to taking the plunge and getting a 7970. Now, I know you said gtx 670, Sam. And others say 7950 if AMD. But here's a thought. I don't have a sli-certified rig for now - but I do have a cf rig, with full 16 lanes of pci-e simultaneously for two cards, while I know pci-e lanes don't matter that much. BUT - just suppose that I put a GTX 670, or 7970 in, and for any particular title, as unlikely as it could be, I am still somewhat gpu bound - this is of course after overclocking the cpu up to 3.6.

So - as unlikely as it would be - IF I ended up slightly gpu bound - say on BF3, well then another 7970 would solve that wouldn't it. But that option would not be available if I opted for a gtx670.

I guess a gtx 670 is $400, same as 7950, but 7970 is $480. I'm not too concerned over $80 - I can eat rice for a while and save $80.

Let me ask - for 30" gaming, is there any single title on which a 7970 beats a gtx670, or do you have to get to eyefinity with 6 million pixels before you ever beat it?

Rich
Senior Member
_
31. May 2012 @ 10:51 _ Link to this message    Send private message to this user   
The 7970 is better then the 7950 and if you're not concerned about the $80 go with the better card. The GTX670 is a very good card too so if you prefer NVidia then go that way.

The update you should hide and avoid is KB971033 even if you are using legit OS's. I've had that update pooch good copies of Windows and that is why MS doesn't automatically install it now, you normally would have to review the updates and check the box for that one if you want to install it.

If an update corrupts your machine MS won't be able to fix it, you would need to restore-image or re-install. MS can only fix activation issues on a machine that is not corrupted or doesn't have a physical issue(s), to address the concern prior.

The Wat is only one way to cheat and an old one at that, there are better ways to do the same thing.

This message has been edited since posting. Last time this message was edited on 31. May 2012 @ 10:53

harvrdguy
Senior Member
_
31. May 2012 @ 23:06 _ Link to this message    Send private message to this user   
Hey gang - movies - well, I am starting to change my mind about some things. I am looking at CF like a lot of you have been running for a long time, Sam and Jeff for example, and my Q9450 is not sli-certified, so that obviously points to an AMD solution. Let me see if you guys think that I am on the right track.

I haven't done a lot of personal research (I should, lol) but I'm hearing some things. If the GTX670 or GTX680 is better than the 7970 for everything except eyefinity, and that's what I am hearing, then I assume - while it might be better, the difference is probably not more than 10%, right?

For sure it can't be better than a 7950 crossfire solution which I would guess would be at least 50-60% more powerful on 30" gaming. Do you guys agree?

So, I'm kind of looking at EITHER $800 for a cpu upgrade, or $800 for a gpu upgrade.

But I am already gpu bottlenecked. So my thought is to put the cpu upgrade off until next year and load up with crossfire, if that would allow me to play BF3 at ultra settings, 2560x1600. I guess I would take the gpu upgrade in two steps, by starting with one card, working with it for a while, making sure that I still had a gpu bottleneck on the most stressful titles. (For example I could go through the single player of BF3 like I already did with the 8800GTX. I would set it to ultra settings and log gpu load vs cpu load. Hopefully I'll see gpu load 100% and cpu something significantly less than that - then I would add the second card to eliminate the gpu bottleneck.)

After that, the bottleneck would of course shift over to the cpu, but if I can pull 30 fps on BF3 with ultra settings, I'll move forward to strengthen the cpu next year.

What do you guys think? (And for my needs, am I right - 7950 is just about the same?)

Rich
Senior Member
_
1. June 2012 @ 00:10 _ Link to this message    Send private message to this user   
You could use your Q9450 with a GTX 690 which you wouldn't need SLi and you should have plenty of GPU punch. The card is steep in price but it would perform better than two lesser cards at about the same price for the two. And if you want to upgrade your MB/CPU later you could always do so and use this card as well or even add another if you wanted to play with SLi.

The Radeon HD 7970 should perform better spec wise then the GTX 680 but you could be right that the NVidia performs better, maybe?

At 2560x1600 resolution won't be an issue for any of these GPU(s) so you should be fine.

Since you are a gamer I would stick with the higher end cards and wouldn't go with a 7950, go with the 680, 690, or 7970, and I think NVidia is a little stronger for its Cuda Core's versus Streams with AMD.

Well that's my thoughts, Sam or Russ may have some good ideas that differ or that I didn't bring up here.

EVGA 04G-P4-2690-KR GeForce GTX 690 4GB 512-bit GDDR5 PCI Express 3.0 x16 HDCP Ready SLI Support Video Card

XFX Double D FX-797A-TDBC Radeon HD 7970 Black Edition 3GB 384-bit GDDR5 PCI Express 3.0 x16 HDCP Ready CrossFireX Support Video Card

harvrdguy
Senior Member
_
1. June 2012 @ 02:41 _ Link to this message    Send private message to this user   
Movies, that is an interesting idea - I was seriously thinking of a gtx690 once I learned that the sli compatibility issue wouldn't come up.

BUT - the very scary idea surfaced that the card might not even boot up on my rig.

I chased a thread on tom's hardware and then posted a question - and that's when the issue of compatibility with older core 2 platforms surfaced. Then I dug through all the newegg reviews, and one guy with an i7 newer platform, mentioned that he wasn't able to get the card to boot initially, until he made some bios adjustments.

So, not only is the card expensive at about $1000 or more, but IF IT WON'T BOOT on the core 2 quad that I own, I'm out a $150 restocking fee. On the other hand, I know the 7970 will run, because I found a guy with a Q9550 posting his 7970 3dmark scores. And so I can start with a small $400 chunk and take it in little steps.

I have read some interesting things about the overclocking abilities of the 7950 - it's actually a bit of a cooler running card than the 7970, and pulls less power, about 32 watts less under load clocked the same as a 7970. It's volted slightly lower, but you can move that up to 7970 specs, and then it really overclocks like crazy. The reviewer said he thought it was better balanced architecturally - whatever he meant by that - but I'm thinking that as a slightly trimmer version of the 7970, it might be better for my 4 megapixel 30" gaming needs, since 6 megapixel eyefinity is not what I want to do.

(Normally eyefinity is very widescreen, 5760x1200, but I saw an interesting custom eyefinity on youtube, with three samsung 1920x1080 monitors, set in portrait mode with the bezels removed, and a mere 6 mm gap between monitors - that's 3240 x 1920 - it looks good on youtube - he's running it on 7970 CF playing BF3.)

Plus my toughbpower 750 PSU would more easily handle two of those in CF, pulling a combined total under load of 64 watts less, closer to 220 watts each instead of 250 watts for the 7970, and I might overclock it way more than the reviewer. So if the 7970 were getting close to pulling 300 watts on a major overclock, and two of them at 600 watts - that might start to push my 750 PSU, whereas at same extreme clocks, I should see a 60 watt savings from the 7950s.

And lastly, there is the money savings - it's $80 cheaper, that's $160 less for the two of them. (I'm referring to the dual-fan sapphire at $399 with what looks like a good non-reference cooler.)

Rich
Senior Member
_
1. June 2012 @ 13:12 _ Link to this message    Send private message to this user   
I've dug around some and found some of the complaints but they seem to be user issues more then true compatibility issues. The one issue that could be due to compatibility is related to the PCI Express 3.0 slot but that is a backward compatible spec so it shouldn't be an issue. However that doesn't mean it can't be a problem and really would be more of a driver problem which is fixable. Most problems I found were with people OC'n which makes sense. Plus I've looked at the specs from the EVGA manufacture and the card IS PCI Express 2.0 to 3.0 compatible so that won't be a problem for you, your rig is OK for use with it. SPEC

Running SLi or Crossfire will not give you a big boost in performance as you run the video cards in master/slave configuration which means the second card will only handle about a quarter of the load regardless of having two 16x slots or not. So you will pay more for less. But you will gain power over just having one card so if you wanted to get one 7970 now and another later when you get a Crossfire rig that would be OK as well. Keep in mind that OC'n two cards is much more difficult too. That BF3 was pretty nice with the two cards but I wonder how well it worked with just one of the 7970's, probably close to the same. My card is older so like you I need to upgrade too and haven't played with the 7970 yet.

As to heat typically you always have more heat with more power which is why the 50 runs cooler than the 70, unless you OC the 50 of course.

Hope that helps some,
Stevo

This message has been edited since posting. Last time this message was edited on 1. June 2012 @ 13:14

AfterDawn Addict

15 product reviews
_
1. June 2012 @ 13:48 _ Link to this message    Send private message to this user   
Quote:
Movies, that is an interesting idea - I was seriously thinking of a gtx690 once I learned that the sli compatibility issue wouldn't come up.
Dual GPU cards still rely on Crossfire and SLI scaling respectively. They are two GPUs on a single PCB, not a single GPU with two cores. The scaling issues will be less due to less electrical and mechanical linkage/resistance between the chips, but to say Crossfire/SLI doesn't apply is wrong.

As far as actual SLI compatibility motherboard-wise, most modern boards should be able to use a dual-GPU card problem free, even if only equipped with a single slot.

Quote:
Running SLi or Crossfire will not give you a big boost in performance as you run the video cards in master/slave configuration which means the second card will only handle about a quarter of the load regardless of having two 16x slots or not.
Movies, you're almost a decade behind the times. Dual GPU configurations haven't used Master/Slave configs for several years. It hasn't been used for 6 or 7 generations of hardware now. The last TRUE Master/Slave cards were the X1800/X1900 series. Nvidia was even quicker to the draw than that, never actually having had Master/Slave SLI. Master/Slave Crossfire/SLI is positively ancient(centuries in computer terms) and an outdated idea.

Both of my video cards are identical. Neither one is a master or slave. Also, I HAVE NO CLUE where you keep getting these convenient performance numbers you never seem to want to show. I can show you some real performance numbers on a wide variety of games. With a second card my framerates are typically 70-90% higher. Both cards normally hit 80-90% usage per-card and sometimes close to 100%. Far from "a quarter" of the work.

ALSO, slot bandwidth matters very little. Most cards will perform identically at as low as 4x, even the very highest-end models. I also have numbers and PROOF for this.

Quote:
Keep in mind that OC'n two cards is much more difficult too.
Not really. I would know, as I own two video cards and they are overclocked. The only trick is to link the clocks for the two cards. Otherwise, it's the exact same as OCing anything else. Turn the clocks up till they won't test stable, back them down until they do. The only catch is that you're limited by your lowest clocking card. Again, linking their clocks eliminates the guesswork. When the benchmarks test stable, both cards are stable. It's not any more difficult AT ALL.

Quote:
That BF3 was pretty nice with the two cards but I wonder how well it worked with just one of the 7970's, probably close to the same.
Battlefield 3 gets almost perfect 100% scaling. The framerate would be nearly double with a second card. I have proof, both from my own testing and THOUSANDS of other people.

I only emphasize my points as such because I swear we've visited these exact subjects two or three times now...



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 1. June 2012 @ 14:09

Senior Member
_
1. June 2012 @ 14:15 _ Link to this message    Send private message to this user   
Your wrong Jeff! But I'm off with the trade off it turns out that it is as high as 50% now so they can share the load equally now. There IS a Master/Slave relationship!

Originally posted by Wiki:
Implementation

SLI allows two, three or four graphics processing units (GPUs) to share the workload when rendering a frame. Ideally, two cards using identical GPUs are installed in a motherboard that contains two PCI-Express slots, set up in a master-slave configuration. Both cards are given the same part of the 3D scene to render, but effectively half of the work load is sent to the slave card through a connector called the SLI Bridge. As an example, the master card works on the top half of the scene while the slave card works on the bottom half. When the slave card is done, it sends its output to the master card, which combines the two images to form one and then outputs the final render to the monitor.

AfterDawn Addict

15 product reviews
_
1. June 2012 @ 14:34 _ Link to this message    Send private message to this user   
That information is outdated. Crossfire/SLI now uses a tiling method instead of rendering two halves of the screen. The cards don't SHARE a load, they are given two separate loads. The Master/Slave relationship explained there is being used as a demonstration tool. Notice it doesn't mention which hardware generation is being used for its example. YES, one card is going to have a bias simply because you have the monitor plugged in, but that's as far as the bias goes. Properly coded SLI/Crossfire has both cards at or near 100% load. Typically 80-90%. Both cards calculate separately and almost all inter-communication between the two is done by the CPU and the memory sub-system.

Also, wikipedia is not the most reliable for hardware information. Notably their CPU/GPU lists are missing large chunks including entire families of hardware.

I will also mention that Nvidia and AMD have taken two entirely different approaches to dual-card, further separating current tech from the Master/Slave configs of old. SLI is hardware based and Crossfire is software based, thus the need for Crossfire Application Profiles. Whatever the relationship may be, Master/Salve does not exist anymore as it once did. I'm making the distinction because Master/Slave was an entirely different way of making the hardware connection. It would imply one card has special circuitry allowing it to control the link between the two, making it effectively a "Master" card. But that is not the case anymore. Both cards are identical.

I'm not trying to argue, I'm trying to set things straight. You might have been mislead, or misinterpreted something, or simply understood it differently than I did. I'm all for a bit of rousing hardware debate as long as we can keep it sensible. In this case, the facts are ever so slightly different from what you've read(or seem to be explaining to my biased ears).



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 1. June 2012 @ 15:20

AfterDawn Addict

7 product reviews
_
1. June 2012 @ 14:36 _ Link to this message    Send private message to this user   
"You're" :p

That was a correction to your post, not a stab.



To delete, or not to delete. THAT is the question!

This message has been edited since posting. Last time this message was edited on 1. June 2012 @ 14:36

Senior Member
_
1. June 2012 @ 15:22 _ Link to this message    Send private message to this user   
It isn't being used as a demo but was the usage a few years back. Originally it was 25% in 2004 the spec changed to 50% but even in 2007 systems still were using 25% as I built thousands of them back then. It recently changed to independent control as you say and which makes sense that the CPU's these days are tasked harder, which is what Sam has complained about with his game bottle-necks. I haven't played much with dual/quad card configurations since it was such a poor way to go but now that they have taken a better approach I could see the benefit as long as you have a powerful CPU to handle the extra burden.

It still would be better getting the fastest card you can for gaming and then down the road when they come down in price getting a second of the same card to improve your performance even more. That would be the path I would go...

Thanks Jeff,
Stevo
AfterDawn Addict

15 product reviews
_
1. June 2012 @ 15:41 _ Link to this message    Send private message to this user   
Quote:
It still would be better getting the fastest card you can for gaming and then down the road when they come down in price getting a second of the same card to improve your performance even more. That would be the path I would go...
I couldn't agree more. Unless you actually need two cards right away to make the upgrade worthwhile(as was the case with myself), one higher performance card is far and above the superior option. And, as you said, price drops make Crossfire a more viable option later :P



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
harvrdguy
Senior Member
_
1. June 2012 @ 16:30 _ Link to this message    Send private message to this user   
Hey Stevo and Jeff and Kevin,

Wow, that turned into a nice little discussion! LOL

Stevo, thanks for the information. On the specifics of crossfire, I think we have no choice but to sit back and difer to Jeff's expertise. Along with Sam, he's the man on crossfire. He's accumulated a great deal of personal experience now in the last several years, and he knows his hardware - especially if it is hardware that he is running and working with everyday. (Not only hardware - the way Jeff digs in and discovers the patches and mods to tweak a particular game to max eye candy perfection is inspiring.)

Jeff, I keep thinking of you, every time I think of crossfire 7950 instead of 7970. You did the same thing - which family was it, 6000, 5000, 4000? It's been a while, but then I think I recall that you more recently upgraded, is that right?

You have had spectacular results on 2.3 megapixel gaming, running everything in full settings, while I have watched on the sidelines for three years, with my 4 megapixel requirement.

But I wouldn't trade my big 30" Dell for anything - it's the best thing Sam ever suckered me into. :)

But at 4 megapixels, I have had to forget about running anything with max settings unless I was willing financially to remain on the bleeding edge of technology a step or two behind Sam. I watched his frustration with quad CF on the 4000 family, and I thought "Go Sam go!" He went through a lot running those two 4870x2 cards - on the verge of quitting just before he figured it out. Whew! Better him than me! That was painful to watch. At that time he said it would be 3 more generations before we could play crysis at Very High, and he was right.

Thank god Asus hardware can be quirky at times, and that the p5e acted up and reset the sata drive spec in the bios, lol, so at the beginning of 2010, after only two weeks of use, it wouldn't boot up for Miles the modeler/animator. They shipped him a 1366 i7 instead. The Sonata sat in his garage for 9 months until he said to me one day - "Hey I've got this computer that doesn't work - you want to see if you can get it running and if so you can use it for a while." It's been almost two years.

That Q9450 and 8800GTX churned 13,500 3dmark6 points, way above the 6,000 points I was getting from my P4 and 3850, and moved me into some serious gaming - but not serious like you. So here I am ready to move past the gpu bottleneck. I want to join you guys out there on the BF3 battleground, but you have raved about ultra textures, so that's what I want too, lol.

I never thought I would go for anything but the Cadillac of a particular family, meaning the 7970, but more and more I am thinking that the 7950 is a slightly trimmer card, and is better suited for my "mid-range" 4 megapixel 30" needs, halfway between 2.3 megapixels on 24" gaming, and 6-7 megapixels for eyefinity. At identical clocks it pulls 32 watts less under load, 220 vs 252, about 12% less, which matches the the 12% fewer stream processors.

I have the sneaking suspicion that, as a cooler chip without "the baggage" of those extra processors, it might actually overclock better than the 7970. The review suggested as much.

At identical clocks, it seems to perform only about 2-3% less fps on demanding titles, and I could probably make up for that, by a slightly higher clock. I could equal the 7970 performance, but still use less wattage. What I am saying is that the extra processors on the 7970 are not really needed for my mid-range 4 megapixel load, and only get in the way by adding wattage and heat.

How's that for a radical theory!! Hahahaha

Rich
AfterDawn Addict

4 product reviews
_
1. June 2012 @ 17:21 _ Link to this message    Send private message to this user   
Much like some other 'debates' we've had, I suspect the experience Stevo is drawing on is a personal one, rather than a factual one. We know from experience how crossfire works because we use it. Stevo is going by his beliefs, which as we know, supersede proven evidence.

I'm not really getting into the debate, it's been a long and difficult month of work, thus I haven't been around to post often, though it does amuse me to see the heated arguments flare up even in my absence, which goes some way to dispelling the commonly held belief that I'm the catalyst for all the negative conversations around here.

To add some worth to this post in a succinct manner, here are some facts:

1. The GTX690 as a dual-GPU card does not require an SLI licensed chipset/motherboard to operate. It is, however, being a modern geforce, potentially likely to conflict with older LGA775 based boards, which Radeons do not.
2. The GTX690 is overpriced. At $1100 where you can find one (1 month+ waiting list at the moment), it's fully twice the cost of the GTX680 which is itself massively overpriced compared to the near-identical GTX670 which is a comparative bargain.
3. Crossfire scaling, in all properly supported titles, is 85-95%. That's a given nowadays, if you get less, it's a bug, and it may or may not be fixed, depending on the title. SLI scaling also falls into the same region, but to the lower end, rather than the higher end - CF typically takes a 5-10% lead in scaling. However, SLI is more reliable (fewer games bug with low/no scaling) and has a shorter lead time (typically nil to 3 weeks, versus 2 to 24 weeks with crossfire)
4. Power consumption and performance per watt etc. are out the window this gen - both cards are basically on an even footing here, so that argument is, to the delight of nvidia fans throughout the world, history.
5. This statement still holds true: It is better to buy one single-GPU card that is the performance-equal or performance-approximate of an existing dual-GPU configuration, even if it costs more. Single GPUs are simply better, when they can provide enough power.
6. The new generation of hardware improves things, but does not allow, nor will future generations for some time, allow every modern title to be maxed out at 2560x1600.
7. PCI Express bandwidth remains a non-issue for any modern graphics card (Dual-GPU cards excluding) all the way down to 4x inclusive. Do not try and operate a dual-GPU card on a 4x slot, or a single card in a 1x operative slot, but anything else goes. There were no tangible effects proven when running two HD5970s or HD6990s in a dual 8x system - providing 4x per GPU, nor were there when testing four HD5870s in 4x slots each.
8. PCI Express 2 & 3 backwards compatibility will not cause any problems other than the potential stumbling block with nvidia cards and old chipsets already stated.
9. RemoveWAT may be old, and perhaps there are better methods - but it's reliable, requires no install or registry edits, and works with one simple click. It's as easy as it gets.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
Advertisement
_
__
 
_
AfterDawn Addict

7 product reviews
_
1. June 2012 @ 17:32 _ Link to this message    Send private message to this user   
Originally posted by harvrdguy:

But I wouldn't trade my big 30" Dell for anything - it's the best thing Sam ever suckered me into. :)
Shouldn't take much to sucker anybody into a wonderful monitor like the Dell IPS monitors! :) I love my 24" Dell, but I sure do want the 30" now. I honestly wouldn't mind a monitor with even more resolution, and larger :P Of course, if it were available now(probably), it would be crazy expensive ;)



To delete, or not to delete. THAT is the question!
 
afterdawn.com > forums > pc hardware > building a new pc > the official pc building thread - 4th edition
 

Digital video: AfterDawn.com | AfterDawn Forums
Music: MP3Lizard.com
Gaming: Blasteroids.com | Blasteroids Forums | Compare game prices
Software: Software downloads
Blogs: User profile pages
RSS feeds: AfterDawn.com News | Software updates | AfterDawn Forums
International: AfterDawn in Finnish | AfterDawn in Swedish | AfterDawn in Norwegian | download.fi
Navigate: Search | Site map
About us: About AfterDawn Ltd | Advertise on our sites | Rules, Restrictions, Legal disclaimer & Privacy policy
Contact us: Send feedback | Contact our media sales team
 
  © 1999-2025 by AfterDawn Ltd.

  IDG TechNetwork