|
Two weaker Crossfire video cards or one better one?
|
|
jeanpave
Member
|
17. August 2010 @ 19:51 |
Link to this message
|
Hello there.
I'm trying to build a very nice PC (for the money I have to spend), and I'm in a bit of a conundrum.
Should I get a regular PC configuration, with one video card, the Radeon HD 5850 (which, supposedly, is better than even the nVidia GTX 465 (?), even though the 465 is listed higher at videocardbenchmark.net), or get a Crossfire type of configuration, with two Radeon HD 5770s (that would cost a bit more) or maybe with two 5750s (that would cost exactly as much as the 5850)? (Money is a bit of a factor, too.)
I understand that:
5850 - has core clock of 725 MHz, 1 GB of memory, and 256-bit memory interface
5770 - has core clock of 850 MHz, 1 GB of memory, and 128-bit memory interface
5750 - has core clock of 700 MHz, 1 GB of memory, and 128-bit memory interface, but also 2-DVI, unlike the 5770.
It would be very nice to have 2 GB of video memory, and two GPUs working in tandem, but I'm still not sure whether it's an upgrade over just the one, better video card. It is, right?
I also am not 100% sure what kind of motherboard I should get for the Crossfire option, if I decide to go with that. (But I know I wouldn't like to pay more than 150 dollars for just the mobo. Oh, and it has to be able of full HDMI, of course, otherwise what's the point, right?)
How's the Asus AMD785G M4A785TD-V EVO? Is it good for Crossfire?
Oh, and having two GPUs... would that need some extra cooling fans in the computer case, too?
Sorry for asking so many details.
Thank you very, very much.
|
Advertisement
|
|
|
AfterDawn Addict
4 product reviews
|
18. August 2010 @ 08:56 |
Link to this message
|
One better one, every time. The only reason to use crossfire in my mind, is to get more performance than the fastest single GPU can offer. It makes a tempting 'I'll do it later' upgrade for people with lower-end cards, but one faster card is universally better, if a little more expensive, since crossfire often doesn't scale to 100%, or anywhere near it.
An HD5870 is about the same as two HD5770s, but 100% of the time. An HD5850 is about 70% faster than an HD5770. A GTX460 is a bit less, depending on which version you buy. It's more like 35% for the 768MB version, and more like 45-50% for the 1GB version. Either way, not as fast as the HD5850.
I'd advise against buying an Asus board, they're pretty low quality and don't last very long. For AMD I'd say something like a high-end Gigabyte UD board on the 785 chipset.
|
jeanpave
Member
|
19. August 2010 @ 01:41 |
Link to this message
|
Thank you very much, Sammorris.
I have to say, though, people usually say to me that getting two video cards that can beat a more powerful card is better than just buying that more powerful card.
In my case, I was thinking of getting the 5850, not the 5870, which is more expensive. If the two 5770s can beat the 5870, too, then they would be significantly better than the 5850, no? And I don't plan on opening up the case and maybe adding another 5850 later on, when they become less expensive. I buy this configuration, and then that's that until the next PC.
Do you still think I should go with the 5850?
Somebody just suggested this: http://techreport.com/articles.x/19404 to me, and it seems like kind of a good advice to me. May I ask what you think of those comparisons?
And, of course, thank you very much for the advice about Asus mobos.
I only have six options for a Gigabyte motherboard, though:
- Gigabyte AM3 790X Motherboard (GA-790XT-USB3)
- Gigabyte AM3 AMD770 Motherboard (GA-770T-USB3)
- Gigabyte AMD770 (AM3) Motherboards (GA-770TA-UD3)
- Gigabyte AMD790 Motherboard (GA790FXTAUD5)
- Gigabyte AMD790 Motherboard (GA790XTA-UD4)
- Gigabyte Motherboard AM3 GA-MA785GT-UD3H
(I think the fourth one - UD5 - may be too pricey for me, in any case.)
Would you mind telling me which one you would get for CrossFireX? And which one for a single video card, please?
Thank you again, very much!
This message has been edited since posting. Last time this message was edited on 19. August 2010 @ 01:47
|
AfterDawn Addict
4 product reviews
|
19. August 2010 @ 06:47 |
Link to this message
|
Originally posted by jeanpave: I have to say, though, people usually say to me that getting two video cards that can beat a more powerful card is better than just buying that more powerful card.
- Is never the case. It's cheaper to do that, but it's never better.
The problem with using two cards is as I explained to you before. Two HD5770s are barely better than an HD5870 (we're talking roughly 5% here) when their scaling is absolutely perfect, and this doesn't happen very often. Most games scale 80% or less, which would put the pair of HD5770s on a par with the HD5850. However, often games don't support crossfire at all, and in those games, two HD5770s are only as fast as one HD5770 and the HD5850 would obliterate them. This happens with a fair few games, and it happens to all titles for a period of 1-3 months after they are released, before crossfire is enabled for the game.
As for thetechreport benchmark, it's quite weak really, since thetechreport still don't bother including a fundamental part of graphics testing, which is minimum frame rate. They still take an average. It's no use having an average of 100fps if a certain part of the level causes it to drop down to 15. This happens sometimes, and it happens far more often with SLI/Crossfire than it does with a single card.
In addition, for whatever reason thetechreport has posted more favourable numbers for crossfire than I've seen from other sites. Usually in Aliens vs Predator for example, crossfire is pretty much useless.
I put down crossfire, even though I use it, because it's far from perfect, though it does work well in a few titles. However, I use it because it got me far more power than I could ever get from a single card at the time. When you have the option of buying a single GPU card instead, I'd take that option every time.
If you still want to go crossfire, I'd probably say go with something like this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16813128435
For a single GPU, I'd probably say the same.
|
jeanpave
Member
|
21. August 2010 @ 01:01 |
Link to this message
|
Thank you very much, again, Sammorris.
But see, I just found out about this: Some mobos, like the one I mentioned, and also the one you suggested, only allow for the two cards that work in tandem to be used partially. (i.e. at 16x/0x or 16x/4x or 8x/8x). (The GIGABYTE GA-890GPA-UD3H AM3 AMD 890GX allows for either 16x/0x or 8x/8x.) However, I could buy this motherboard:
MSI 790FX-GD70
This would allegedly permit both CrossFireX cards to be used at 16x. (Right?)
And that would significantly increase the performance of those cards. Am I getting this properly?
Maybe then I'll get the most from the CrossFire configuration, and get something better than just the 5850 for roughly the same price.
What do you think?
Thanks again, very much, for replying. I'm glad I found a guy with whom I can have a conversation about this.
P.S. It's okay if I don't have USB 3.0 support on the mobo...
This message has been edited since posting. Last time this message was edited on 21. August 2010 @ 03:52
|
AfterDawn Addict
4 product reviews
|
21. August 2010 @ 06:49 |
Link to this message
|
8x has no effect on the performance of current graphics cards at all. Even at 4x, it's usually a 5-10% impact or less. The only boards that can run 16/16 on AMDs are the 890X chipsets, they cost a fortune - if that's all you need one for, it isn't worth it.
|
jeanpave
Member
|
21. August 2010 @ 18:35 |
Link to this message
|
Originally posted by sammorris: 8x has no effect on the performance of current graphics cards at all. Even at 4x, it's usually a 5-10% impact or less. The only boards that can run 16/16 on AMDs are the 890X chipsets, they cost a fortune - if that's all you need one for, it isn't worth it.
You're right, I have to get this one:
GIGABYTE GA-890FXA-UD5
to get the two video cards running at 16X.
The MSI I mentioned can do it, too, according to its newegg.com page, but the problem with that one is that it doesn't support Phenom II X6 processors. (The GIGABYTE does, though.)
A small fortune is okay. Well, it's all relative, I guess. I can spend up to 1400, but not more, and I can afford that mobo - the 890FXA-UD5.
'Cause, you know, I'm thinking: If a single card is almost always run at 16X, but this motherboard allows for two cards to be run at 16X, too, then the comparisons they make on those websites, with two cards working together to get better performance than one, more powerful card should be accurate in my case, also.
They probably used those cards at maximum performance (16X), too.
(And if they didn't - even better.)
P.S. Sorry if it seems that I'm trying to get you to change your point of view. I'm not. Obviously, you tried it, and didn't like it. But the thing is I still apparently think two video cards would work better for me. (You're not offended, hopefully, right?)
|
AfterDawn Addict
4 product reviews
|
22. August 2010 @ 09:58 |
Link to this message
|
I use four GPUs in tandem, let alone two, and have done since 2008. I know all about the system and it does work. It's just always better to use a single GPU, for the reasons I've stated. It makes no odds to me whether you go with the single card or the lower-end crossfire setup, I'm just making you aware that you're buying the inferior option - plenty of games will see the pair of 5770s slower than the 5870, whilst using quite a lot more power.
The 16x/16x bandwidth makes no odds at all. For what it's worth, I use 4870X2 cards, and 4870 GPUs are similar to the 5770s you want to buy (they're slightly faster, but there's not a lot in it). On my i5 system, I can only use 8x/8x, and this means each GPU is only getting 4x bandwidth. Despite this, the apparent performance loss is undetectable - I dont see any loss in graphics performance from it.
8x vs 16x simply isn't an issue, and any money spent with the sole benefit of getting a higher PCI express bus speed is money wasted.
|
jeanpave
Member
|
24. August 2010 @ 01:37 |
Link to this message
|
Hmm, I see.
Well, okay, thank you very much. I will definitely keep your advice in mind. I have lots to consider now.
Could I ask you just one more thing, though, please? - Does the 1 GB (from the first card) + 1 GB (from the second card) vs. the 1 GB from the lone card make any kind of difference (in gaming, I guess)?
[In all fairness, though, I'm not just interested in gaming. I'm also interested in VMWare emulation (of OSs) and complex video editing.]
But, anyway, thanks again, Sammorris. I see you definitely know what you're talking about, and I really appreciate the effort to explain stuff to me.
This message has been edited since posting. Last time this message was edited on 23. August 2010 @ 21:40
|
AfterDawn Addict
4 product reviews
|
24. August 2010 @ 06:15 |
Link to this message
|
When you use multiple graphics cards, each card must have the memory you would normally use, and a little more to account for the transfers between them. Say a game uses 975MB of video memory and you have a 1GB card. That would be fine, but when you use two 1GB cards in crossfire, it may potentially push you over the 1GB limit, causing a drop in performance.
If you're not interested in gaming, multi-GPU solutions definitely aren't for you. You don't even need a midrange graphics card if you're not gaming.
|
jeanpave
Member
|
24. August 2010 @ 17:08 |
Link to this message
|
Hi again.
Oh, I see. So, if the game uses more than 1 GB of video memory, then it's better to have two video cards, because they would have more memory, combined. (No need to reply if I got it right.)
I am interested in gaming, too, for sure, yes. And who knows what requirements games will have in two years? (I'm not buying this system to buy another one next year - that's another important consideration for me.)
Well, thank you very much, Sammorris. Everything you said made perfect sense, and was extremely useful to me. Thanks again. (I wish I could repay you somehow.)
This message has been edited since posting. Last time this message was edited on 24. August 2010 @ 17:09
|
AfterDawn Addict
4 product reviews
|
24. August 2010 @ 18:20 |
Link to this message
|
No, you got it wrong, sorry. The video memory from cards used in crossfire does not combine. You need to have as much memory as you would need on one card, on every additional card you use. plus a little more, due to the way crossfire/SLI work.
e.g. if you need 975MB on one card, and you want to use two cards, both of the cards you buy would need to be about 1000MB or bigger. You don't get to use the extra memory, as both cards are doing the same job, essentially. Unfortunately just because they only render half the frames doesn't mean they need any less memory, as they are still rendering pretty much the entire scene. This is a little complex to explain, I'll leave it out unless you want to know more :P
|
jeanpave
Member
|
27. August 2010 @ 03:01 |
Link to this message
|
Oh, crap!
Then could you just explain this, please?
If a game is created, and that game requires, let's say, 1215 MB of memory, do I have to get the two video cards I currently have, throw them out, and install a newer card, with more than 1.22 GB of memory (or, else, forget about playing that game)?
The 2 GB that I would have from these two cards (1 GB + 1 GB) would not suffice, even though there are two cards, and each has 1 GB?
That's some perverse stuff!...
Thanks, though, for making it clear.
EDIT: Oh, may I ask you another question, unrelated? - What happens to the second card in a CrossFireX configuration, if the first video card dies, or malfunctions, or something of the sort? Does the former still work, independently, or something? Is the computer going to continue working properly? Thanks again.
This message has been edited since posting. Last time this message was edited on 27. August 2010 @ 03:08
|
AfterDawn Addict
4 product reviews
|
27. August 2010 @ 05:43 |
Link to this message
|
The amount of video memory games require is largely unrelated to how much system memory (RAM) or Hard disk space they need. It is impossible to calculate how much video memory a game requires as it varies with every frame that's rendered, in addition to the detail level, resolution, and whether AA/AF are being applied, and by how much. It is extremely rare for any game to use in excess of 1GB of video memory at resolutions 1920x1200 or lower. Typically it only occurs at top end resolutions like 2560x1600.
It doesn't matter how many graphics cards you have, whatever memory you need on one card, you need on all of them. The memory on multiple graphics cards does not stack, as each of them need to do the same job, each of them need the full amount of memory.
If one card in a Crossfire X config fails, you will get the same problems you would get if it was your only card and it failed. You are just left with the job of working out which card it is that has failed.
|
jeanpave
Member
|
27. August 2010 @ 20:00 |
Link to this message
|
Thank you very much.
(So, I assume that yeah, to that question I asked, I would indeed need a video card with more memory as the one with 1 GB wouldn't work for a game that requires more than 975-1000 MB.)
But do you think in the future (let's say the next 5 years), the top games are going to switch to more than 1 GB of video memory? Because I see the 5970 card has 4 GB, for example.
What's your estimate on that transition, Sammorris, please?
P.S. I suppose they're going to stick with 1920x1080, or 1920x1200, as supported resolution for PC games, for a good while due to the HD televisions that are currently created and sold to consumers. Right?
|
AfterDawn Addict
4 product reviews
|
28. August 2010 @ 05:01 |
Link to this message
|
The 5970 doesn't have 4GB. The standard 5970 has 2GB and is two HD5870s on one board (downclocked to HD5850 speeds), so it only has 1GB per GPU, same as normal cards.
The 4GB 5970s are special cards with 2GB per GPU, but are very expensive and very unreliable.
Games do slowly increase in how much video memory they use, but so far 1GB is only insufficient for 2560x1600 (30" monitors) and 3 monitor eyefinity/surround setups.
1920x1200 will be around for a while. 1920x1080 is still maturing as it is relatively new for games to support (the last 3-4 years), but it will be around for many many years due to the popularity of using HDTVs as monitors.
|
Xplorer4
Senior Member
4 product reviews
|
28. August 2010 @ 20:42 |
Link to this message
|
Originally posted by jeanpave:
[In all fairness, though, I'm not just interested in gaming. I'm also interested in VMWare emulation (of OSs) and complex video editing.]
This has little impact on your GPU choice.
vmware- If you plan to run alot of OSes in vmware then you need to invest in more RAM, and I dont mean VRAM. VMWARE takes up a good amount of memory when in use even under simple conditions. If your only going to run 1 virtual OS then 4 GB may be sufficient. And you dont want to run Windows as your Virtual OS or your bound to run in to driver issues in gaming. I cant imagine why you would need to run many virtual OSes other then programing and Im assuming your not programming anything visually intensive so GPU will make little difference here.
video editing - of course you need some GPU power but it may not be as important as you think. Certain apps can make use of the GPU during encoding but usually they take advantage of ATI or Nvidia card but not both. Cuda support(nvidia exclusive) seems to be favored by software vendors. There are apps that will use ATI cards as well. None the less, it does give a slight boost in your encoding speeds but nothing worth bragging about. Anyways a single card is able to handle the decoding. My 4890 can handle 720p playback in VLC Media player just fine and even when I skip, it picks up from the new position with out a hitch. So gpu is important here, but a 5670 is capable of playing 1080p. I dont know how well it would do if you skipped positions in the movie, but basic playback power is there.
So the main thing your GPU will be used for is video playback and gaming and not all programs will use the gpu for decoding. While VLC does, Windows Media Player relies more on CPU power.
OS: Kubuntu 12.10/Windows 8 -- CPU: Intel Core i7 2600K -- Motherboard: MSI P67A-G45 -- Memory: 2x4GB Corsair Dominator -- Graphics Card: Sapphire 4890 Vapor-X -- Monitor: Dell 2208WFP -- Mouse: Mionix NAOS 5000 -- PSU: Corsair 520HX -- Case: Thermaltake Mozart TX -- Cooling: Thermalright TRUE Black Ultra-120 eXtreme CPU Heatsink Rev C -- Hard Drives: 1x180 GB Intel 330 SSD/1xWD 1 TB Caviar Black/1xWD 2 TB Caviar Green/2xWD 3 TB Caviar Green
|
jeanpave
Member
|
29. August 2010 @ 02:24 |
Link to this message
|
Originally posted by sammorris: The 5970 doesn't have 4GB. The standard 5970 has 2GB and is two HD5870s on one board (downclocked to HD5850 speeds), so it only has 1GB per GPU, same as normal cards.
The 4GB 5970s are special cards with 2GB per GPU, but are very expensive and very unreliable.
Games do slowly increase in how much video memory they use, but so far 1GB is only insufficient for 2560x1600 (30" monitors) and 3 monitor eyefinity/surround setups.
1920x1200 will be around for a while. 1920x1080 is still maturing as it is relatively new for games to support (the last 3-4 years), but it will be around for many many years due to the popularity of using HDTVs as monitors.
Oh, I see now. Thanks for explaining that about the 5970. (I saw that 4-GB GPU pairing as an option on ibuypower.com, and it said "4 GB" next to the card name, so I thought...)
And also thank you very much for the reassurance about the resolution support and HDTVs.
I guess I've kept you busy a lot with this thread, so thank you very much for the advices, the effort and the patience!
Originally posted by Xplorer4: Originally posted by jeanpave:
[In all fairness, though, I'm not just interested in gaming. I'm also interested in VMWare emulation (of OSs) and complex video editing.]
This has little impact on your GPU choice.
vmware- If you plan to run alot of OSes in vmware then you need to invest in more RAM, and I dont mean VRAM. VMWARE takes up a good amount of memory when in use even under simple conditions. If your only going to run 1 virtual OS then 4 GB may be sufficient. And you dont want to run Windows as your Virtual OS or your bound to run in to driver issues in gaming. I cant imagine why you would need to run many virtual OSes other then programing and Im assuming your not programming anything visually intensive so GPU will make little difference here.
video editing - of course you need some GPU power but it may not be as important as you think. Certain apps can make use of the GPU during encoding but usually they take advantage of ATI or Nvidia card but not both. Cuda support(nvidia exclusive) seems to be favored by software vendors. There are apps that will use ATI cards as well. None the less, it does give a slight boost in your encoding speeds but nothing worth bragging about. Anyways a single card is able to handle the decoding. My 4890 can handle 720p playback in VLC Media player just fine and even when I skip, it picks up from the new position with out a hitch. So gpu is important here, but a 5670 is capable of playing 1080p. I dont know how well it would do if you skipped positions in the movie, but basic playback power is there.
So the main thing your GPU will be used for is video playback and gaming and not all programs will use the gpu for decoding. While VLC does, Windows Media Player relies more on CPU power.
Thank you very much, Xplorer4.
I'm probably going to have to emulate Mac OS 10.6 in VMWare, if I can't install it natively. That's the only OS I intend/need to emulate (for my work), and I have to do it by using Windows as the base operating system.
But I did get 8 GB of (DDR3) RAM, for that purpose.
What do you think?
I would have liked to invest in CUDA, I must say, but the ATI cards were more affordable in the end. (And I just couldn't do without DirectX 11 support, so the 2xx nVidia cards were out of the question.)
As for video editing, I know many cards are good for that, but I wanted to get the best for the maximum amount I could afford to spend.
Having said that, though, many companies are famous for making bloated applications: Adobe (Premiere) first and foremost, FCP more or less, and even the Nero editing tools sometimes.
So, why get the best processor and a lot of RAM, and skimp on the video card? Right? One of the best video cards can't hurt with the bloated software out there, no?
Thanks again.
This message has been edited since posting. Last time this message was edited on 29. August 2010 @ 02:59
|
AfterDawn Addict
4 product reviews
|
29. August 2010 @ 05:27 |
Link to this message
|
Even an HD5450 can play 1080p fine. In your instance, buying a powerful CPU, lots of RAM but a weak video card is the right way to go.
|
jeanpave
Member
|
1. September 2010 @ 00:57 |
Link to this message
|
Might I not rue the day if, in three years, or so, I get a nice video game I really want to play, and the video card is not too good for that game?
I don't exactly buy a computer every year. (Not every other year, either...) So, I thought that, as long as I'm spending good money, why not get one of the top 20 or 25 of everything. You know? Well, that's my reasoning, flawed though it may be.
|
AfterDawn Addict
4 product reviews
|
1. September 2010 @ 05:43 |
Link to this message
|
Well, do you play games often? If you do, you certainly want a powerful graphics card.
|
jeanpave
Member
|
4. September 2010 @ 22:39 |
Link to this message
|
Not extremely often, but I get the FIFA game every year, for example.
|
AfterDawn Addict
4 product reviews
|
5. September 2010 @ 09:59 |
Link to this message
|
FIFA probably isn't that demanding, so a relatively basic card like a 5670 will probably do.
|
Xplorer4
Senior Member
4 product reviews
|
6. September 2010 @ 00:47 |
Link to this message
|
Originally posted by jeanpave: Having said that, though, many companies are famous for making bloated applications: Adobe (Premiere) first and foremost, FCP more or less, and even the Nero editing tools sometimes.
So, why get the best processor and a lot of RAM, and skimp on the video card? Right? One of the best video cards can't hurt with the bloated software out there, no?
Thanks again.
You can buy the most expensive video card on the market, but that bloatware is going to be using RAM not the video card so it isnt going to make a difference.
TMPGEnc Authoring Works 4 works better then Nero imo with out the bloat ware.
OS: Kubuntu 12.10/Windows 8 -- CPU: Intel Core i7 2600K -- Motherboard: MSI P67A-G45 -- Memory: 2x4GB Corsair Dominator -- Graphics Card: Sapphire 4890 Vapor-X -- Monitor: Dell 2208WFP -- Mouse: Mionix NAOS 5000 -- PSU: Corsair 520HX -- Case: Thermaltake Mozart TX -- Cooling: Thermalright TRUE Black Ultra-120 eXtreme CPU Heatsink Rev C -- Hard Drives: 1x180 GB Intel 330 SSD/1xWD 1 TB Caviar Black/1xWD 2 TB Caviar Green/2xWD 3 TB Caviar Green
|
Advertisement
|
|
|
jeanpave
Member
|
7. September 2010 @ 20:58 |
Link to this message
|
Originally posted by sammorris: FIFA probably isn't that demanding, so a relatively basic card like a 5670 will probably do.
Oh, I'm sure it will do right now, but how about in 5 years?
(On the (old) computer I am currently using for gaming, I tried to install the latest FIFA. The game installed without complaints, but when I tried to play it... black screen, nothing would happen. And the game exits okay, too, so it wasn't frozen. All these game franchises, when they see people are buying, seem to always attempt to increase the system requirements of their games exponentially. But enough complaining!)
Yeah, I'm trying to think a little about the future, too.
When do you think FIFA makers might decide a 5850 is obsolete?
Originally posted by Xplorer4: TMPGEnc Authoring Works 4 works better then Nero imo with out the bloat ware.
Thank you very much for the suggestion.
I've been using TMPGEnc Xpress for a couple of years now, and I am very satisfied. (The sizes of the files it outputs are a little big, but other than that...)
I was, in fairness, curious about their new Authoring Works package. I'm probably going to give it a try on the new PC.
This message has been edited since posting. Last time this message was edited on 7. September 2010 @ 21:02
|
|