|
What graphics card?
|
|
854x4
Newbie
|
16. February 2010 @ 22:36 |
Link to this message
|
I got my devil may cry 4 game but they had to ship my hd4670 card from a different location so I will get that tomorrow. I installed the game just to see what the 6150se intergrated chipset could do and it runs at 5 fps. When I get the card tomorrow I will install it at let everyone know how it runs.
|
Advertisement
|
|
|
AfterDawn Addict
4 product reviews
|
18. February 2010 @ 12:40 |
Link to this message
|
That sort of performance in Devil May Cry 4 doesn't surprise me. Devil May Cry 4 isn't a very demanding game by modern standards, so you were able to achieve 5fps with the 6150, lots of newer games will run far worse than that, hence why we recommended the HD4670.
|
854x4
Newbie
|
19. February 2010 @ 21:57 |
Link to this message
|
Sorry for the late reply but I got my card and installed it. The fan is so big it blocks off one of my pciex1 slots but I wasnt using it anyway. When I started it up it installed the drivers and now my ram reads 3.00gb instead of 2.75gb it use to because it had to share with the 6150. I used the disk that came with it to install the drivers and control center and then I upgraded the drivers and catalyst control center from the ati website. I havent had any problems with my 250w power supply and I played Kane & Lynch all the way through no problems. As for devil may cry 4 my fps are around 60-80. When I check my temp on the overdrive tab of the catalyst control center I run about 60 degrees C. Is that good? Thanks for all the help.
|
Erzengel
Junior Member
|
19. February 2010 @ 22:03 |
Link to this message
|
Originally posted by sammorris: The HD4350 can't play any modern games at all, but it will run using almnost no power. The HD5670 uses more power but is a much more powerful card, probably by a factor of 10 at least. It should still work fine on a 250W unit. It's 200W and less I'd be concerned, or if you have a very power hungry CPU like an AMD.
Yeah, I have an HP a6203w with an AMD athlon 64x2 4400+2.3 ghz. I don't know the exact wattage, but where can I find specs on my computer that would list the wattage draw on the PSU? I am assuming the 250w I have is reliable. But according to this site http://www.atxpowersupplies.com/hp-power-supplies.php the HP Pavilion a6203w desktop I own has a 300w power supply?
I'm just trying to find out the exact wattage on the PSU I own, the draw of the processor and rest of the comp, and if its reliable. I trust that the cards listed in this post will run on it, like the 4670, etc.
Thanks.
|
AfterDawn Addict
4 product reviews
|
20. February 2010 @ 02:17 |
Link to this message
|
854: That sounds fine. 60C is perfectly reasonable for a graphics card, up to around 95 is safe.
Erzengel: If in doubt, look inside.
|
Jinkazuya
Member
|
20. February 2010 @ 19:41 |
Link to this message
|
Sam...I just don't get it...How do you determine what power supply is needed for certain components? Like what power supply might work with this or that?
|
AfterDawn Addict
4 product reviews
|
20. February 2010 @ 20:05 |
Link to this message
|
Quite simple really
Brand: Only use good brands, or you risk component damage, or even safety hazards
Wattage: Work out from reviews how much power you'll use, then allow at least 20% breathing room for future upgrades, lower noise and increased reliability.
Power connectors: Work out how many power connectors of each type you need, and make sure the minimum unit has enough. If not, choose a bigger PSU.
|
Jinkazuya
Member
|
21. February 2010 @ 23:15 |
Link to this message
|
Well...I know this. But when it comes to the fact that for instance, a video card, and on the label, it is said something like if you want SLI or Crossfire, you need to have a PSU that is 500 watts or something(assuming it says something like that), should you really go for 500 watts? But according to what you suggested to this person who made this thread, a label of the video card claims that if you wanna crossfire or SLI, you need 500 watts, you said 300 watts is enough(just ASSUMING this is what you said earlier, I know this is not what you said).
This is what you said SAM.
Yes. A 250W unit will handle the HD5670. It will not, however, handle a card that requires a PCIe power connector. The 400W estimate is based on higher-performance components in the rest of the system than you're using.
This is what I would like to learn even though the criterion or whatever is written on the label on certain graphic card or so claiming to use this or that power wattage, you actually don't really need this much...How do you know that? I'd like to learn this.
|
AfterDawn Addict
4 product reviews
|
22. February 2010 @ 11:42 |
Link to this message
|
Unfortunately, most people who are new to PC building perceive everything about it as being an exact science. It isn't.
The problem with Power Supplies derives from the existence of cheap, useless units going around, which I never recommend people to buy, and if I spot them in their shopping carts, I strongly advise them to change. This is because cheap tacky PSUs are not only prone to breaking, they can also cause fire and damage to your PC components.
Many people have terrible 500W units that go pop trying to deliver 200W. Rather than blame the brand, people assume (wrongly) 500W isn't enough, buy a 750W unit from a better brand and everything works, bolstering their assumption that 500W wasn't enough. It had nothing to do with it, they could have dropped to a 400W unit and still been fine, it was buying a proper brand PSU that did the job.
This also causes the discrepencies between my advice and that of hardware manufacturers. So that graphics card producers don't have to waste time on tech support with low-grade power supplies causing issues, they will overstate the power requirements to cover their bases. On top of that, if someone happens to be using a dual processor mega workstation with loads of hard disks, they may come very close to using the 500W that the box states for using a normal graphics card, not because the graphics card used that much, but because their other components are power-hungry. This is another example of covering bases.
A third reason is that PSUs come with a limited number of power connectors, as the manufacturers deem appropriate for how much power the unit has. It is rare to see units as low as 450W with three or more PCI express power connectors as you would only use that many with multiple high-end graphics cards, too much for a unit of 450W to supply (in most cases).
While people can easily read the wattage figure on the side of their PSU, it may not occur to them to check how many connectors they have before buying their graphics card. Cards like the HD5850 use two connectors, even though they are only one GPU. While a 450W PSU would be ample to power such a card in a normal system, many 450W power supplies, even good ones like the Corsair VX only have one connector. Thus, when people buy the card and can't use it for this reason, they moan at the mnanufacturer. Yet another 'covering bases' example.
Fourthly, there is also the matter of absolute usage. My system for instance will use around 700W in game due to the 4 GPUs working. I use an 850W PSU for this reason, as it has the four PCIe connectors that I need, and produces enough power with a little to spare. Yet I am recommended to use 1KW by ATI. However, games, even if they are coded well, do not use every single processing core, or every single byte of memory on your card all of the time. That's not inefficient code, there's just no need for every single pixel to be changing every time a new picture comes up - This after all, would only ever result in a jumble of colours you could never understand. If you do this, however, as part of a 'stress test', useful for checking the stability of overclocks to your graphics, the amount of power this uses is astronomical. My 700W rises to around 860-870W, above the rating of my PSU. I can run it, as there is usually a little overhead in PSUs (For example, a good 520W PSU may deliver up to 560-570W before shutting down), but if I add a CPU stress test to the mix, it's 900W+. That's more than I'd like to see, so I don't run stress tests :P - However, from this test, this means that if I were to do work that used all of my processing cores all of the time (distributed computing programs like SETI and Folding@Home might do this) I would be far better off with a kilowatt PSU.
In a system with normal hardware and a proper brand PSU, you will therefore be using less than half as much power as most people think. A quad core processor and a high-end graphics card like an HD5850, most people would think they need at least a 600W PSU. In reality, such a system will use no more than around 290-300W.
A lot to read, but hopefuly this gives some understanding into how PSUs work.
|
Advertisement
|
|
|
Jinkazuya
Member
|
22. February 2010 @ 15:55 |
Link to this message
|
OK...I think you know what you mean SAM...But the question is like, for example, if you have a 450W, how do you know if that would be enough to supply the components of the PC? Do you do the calculation of the wattages of all the component and add them all up to get the result? Let say I have a i7 processor, which uses 130w, and my graphic card for example, uses 200w, and so on does it mean that you have to add them up such as 200 + 130 = 330w? If that's the case, you have to check all the wattages of all your components before you make your purchase.
Besides that, the manufacture says OK, if you want to use my video card, you have to have a PSU that's at least 300w, but in fact, 200w is enough...this is what I don't understand HOW you determine if 200w is enough? Unless you do a thorough test for the PSU...Really confuses me.
|
|