User User name Password  
   
Sunday 8.2.2026 / 12:20
Search AfterDawn Forums:        In English   Suomeksi   På svenska
afterdawn.com > forums > pc hardware > other pc hardware > nvidia geforce fx 5700le - good or bad?
Show topics
 
Forums
Forums
NVIDIA GeForce FX 5700LE - Good or Bad?
  Jump to:
 
Posted Message
Member
_
2. July 2006 @ 10:13 _ Link to this message    Send private message to this user   
I own an NVIDIA GeForce FX 5700LE graphic card. Note the 'LE' in the name (i guess it means Limited Edition).

Now, I have read in some websites, that the 'LE' model is not so good. Is that really true? I spent Rs. 6000 (Indian Rupees), which is nearly 130.50 USD on it. Some days ago, I exchanged my card with my friend for his ATI Radeon 5200 (which costs Rs. 2500 i.e. 54.35 USD), and believe me, it works better than my 5700LE!! Here is a short overview, of performance in Quake 4 on the SAME machine, except the GFX cards.


NVIDIA GEFORCE FX 5700LE WITH LATEST DRIVERS
Q4 Settings: Low Quality
Resolution: 640x480
Advanced Settings: AntiAliasing OFF, VSync OFF, all others ON
Resultant Speed: Very Fast, playable.

Q4 Settings: Low Quality
Resolution: 800x600
Advanced Settings: AntiAliasing OFF, VSync OFF, all others ON
Resultant Speed: Moderately fast, and occassionally VERY slow, especially when looking at far off areas, or when a door opens, etc.

Q4 Settings: Medium Quality
Resolution: 640x480
Advanced Settings: AntiAliasing OFF, VSync OFF, all others ON
Resultant Speed: Quite slow, and very slow when looking at far off areas, or when a door opens.

Q4 Settings: Medium Quality
Resolution: 800x600
Advanced Settings: AntiAliasing OFF, VSync OFF, all others OFF
Resultant Speed: Terribly slow, considerable drop in framerate, not playable at all. (this was soo obvious ;)

ATI RADEON 9200 WITH THE LAST DRIVER THAT SUPPORTED THIS PARTICULAR MODEL
Q4 Settings: Low Quality
Resolution: 640x480
Advanced Settings: AntiAliasing OFF, VSync OFF, all others ON
Resultant Speed: Very Fast, playable.

Q4 Settings: Low Quality
Resolution: 800x600
Advanced Settings: AntiAliasing OFF, VSync OFF, all others ON
Resultant Speed: Very Fast, playable.

Q4 Settings: Medium Quality
Resolution: 640x480
Advanced Settings: AntiAliasing OFF, VSync OFF, all others ON
Resultant Speed: Very Fast, playable. <- !!!

Q4 Settings: Medium Quality
Resolution: 800x600
Advanced Settings: AntiAliasing OFF, VSync OFF, all others OFF
Resultant Speed: Quite slow, and slower when looking at far off areas.


As you can see, the ATI Radeon 9200 is giving better performance tham the NVIDIA GeForce FX 5700LE. For those of you who don't know, the ATI Radeon 9200 is more or less equivalent to the NVIDIA GeForce FX 5200.

Also, using this card (Geforce), my system automatically restarts sometimes, but this is not a problem with the ATI card.

Please give some suggestions. If this card is really bad, I must seriously think of exchanging it for a new one. Howz the GeForce FX 6600?
Advertisement
_
__
PeterG969
Member
_
2. July 2006 @ 12:36 _ Link to this message    Send private message to this user   
are both graphics card shared or dedicated?

I have a ati radeon 9000 and it can run doom 3 and that pretty good

If you are looking for a new graphics card i suggest you wait just another few months/weeks till nvidia introduce there 7900 512mb
Member
_
3. July 2006 @ 08:14 _ Link to this message    Send private message to this user   
Quote:
are both graphics card shared or dedicated?

What does 'shared' and 'deticated' mean?
Quote:
I have a ati radeon 9000 and it can run doom 3 and that pretty good
Well, if you look on the internet, some users (including me, with the ATI card) are experiencing problems with the Quake 4 textures appearing green. However, they don't have this problem with Doom 3. That goes to prove that, although both these games use the same engine, there are some major differences in the programming of the game. Also, the System requirements of Quake 4 are slightly higher than Doom 3. Maybe that's why your radeom 9000 runs Doom 3 well.
Quote:
If you are looking for a new graphics card i suggest you wait just another few months/weeks till nvidia introduce there 7900 512mb
Well, that depends on the cost of the card, whether I'll take it or not. But as for the 512MB, there is -no- card available in India of 512MB. The max. available is 256MB.

For now, I just want to know what kind of performance boost can I expect with the 6600 card, as compared to the 5700LE.

Also, what is the latest version of Pixel Shader, Vertex shader available in cards?

This message has been edited since posting. Last time this message was edited on 3. July 2006 @ 08:28

Member
_
3. July 2006 @ 13:21 _ Link to this message    Send private message to this user   
i believe that shared memory means that it sucks memory from your ram and dedicated means that it has its own memory built in. not 100% positive though.
wazid360
Suspended due to non-functional email address
_
3. July 2006 @ 18:38 _ Link to this message    Send private message to this user   
you're right


Member
_
3. July 2006 @ 21:34 _ Link to this message    Send private message to this user   
Well, my graphic card has 128MB of dedicated memory. I'd never buy a graphic card that uses shared memory ;8)
Michael73
Newbie
_
6. July 2006 @ 09:41 _ Link to this message    Send private message to this user   
The 5700LE is a good card, Its already outdated. The clock speeds are very low and there is only 2 pipes. To get the most out of this card clock it to 420/500. It can be done easily with riva tuner and I believe Nvidia has a great clocking utility to work with it. I have had it for over 2 years. Still good. The 6600GT is a Much better card out of the box. My friend has it and it rocks.
Member
_
6. July 2006 @ 21:21 _ Link to this message    Send private message to this user   
Quote:
The 5700LE is a good card, Its already outdated. The clock speeds are very low and there is only 2 pipes. To get the most out of this card clock it to 420/500. It can be done easily with riva tuner and I believe Nvidia has a great clocking utility to work with it. I have had it for over 2 years. Still good. The 6600GT is a Much better card out of the box. My friend has it and it rocks.

I have overclocked this card to 340/525 with RivaTuner. If I overclock it to any more than that, then either the 'Test' fails, or I get horrible artifacts (or whatever they are called) in Hitman Blood Money.
BTW: how do you overclock your Clock speed to above 375?. Rivatuner doesn't allow clock speeds beyond 375, as the slider ends ar 375.
Also I wanted to know, that when you click the 'Test' button, what exactly happens then? And how does it come to know whether your card can run those speeds? Coz. Sometimes on the same card, when I set a particular clock speed and memory speed, the test fails, and sometimes, on the same both speeds, the test becomes successful.

Finally, what is the latest version of Pixel Shaders and Vertex shaders available in cards? I really need to know this.

This message has been edited since posting. Last time this message was edited on 6. July 2006 @ 21:23

Michael73
Newbie
_
7. July 2006 @ 06:12 _ Link to this message    Send private message to this user   
Install Nvidia Forceware driver > 81.85 then install the coolbits reg hack and it will open hidden power features in the NVidia display panel . If you go into the properties of the card under this utility it will provide you with a clock frequency setting for mem and core. Its an excellent tool. You can test on the spot. Also I use a total of 3 fans on my card. The cooler you get it the Higher you can clock.
http://downloads.guru3d.com/download.php?det=815
Member
_
9. July 2006 @ 04:14 _ Link to this message    Send private message to this user   
Quote:
I use a total of 3 fans on my card. The cooler you get it the Higher you can clock.
How did you do that? Are they really small ones, so that 3 of them can fit, or is your heatsink that big? I assume that heatsink is a custom one that you have installed, not the factory shipped one. Are custom heatsinks available for all cards? (I am in particular talking about the Geforce FX 6600xx and the 6800xx - where xx is either clank, or letters like GE, GV, GX etc)

The foll. is quoted from my previous post
Quote:
Also I wanted to know, that when you click the 'Test' button, what exactly happens then? And how does it come to know whether your card can run those speeds? Coz. Sometimes on the same card, when I set a particular clock speed and memory speed, the test fails, and sometimes, on the same both speeds, the test becomes successful.

Finally, what is the latest version of Pixel Shaders and Vertex shaders available in cards? I really need to know this.

Someone please answer.

All you guys, thanks a million for the help.
Michael73
Newbie
_
9. July 2006 @ 07:38 _ Link to this message    Send private message to this user   
When you select 'test' the program starts to benchmark the clock cycles of the chip. This is not an exact test. It is engineered to be set at the most stable clock cycles. If you force the test to a specific # then it may succeed or possibly hang up the card. You need to have a feel for your card (Increase cycles by 10 per test). Keep it cool and you have a better chance for stability. The hotter the card gets the more erratic it will perform. I use the stock heat sink with OEM fan, added 2 other aftermarket small fans over both sides of the transistors. They get burning hot. So much so that I?ve heard stories of them unwelding themselves and just falling off the card. I?ve overclocked and played KOTOR2, Battlefield 2, Call of Cthulu, and Farcry all successfully on the 5700le. The good thing about the card is that it has great overclocking potential. If you don?t clock it then it?s just another card. The 6600GT is already maxed out on its clock speeds write out of the box. If you?re really serious about keeping your card cool they make factory cooling cases for video cards- ARCTIC COOLING AVC-NV5R3 NV Silencer Rev.3

Peace out.
Member
_
11. July 2006 @ 01:11 _ Link to this message    Send private message to this user   
Michael73,
Thanks for the info. But you have not mentioned anything about the 6800xx series if cards. Have they reached their overclocking potential too, or can they be overclocked decently?

Also, 'bout the cooling kit, have YOU used this kit on the 5700LE? I'm asking coz mu heatsink has only 2 holes for the screws (clips, actually....SIGH!), and if you take a look at the pictures on the site below, the heatsink has 4 screws at the bottom.

http://cgi.ebay.com/FREE-SHP-AVC-NV5R3-ARCTIC-COOLING-NV-Silencer...

Just wanted to make sure, thakks!

EDIT: Could somebody arrange the abbreviations GS, GT, GV and Ultra in the ascending order of performance? (for Nvidia graphic cards)

This message has been edited since posting. Last time this message was edited on 11. July 2006 @ 01:47

Michael73
Newbie
_
11. July 2006 @ 11:53 _ Link to this message    Send private message to this user   
I rigged the cooling on the 5700le. I dont believe they make a specific cooling case for this card. Iam still using the card. If I were to use the 6800 and decide to overclock it I would buy the cooling kit. Its a good price for the product. I know that the GT on the cards designate overclocked from the factory. I tested several of them and they are pretty much dead on with no real signifacant room to increase clock speeds (like the 6600gt). Iam sure one of the other designations are underclocked like the LE was. You will have to do some research to find out. Bottom line best bang for you buck is the 6600gt. Plug in dont worry about overclocking or cooling and start playing some cool games. I saw that card handle Oblivion, defaulted to medium setting on my friends comp AMD X2 4200 and looked good. The 5700LE even being overclocked worked but could not handle it (my opinion not playable).

Peace out.
mark5hs
Suspended due to non-functional email address
_
11. July 2006 @ 12:50 _ Link to this message    Send private message to this user   
5700 was decent in its time (3 years ago) but now its completley outdated. I understand that technology is expensive in india but $130 for a 5700 is rediculous. Hear in the US the best card for that price would be the 7600gs (gt if you pay a bit more). The 5700 is from the 5 series, there is a 6 series that is a bit old, the 7 series is the current one from nvidia, and theres an 8 series coming out soon. Im guessing youd have to pay a ton to have one shipped to India from the US so Im not sure what to say. Do you have any relatives in the US, Canada, or UK?

X2 3800, A8n deluxe motherboard, 7800GT, Xion 2 case
2GB Patriot DDR 400, Audigy 2 ZS gamer, 400GB SATA RAID 0
games Im looking forward to: bioshock, SSBB, enemy territory quake wars, halo 3
currently playing: command and conquer 3, supreme commander, counter strike, shivering isles
akuma96
Junior Member
_
11. July 2006 @ 13:07 _ Link to this message    Send private message to this user   
i have a 5700 and i got it when it came out $350 .now my gf has a 6200 and it good but for some reson ? WOW plays better on mine ... my pc is slower then hers ? i have a 1.80 and she has a 2.80 go figger ....but i want that SLI now but that means more $$$ mobo cpu and the SLI :(
Advertisement
_
__
 
_
Member
_
11. July 2006 @ 21:20 _ Link to this message    Send private message to this user   
Michael73,
Which one of these is the best - LE, GE, GT, GV, GS, Ultra ?
Quote:
I understand that technology is expensive in india but $130 for a 5700 is rediculous.

mark5hs,
That was the cost when I purchased it in October, 2004. The current cost is a ridiculous 4000 Rupees ($86.95),
Quote:
Do you have any relatives in the US, Canada, or UK?

And, I don't mind the cost, but I want to buy it in India. Although I do have relatives in the US, if something were to go wrong with the card within it's warranty period, I would have to send it back to the US, which will cost me huge shipping charges and lots of time (i'm guessing 3-4 weeks).
Quote:
i have a 5700 and i got it when it came out $350 .now my gf has a 6200 and it good but for some reson ? WOW plays better on mine ... my pc is slower then hers ? i have a 1.80 and she has a 2.80 go figger ....but i want that SLI now but that means more $$$ mobo cpu and the SLI :(
akuma96,
If you see my first post in this topic, I was facin a similar situation with the NVD. 5700LE and the ATI Radeon 9200. I haven't figured out the reason, tell me if and when you do.

This message has been edited since posting. Last time this message was edited on 11. July 2006 @ 21:22

afterdawn.com > forums > pc hardware > other pc hardware > nvidia geforce fx 5700le - good or bad?
 

Digital video: AfterDawn.com | AfterDawn Forums
Music: MP3Lizard.com
Gaming: Blasteroids.com | Blasteroids Forums | Compare game prices
Software: Software downloads
Blogs: User profile pages
RSS feeds: AfterDawn.com News | Software updates | AfterDawn Forums
International: AfterDawn in Finnish | AfterDawn in Swedish | AfterDawn in Norwegian | download.fi
Navigate: Search | Site map
About us: About AfterDawn Ltd | Advertise on our sites | Rules, Restrictions, Legal disclaimer & Privacy policy
Contact us: Send feedback | Contact our media sales team
 
  © 1999-2026 by AfterDawn Ltd.

  IDG TechNetwork