C&C Generals Playing Errors
|
|
Shoey
Suspended due to non-functional email address
|
24. August 2003 @ 05:03 |
Link to this message
|
Quote: Any good?
A) Yes, although the price "seems high". I bought my video card about 4 months ago and paid 90 buckaroos (total) "Google" up Jayton or eVga.
Shoey :)
|
Advertisement
|
|
|
Applecorp
Member
|
24. August 2003 @ 05:32 |
Link to this message
|
Well I'm in the UK so prices are bound to be higher!
|
Praetor
Moderator
|
24. August 2003 @ 05:42 |
Link to this message
|
I'm in Canada and the price seems about $40CAD too high but well within a reasonable range. Then if i think about it a little more, Abit is a decent manufacturer and it may be worth the extra bit of moolah.
|
Praetor
Moderator
|
24. August 2003 @ 07:10 |
Link to this message
|
Ah.... finally... manual found... now to clarify some stuff:
FROM THE MSI MANUALS
GF4Ti4600
- Features the nVidia nfiniteFX II engine
- Dual Programmable Vertex Shaders <-- This is indicitive of DX8/DX9 cards... DX7 cards did not feature programmable nothing
- Lightspeed Memory Architecture II
- Accuview Antialiasing
- 4 dual-rendering pipelines
- 8 texel/cycle
- Dual cube environment mapping
- 10.4GB/s memory bandwidth
- 136M verticles/sec
- 4.8G AA samples/sec fill rate
- 1.23T operations/sec
- DX8.1 Card
GF4Ti4400
- Features the nVidia nfiniteFX II engine
- Dual Programmable Vertex Shaders <-- This is indicitive of DX8/DX9 cards... DX7 cards did not feature programmable nothing
- Lightspeed Memory Architecture II
- Accuview Antialiasing
- 4 dual-rendering pipelines
- 8 texel/cycle
- Dual cube environment mapping
- 8.8GB/s memory bandwidth
- 125M verticles/sec
- 4.4G AA samples/sec fill rate
- 1.12T operations/sec
- DX8.1 Card
GF4Ti4200
- Features the nVidia nfiniteFX II engine
- Dual Programmable Vertex Shaders <-- This is indicitive of DX8/DX9 cards... DX7 cards did not feature programmable nothing
- Lightspeed Memory Architecture II
- Accuview Antialiasing
- 4 dual-rendering pipelines
- 8 texel/cycle
- Dual cube environment mapping
- 8.0GB/s memory bandwidth
- 113M verticles/sec
- 4.0G AA samples/sec fill rate
- 1.03T operations/sec
- DX8.1 Card
GF4MX460
- 2nd Generation T&L Engines
- Non-programmable Shading rasterizer with 24 of 26 DX8 pixel shading functions <-- so is this really a DX8 card?
- 38M Triangles/sec
- 1.2G texel/sec fill rate
- 600M pixel/sec fill rate
- Single cube environment mapping
- 8.8GB/sec memory bandwidth
- According to MSI, this is a DX8.1 capable card
GF4MX440
- 2nd Generation T&L Engines
- Non-programmable Shading rasterizer with 24 of 26 DX8 pixel shading functions <-- so is this really a DX8 card?
- 34M Triangles/sec
- 1.1G texel/sec fill rate
- 540M pixel/sec fill rate
- Single cube environment mapping
- 6.4GB/sec memory bandwidth
- According to MSI, this is a DX8.1 capable card
GF4MX420
- 2nd Generation T&L Engines
- Non-programmable Shading rasterizer with 24 of 26 DX8 pixel shading functions <-- so is this really a DX8 card?
- 31M Triangles/sec
- 1.0G texel/sec fill rate
- 500M pixel/sec fill rate
- Single cube environment mapping
- 2.7GB/sec memory bandwidth <-- OUCH!
- According to MSI, this is a DX8.1 capable card
GF3Ti200Pro
- nfiniteFX engine for full programability <-- something the GF4MXs dont have
- Lightspeed Memory Architecture <-- yet another thing the GF4MXs dont have
- Programmable Vertex Shader <-- and again
- Programmable Pixel Shader <-- and again
- The manual says "integrated hardware T&L" so I would imagine only 1st generation T&L
- 2.8G AA samples/sec fill rate
- 6.4GB/sec memory bandwidth
- DX8.1 capable card
Interesting that they dont specify the pixel and texel fill rates.
FROM THE ASUS MANUALS
GF3Ti500Pro
- nfiniteFX engine for full programability <-- something the GF4MXs dont have
- Lightspeed Memory Architecture <-- yet another thing the GF4MXs dont have
- Programmable Vertex Shader <-- and again
- Programmable Pixel Shader <-- and again
- The manual says "integrated hardware T&L" so I would imagine only 1st generation T&L
- 3.84G AA samples/sec fill rate
- 6.4GB/sec memory bandwidth
- DX8.1 capable card
Interesting that they dont specify the pixel and texel fill rates.
Summary
All the GeForce4Ti model cards are based on the NV25 chipset
All the GeForce4Mx model cards are based on the NV17 chipset
I have no idea what the chip model is for the GF3Ti cards (not even sure they made GF3MX cards)
If you have the budget, jump yourself to a GF4Ti otherwise hunt around for a GF3Ti.
According to MaximumPC's August issue, they state that any nVidia GeForce3 and any nVidia GeForce4 card is a DirectX8 compliant card in their final summaries however their first paragraph states that: DirectX 8 cards were the first to include programmable shaders on the GPU
Furthemore, in MaximumPC's June issue, they compared the some of the older video cards:
-GF3: "...Although quite anemic compared to today's DirectX9 accelerators, the GeForce3 holds its own in the poor-boy scene. But will a core speed of 200Mhz and a 230Mhz DDR be enough to trump the GeForce4 MX 460?"
-GF4MX460: "When you see the 'MX' designator, you think 'budget'. But while nVidia's budget video card, the GeForce4 MX 460, doesnt run programmable shaders in hardware, it does feature a higher core and memory speeds than the programmable GeForce3 (300/275Mhz as opposed to 200/230Mhz), and more memory bandwidth than the GeForce4 Ti4200. We didn't think the nForce2 could match the raw power of GeForce4 MX 460"
-The Contest: "...And despite out conjecture that the highly clocked GeForce4 MX 460 would be the overall champ, the old GeForce3 nosed by it to steal victory. It turns out that in this contest, the GeForce3's 57 million transistors and four pipelines beat the GeForce4 MX 460's 29 million transistors and two pipelines - core and memory speeds notwithstanding."
Personally, I tend to agree with the June issue's judgement about the GF4MX and the GF3 (and they were comparing with the MX460 -- odds are, the one you will find is a MX440, the did not however specify whether it was a GF3Ti200 or GF3Ti500). It seems that their August issue made a slight goof up about classifying ALL GF4's as being DirectX8 capable. (Even according to the specs, the MX cards only feature 24 of 26 DX8 pixel shading functions). So So in my summary (hehe this has turned out to be quite the rant), I would reccommend that if you have the money, hunt down a GF4Ti, if you dont then (if you have the time, as they are incredibly hard to come by, in my experience), hunt down a GF3Ti and as a last resort, grab a GF4MX.
|
Applecorp
Member
|
24. August 2003 @ 08:07 |
Link to this message
|
I think I'm going to go for the card I found, I know I can find it cheaper, not much but as you said Abit are good, my GF3 siluro is an Abit and I am very happy with it's perfomance.
|
Shoey
Suspended due to non-functional email address
|
24. August 2003 @ 09:28 |
Link to this message
|
Asus K8N nVidia nForce3 Pro 250 GB, Athlon 64 3200+, Hitachi 80 gig SATA 150, Corsair XMS 1 gig PC4000, ATI Radeon Saphire 9600 Pro (256 DDR), Windows XP Pro (64 Bit),Lite-ON SOHD 167T,, Plextor PX-712SA,BenQ 1640.
This message has been edited since posting. Last time this message was edited on 24. August 2003 @ 10:00
|
Praetor
Moderator
|
24. August 2003 @ 10:11 |
Link to this message
|
Yes indeed an excellent card! Beware the MSI drivers tho - they tend to be a bit dated..... and finicky at times. I would reccommend you use the nVidia reference drivers unless you have a specifc reason no to otherwise. :-)
This message has been edited since posting. Last time this message was edited on 24. August 2003 @ 10:12
|
Praetor
Moderator
|
26. August 2003 @ 20:38 |
Link to this message
|
Finally dug it up... now to put this DX7/DX8 thing to rest hehe. Accoding to PC Gamer (April 2002):
Quote: The GeForc4 Ti is NVIDIA's next-generation 3D technology. It handles all the DirectX 8 features 9and more) in hardware and is faster and more powerful than the GeForce . Good stuff. No problems here.
GeForce4 MX,on the other hand is a DirectX 7 part, and unlike GeForce 3 or GeForce4 Ti, it has no programmable pixel or vertex shaders. In fact, based on transister count and feature set, the chip looks to be roughly equivalent to an exceptionally fast GeForce 2.
So its decided... i'm surpsied Maximum PC made such a goofup :S ...all the GF4Tis use the NV25 core while the GF4MXs use a NV17 core.
|
Applecorp
Member
|
27. August 2003 @ 06:25 |
Link to this message
|
I was surprised the GF3 series didn't last that long, the GF2 series seemed to go on indefinately.
But the leap between 2 and 3 is quite immense don't you agree?
My GF3 card quite happily runs todays most demanding games.
Also, what are your thoughts on direct x 9b, for my Geforec 2 Pro and later cards?
This message has been edited since posting. Last time this message was edited on 27. August 2003 @ 06:28
|
Praetor
Moderator
|
27. August 2003 @ 10:08 |
Link to this message
|
Yes indeed i would agree the jump from GF2 to GF3 is immense... the introduction of the programmable pixel/vertex shader will account for that. Also the GF3s are DX8-hardware cards whereas, the GF2s are DX7 cards. All this fancy crap just means that the hardware is natively capable of handling DX7/DX8 instructions. If i am not mistaken, if your hardware does not support say, DX9 instructions, then those instructions are relegated to your CPU which kinda negates the point of having a video acclerator in the first place hehe. This of course only applies to specific application calls for DX9 effects and not necesssarily for the entirety of the game (I hardly believe that the main menu is 3d-rendered and requires extensive hardware DX9 support hehe).
Quote: My GF3 card quite happily runs todays most demanding games.
That it may, but with the coming of HL2 (which im not looking forward to hehe) and DOOM3 (which i am looking forward to), your current hardware setup may be pushed just a little bit further than even some of the more demanding games now hehe I would render to guess that to play HL2/D3 at an "enjoyable" level one would need a GF4Ti series or better card... a GF3 may do it but your enjoyment factor will be heavily dependant on the other aspects of your hardware.
|
Applecorp
Member
|
14. September 2003 @ 05:43 |
Link to this message
|
I eventually bought a used GF4 ti4200 for £60 on ebay, not bad I think.
I too am very much anticipating HL2 and DOOM 3, as I would prefer to play them on PC rather than my XBOX. Is HL2 coming to XBOX?
Would my new card be sufficient enough to play these games with a decent framerate, or would my CPU (Athlon 1ghz), RAM (256 mb) etc drag it down? I'm pretty sure I need more RAM at least.
This message has been edited since posting. Last time this message was edited on 14. September 2003 @ 05:44
|
Shoey
Suspended due to non-functional email address
|
14. September 2003 @ 05:49 |
Link to this message
|
Quote: RAM (256 mb) etc drag it down? I'm pretty sure I need more RAM at least.
You'll get the best performance uping your video card ram if you're a pc gamer m8. Sure there are "tweak" programs out there to help.
Seriously consider uping your ram to at least 512. I'm running 512 as it is and I'm not comfortable and soon to up to 1 gig.
Shoey :)
Asus K8N nVidia nForce3 Pro 250 GB, Athlon 64 3200+, Hitachi 80 gig SATA 150, Corsair XMS 1 gig PC4000, ATI Radeon Saphire 9600 Pro (256 DDR), Windows XP Pro (64 Bit),Lite-ON SOHD 167T,, Plextor PX-712SA,BenQ 1640.
|
Applecorp
Member
|
14. September 2003 @ 06:18 |
Link to this message
|
Thanks, I'll probably add another 512mb.
So no CPU problems then? If I had to change it, it means buying another mobo. My CPU seems to be performing very well all things considered, I recently did a benchmark/stress test and it passed.
|
Praetor
Moderator
|
14. September 2003 @ 10:21 |
Link to this message
|
Athlon 1GHz.... i take it thats the Tbird or the original Athlon... in either case.. you may find that HL2 and D3 run slowly (more so if you ramp the quality settings hehe). Consider upgrading the memory dude. Im runnin 1GB and its not enough hehe....
About upgrading the CPU... you wont have to upgrade the mobo if you stay within the same CPU class (generally). However since the Athlon/TBird is ancient as far as the nwer systems go, if you want to upgrade the CPU yto something more newer (a Palo/TBred/Barton) you will have to upgrade the mobo... look around but you should be able to find the ASUS A7X8X-X mobo (same as me) and a TBred 2400 for fairly cheap.
|
Shoey
Suspended due to non-functional email address
|
15. September 2003 @ 01:27 |
Link to this message
|
Quote: Im runnin 1GB and its not enough hehe....
Wonders why? (just kiddin' m8)
You can do wonders if your mobo supports the AthlonXP 2100+ as this has a multiplier of 13x (I think) and you can overclock. If your mbo doesn't support this high of a cpu maybe a mobo flash "might" get you there but I seriously doubt it. Look at some Asus& MSI (Micro Star) mobo's as you can find one to suite your needs reasonably fai priced.
Shoey :)
Asus K8N nVidia nForce3 Pro 250 GB, Athlon 64 3200+, Hitachi 80 gig SATA 150, Corsair XMS 1 gig PC4000, ATI Radeon Saphire 9600 Pro (256 DDR), Windows XP Pro (64 Bit),Lite-ON SOHD 167T,, Plextor PX-712SA,BenQ 1640.
|
Applecorp
Member
|
15. September 2003 @ 03:53 |
Link to this message
|
My mobo is several years old, it doesn't support CPU speeds above 1.4 ghz. That's why I mentioned I would have to change it to get a decent CPU upgrade.
|
Praetor
Moderator
|
15. September 2003 @ 08:53 |
Link to this message
|
Well the 1GB just kinda gets used lol. Yea on a more serious note... you should be able to find a decent MSI/ASUS board for a good price nowadays.
|
Applecorp
Member
|
28. September 2003 @ 06:35 |
Link to this message
|
Back again, lol.
My ABIT Siluro Geforce 4 Ti4200 128mb OTES (yes, the suspect thermal exaust) card which I received the other day is faulty, read on.
Unfortunately it is rubbish because it doesn't work properly and loads of people have had problems with it.
Here is a link to the abit forum where you can see a screenshot.
http://forum.abit-usa.com/showthread.php?s=a57200474fb44209b26d5a84d9c8e83e&threadid=20450
What do you think of this? people are having various problems (worse than mine!) and the common opinion is that the card has bad RAM, my exact issue concerns the one 'equilibrium' is having on this forum, ie dots all over the screen in 3D games. Otherwise it performs ok.
This message has been edited since posting. Last time this message was edited on 28. September 2003 @ 06:52
|
Praetor
Moderator
|
28. September 2003 @ 07:24 |
Link to this message
|
Abit makes nice mobo's dude... no video cards! (I doubt that card was less than the MSI FX card heehee). Seriously though, does this happen all the time or only when say, aniso is enabled?
|
Shoey
Suspended due to non-functional email address
|
28. September 2003 @ 07:28 |
Link to this message
|
Google up nVidia Detonater and download the latest version drivers to see if this might be a fix m8.
Shoey :)
Asus K8N nVidia nForce3 Pro 250 GB, Athlon 64 3200+, Hitachi 80 gig SATA 150, Corsair XMS 1 gig PC4000, ATI Radeon Saphire 9600 Pro (256 DDR), Windows XP Pro (64 Bit),Lite-ON SOHD 167T,, Plextor PX-712SA,BenQ 1640.
|
Shoey
Suspended due to non-functional email address
|
28. September 2003 @ 07:32 |
Link to this message
|
Asus K8N nVidia nForce3 Pro 250 GB, Athlon 64 3200+, Hitachi 80 gig SATA 150, Corsair XMS 1 gig PC4000, ATI Radeon Saphire 9600 Pro (256 DDR), Windows XP Pro (64 Bit),Lite-ON SOHD 167T,, Plextor PX-712SA,BenQ 1640.
|
Praetor
Moderator
|
28. September 2003 @ 07:34 |
Link to this message
|
|
Applecorp
Member
|
28. September 2003 @ 08:51 |
Link to this message
|
Ok, Praetor first, when you say aniso do you mean ansiotropic filtering? I'm not sure if its enabled or not but it happens on every single game I play, 3D mark included (and not sure how to en/disable aniso?) but I don't think that is the problem.
Shoey, I have tried many drivers, dets, abits, all sorts!! no difference. I have lowered clock core and memory speeds all without any improvement. :(
I think that basically the card is bad, and there is no fix.
This message has been edited since posting. Last time this message was edited on 28. September 2003 @ 08:53
|
Advertisement
|
|
|
Praetor
Moderator
|
28. September 2003 @ 11:09 |
Link to this message
|
To get to the Ansiosettings:
1. Control Panel
2. Display
3. Settings
4. Advanced
5. The tab that says ur vid. chip
6. Performance & Quality Settings
7. Set ansio to off and turn of texture sharpening
I am inclinded to believe ur vid card is el toasto though
|