|
The Official Graphics Card and PC gaming Thread
|
|
AfterDawn Addict
4 product reviews
|
21. February 2013 @ 17:14 |
Link to this message
|
Interesting that you should say that about the graphics, given that you are far more the discerning graphics enthusiast than anyone else I know, as it's in contrast to several reviews I've read which have suggested that Battlefield 3's visuals were still slightly superior and that they found Crysis 3's graphics disappointing.
Yeah I don't see AMD going away any time soon, it's important to remember that they have won all three games console contracts this generation, which will provide them with an enormous amount of income. The HD7970 1Ghz flagship is already quite reasonable, with some examples available for just under £300 ($380+tax equivalent) over here. Not a lot for what AMD are suggesting is the best single GPU out there. Excluding the $1000 Titan they may well be right.
The 1GB HD7850, a bit short on memory I know, is being pushed as a good 'value' card, at just £120, or $150+tax. Even the 2GB example is pretty inexpensive considering it's basically the modern equivalent of an HD6970.
What would you see the prices at?
|
Advertisement
|
|
|
AfterDawn Addict
15 product reviews
|
21. February 2013 @ 17:18 |
Link to this message
|
Crysis is a higher quality game than Crysis 3 by far. Actual graphical quality is no substitute for atmosphere and art direction. In its best form, Crysis 3 is a shadow of Crysis 1. Crysis and Warhead remain undefeated, for now.
Battlefield 3 continues to take my breath away. It and Crysis are only a hairsbreadth apart.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 21. February 2013 @ 17:21
|
AfterDawn Addict
4 product reviews
|
21. February 2013 @ 17:21 |
Link to this message
|
Fair enough, that's kind of disappointing. Still, will give it a try at some point.
|
AfterDawn Addict
15 product reviews
|
21. February 2013 @ 17:30 |
Link to this message
|
As far as prices go I'd like a pair of 7970s at $300 each which is what I picked up the 4870s for only a month after release. 4870s were, at the time, AMD's single fastest GPU. Relative performance to the competing product aside. The 7900 series is especially strong, but were a bit more expensive than recent generations for their relative price bracket. That has changed somewhat recently, but I'd like to see it lower.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 21. February 2013 @ 17:32
|
AfterDawn Addict
4 product reviews
|
21. February 2013 @ 18:14 |
Link to this message
|
Interestingly HD7970s seem a bit more expensive in the US. You can get HD7950s for $300, but of course that's a fair bit away from the bleeding edge, as the HD7970 1Ghz edition is a fairly substantial 32% faster than the HD7950 (The HD4870 was only 25% more powerful than the HD4850, however what we're really comparing here is more like an HD4890 to an HD4850, and they were 40% apart).
Also, on the Crysis 3 subject - with the release version optimisation, it's not actually looking too cringe-inducingly demanding:
CRYSIS 3 (Very High) - No AA - Expected Min/Ave fps
HD6850: 1280x720 30/34fps
HD6870: 1440x900 28/32fps
HD6950: 1600x900 30/33fps
HD6970: 1600x900 34/39fps
HD6990: 1920x1200 45/49fps / 2560x1600 30/34fps
HD7750: 1280x720 29/31fps
HD7770: 1366x768 33/36fps
HD7850: 1680x1050 29/33fps
HD7870: 1920x1080 30/34fps
HD7950: 1920x1200 31/34fps
HD7970: 1920x1200 36/39fps
HD7970GE: 1920x1200 41/45fps
HD6850CF(EST): 1920x1080 30/33fps HYPOTHETICAL - VRAM REQUIREMENT NOT MET
HD7870CF(EST): 2560x1440 43/47fps
HD7950CF(EST): 2560x1440 48/52fps / 2560x1600 44/47fps
HD7970GECF(EST): 2560x1600 58/64fps
2.17 / 2.08
CRYSIS 3 (Very High) - 4x SMAA - Expected Min/Ave fps
HD6950: 1440x900 27/30fps
HD6970: 1600x900 30/35fps
HD6990: 1920x1200 35/39fps / 2560x1440 26/29fps
HD7770: 1280x720 29/34fps
HD7850: 1600x900 29/32fps
HD7870: 1680x1050 28/32fps
HD7950: 1920x1080 27/30fps
HD7970: 1920x1080 31/34fps
HD7970GE: 1920x1200 33/36fps
HD7870CF(EST): 2560x1440 32/35fps
HD7950CF(EST): 2560x1440 36/38fps / 2560x1600 33/35fps
HD7970GECF(EST): 2560x1440 48/50fps / 2560x1600 44/46fps
This message has been edited since posting. Last time this message was edited on 21. February 2013 @ 18:14
|
AfterDawn Addict
15 product reviews
|
25. February 2013 @ 23:06 |
Link to this message
|
Quote: HD6850CF(EST): 1920x1080 30/33fps HYPOTHETICAL - VRAM REQUIREMENT NOT MET
Thanks for the ballpark estimate. I will be exploring some tweaks and other things but it seems like video memory has finally caught up with me.
Crysis 3 has proven to be slightly worse quality than Crysis 2. Optimized worse(many examples), which I thought was impossible, but more eye candy thrown on. It looks damn good, as did Crysis 2 with the official add-ons. Crysis 3 is almost a graphical rival to the original Crysis but Warhead certainly edges it out. Crysis 3 includes much of what got left out of Crysis 2 but still lacks the overall polish and art direction of the original. It has every bit the eyecandy, but not put together nearly as well.
Crysis & Warhead: Graphics 9/10 Gameplay 9/10
Crysis 2: Graphics 8/10 Gameplay 7.5/10
Crysis 3: Graphics 8.5/10 Gameplay 7/10
FarCry 3: Graphics 8.5/10 Gameplay 8.5/10
Worth saying that all of the Crysis series have good graphics mods. I have compiled collections for Crysis 1 and 2. Crysis 1 is easily the best looking game ever made with the mods present. Currently my 10/10 standard. Crysis 2 doesn't have near the community. Will note that Crysis 2 is improved noticeably by the official add-ons.
None of them are fundamentally bad games, but Crysis 3 suffers from CoD syndrome in a bad way. We're not talking Infinity Ward's usual heart-pounding thrill ride, we're talking Treyarch's on-rails shooting gallery with historical references. Both Crysis 2 and 3 suffer from bad writing and level design while Crysis 1 was well written and well paced.
I am finding myself continually impressed with FarCry 3. Well optimised but certainly requiring a significant upgrade. I may be buying a pair of 2GB 7850s and keeping with tradition. I am keeping to mostly very high settings right now. Ultra high is damn pretty, but demanding. Still lacks some variety like its predecessor, but such a major improvement. Great game.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 26. February 2013 @ 15:18
|
AfterDawn Addict
15 product reviews
|
1. March 2013 @ 16:09 |
Link to this message
|
Just an update: The Catalyst 13.2 Beta driver fixes a lot of glitches for my cards and helps performance considerably in FarCry 3 and Crysis 2. Have also experienced generally better stability.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
4 product reviews
|
1. March 2013 @ 17:52 |
Link to this message
|
Interesting, I'll keep that in mind, been a long time since I updated my drivers. If it ain't broke etc...
|
harvardguy
Member
|
3. March 2013 @ 01:25 |
Link to this message
|
Hmmm. Terrific information guys.
Sam I should quote your 8782 post with highlights of your history with quad cf in its entirety, but I will save that post by printing it out to my appropriate paperport folder. I found that one part interesting, when you went to an "insane test" but you stopped the cpu part of the test when you saw the wall power drain climb to dangerous levels, haha.
You are definitely the pioneer and you blazed an amazing trail.
I might follow you into quad cf territory. But if I don't, even with an upgrade of "just" dual 7970s I think I am beyond the capabilities of this toughpower 750, especially if I put the nehalem board in the spedo and then try to overclock it. Would you agree?
So what do you guys like for an 850 or 1000 watt PSU? I did take note of your comments Sam about that digital corsair 1200 as maybe being a bit over the top. But if I have to pick up another PSU, then I might as well invest in one that would be big enough for quad cf if I ended up going that route. My reasoning is, a 1200 won't burn any more juice than an 850 - but it unlocks extra potential for me to go all the way to quad cf. And I doubt the price differential would be much more than $125 additional. The Corsair 1200 digital had about a $330 price point as I recall.
HERE'S WHERE I'VE BEEN FOR A MONTH
I have not touched Far Cry 3 for a month - (sorry about no new screenies Kevin, lol) as I have been updating my 3 business computers which each runs w98 and xp in dual boot - I might need them if I get back into real estate the way I used to do it with the help of high school seniors sending out mail, putting call lists together, etc. how I used to do it. So that little project led me to pick up 5 used ide drives, some with significant hours (power on hours count from SMART data.) One of them is identical to 3 500 gig ide WD drives that I picked up from newegg 4 years ago. The interesting thing there is that two of those 500 gig drives have a few thousand hours on them, but one only had a total of 17 hours on it - however that one had a significant number of bad sectors.
It turns out you aren't supposed to put hard drives in a box on a shelf somewhere - they don't really like that. Anyway, Western Digital write zeros thorough test actually improved several drives that were giving me problems and brought their test scores up to perfect, eliminating pending sectors, offline uncorrectable sectors, etc.
My small amount of research on archival data storage indicates that you should really keep your drives active, like running a car once in a while. So I now plan to exercise all dozen or so outside-the-case drives every 3-6 months at a minimum, for a day or two each, to keep them "in shape."
That is great news to hear you guys mention about AMD getting those lucrative console contracts - yay - not wanting to see them disappear.
MY UPGRADE PATH IN THIS FULL-TOWER SPEDO
First of all, I ordered all silverstone 140mm intake filters, six of them, to allow my case to breathe better - the silverstone older filters have a much looser mesh. I put the Demciflex very fine mesh filters (silverstone also introduced one like that) away, as it's a compromise between airflow and dust, so I am going to allow a little bit of dust for better airflow. These filters have magnets and stick to the case - so Sam - maybe you shouldn't run unfiltered side intakes, lol.
Anyway, the 7950 purchase was largely due to the mid-tower case - they consume less energy and put off less heat. I was VERY concerned about heat in that smaller case. So I made the big move, up to the full tower, and now with the new filters I have taken off the speed reducer from the ceiling fan, because I definitely have positive case pressure, so that ceiling exhaust can now run at full speed.
So I think my upgrade path is to settle on MSI Lightning 7970s, at about $500 each, getting two to start with, and possibly a third in the future. And I need a new PSU.
Jeff, before you buy those 7850s, I think we should talk about whether you could take advantage of these 7950s, depending on your timing. If I am going to upgrade, I don't know why I would keep them, and I would almost rather see you max out your 2.3 megapixel gaming rig now with your 6-core, to keep me up on games that I might like to play, like you did with Rage, the two OF games, and Metro 2033 (remember when I panned the game - "oh that thing in the subway with those radioactive freaks" haha.) So if you have any interest, we should talk.
The thing for me, regarding timing, is picking up the nehalem rig that is sitting up there in LA waiting for me to go up there and fix the photo server that I fixed two years ago with the dual raid mirror setup. It's a registry problem. That was almost scheduled for this past week, but some family illness up there derailed that trip. I already invested in the silver arrow extreme sitting in the cabinet above my head. With your great charts, Sam, and the fine-tuning crytek has done since release, and that new catalyst, the gigabyte 1366 motherboard in the nehalem rig should work well for me if I can get 4 ghz out of it, which would boost me by 43% over where I am now with a 3.343 9450, adding the 20% clock for clock more speed that you mentioned.
On the graphics side, I see that newegg has the dual gpu powercolor 7990 for $900.
For the 7970, I really like the MSI Lightning 7970, at $500, which newegg discontinued, but Tiger has them. They don't have dual link dvi so you have to add an extra $95 active adapter - BUT YOU ONLY NEED ONE FOR ONE MONITOR.
The 1366 gigabyte board, assuming it works, which comes with the nehalem chip and 12 gigs ram, is the GA-X58A-UD3R I believe. It's definitely the tri CF mobo, I remember that. There are 800 reviews on newegg, but a lot of them are bad, lol.
QUAD CF, OR JUST TRI CF?
Anyway, I could run tri cf, rather than quad cf I guess.
Let me ask you guys. Even with one dual gpu card, one still has the option of running tri cf, is that correct? If you get one 7990, you can add a 7970 to that. Am I right? So I could be looking at $330 for a psu, $900 for a 7990, and $500 for a 7970, plus $95 for a dongle for the lightning card, or about $2,000, less whatever the 7950s fetch back.
Or like I say, 3 discrete msi lightning 7970s clocked at 1070 each, running $500 apiece and no state tax at tiger, plus one $95 dongle. That might be better than a 7990 plus a single 7970. However, they would be crammed right up next to each other, but still - two gpus on the same board have major heat issues, right?
The only advantage of the 7990 plus a 7970, is that you still have the ability to trade out the single 7970 and get another 7990, for quad cf. So there is still some graphics headroom if you need it. But I suppose that tri cf should be plenty of power for quite a while.
Anyway, I would say that for starters, a corsair 1200 psu at $330 is a requirement - I started to say that cards in the future will burn less power, but who knows - as Sam said, they are burning MORE power, not less, lol.
I guess my timing is going to be when I pick up the nehalem and make sure I reseat the stock heat sink, and verify that it does not exhibit the high temps that it was displaying. Also, that motherboard has a lot of bad newegg reviews - that could be the problem.
FINAL THOUGHTS ON LIGHTNING 7970 CARDS
Anyway, the MSI is $500 at tiger direct, clocked at 1070. By removing the "reactor" on the back, you can run 3 of them right next to each other. So, use two of them at first, with the reactor, and crank from 1070 to 1200 like some do, for about 25% more gpu power than I have now. Later (if crysis 3 is ok for now with two) add a 3rd card and remove the reactor, drop back to stock clocks, and pick up about 45% more power than I have now assuming 80% scaling for 2nd card, and 60% scaling for third card. That's a nice upgrade path that should last 3 years.
Rich
This message has been edited since posting. Last time this message was edited on 3. March 2013 @ 03:47
|
AfterDawn Addict
7 product reviews
|
3. March 2013 @ 12:51 |
Link to this message
|
Indeed. I exercise my hard drives at least once a month :p
My 3Tb drives have pretty much been running since day one :D
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 3. March 2013 @ 12:51
|
AfterDawn Addict
7 product reviews
|
3. March 2013 @ 14:54 |
Link to this message
|
Sorry about a double post, but I want to insure this being seen ;)
Do GPU's require full power at start up? E.g. the GTX 260? Because I believe the following power supply is enough for a simple test.
http://www.newegg.com/Product/Product.aspx?Item=N82E16817151072
I'm gonna see if it's dead once and for all! LOL! I'm betting that it's merely the fan that ramps up to max, and not the cores themselves. And anyhow, the seasonic can probably handle the max watts for a limited time anyway.
It's going in an older board for testing. That way my current/modernish board remains safe ;)
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 3. March 2013 @ 14:55
|
harvardguy
Member
|
3. March 2013 @ 17:12 |
Link to this message
|
Kevin,
I think you're right. Big fan noise only - not heavy core activity.
(So were you saying you never turn any of your disks off, lol.)
I have several computers that exhibit big fan noise on re-boot, that way I can tell, when I press Alt Ctrl Del, if the cpu has actually re-started the post, for example when I unplug an ide drive that it didn't detect, to reset a master/slave jumper, on non-WD disks, or older WD disks with very cryptic jumper diagrams. Instead of just hitting the power button off like I used to, I have begun to do my plugging and unplugging while the power is on, to reduce the power on cycles of the other connected and/or internal drives, some of which are still css technology (contact start/stop) - I don't personally think there's any harm to doing that while I'm still in post.
(By the way, I had a crazy short circuit while messing with one of the computers - I had an extra molex hanging down, and just pushed it into the case to get it out of the way, when BING - the computer shut off. WTF!!@$@# There was a tiny flash of light - some F#@%#ing random case screw was lying there hiding itself in the bottom of the case, and it had touched the inside of that molex connector for an instant. Shame of me for not knowing it was there - I'm usually much more tidy than that! Anyway I could detect no permanent harm - a little instant-fuse in the power supply I guess, good for that allied power supply.)
By the way, speaking of allied, they are super cheap, but actually somewhat heavy indicating decent iron inside, and my pc builder friend Mo used to use them all the time - maybe you could check and get one for closer to $30. Wow, the egg has one.
Edit: It's putting out slightly less amps on the 12 volt rail, 14 amps vs 17 on the seasonic. I think still enough for your test, if that's all you really want it for.
I was doing a lot of that kind of testing this past month, waiting for that big fan surge. "Ok, here we go, let's see if it will detect it now."
The fans will max just sort of to test them, but not the cores. And even if they do max for an instant or so, that will not burn out the power supply - it won't have time to heat up and destroy the components in just a few seconds - Sam may want to correct me - but PSUs can, in small few-seconds chunks of time, safely put out much more than rated power.
That's my understanding. Anyway, yes, we need to see once and for all if that card is any good, lol.
Rich
This message has been edited since posting. Last time this message was edited on 3. March 2013 @ 17:18
|
AfterDawn Addict
7 product reviews
|
3. March 2013 @ 17:56 |
Link to this message
|
The PC I want to test it in, already has the Seasonic.
My internal HDD's are always running. I don't allow them to idle. Though I believe the heads still unload. And since my PC is running nearly 24/7... they haven't gotten much of a break LOL!
I'll be testing the GTX 260 today. As soon as I can find a double molex to 6 pin adapter. I believe I have several. Somewhere...
I just fear that it may be inconclusive, if the seasonic can't power it :S I guess I'll know immediately, whether or not.
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
7 product reviews
|
3. March 2013 @ 19:36 |
Link to this message
|
It works!!! I was really leaning this way. I inspected the GPU with a magnifying glass more than twice. No burned spots, nothing appeared broken or bulging. I was just so afraid to plug it back in. When the leak occurred there was a POP sound, and the video got garbled. Since the GPU was drenched in the fluid, it was logical to assume it was fried somehow. But so was the power supply. So I'm assuming either the CPU overheated, or the power supply shut down in order to protect itself. Causing a visual error in windows ;)
Since it works, I can reapply thermal paste and sell it :D
To delete, or not to delete. THAT is the question!
|
harvardguy
Member
|
3. March 2013 @ 20:48 |
Link to this message
|
Very cool! Congratulations!!!
So all that kidding we gave you was for nothing - ddp saying - what did he say? "Fizz" or something like that, lol.
"HEY DDP, HE DIDN'T FRY HIS CARD, WHAT DO YOU THINK ABOUT THAT?"
Actually I don't think ddp wanders over here very much because we don't have the type of uncivilized arguments that characterize the builder thread. Haha.
Your card pulls 260 watts at full power and 160 at idle. That psu was putting out 200 watts on the 12 volt rail.
I imagine you didn't run the card too long. But - did you at least run the card for a few seconds in 3d? The fact that you get a 2d picture on the screen is not the same as the card going into 3d mode and rendering something. But you can't run anything for more than 10 seconds tops I would say.
I guess you could open gpu-z logging, real time, make sure you don't exceed about 10% load, and run the tiniest furmark window possible with no AA for about 10 seconds. Or something in unigine, tiny window. Anything 3d.
If it jumps to 50% load, count to 5 and then close the window. That will be good enough.
Rich
|
AfterDawn Addict
7 product reviews
|
3. March 2013 @ 21:16 |
Link to this message
|
That the GPU is putting out a signal at all is VERY good. THat's what I wanted to know. Since it passed test one, now I can plug it into my main tower, which has a superior PSU. I allowed windows to run for more than a minute. Then restarted, then shut down. If the other computer ran it without trouble, I can trust it in my main rig ;) I just didn't wanna plug it into my main, and blow a northbridge or something extreme like that!
I don't wanna stress it just yet, because when I took it apart, the thermal paste got all roughed up. I'd like to reapply it before stressing it ;)
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 3. March 2013 @ 21:18
|
AfterDawn Addict
15 product reviews
|
3. March 2013 @ 21:22 |
Link to this message
|
Just an FYI for anyone remotely interested in combat aviation, there is a new game in Open Beta called War Thunder made by Gaijin, the owner of the legendary IL-2 flight-sim franchise. It is a free to play WWII-based competitive flight combat game. It is in direct competition with a game made by World of Tanks creator WarGaming called World of War Planes. There are several modes of realism ranging from the deep, industrial-grade flight-sim of old requiring rudder pedals, stick, throttle, every key on the keyboard, and some basic flight experience, to a very simple, but powerful, "FPS style" keyboard interface with arcadey physics and plane realism that anyone can pick up and play. It uses the well established ACES engine which has been under the hood of every IL-2 game in the last several years so the overall quality of the game is excellent.
===================================================================
Oldschool IL-2 fans will rejoice as this is a true hardcore flight-sim, but it offers a much more widely appealing arcade mode which is a great game in its own right. It follows a free to play model with optional premium content somewhat similar to World of Tanks including full-blown single player campaigns(which WoT does not have) that can be played to earn bonuses, some excellent premium planes, and a premium credit system that can be used to purchase extra plane slots, premium-only planes, etc. It does have a cheap subscription fee for "Premium Status" which offers a bonus to plane and crew experience as well as credit income, greatly increasing the rate at which you can earn and purchase new planes. It is greatly similar to World of Tanks in many ways.
I have been playing hardcore IL-2 for years already so this arcade mode is really refreshing. The mouse and keyboard interface are fantastic and make for a very fun flight game without the intense plane management and decent joystick required of a full monty sim.
The majority of the players are in Arcade mode, and this is where the real fun is. Very fun casual game, very quick matches, usually 10-20 minutes. You have an ever improving stable of planes from several major nations that took part in World War II, and you take your chosen 4 or 5 best ones into combat. Whenever a plane is damaged, you can try to land and repair, or it crashes, and you simply respawn in a new plane until you run out of planes or the planes you have left might not be profitable to use. Obviously landing and repairing earns experience, money and is more economical. You have a random objective, map, weather, team consistency(within competitive aircraft tiers), and you simply go at at. You make your best effort to work as a team and play strategically and it really flows together smoothly, chaotically, gloriously.
Luckily my long-time learning process in IL-2 and the newer releases(Wings of Prey, Cliffs of Dover) has left me well versed in the basics, so I often get matches full of casual players and clean up quickly. There are just as many guys with a general idea about combat aviation though, so skills are everything. It's not brutal, but a good player in a powerful plane is a force to be reckoned with.
===================================================================
Both Companies are attempting to make a Land, Air and Sea combat game with all 3 gametypes happening on the same server. WarGaming with World of Tanks, World of War Planes(World of War Ships planned), and Gaijin with the recently released War Thunder(other components planned).
World of Tanks is already heavily entrenched in gaming culture after a year and a half of real exposure, and is a finely-honed game that I have spent countless hundreds of hours in since very early beta and maybe a couple hundred dollars on for premium vehicles and status. World of War Planes, on the other hand, is a poor quality, steaming pile that has no business attempting air combat and every facet of the gameplay is a joke compared to War Thunder's objective-based ballet of death. War Thunder is by far the better flight sim and has miles better mouse controls to boot. Gaijin understand flight combat, even a very arcadey game needs the right feel to be authentic. Gaijin does just this by giving you a very accessible arcade interface draped over the hardest of core flight sims ever made.
World of War Planes fails in many departments. I am currently still under an NDA about it(WoWP), so I can't give details or screens, but it simply isn't as good as or authentic/nostialic as War Thunder by any stretch. This is based on a 12 year old engine I played as a kid with all the modern trappings.
===================================================================
I recently spent $20 on a special offer in War Thunder just for the hell of it. That bought me two separate 4-5 hour campaigns that are worth in-game credits to complete, two excellent, high-level premium planes which earn bonus credits and XP on top of any other bonuses, and a full month of premium status which gives +100% XP and +50% credits.
In World of Tanks, $15 gets you a single month of premium which mean +50% credits and XP income. $30-60 will get you a nice, high-level premium tank which earns bonus credits and XP. $5-20 will get you a low to mid level one.
See the disparity of content to price here? World of Tanks is fantastic and worth every penny I've put into it, which includes $35 for a premium tank, and several months worth of premium over the last year. But War Thunder has given me what would easily be $80-100 worth of content in World of Tanks for $20.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 3. March 2013 @ 22:16
|
harvardguy
Member
|
4. March 2013 @ 02:39 |
Link to this message
|
Hey Kev, I have a post put together about hard drives, and using SMART data to decide how to set your power scheme, but I'll post that some other time - I want to comment on Jeff's post above, and then talk about GTX680s - yeah the green guys!
First off, - man, Jeff, are you a gamer's gamer or what. Your level of expertise and the range of genres that you negotiate and enjoy and know how to play is astonishing. I refuse to even think about it! LOL
I can totally understand that would be a kick, and why it makes sense to put a little money in and get a better tank, or a better plane. That's the new formula, free to play but make it appealing to put a little money in. I think it's an extremely interesting marketing concept. Like TF2 which I have still never tried but I own it, and it is installed - it is now a free game. They sell you hats and things which makes the gaming even more fun, I hear.
But I would face a steep learning curve. Somehow, in Pacific Assault, my favorite Medal of Honor of all time, I managed to survive the dogfights, but it was tough. Second of all, it was tough. Did I mention that it was tough? Well, yes it was fun, but it was tough.
And in addition - "No no no no no" - I have to get my butt going and start to sell some houses.
So - yes I will get Crysis 3 and start playing it, eventually, and I'll get back into Far Cry 3 and do the new reset, which is a higher level of hardness, and repopulates all the enemy camps - that might be fun, we'll see.
Meanwhile, I spent all day on the internet today thinking about yesterday's Upgrade Path post. I also looked at a couple of PSUs, like a 1200 watt toughpower for $250.
Also I read a lot of reviews about the dongle that converts a display port to a dual dvi port for the MSI 7970 lightning. Then in monkeying around, I noticed that MSI also built the Lightning model based on the GTX680. I looked at that. It comes with two dual-link dvi ports - no dongle needed. How on earth did they decide to cripple the 7970 version of the Lightning, with NO dual link dvi ports? No wonder the egg doesn't carry it anymore? What a bunch of jerks!
So, for 2560x1600 gaming, I'm looking at the green guys now. Especially since as Sam recently mentioned, Nvidia, although still a shady company, has reasonable products lately at fair prices, and the GTX 680 Lightning at newegg is only $500, no dongle needed. Plus AMD, in cancelling their 8000 launch, are temporarily at least giving up the high end - and they are doing okay financially with those new console contracts.
So having two and a half years ago gotten the 9450 with an 8800GTX attached, which opened up World at War for me, and a few other great titles, World in Conflict, and even Rage at 60 hz, due to Carmacks adjustment magic, and then these two 7950s of 8 months ago took me through all the crysis games, and sleeping dogs, and lately far cry 3 - maybe I'm ready to go back and try some nvidia products again, like Kevin whose 260 is not as hosed as he thought it was. :P
That gigabyte card with the i7 on it is a triple cf or triple sli card.
On the MSI GTX680 Lightning, guys are saying that for triple sli you might have some trouble slapping them together that closely, and water blocks for the video cards might be the only practical way to do it - but I noticed tonight on the other side of my trailer/office, that my spedo case has two water hose inlets in the back, and I guess the radiator is meant to mount on top above the 200mm ceiling exhaust. As long as I used all good parts, no plastic, water for the video cards, I suppose, could work.
I ALMOST BOUGHT TWO CARDS TONIGHT THAT I CAN'T RUN TOGETHER YET
So does it sound like I might be seriously leaning toward those Lightning 680s? I very nearly almost ordered two of them tonight! Oh, but wait, jeeesus - my present P5E is not sli-rated. Glad I held off - I would have had no way of testing them together. Well, I could have tested them individually though.
I still don't like "only" 2 gigs of memory per card. But ... they seem to perform pretty well anyway. Still ... 2 gigs? Maybe one dongle is not so bad.
Tiger wants $680 for the 680 Lightning, which is only $500 at the egg, but Tiger wants only $500 for the 7970 Lightning, which the egg doesn't carry anymore.
The dongle is only $100. It is an active device, powered off the usb port. Most of the time people don't have problems with it - but sometimes they do. Damn MSI anyway for not putting at least one dual link dvi port on their 7970! That pushes me toward nvidia. The adapter is called the Accell UltraAV B987B-002B DisplayPort/DVI-D Dual Link Adapter.
Do I need 3 gigs? The Lightning GTX680 appears to be a major overclocker and seems like when it is overclocked, it more than holds its own at 2560x1600 gaming versus the GHZ 7970 models. After evaluating gigabyte, xfx, powercolor, and returning 3 video cards before settling on these two great HIS cards, I am a little concerned about buying ordinary cards - that was a lot of testing and I wound up with some good cards, thank heavens, but I experienced overheating cards, artifacts, and sparkling from the ones I returned.
So now I'm thinking - since I want stuff that will last for 3 years, as games get more and more advanced - that I need some solid cards that will really overclock - which is one way to keep improving the system, besides adding a third card. Adding a third card, AND overclocking, and maybe using water to keep the cards cool, that might carry me along for quite some time, and help me avoid micro-stutter while doing it.
So I just really like the Lightning series, the military specs, the intense engineering, the extra heavy duty components, the power reactor.
What does anybody think about that, the nvidia with no dongle needed, the 2 gigs not 3, the green guys vs AMD?
Rich
EDIT Why didn't nobody say nothin!!?? I don't have to give up 3 gigs - there's a bunch of 680s out with 4 gigs!! Yahoooooo!
This message has been edited since posting. Last time this message was edited on 4. March 2013 @ 05:45
|
AfterDawn Addict
15 product reviews
|
4. March 2013 @ 13:53 |
Link to this message
|
As far as PSUs go, Corsair, Seasonic, and Antec are all as solid as ever. Highly enjoying my Corsair 750HX. Very quiet and so far quite stable.
As for GPUs, it seems a lot of different things are driving your thoughts. I wish I could offer a solid piece of advice. I think though, after 3GB most games aren't going to need more video memory. AMD's offerings, including your 7950s, remain quite valid. I do agree though that the latest Nvidia offerings are very competitive.
I do try to play a wide variety of games. Driving, flying/sims, strategy, FPS, RPG. Some require simulator gear, at least a basic flight stick with rudder controls, others requiring the trusty old Xbox 360 pad. Most of my gaming remains firmly keyboard and mouse though.
I am a large fan of simulators and sim-style flight games. Space, WW2, modern, you name it.
Freespace 1 + 2
FreeLancer
Independence War
X2, X3, X Rebirth
IL-2 + Expansions, Wings of Prey, Cliffs of Dover, War Thunder
Lock-On Modern Air Combat, Lock-On Flaming Cliffs
MechWarrior 4, MW4 Black Knight, MW4 Mercenaries(Fantastic giant robot simulator)
Star Wars X-Wing, TIE Fighter, X-Wing Alliance(Fantastic Star Wars flight sim)
Those are just off the top of my head, games I have put over 100 hours into using solely my flight stick. The stick itself, a very well-used Saitek X45, has over 1000 hours on it.
Sadly, I do have some trouble using the most realistic flight physics in the IL-2 games. I can fly a plane just fine, but you throw in gaining speed and altitude on an opponent, banking and high speed turns, trick maneuvers like cutting engines and flopping over, etc and I simply have a hard time keeping the plane under control.
I do use 90% of the realistic physics but I generally disable "Turn Stalling" if I plan to dogfight. It still allows a regular stall, like climbing too fast and losing engine power, falling into an uncontrollable twisting dive. But it prevents your plane from twisting itself into a stall in turns and other high speed maneuvers. That's normally something a pilot spends a lot of time compensating for.
I can manage it just fine when cruising around, but not with 3 fighters on my tail while climbing up a vertical cliff, going 400MPH, on fire, with one engine starting to die and only half of my hydraulics working. That kind of sim is for a greater man than I. I already have a hard enough time managing the engines, super charger, fuel/air mixture depending on altitude, rudder and aileron trim, engine cooling, emergency fire control systems, several different weapons systems, communications, you name it.
The trade-off when I play in sim mode is that those doing said detailed plane management gives a good payoff in handling capabilities. I lose out a bit.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 4. March 2013 @ 15:24
|
ddp
Moderator
|
4. March 2013 @ 14:52 |
Link to this message
|
harvardguy, i still read the threads when there is an update.
|
AfterDawn Addict
4 product reviews
|
4. March 2013 @ 15:19 |
Link to this message
|
I haven't had the chance to write a response to the posts in this thread as of yet (I got halfway through responding to the first post in half an hour, then came back to find three more similarly lengthy posts!) but what I will say is that if using multiple displays, either use nvidia, or have all displayport-compatible monitors. You should always assume with multiple display setups that displayport->DVI dongles do not exist, as they have never been compatible with eyefinity and therefore do not work!
Of course the flipside of this is that AMD cards handle the higher resolution better than the Geforces, even when considering the 3GB vs. 4GB on some GTX680s.
This message has been edited since posting. Last time this message was edited on 4. March 2013 @ 15:19
|
AfterDawn Addict
7 product reviews
|
4. March 2013 @ 15:20 |
Link to this message
|
Ahh, you don't wanna read all of that? :p
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
4. March 2013 @ 15:32 |
Link to this message
|
I will, but that saves people from wondering why I haven't responded, while I get a chance to read it all!
|
AfterDawn Addict
15 product reviews
|
4. March 2013 @ 16:32 |
Link to this message
|
Sad to say I often skip large tracts of conversation if it doesn't immediately appeal to me. Not that I'm ignoring anyone more than looking for things I can give somewhat intelligent input on.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
Advertisement
|
|
|
harvardguy
Member
|
4. March 2013 @ 16:50 |
Link to this message
|
Awwww - I forgot to check before hitting reply - hey Kevin and Sam - you don't have to go back to re-read any of my posts, it's summarized here. I don't want multiple displays, the dongle is just to make the lack of a dual dvi work on the MSI R7970 Lightning - but I am not going to do that.
-----------------------------------------------
Jeff, you are a connoisseur! I just want to participate in one small part of your giant world - the first person shooter segment.
You are right - a lot of elements driving my thinking regarding video cards.
I now feel that for me, the massive graphic requirement of 4 megapixel 30" gaming means I made a mistake in getting the 7950s. You were right all along of course.
I don't entirely regret it, because that purchase, based initially on less power usage and less heat buildup vs 7970 in the midtower case, forced me to finally pull that brand new spedo out of the garage "warehouse" from 4 years prior. But essentially, with sleeping dogs and far cry 3, I have found that the 7950s are holding me back for 30" gaming. This 2560x1600 isn't a sweet spot - it's not as bad for heavens sakes as eyefinity - I don't quite need to be surrounded by enemies on all sides, lol - but it is very definitely a demanding resolution that doesn't forgive too many compromises.
I have found my beloved HIS IceQ cards to be rock stable at 975 core, 1087 vddc. Never a hiccup - 8 to 12 hours at a time. Fairly cool - I don't like them to ever even hit 80. Lately in Heaven testing, letting it run and watching temps, I have gotten 1025 stable at 1137 vddc. Above that voltage, more heat, and less stable - that surprised me. I thought more voltage was more stable, but at a heat penalty. Apparently not true and it wasn't heat that suddenly turned off the display. Well, maybe it was, but it wasn't gpu core heat. Below that, less voltage, cooler, but less stable. So that seems to be a sweet spot, and unless it is really hot here in the trailer, they still run cool at that setting. But I have found that there is no sweet spot that will run Heaven at 1050 for more than an hour, no matter how cold the night was as I was bundled up here in the trailer this past winter. The lower core speed, 1025, has not been exhaustively tested, but several hours on Heaven on several different occasions, and testing in far cry 3, showed it was stable, with good temps well below 80. So in my mind, my cards max out at about 1000, a nice even number.
The 975 core clock is supposed to match the 925 on the 7970 - so I guess the 7970 is 5% faster at same clocks, with 12% more stream processors, about 250 more. I just saw one review on the gigabyte 7970 that comes clocked at 1100 where the 2560 gamer said he got it stable at 1250. If true for me - that would be 5% times 1.25% or close to a 30% performance gain, so 30 fps goes to 40 fps. But his newegg review wasn't talking cf. And I personally sent my similar multi-fan gigabyte 7950 back. But again, that was in the smaller case and all the heat put out inside the case did not help things.
The MSI Lightning R7970 comes at core 1070, and lots of people have gotten it to "only" 1300 - some of them disappointed, lol. Anyway, seems like a strong overclocker - but there are some negative reviews on the $95 active dongle, so that seems iffy. It really looks hit and miss on that adapter - one more item in the mix which might not work.
The iffy adapter, the 8000 cancellation, the console contracts, and comments in this thread about nvidia offerings, have sent me over to the gtx680 side, to the same series I was looking at, the Lightning.
There have been some interesting things I didn't know about, like TXAA, which you guys are probably familiar with - which supposedly provides 4x or 8x AA effective, but with only a 2x performance hit. If true, that could be a big help. One newegg reviewer talked about liking it.
I have the screens, quite a few in fact, which I might post one day showing 4x AA is virtually the same as 8x on Far Cry 3. Not so with 2x - the jaggies pop out at you from time to time and are distracting. So in my mind 2x is unplayable - and like I said before, being able only to support 2xAA with the 8800 gtx card, caused me to put Far Cry 2 away for two years until this past Fall with the 7950s, where I discovered the beautiful river boating.
But back on the Lightning 680, I am bothered by only 2 gigs. I have a hard time mentally embracing the idea of reducing the vram - that sounds like compromise - and I have an uncompromising screen resolution if I am unwilling to sacrifice graphic fidelity, like yourself.
That's why I seem to have reached a decision point when yesterday a lot of troubling issues appeared to be resolved by the "discovery" of multiple reasonably-priced 4 gig 680 solutions. The evga has a turbine and blows the heat out the back - probably loudly, lol. They clock at just under 1200. Their video shows 2 of them side by side - the turbine tilts in slightly so it can still draw even with another card right above it.
In my choice of card I want the ability to slip a third card in there. While scaling is not so great with the 680s, I guess, you probably get at least 40% scaling on the third card. I am pretty sure the nehalem 1366 gigabyte board that I am going to inherit supports tri cf AND tri sli. I definitely saw with my own two eyes the three large slots (if not 4) and I was pleasantly surprised, making a mental note about that. Whether it is cf only, not sli, I doubt, as there didn't seem to be that many gigabyte 1366 boards around at that time - beginning of 2010. So the one I think it is, is the GA-X58A-UD3R. I guess there are 4 16x slots in total, supporting tri cf or tri sli.
So, when I discovered 4 gigs yesterday, that calmed me down, showing me the light at the end of the research tunnel - "Ah hah, that's what will work for me" - pulling me away from the 2 gig Lightning. But MSI has a similar card, the 4 gig twin frozer, which might be just about the same card, but no "reactor."
Today is supposed to be massive research day, reading benchmarks, etc. all the stuff you guys are expert on. I am definitely out of the knowledge loop especially on what nvidia offers. As Sam points out, with the nehalem I will be several generations behind still, and with a hot energy hogging solution. For another $600-800 I could get a fresh i5 or i7, with mobo, cpu, memory. But the nehalem, fully populated at 12 gigs, is free. That savings buys me the third video card. So .... If that goes sideways then I'll do the fresh build.
Being forced to move into that spedo case, like all of you guys with your HAFs, was a good forced move for me - it's had a major impact on my viewpoint. Now I'm thinking long term VERY powerful graphics solutions. I have the foundation - the case - to support that. For the time being I'll stay on air, but who knows what tri sli will require - I see that evga provides some nice water blocks.
- - - - - - - - - - - - - - -
DDP!!!
haha - hi ddp, just pulling your chain, lol
nice to know you're around! you can't believe what I'm doing with w98 these days - still locked and loaded with my dbm2 (dos) mail system
every time I mess with it I think of that customer of yours needing to run that one game and only w98 would work - it's very streamlined - but all the beautiful graphics I used to create were all done on that O/S - xp would run most of them okay, but in a few cases the colors and/or shapes were slightly off and I had to figure out how to do it on the "original w98" - like reboot, get a fresh 98 with no memory leakage, and test word print preview to make sure part of the image wasn't missing, then print out the stock expired listing stationery with the color house picture in the upper right corner.
I have been almost exclusive xp the last 5 years, but I am now reviving the w98 and refreshing my memory on all the old procedures that worked really well before real estate entered "strange times" 7 years ago.
Rich
This message has been edited since posting. Last time this message was edited on 4. March 2013 @ 16:57
|
|