|
The Official Graphics Card and PC gaming Thread
|
|
bradford86
Newbie
|
14. January 2011 @ 08:51 |
Link to this message
|
Originally posted by sammorris: You only need one space between the slots to run SLI. The cards will be next to each other, so they'll run hotter and noisier, but they'll still be within spec, and will still work fine.
This is the problem with 3-slot boards which is why I've never advocated the use of one. It's a lot easier to just use two slots, as far as cooling and noise goes.
X58 is the chipset I was referring to earlier, with the first generation i7s. It is for socket LGA1366 and will not fit the new LGA1155 processors.
If you need all 5 monitors as a single display space for something, you're better off with AMD's eyefinity, as it works with 5 displays as long as you have the outputs. There are 5 outputs on a Radeon HD6950 or HD6970, which can be used to run 5 displays at once [only up to 1920x1200 resolution max, but that's not an issue here]. You will need some active displayport to DVI dongles unless you buy two displayport monitors though.
Cheapest 1080p displayport monitor I can find is this one:
http://www.newegg.com/Product/Product.aspx?Item=N82E16824260019
which does still have the portrait-tilt.
However, if you run some monitors portrait and others landscape, you aren't going to be able to run a single displayspace, as you won't have a rectangle shape display area, it'll be taller on the outsides, so H-shaped. If this is how you want the monitors setup you will have to abandon the idea of a 5-monitor resolution, and simply manually stretch the program across all the monitors, wasting the extra space above and below on the outside monitors. Does that make sense?
i'm aware that the space will end up looking like an H. i have an H right now. it kind of looks like starwars
here's a printscreen of what i've got, note that windows blacks out the dead space:
http://i.imgur.com/UKOgj.jpg
i looked at the mother board you recommended again, you're right, it should all fit.
it does say this:
1 x PCI Express x16 slot, running at x4 (PCIEX4)
* The PCIEX4 slot shares bandwidth with the PCIEX1_1 and PCIEX1_2 slots. When the PCIEX1_1 slot or the PCIEX1_2 slot is populated with an expansion card, the PCIEX4 slot will operate at up to x1 mode.
what is x1 like?
right now i have:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814187083R - sparkle card
http://www.newegg.com/Product/Product.aspx?Item=N82E16813121385 - mother board
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125311 - graphics card
and the sparkle card monitor lags.. pisses me off. the sparkle card is a PCI card
based on my brief analysis referencing PCI and PCIe x1 stuff the bandwidth to this auxiliary card is going to be around 500MB/s
http://en.wikipedia.org/wiki/PCI_Express
My current PCI slot gets 533 MB/s MAX, as I operate at 64-bit
http://en.wikipedia.org/wiki/Conventional_PCI
I have a feeling that it's going to slow down my system...
right now, when i put video on that monitor or try to type into word, or outlook for an email, it lags and makes it incredibly difficult to deal with... so i just use it to put up tickers only, because i dont need to enter anything, just display stuff and it's ok if it lags a little then.
just found this one:
http://www.asus.com/product.aspx?P_ID=wurRaDZ8lo4Ckukj&content=specifications
http://www.newegg.com/Product/Product.aspx?Item=N82E16813131614
what is your take on that mother board? then all i need is an ATX case I think? I dunno.
I make big things happen fast.
Glen
|
Advertisement
|
|
|
AfterDawn Addict
4 product reviews
|
14. January 2011 @ 09:07 |
Link to this message
|
Here's the layout of the 5 monitors you wish to use. The picture at the top is obviously of them all landscape, and the bottom is of the outer monitors portrait.
If you have a program running on all 5 monitors at once, the best you can hope for is the black area in the bottom picture. You can't max the image on the other monitors because it would mean you lose half of the picture above and below the centre three monitors.
As far as connections go, it runs like this:
This means, you need the first GTX580 to run the yellow and green monitor [DVI], the second GTX580 to run the blue monitor [DVI] and a third card to run the red and purple monitors.
Displayport is not needed, nor is it provided with the nvidia setup.
As far as AMD are concerned, despite the HD6970 having five connectors on it, two of the connectors are derived from the same output on the card, it is actually only a 4-display card. Crossfire requires all the outputs to be produced by the master card, so you can't plug anything into the second card, meaning that you will also need three cards to pull this off.
One way you could do it is the following:
22"/23"
Red & Purple Monitors - Dell U2211H with displayport ($250 each)
Yellow, blue & green monitors - LG W2363D 3D ($370 each)
or Acer GD235HZbid
24"
Red & Purple Monitors - Dell U2410 with displayport ($550 each, these are pro-grade monitors)
Yellow, blue & green monitors - Zalman ZM-M240W 3D ($560 each, from provantage.com) [These are not 3D Vision monitors, they work using nvidia's stereoscopy driver. The Acer monitors are the only fully 3D vision certified 24" displays at this time]
or Acer GD235HZbid
Graphics cards
1: http://www.newegg.com/Product/Product.aspx?Item=N82E16814162068
2: http://www.newegg.com/Product/Product.aspx?Item=N82E16814162068
3: http://www.newegg.com/Product/Product.aspx?Item=N82E16814131358
[Running nvidia and ATI graphics at the same time is not going to be easy, but it should be possible. If necessary, just use a separate operating system for the ATI hardware]
Connections
Red: Displayport1 from HD5770E5
Yellow: Displayport2 -> HDMI from HD5770E5 AND DVI1 from GTX580-1 [if using LG monitor, you may need to swap DVI cables over each time, as the HDMI ports are on the side. Not necessary on the Acer due to the HDMI ports being on the rear]
Green: Displayport3 -> HDMI from HD5770E5 AND DVI2 from GTX580-1
Blue: Displayport4 -> HDMI from HD5770E5 AND DVI1 from GTX580-2
Purple: Displayport5 from HD5770E5
re: Second Post
That's another X58 board. See before!
|
bradford86
Newbie
|
14. January 2011 @ 10:04 |
Link to this message
|
I make big things happen fast.
Glen
|
AfterDawn Addict
4 product reviews
|
14. January 2011 @ 10:23 |
Link to this message
|
|
bradford86
Newbie
|
14. January 2011 @ 11:22 |
Link to this message
|
I make big things happen fast.
Glen
|
AfterDawn Addict
4 product reviews
|
14. January 2011 @ 11:41 |
Link to this message
|
The P67A-UD7 makes use of a PCI express bandwidth multiplier, the nforce 200. These have been a source of unreliability in the past.
What's wrong with the third card running at 1x if you're only going to use the desktop with it?
I'm no big fan of Acer monitors, they're very low quality, but it doesn't look like you really have a choice for the centre displays to be anything but Acer if you want 24", so you may as well make all 5 Acers I suppose.
Now that we're using three cards, you're going to need a bigger PSU than that 850. To be on the safe side I'd probably say this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16817139014
or at the very least, this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16817139007
[This one's very loud. The AX1200 is quite a lot quieter]
I assume you're considering, but ignoring the warning about the video memory on the 570 versus the 580. GTX580s are a fair bit more expensive, but only they can really run the majority of games out there at 5760x1080 (and even then not quite all of them at max settings). The GTX570s sometimes have to be cut down to 5040x1050 resolution, even without any anti-aliasing to be playable, because they only have 1.25GB of video memory each.
You'll want a high end case with the enormous amount of heat this system will be putting out.
High end cases:
http://www.newegg.com/Product/Product.aspx?Item=N82E16811146067 (Also available in red or black)
http://www.newegg.com/Product/Product.aspx?Item=N82E16811119160 (Also available in blue LED form)
http://www.newegg.com/Product/Product.aspx?Item=N82E16811119225
http://www.newegg.com/Product/Product.aspx?Item=N82E16811129100
http://www.newegg.com/Product/Product.aspx?Item=N82E16811146062 (This one's a bit smaller so will be quite cramped)
You're missing a CD/DVD drive, I assume you have one from an old build you're willing to use?
This message has been edited since posting. Last time this message was edited on 14. January 2011 @ 11:42
|
bradford86
Newbie
|
14. January 2011 @ 13:10 |
Link to this message
|
here is my greatest fear:
right now on my 3 monitor setup, my rightmost monitor lags.. it's SOOO Annoying.. i want to AVOID That and I am willing to PAY for it... lol here's the setup i have now: the PCI sparkle card is like a splinter... it just is aggrivating and i'd do anything to get rid of the lag.. like, if i'm dragging an open notepad document around on the sparkle monitor, the rest of my computer lags.
back to the 1x board:
it does say this:
1 x PCI Express x16 slot, running at x4 (PCIEX4)
* The PCIEX4 slot shares bandwidth with the PCIEX1_1 and PCIEX1_2 slots. When the PCIEX1_1 slot or the PCIEX1_2 slot is populated with an expansion card, the PCIEX4 slot will operate at up to x1 mode.
what is x1 like?
based on my brief analysis referencing PCI and PCIe x1 stuff the bandwidth to this auxiliary card is going to be around 500MB/s
http://en.wikipedia.org/wiki/PCI_Express
My current PCI slot gets 533 MB/s MAX, as I operate at 64-bit
http://en.wikipedia.org/wiki/Conventional_PCI
(I actually have no idea what I operate at, or what the bandwidth is or anything. I just know that when I try to scroll websites, it lags.)
trying to figure out whether pci-e x1 is going to lag or not.. not much research on the topic.
I make big things happen fast.
Glen
|
AfterDawn Addict
4 product reviews
|
14. January 2011 @ 13:16 |
Link to this message
|
Your desktop shouldn't lag even if your card was plugged into the PCI bus, let alone the PCIe 1x. (Last I checked, high speed 64-bit PCI was rare) That sounds like a driver bug to me, i.e. software rendering. I don;t think the sparkle card is even operating.
|
bradford86
Newbie
|
14. January 2011 @ 13:42 |
Link to this message
|
I make big things happen fast.
Glen
|
AfterDawn Addict
4 product reviews
|
14. January 2011 @ 14:00 |
Link to this message
|
No point testing if there's no driver installed for it, which is what I suspect is the case.
|
bradford86
Newbie
|
14. January 2011 @ 14:13 |
Link to this message
|
Originally posted by sammorris: No point testing if there's no driver installed for it, which is what I suspect is the case.
nah, the drivers for the video card are working, it detects nvidia geforce 9500 GT, which is what the sparkle is.
but for example, i have the monitor in portait mode... and when i switch between tabs on a website or do anything on the monitor, the entire monitor sweeps the next picture onto it from right to left (would be top to bottom if i pivoted it horizontally.
but yeah, so far i've confirmed the video card was 32-bit.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814187083R
32 stream processors...
then the question is the frequency of the pci. 133 or 266
|
bradford86
Newbie
|
14. January 2011 @ 14:21 |
Link to this message
|
so i'm 133,
133 MB/s is the capacity at these settings. and that's one monitor.
so if i switch to x1 pci express that's 500 MB/s for 2 monitors, haha.. going to be tough
what's your take? think i should just do the PCI express 2.0 x16 slots at 1x?
i honestly don't know what added capacity does for video cards... it is weird to me that i cant easily scroll a website though that's mostly white and black text on that monitor.
I make big things happen fast.
Glen
|
AfterDawn Addict
4 product reviews
|
14. January 2011 @ 14:26 |
Link to this message
|
I'm still almost certain the bandwidth is not the issue. You'll be fine with a third slot running at 1x.
|
bradford86
Newbie
|
14. January 2011 @ 14:50 |
Link to this message
|
I make big things happen fast.
Glen
|
AfterDawn Addict
4 product reviews
|
14. January 2011 @ 15:15 |
Link to this message
|
You still need the extra card to drive five displays:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814131358
It's this that's going in the "1x" slot.
You won't be able to use a joined desktop space otherwise, you will have to configure programs to use all five displays yourself.
As for the main geforces, it really depends on how much you're willing to spend on performance. The 1GB on GTX460s (and 560s when they come out) is woefully inadequate to run games at high settings at 5760x1080. 1.25GB on the GTX470 and 570 is passable but not ideal. 1.5GB on the GTX480 and GTX580 is not perfect, but it's better.
As an example (none of these tests incorporate 3D which will decrease performance further, but otherwise they are run at maximum settings)
F1 2010 - very smooth at 5760x1200 with 4xAA on HD6970s. Acceptable on GTX580 SLI. Unplayable on GTX570s unless at 5040x1050 with no AA, at which performance is barely adequate for a racer. To run smoothly on 570s, a low resolution like 3840x800 may be needed.
Civilization V - Acceptable at 5760x1200 on HD6970s. GTX580 SLI allows 4xAA to be used as well. GTX570 SLI is fine at 5760x1200 but without AA.
Metro 2033 - Mediocre performance at 5760x1200 on HD6970s, only using AAA. GTX580s are marginally faster, but would only be smooth-ish with AA off. GTX570s achieve borderline acceptable performance even with AA off. With proper 4x MSAA like other games, GTX580s are effectively unplayable, whereas HD6970s are just playable, barely. Advanced Depth of field, one of the game's extra features, is disabled as it makes the game too slow to be fun on almost anything.
Bad Company 2 - Barely acceptable performance at 5760x1200 on HD6970s with 4xAA. GTX570s fare about the same but with fewer (though worse) dips when the stuff gets heavy. GTX580s are able to maintain solidly playable frame rates throughout, even with much more anti-aliasing applied. GTX570s are simply unplayable wuth more than 4xAA due to memory restrictions.
Mafia II - Difficult at 5760x1200 for Radeon HD6970s even without PhysX, which they can't use. Relatively smooth on GTX570s with medium PhysX. GTX580s allow the use of High PhysX.
|
AfterDawn Addict
4 product reviews
|
16. January 2011 @ 10:21 |
Link to this message
|
Ran Bad Company 2 and Crysis/Crysis Warhead on the 6970s.
Bad Company 2: Laguna Presa
2560x1600 8x MSAA HBAO - Minimum fps 41 Average fps 47 - Unpleasant
2560x1600 4x MSAA HBAO - Minimum fps 59 Average fps 68 - Smooth
Crysis: Reckoning [Inside the ship, haven't advanced the save to the epic bit yet]
2560x1600 8x MSAA Very High - Minimum fps 63 Average fps 70 - Smooth-ish
Crysis Warhead: Call me Ishmael [First section through the river]
2560x1600 8x MSAA Enthusiast - Minimum fps 18, typical Minimum fps 36 Average fps 42
2560x1600 4x MSAA Enthusiast - typical Minimum fps 43 Average fps 48. Not tested the EMP yet. Will do so shortly.
Either way, while the Crysis tests will get worse in different sections, this is good playability with 4x MSAA at otherwise maximum detail, and I'm happy to settle with 4xAA instead of 8, the difference is minimal.
|
AfterDawn Addict
15 product reviews
|
16. January 2011 @ 13:51 |
Link to this message
|
Good now PLAY BOTH ALL THE WAY THROUGH lol. Seriously the entire game is different when you go through with maxed settings.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
4 product reviews
|
17. January 2011 @ 11:50 |
Link to this message
|
Made a slightly more colourful and user-friendly version of a required performance chart. This time specifically focusing on 60fps for minimums/averages, for the behemoth Metro 2033.
(11,10,9=DirectX Version. H=High, V=Very High, T=Tesselation, D=Depth of Field)
Will post more if people are interested.
|
harvrdguy
Senior Member
|
18. January 2011 @ 21:15 |
Link to this message
|
For such a demanding game - is Metro 2033 any good? Jeff said a few words about it, but it didn't shine in his review the way that Dragon Rising did. And since it's so demanding, I guess I'll never experience it unless I visit somebody with an xbox.
Wow, that was some kind of amazing project you and Bradford were collaborating on. Five monitors - good for him - pushing the state of the art in his field.
I didn't follow too closely since that is way more than I ever want to do - I pretty much lose myself in the 30" of monitor real estate 6-10" in front of my nose - I can't imagine being more immersed (other than force feedback, which I should check into, lol.)
And with that level of immersion, that's another reason I don't know if I would want to "enjoy" the Metro 2033 experience, of wandering around in a dark subway system 90% of the time waiting to get pounced on!
LOL
But Crysis and Warhead - that's a different story:
So you're finding that 4xaa yields virtually just as good a picture as 8xaa on Warhead - and your minimum 43, average fps 48 is providing: "good playability." Excellent!
Jeff wants you to go through now and finish the whole thing - and get the "true enthusiast experience."
Sam, I took note that in your frag soc, one of the sandy bridge guys with only one 6970 pushed past you on 3dmark6 all the way to 34000. Do you think your fps would improve at all with a sandy bridge platform for your 12 gigs and dual 6970s, or do you feel you are still gpu bound? (And by the way, how are you utilizing your 40gig SSD - did you put Warhead on it - would doing so help out?)
Rich
|
AfterDawn Addict
15 product reviews
|
19. January 2011 @ 01:11 |
Link to this message
|
Metro is a great game and totally worth the play. Also the PC version is superior in every way. I doubt the Xbox version looks anywhere near as good, if not downright ugly in comparison. The maxed Dx11 graphics are incredible, and really must be seen to be believed. A good chunk of the graphics are pretty bland, but it's extremely sharp looking bland XD Of note is that the lighting engine is one of the most advanced I've seen yet, and is a hair better than Crysis. Overall it really is in Crysis territory for performance and graphics, though I find Metro runs a hair better probably owing to better Crossfire scaling. I know Sam might argue but sometimes my maxed settings aren't always "maxed" :P In Metro particularly I don't enable Advanced Depth of Field. The game already has fantastic Depth of Field and the performance hit isn't worth a slightly prettier blur shader. As long as Advanced DoF is turned off the game runs quite smoothly, even leaving tessellation on(which does look pretty good IMO). I'd say 40-50 FPS in action, averaging 60+ running around.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 19. January 2011 @ 01:30
|
Red_Maw
Senior Member
|
19. January 2011 @ 02:56 |
Link to this message
|
Those metro fps was with dual 5850's right? So I'd guess us single GPU people should continue putting off playing metro 2033 for a few more years if we want to max it :(
This message has been edited since posting. Last time this message was edited on 19. January 2011 @ 02:56
|
AfterDawn Addict
7 product reviews
|
19. January 2011 @ 03:01 |
Link to this message
|
I'd like to play it. But I'm sure my GTX 260 wouldn't run it at levels that SHOULD be experienced ;) Besides, time seems to be an issue for me right now. As well as money :(
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
19. January 2011 @ 06:51 |
Link to this message
|
I'd put Warhead on the SSD, but honestly? With 12GB of RAM I don't really notice the sequential loading spikes. When they occur now they're so slight I don't think an SSD would make any difference. I have to agree with something Jeff said to me over messenger, the game seems to throw a fit if you run out of system RAM, claiming to use tens of gigs and generally being a bit unstable, but as long as you have enough RAM to start with (Which in my case is turning out to be about 6GB, occasionally slightly over) then you seem to be fine. I've not once reached the 8GB used level yet.
As far as CPU limitations, I'd get more frames in CPU-bound stuff like CSS and TF2, but I really don't see Crysis frames going up. As CPU-demanding as Crysis is, a 4.1Ghz i5 is at the point where really, a faster CPU is higher synthetic benchmarks and nothing else.
The people I know who're upgrading to sandy bridge CPUs are one who pursues relatively cutting edge hardware (and his previous i7 920 was a very poor overclocker, as they go), and people who still had Core 2 Quads and Phenom IIs (The unlocked dual core kind, not the true phenom kind) - to them of course, it's a fairly noticeable upgrade.
Right now, I sit with what is known as the 'flip' upgrade, whereas Sandy Bridge is the 'flop' upgrade. (This is actually a recognised industry term), whereby with each generation, a 'flip' occurs, with a breakthrough new technology that adds a fairly considerable amount of performance, then the 'flop' occurs a bit later, refining the architecture, adding a bit more performance.
As for Metro's depth of field, HardOCP summed it up when they basically went "we're not benchmarking depth of field. We'll cover it in a single graph to show what it does, but we're not enabling it in any of our 'maximum playable' benches. It's hugely demanding, and doesn't really do very much" (Not a quote, just a rough paraphrase)
The chart posted sums this up pretty well, the increase at 1920x1200 shows from two 6970s to FOUR, and from two 5850s to three 6950s. Geforces have a bit of an easier time, from two 580s to three 570s, and from two 460s to two 570s. Still hefty increases though.
At 2560x1600 it borders on the ridiculous, from four future 28nm GPUs, to four future 20nm GPUs, or in nvidia's case, two future 28nm GPUs, to three, possibly more. For an average of 60fps it's still an increase from three 6970s to three 20nm GPUs, two generational leaps, and from two GTX580s to three.
4x MSAA is also a fairly big hit, and as HardOCP also covered, it can decrease image quality. This was meant to be fixed pretty shortly after the game came out, but I never read anything definitively saying they had/hadn't done it.
Either way, given the ludicrous demand of the game, AAA is a pretty safe bet, as it is still some AA, and might allow the game to otherwise be maxed at 2560x1600 with four HD6970s, but maybe not, depending on which reference you use. Likewise, the best that nvidia can offer, three GTX580s, still may or may not be enough. (This is $1500 of GPUs in each case, with one of the details off, and AA at the lowest setting where it's still on, and we're not sure it'll work!). Just going for an average of 60fps however, allowing for big tanks in performance in places, we can get off with the (comparatively speaking) lightweight of two HD6970s, or two GTX580s, maybe. The two 6970s aren't guaranteed, you might need three!
As far as running the game on a GTX260 goes, for nvidia owners there's no real extra performance to be had by cutting to lower DirectX levels, so you can stick with DX10 at least, but realistically, you'll be looking at an average of 30fps, a minimum of 20 at best, at 1920x1200, with no AA or Depth of field, everything else max though. Up to you if that's worth it or not.
|
AfterDawn Addict
4 product reviews
|
19. January 2011 @ 09:22 |
Link to this message
|
Bit of a marathon one to compile, this.
As you see, Arma II has many configurable settings that have an appreciable impact on performance.
'VHAF' refers to the game's inbuilt Anisotropic filtering setting set to very high, rather than forcing it to 16x in the driver.
Vxxxx is the visibility distance.
T = Textures
Tr = Terrain
O = Objects
S = Shadows
PP = Postprocessing
Acronyms used for SLI/Crossfire should be obvious.
"But what is the TCF HDX 800R?"
This refers to what would be known as three of the HD 10870. Since it is very unlikely by the time this comes that AMD would move to five figures, they'll likely be using a new name. Since we have no idea what that name would be, I made one up. HDX 800 is merely an extension of the move AMD made 7 years ago when they reached the 9800 series, moving to the X800. The letter R represents that it is the most powerful (single) card in the series, I did not use XT as AMD are likely to have left that nomenclature far behind.
This, by the way is just the original, not the far more demanding expansion.
Quite the ridiculousness going on here, with a modest 4xAA, even when we reduce terrain, objects and shadows down to medium, to run Operation Arrowhead at 2560x1600 res (let alone eyefinity!) with a very high draw distance requires three GPUs from FIVE generations away!
Just to put that into perspective, five proper generation gaps behind us is the Radeon 9600/9800 series, or the Geforce FX series, back before PCI Express had even been invented.
This is so we can use three cards as well, not just a single one.
Suppose we:
Wanted one GPU instead of 3? -> Two more generation gaps
Wanted more AA, or supersampled? -> At least another generation gap
Wanted to run 7Mp eyefinity, or even 12Mp? -> At least two generation gaps, possibly three
Wanted to turn the rest of the details up to max? -> At least one generation gap, probably two
Wanted to run in 3D with the same fluid frame rate? -> Two more generation gaps
Add all these together, Arma II: Operation Arrowhead may not be maxing out until the PCI Express interface, or possibly even the ATX standard, is long dead and buried. In the politest sense, a lot of the older forum members may not still be alive to witness the coming of this.
This is very unlikely to occur in this decade, probably in the mid, or possibly late, 2020s. Scary stuff.
I've long suspected, but never really analysed it. Arma II:Operation Arrowhead truly is, the new champion of hardware demand.
Oh by the way, the view distance probably goes higher than 5161, that's just the highest I've seen tested yet, probably because it's the limit of 1.5GB per GPU.
This message has been edited since posting. Last time this message was edited on 19. January 2011 @ 10:56
|
Advertisement
|
|
|
Red_Maw
Senior Member
|
19. January 2011 @ 15:30 |
Link to this message
|
So in summary in about another decade I should be able to play Arma II arrowhead with satisfactory settings? lol
This message has been edited since posting. Last time this message was edited on 19. January 2011 @ 15:31
|
|