|
The Official Graphics Card and PC gaming Thread
|
|
AfterDawn Addict
4 product reviews
|
14. September 2013 @ 08:41 |
Link to this message
|
Originally posted by Estuansis: Good luck on your OC Rich. The 1090T wrestled with me all the way to 4GHz. Never had a more stubborn chip.
Apart from CPUs that can't do 4Ghz of course :D
I perhaps wasn't the most determined ever, but I had serious grief getting the 'expected results' out of the Core 2s in the X38/X48 days because at the time I wasn't aware of just how much difference it made having both PCIe slots filled with GPUs. In the end I settled for the 'good enough' values that gave good enough performance but without pushing the boundaries of stability for that last couple of 100mhz (3.24Ghz for the Q6600 and 3.65Ghz for the Q9550).
Still super impressed with my i5 750, will turn 4 in february and still currently running at the 54% overclock to 4116mhz I set from day one. Had a couple of BSOD scares but they seem to have disappeared after ditching the WHQL graphics driver for the current beta. At this speed, the current Haswell i5s are barely more than 10% faster than my CPU at their stock clocks. Sure I could buy a Haswell system with new board and CPU but for that sort of gain, I've literally no enthusiasm for it. So far if I am going to need any more performance, it's going to be graphical, and even there, there's a lot of stuff I can run at 2560x1600 with acceptable frame rates with the HD6970s still.
Considering the days where I used to perpetually be upgrading things here and there, dunno what's happened to me! :D - Should there be something that completely revolutionises performance in the GPU or CPU area I'll probably buy it, but for now, really, I'm likely to be running this set of hardware until something goes wrong with it.
|
Advertisement
|
|
|
AfterDawn Addict
7 product reviews
|
14. September 2013 @ 13:15 |
Link to this message
|
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 14. September 2013 @ 13:16
|
harvardguy
Member
|
20. September 2013 @ 21:47 |
Link to this message
|
Jeff and Christa have been busy - faster cpu arrived, and is now overclocked!!
Originally posted by Jeff: Good luck on your OC Rich. The 1090T wrestled with me all the way to 4GHz. Never had a more stubborn chip.
I copied that to the clipboard, then advanced a page here in the forum, and noticed that Sam had already picked up on the quote. It's such a great Estuansis line - as you see I couldn't help myself from quoting it again. :)
That's great to hear, Jeff. That gives me added encouragement. I haven't started any reading or any further testing, but as I get ready to make this evga i7 my main system, a few things are going my way on that Lian Li.
I started out with zero molex connectors in the case - sata power connectors only, and two fans running on motherboard fan headers. There were two pci-e 6+2 connectors coming out of the harness with the motherboard power, powering the gtx285 card, but I needed power for my second 7950, which takes two 6-pin pci-e connectors. I had 3 unused psu modular 6-pin outlets, one black for accessories, and two red for pci-e power.
But no help from Antec.
There was nothing at their store, and when you start a ticket to buy parts, they do finally get back to you - but in this case I asked for a molex harness, and two modular pci-e power supply cables, and they wrote back after a few days - the model has been discontinued and nothing in stock.
Fortunately, I lucked out - on ebay a guy in Canada had a number of antec molex harnesses for about $5 each by the time you add in the shipping - not bad at all. I had bought one for the black psu jack, and it was powering the kaze high-speed fan.
I tried to modify some cables from thermaltake thinking that these modular power connectors are probably all wired about the same, and just used a box cutter to get rid of the unpopulated two extra plastic pin slots on the psu side of the cable, so it would plug into the power supply. The picture clearly showed that the red was 6 pins, like the cables I had bought for my Toughpower, which were out of stock, but when it arrived, the red was 8 pins. Yes I could have returned them, but for $20 I took a chance and modified them instead.
They plugged in just fine. But I apparently wasn't engaging the correct sensor on the power supply, somehow, and the computer would not power up at all if I had either one of them connected to the gtx285 card. The thermaltake service tech had warned me that he tried the same type of thing, on his personal computer, and it didn't work. He was right.
Then I had the bright idea of using the molex harness I had already bought from Canada, and so I plugged it into one of the red psu connectors. Okay, so far so good - it plugged right in. I thought - "well I'll just use a couple of molex to 6-pin adapters that I got with the 7950s, and see if those work."
They worked!
I powered up the gtx285 with that one harness, and two adapters, and ran it for about 5 minutes in a small furmark window, getting hotter and hotter before the 285 turbine fan came up to full 3400 rpm per gpu-z - wow is it loud at full speed - and temps leveled off at 85.
So I figured that was a good test of that molex harness. Then I ordered two more of those harnesses from the Canadian ebay guy. They just arrived. I tested again, this time with all harnesses plugged in, and one adapter per red psu jack, which is how I'll actually use them - spreading the power around in case they use separate rails. The furmark test was fine as expected, so I have my full crossfire power requirements, and with a bunch of molex splitters, I'll be able to drive all those 11 spedo fans (plus 2-3 more fans on the TRUE silver shadow hsf which I'll install when I put the i7 over in the spedo case.)
All that's left is to get this i7 up to cranking speed on the little Arctic cooling 13 hsf, so that I can conclude that making the swap-out will be justified - current target 3.6 ghz up from present 2.91 (yielding roughly 40% increase in cpu power.) Shouldn't be hard at all!
Originally posted by Sam: I perhaps wasn't the most determined ever, but I had serious grief getting the 'expected results' out of the Core 2s in the X38/X48 days because at the time I wasn't aware of just how much difference it made having both PCIe slots filled with GPUs.
That is a very good point, Sam. In my current testing, I'll be aiming for the 3.6 with just the one gtx285. When I get there, at that time I'll put both 7950s in the Lian Li - which will be a tight fit, but with a bottom psu, the extra 1/2 slot of the 7950 is the turbine fan, which will be jutting out just past the power supply. If the sonata had been a bottom-psu case, I could have run both 7950s in that case - although I am glad I was forced to pull the full tower spedo out of the box in the garage where it had sat, brand-new, for 4 years. LOL
Anyway, that will be a lot of heat, in a small space, with the kaze 3000 rpm kama bay blowing like mad. The HIS 7950 turbos vent to the back, which is a help - but of course the northbridge (vregs) have to handle all that pci-e bandwidth, which as you say, Sam, affects cpu stability. Hmmmm. I would imagine that I can still do it, as 3.6 is such a modest target, I feel. I really want to have the whole crossfire setup at 3.6, and run identical Far Cry 3 wave runner in the crocodile river - and watch while cpu second core is no longer a bottleneck. That will be the clincher!!
Quote: Considering the days where I used to perpetually be upgrading things here and there, dunno what's happened to me! :D
Well, I think what happened is you graduated, got a full-time job, and had enough of the "fun" of being a pioneer out there on the "bleeding edge" - and look what happened in the graphics industry! It seems AMD has the same thought you have - they were happy to develop the 7000 family, take the lead with a slight bump in the clocks after selective binning, and cancel plans for - what was it to have been called - "southern islands" or something like that?
Meanwhile, the thing that started to bug me - micro-stuttering - was "solved" first by RadeonPro, and now by "frame pacing" - and AMD secured the console contracts - so the company should be healthy and profitable for the next few years at least. I would say your only reason to upgrade, as you just said, would be on the graphics side - and since you always resell your hardware later - it seems to me that you should swap out those 2 gig 6970s for a couple of 3 gig 7970s - or at least 7950s - for a total outlay of what - $600? That should hold you for at least two more years, wouldn't it?
Originally posted by Omega: LOL! Hey Rich, did you know there are Nude mods for Tomb Raider?
http://www.youtube.com/watch?v=KVcYsWrg5RA
No, I haven't played the game yet, but I'm strongly considering it. Not because the nude mod... :p
For crying out loud, Kevin. You have to wait until I'm 70% done with the game before letting me in on the nude mods!! Hahahahaha.
Rich
This message has been edited since posting. Last time this message was edited on 20. September 2013 @ 21:57
|
ddp
Moderator
|
20. September 2013 @ 22:11 |
Link to this message
|
wasn't me on ebay as i just buy model warships & computer parts.
|
harvardguy
Member
|
21. September 2013 @ 01:14 |
Link to this message
|
hahahaha - ddp - are you a canadian guy on ebay ?????
But in this case it wasn't you?
Oh, yeah, that's right - you mentioned how proud you were of being Canadian. I remember now. Let's see. "Canada was invaded 3 times but you guys never won."
Hahaha. That was what you said, right?
Well, that canadian ebay guy, who wasn't you, was a godsend - enabled me to get my Lian Li i7 all hooked up sos I don't have to go buy a power supply from some other canadian guy on ebay. :P
Rich
|
AfterDawn Addict
15 product reviews
|
21. September 2013 @ 06:47 |
Link to this message
|
Rich, I wish you the best and hope you can find a setup to stick with and really enjoy for a while! You do an awful lot of jumping around on certain pieced of hardware. A younger me in my earlier years at this site would give you a run for your money though ;)
-----------------------------------------------------------
Something Sam might be interested in. I've finally given Grid 2 a try... WOW.
I have noticed some pretty important things about Grid 2:
- The series has steered away from arcadiness a bit and is focusing much more on the racing than being novel or overly accessible. This is also evidenced by rumors of Codemasters working on a new rally game in the vein of Colin McRae. As fun as it is, the Dirt series is not as pure or realistic as its predecessors. I particularly dislike being forced into multi-disciplined racing, as I play the game for solely the rally and competitive crossover driving.
- The handling is significantly improved. Codemasters have created a new technique to simulate tire grip including tread depth, surface area, deformation, and suspension travel. As a result, the cars feel much less floaty and much more firmly planted. I believe Dirt 3 has some elements of this new technique as well, though not nearly as well developed. The previous games had a nasty habit of simply pivoting the car on the center as opposed to turning at the wheels. It made most of the cars feel really weird and uncontrollable in certain situations.
- The graphics are effing gorgeous and the game is stupendously optimized. A single 6850 does 40-60FPS absolutely maxed with 4xAA.(Which is better than Dirt 1, a pretty old game now). The 6970 is more in the realm of 60-80. I can't do Crossfire 6850s as one is long gone now, but I would imagine it runs better than the 6970 if Grid 1's performance is any indicator. Did I mention the game is beautiful? They removed a lot of the filters and post effects from Grid/Dirt and the game looks MUCH cleaner. The texture work is also fantastic. Really impressive from a technical standpoint.
- The only con worth noting is that they've removed the helmet/interior cam entirely. Despite the protests of many fans and even some of the development team, there's apparently too much graphical load and not enough people using said camera view to justify its inclusion. I personally call bullsh*t. The game runs like butter, and it's an entirely optional camera view. The real issue is that there are not heavily detailed interiors in the cars, and Codemasters wanted to save time and money by not outfitting each individual vehicle. Lazy devs means the game gets screwed out of another layer of immersion and authenticity. I personally find this unacceptable, as it's the only camera view I use other than the hood cam.
Being that it's a racing game and I don't need to play its entirety to judge it, I think I'll just score it out.
Graphics: 8/10
The game is beautiful and detailed. Period. Loses a whole point for lack of car interiors though. Really a detail that should have been included in such a high profile release.
Performance: 9/10
For as good as this game looks, it runs flawlessly. My friend is running it just fine on his HD7640G APU. It runs better than games several years older. Anyone who posts regularly to this thread can simply max it no questions asked.
Gameplay: 8/10
Still not perfect as it's really not a truly serious racing game. It includes racing series' that don't exist mixed in with those that do, and handling still needs a bit of work to be up to the standard of Gran Turismo. Gran Turismo is a whole other beast though because it's the only game that particular studio produces on a regular basis.
Sound: 9/10
Standard CodeMasters work here. Top notch sound recorded from real cars. Nothing truly spectacular that sticks out, but all very satisfying.
Overall: 8/10
An amazingly good game, but not as hardcore as the series' roots, and not detailed enough from a technical standpoint to compete with modern masterpieces like GT5. Anyone who has enjoyed the series is in for a treat, but more hardcore racing fans would do better to look as other offerings or maybe CodeMasters' Formula 1 games.(which are much more focused and detailed games). The lack of a helmet cam really strikes a blow against realism and makes me wonder which side of the fence it cares to fall on. The game is technologically advanced, but its potential is severely hindered by the developers' attempts to keep the game accessible. I'm stuck with hood/bonnet view now, instead of being seated in the damn car...
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 21. September 2013 @ 07:00
|
harvardguy
Member
|
21. September 2013 @ 15:19 |
Link to this message
|
Wow, another great review:
Originally posted by Estuansis: The lack of a helmet cam really strikes a blow against realism and makes me wonder which side of the fence it cares to fall on. The game is technologically advanced, but its potential is severely hindered by the developers' attempts to keep the game accessible. I'm stuck with hood/bonnet view now, instead of being seated in the damn car...
Jeff, that is totally incredible to me that they took the "first person shooter" aspect out of the situation - you're not driving the car. Absurd.
On the other hand, having played some amazing 3rd person shooters, like Spec Ops: The Line, and Assassins Creed 3, and now most of the way through Tomb Raider, I have found that shooters don't necessarily have to be first person to include accurate aiming - with the exception of Assassins Creed 3 those titles I just mentioned had very first-person-like shooting.
But driving - well, I don't know. Maybe it's okay. But obviously you don't think so - and I would tend to defer to your judgment on the matter.
Kind of changing the subject, I like Codemasters, because one of their studios created the two Operation Flashpoint games that you turned me onto, which I thoroughly enjoyed. Those were the two titles, Dragon Rising, and then Red River with the foul-mouthed (realistic) platoon leader Sergeant. The weapon handling and realism in those games and somewhat open map, is close to what I see now in the Beta of Arma 3. I still remember the thrill of leading the two helicopter pilots, and my small team, to safety, crossing through narrow valleys, around ridges, trying to avoid the main group of Chinese pursuers. We reached our helicopter just as the Chinese crossed the river and started closing in. But a couple of long-range 100+ yard grenade launcher rounds took some of the fight out of them and allowed us to board and fly out of there. Cool!!! (Those were the most powerful and longest shooting grenade rounds I have ever encountered - I asked you at the time if that was realistic and you seemed to think it was.)
I still need to go back and play some of those various "rescue the pilot" scenarios. I had bought the additional map pack. Many challenging hours.
But we know that Codemasters closed down those studios to focus on racing games. Their marketing was not successful enough - to my knowledge they never made a game demo. I wonder, if they had created a game demo, if that would have helped the marketing. I think back to the original Far Cry, which launched Crytek, with the Fort demo. If that didn't hook you on the game, nothing would. That must have helped the sales numbers. So I am guessing that the Red River sales numbers were disappointing and that the games were ultimately not profitable.
Coming from a marketing background, myself, I would imagine that the dev team was great, but they needed some better suits in there who knew how to hook players and create more of a buzz in the gaming community. And maybe at the same time, they needed a PG13 version, like some of the Steam games - where they cleaned up the Sergeant's speech a bit - sure, less realistic, but eliminating the profanity wouldn't have bothered me that much.
It would have bothered Kevin, who likes profanity, and especially nudity, but not me. LOL
Rich
|
AfterDawn Addict
7 product reviews
|
21. September 2013 @ 16:46 |
Link to this message
|
You're a funny guy Rich... lol!
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 21. September 2013 @ 16:47
|
ddp
Moderator
|
21. September 2013 @ 18:07 |
Link to this message
|
harvardguy, the states invaded canada twice & the finians invaded canada once & we beat you back to your side of the border all 3 times.
|
AfterDawn Addict
15 product reviews
|
24. September 2013 @ 06:30 |
Link to this message
|
Eagerly awaiting my 4GB GTX760...
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
4 product reviews
|
24. September 2013 @ 06:36 |
Link to this message
|
I bet :) - longer post coming but needs to wait until I have time at work - I tried doing a response to Rich's posts on the train this morning but the interface on a mobile device is just too bad to write more than a few lines.
|
AfterDawn Addict
15 product reviews
|
25. September 2013 @ 23:30 |
Link to this message
|
GTX760 here tomorrow or Friday. For the price of the 4GB card, I could have gone with 7970 by now, so a bit disappointed. A little bit of buyer's remorse, but we'll see. Nvidia has a major advantage in a few games I play regularly, and their general quality seems slightly better than generations past. Certainly not as bad as the GTX400s.
I have a feeling that the performance will be quite adequate for me. 2.5x a single 6850 or about ~30% faster than the 6970(off the top of my head, forgive the approximation). Benchmarks look very promising. There is currently no game I want to play that it won't be able to run maxed. Crysis 3 will be the most challenging but still manages a solid 40FPS at the settings I want to run. Plenty for a Cryengine game. Stalker will also be a challenge. I hope to see a large boost there as it is very strongly biased and memory hungry.
http://www.techpowerup.com/reviews/MSI/GTX_760_HAWK/26.html
According to this review, it might be possible to close the gap with the 7970 with a small OC. The 760 isn't capable of much but it's already firmly ahead of the 7950. The Gigabyte card I ordered has an exceptionally good cooler, so it would be worth a shot to see what it can do.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 26. September 2013 @ 00:21
|
AfterDawn Addict
4 product reviews
|
27. September 2013 @ 08:15 |
Link to this message
|
Ok, now that I have a little time to reply to all this lot:
- Power supply modular connections are not built alike, and should not be treated alike, even if they look the same. In some cases they can be reverse polarity which could be downright dangerous. Obviously the PCIe and Molex standards make it near-impossible and moderately difficult respectively to get the connections the wrong way round, but the other end could be anything.
The problem you will have had with adapting existing connectors to the PSU is that as you see, there are sensors to protect against this kind of thing going on, and that also applies to graphics cards by the way. Even though neither of the two extra pins on an 8-pin connector are live, they communicate to the GPU that a PSU of a high-enough rating is connected, and thus enable either the card itself, or the overclocking facilities accordingly. Some GPU brands will not allow a 6+8 card to even POST with 6+6 attached.
Molex-PCIe adapters, as long as they're the right ones (6 vs 8 pins) will always work, but be careful. Molex is a very weak connection method for sending a lot of current through - if you've ever done much work with cheap chinese case fans/lighting etc and had pins fall out of the plastic housing, or push out the other side when connecting cables, you'll know what I mean. Would you want wiring of that standard to handle multiple amps in a hot ambient environment on a daily basis?
3400rpm is not loud for a graphics card. Remember the peak for old AMD graphics cards (up to HD5 series) is 5000rpm, as of the HD6 series it's 6000rpm, with the same fan speed. THAT is loud, but fortunately, it's almost always completely unnecessary.
The flip-side is of course situations when such a speed is necessary. AMD cards become insanely loud. Nvidia cards with lower maximum fan speed simply overheat, and throttle and/or fail. Yes, the coolers are better so nvidia can handle more heat for the same fan speed, but a geforce cooler at 3500rpm still doesn't equal an AMD one at 6000. At 3500/5000 they're probably quite similar, bar the considerable noise difference.
For the i7 chipset scenario, the Mk. 1 Core i7s were a 'halfway house' between old style systems like AMD/Core 2, and the more recent i5s, as the memory controller is now in the CPU and not the board, but the PCIe controller is still on the board in the X58 northbridge. i5s did away with the PCIe bit and put that in the CPU too, along with (where it was used), the integrated graphics themselves.
Further down the line still, newer architectures like Skylake (Mk. 6 Core i5/i7, the current Haswell being Mk. 4) will do away with CPU sockets so the CPU and board comes together as one item, and will also start doing away with PCI express, instead favouring onboard graphics only (Note: this is not a good thing - it means gamers will be forced to go AMD or purchase the high-end platform, the modern equivalent to X79).
Graphics
Southern Islands was released, it's the HD7 series. Sea Islands was the 'cancelled' HD8 series platform, which is now known as the R9/200 series. Rather than have the anticipated HD8970, we will instead now have the AMD R9 290X. Sea Islands did in fact release by the way, but it was a rebrand of the HD7 series cards. Exactly the same products, different sticker. The reason you didn't see them was because they were OEM only - Prebuild systems require a generational leap periodically as part of their contracts. Since the new architecture was so far behind, the HD8 series name was simply assigned to the same old same old, to keep the system builders happy and the contracts in place. So sorry, if you bought a prebuillt machine with a "new" HD8950 in it, it's just an HD7950 I'm afraid.
Anyway, back to the R9 290X, and the new 'Volcanic Islands' Series - unsurprisingly, not all of the new Rx 200 series cards will be based on the Volcanic Islands GL silicon.
It's almost not a stupid naming scheme, as it borrows from Intel's similarly questionable naming practice - R9, R7 and so on will be to Radeons as i7, i5 and i3 are to Intel processors - spec-level groups - big number is a big card, small number is a small card. The first digit in the second group denotes generation, again same as intel. 290X, 390X and 490X is to Radeons as 2500K, 3570K and 4570K are to Core i5 CPUs.
The next number is the subdivision level, 270X, 280X and 290X replace the expected HD8950, HD8970 and HD8990. The X figure? Just seems to have been added 'because X is cool' - even though that stopped being the case around 2006 with the release of the XFX Geforce 8800GTX XXX Edition (this did exist, believe it or not).
In reality, it's likely to denote that something without an X is designated as a basic card. It's also an opportunity to insert cards inbetween two existing models, by stripping the X from one, or adding it, as appropriate. Let's face it, 250, 250X, 260, 260X would be less confusing than things like LE and Ti that nvidia drop into their product lineup.
Change for the sake of change, in my opinion, because it could have been done simpler, but the new nomenclature should be fairly easy with time. Now we just need AMD to follow suit with their CPU lineup, and understanding product names will become easy! Hmm, let's not jump to conclusions about that one I suppose.
The general consensus seems to be that the flagship R9 290X will be 40-50% faster than the current HD7970GE, which puts it a small but not insignificant amount above the GTX Titan - this is good because it will likely arrive at $700 or under, compared to the current exorbitant cost of the Titan.
One of the potential advantages, but more likely disadvantages, is that a custom API is required for a lot of the performance gains of the new architecture. You know what that means folks, no driver support, no performance upgrade. Where have I heard that one before?
Talking of drivers, AMD has now just released its first WHQL-certified driver since April. The catch? It's the same driver as April's, just recertified for Windows 8.1 (Since at its release, Windows 8.1 had not a single AMD certified WHQL driver, so that's what this is). Because frame-pacing does not meet WHQL standards, any AMD driver that does anything useful will remain beta status indefinitely.
Incidentally, that last WHQL driver they made, caused me BSODs. The beta drivers don't do that, and neither did the old 2012 drivers I had from before. So really? I'd say avoid this one too.
Games
GRiD 2 has me interested. Needless to say I haven't got round to playing it yet. Lack of cockpit view is lamentable, and pretty much unforgivable but in a period when the quality of games is steadily decreasing year by year, it's to be expected.
Personally speaking, if I'm to do anything more than casual messing about, I use the bumper view - low to the ground, no obstructions. That was the only angle that got me through doing endurance races in GT3 and GT4 - once I got into them I found them quite therapeutic actually :D
|
harvardguy
Member
|
27. September 2013 @ 20:19 |
Link to this message
|
Originally posted by ddp: harvardguy, the states invaded canada twice & the finians invaded canada once & we beat you back to your side of the border all 3 times.
Hmmm, just to double-check that we are in the same dimension - what are finians?
So Kevin, how's the nude mod going? :)
Well, Jeff, don't forget that you have more memory at 4 gb than any 7970, and having starved for memory all that time, that seems like a great idea to me.
I think it will hold you for quite a while, until the whole situation turns on its head when 4k really becomes mainstream, and Sam is intending to take the leap next year in 2014, and I might follow in 2016, lol.
By then you will probably be looking at 2560x1440 gaming, at about half the pixels of 4k (if Christa lets you spend the money on hardware) which will probably be your sweet spot until 2020.
@Sam: Bumper view! - like you're riding out in front of the car, holding onto the radiator. Great! LOL
What was that you said about graphics cards being eliminated in the not-too-distant future - I sure hope you're wrong about that.
Rich
|
ddp
Moderator
|
27. September 2013 @ 20:37 |
Link to this message
|
harvardguy, it is obvious you don't know your american history as the finians are irish americans. i still remember that from my grade 13 american history class over 34yrs ago.
|
AfterDawn Addict
15 product reviews
|
27. September 2013 @ 22:21 |
Link to this message
|
GTX760 here and in good condition this time. Performance so far is quite promising. Skyrim balls-out maxed with 4xAA and tons of memory-hungry mods is locked at 60FPS. Even the 6970 choked with mods. Stalker, Crysis 2 and Metro are likewise improved :) It craps all over the 6850s and the 6970. Not even close. Excellent performance. Very large card as it uses a GTX 660/670/680/770 PCB. 8+6 pin power.
The cooler is exceptionally good and very quiet. Highest I've seen it get is 69*C after 4 hours of Skyrim. Idles in the low to mid 30s. So far looks to be a very decent video card. Haven't tried OCing yet, but no reason I can't squeeze a bit more out of it. It's already a factory OC'd card so has a slightly higher stock voltage.
Gigabyte GTX760 OC 4GB
First Impression: 8/10
A little disappointed I was a week late to catch the 7970 for the same price. We'll see how it fares across the gamut of tests. I might actually get a 990 board and another one of these. Solid construction, very high quality packaging from trusty ol' Gigabyte, good performance, great cooling. I can't fault it for anything. Points off for price only. Really quite tickled :)
Something is slightly different about the colors on the screen. Not worse or wrong, just different. The Nvidia card may or may not have better image quality. More of that to come.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 27. September 2013 @ 22:46
|
AfterDawn Addict
4 product reviews
|
28. September 2013 @ 05:34 |
Link to this message
|
From what I can tell, the GTX760 sits roughly on a par with the HD7950, and therefore about 20% faster than the HD6970 at 1920x1200. Obviously the gap with the latter closes a bit at higher resolutions as the HD6900s were never particularly well optimised for the lower resolutions. In titles like Alien Rage, the comparative staying power of the HD6970 causes the situation to reverse at 2560x1600, with the HD6970 sitting above.
However, in titles like Rome 2 Total War, 2GB doesn't cut it at that resolution, and the GTX760 ends up offering nearly double the performance - possibly moot, as that's only double of 11fps :P
In Arma 3, the GTX760's lead over the HD6970 grows to 50% at 1920x1200, and sits almost exactly equal to the HD7970. presumably due to vendor-optimisation. At 2560x1600 the lead drops to 20% (but more for average frame rate) and equivalency to an HD7950 (but with a higher average). Again moot however, as just one of any of these cards is not sufficient to play the game at Ultra detail at 2560x1600. Even 1920x1200 would be a bit unpleasant. Dropping back to very high, the frame rates are more reasonable but the same trends remain.
|
AfterDawn Addict
15 product reviews
|
28. September 2013 @ 06:17 |
Link to this message
|
Consider that mine is a factory OC'd card on a GTX680 PCB(reference cards are on a much smaller PCB) and is now decently overclocked above that as well. See my sig for that. Certainly above what a 7950 is capable of.
The overclock nets ~10% performance over a stock card. More or less depending on the application, and plus a few percent due to the different PCB. Reference GTX760 is 980/1502. Factory setting is 1085/1502. Mine is 1200/1700.
BTW Crysis 3 runs at 30-50FPS absolutely maxed with no AA 16xAF. Average say 35-45FPS in typical gameplay which is quite smooth.Gotta say Crysis 3 is damn pretty all cranked at steady frames. They certainly put more effort in than Crysis 2. REALLY pleased with that result :)
UPDATE: So far with the OC it benches about like a non-GHz 7970. Average 43FPS on Metro Last Light Benchmark, identical settings to this article.
http://www.guru3d.com/articles_pages/me...enchmark,6.html
Very very nice results. In other stuff it's even closer to a 680, though the 7970GHz is going to be faster no matter what.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 28. September 2013 @ 07:34
|
AfterDawn Addict
|
28. September 2013 @ 15:37 |
Link to this message
|
i do wonder if gtaV will come out on PC, having clocked the game on a console, i really want to unleash it on the PC, esp with MODS
MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
|
AfterDawn Addict
7 product reviews
|
28. September 2013 @ 15:40 |
Link to this message
|
There's no reason to assume they won't. I heard about leaked code as well. Suggesting that it's already ready for the PC. The Code was suggesting PC structure.
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
15 product reviews
|
28. September 2013 @ 16:39 |
Link to this message
|
GTAV was confirmed for the PC I thought? Requirements have been tossed around as well. I heard November 22.
Certainly hope I'm ready to run it with this video card. Ultra excited about my overclocking results and the performance. Now well above what the stock clock GTX760 2GB I was planning to purchase is capable of. From looking at reviews, 2GB -> 4GB on these cards nets a tiny bit of performance as well. So add another 1-2%.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 28. September 2013 @ 17:39
|
harvardguy
Member
|
28. September 2013 @ 22:37 |
Link to this message
|
Originally posted by ddp: harvardguy, it is obvious you don't know your american history as the finians are irish americans. i still remember that from my grade 13 american history class over 34yrs ago.
"American history" - what the hell?? I am Iranian. You mean this forum is over in frick**n AMERICA - you bloodsuckers!!! - when we get the BOMB you are going to be in BIG TROUBLE!!!
======================================
Just kidding.
I am fairly sure no american history class that I ever took ever said anything about finians. Besides, aren't you Canadian? We use different history books over here when we study American history - with Americans winning all the battles, and Canadians losing all the battles. You know - true American history - not made up Canadian American history :P
Jeff, that's great about your new card - you're rocking it with all the games that gave you trouble before, and getting another one of those cards will enable you to dominate! In fact, I believe that you will eventually settle into your new 2-year-from-now sweet spot of 2560x1440 - that's what I predict.
In the meantime, that is great. Is Christa still around - is she enjoying those titles with you?
Rich
|
ddp
Moderator
|
28. September 2013 @ 22:45 |
Link to this message
|
if the states won all the battles than how come canada is still an independent country like the state & not part of the state, explain that.
|
harvardguy
Member
|
28. September 2013 @ 23:10 |
Link to this message
|
You know, I think I asked that question too, and it was something like - "No, 51 states doesn't make a good row of stars on the flag."
|
Advertisement
|
|
|
ddp
Moderator
|
28. September 2013 @ 23:20 |
Link to this message
|
the 49th & 50th state came into being in late 1940's or early 50's so that is not it when we last fought you in the 1800's so try again. guam or puerto rico but mainly puerto rico will be the next state.
|
|