User User name Password  
   
Saturday 21.12.2024 / 21:51
Search AfterDawn Forums:        In English   Suomeksi   På svenska
afterdawn.com > forums > pc hardware > building a new pc > the official graphics card and pc gaming thread
Show topics
 
Forums
Forums
The Official Graphics Card and PC gaming Thread
  Jump to:
 
Posted Message
AfterDawn Addict

7 product reviews
_
10. December 2013 @ 00:30 _ Link to this message    Send private message to this user   
34 yrs old, if you're curious. It's just one of the few things I have from childhood, that I care a great deal about :)



To delete, or not to delete. THAT is the question!
Advertisement
_
__
AfterDawn Addict

4 product reviews
_
10. December 2013 @ 15:03 _ Link to this message    Send private message to this user   
Older than I realised - I assumed you were about mine and Jeff's age. Dunno why :/



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
AfterDawn Addict

7 product reviews
_
10. December 2013 @ 15:06 _ Link to this message    Send private message to this user   
Originally posted by sammorris:
Older than I realised - I assumed you were about mine and Jeff's age. Dunno why :/
Yup. I'm pretty old LOL! My job certainly makes me feel it :S



To delete, or not to delete. THAT is the question!
ddp
Moderator
_
10. December 2013 @ 15:38 _ Link to this message    Send private message to this user   
i'm even older than you!!
AfterDawn Addict

7 product reviews
_
10. December 2013 @ 15:51 _ Link to this message    Send private message to this user   
Originally posted by ddp:
i'm even older than you!!
No offense, but I figured. There's a certain wisdom/experience in your words :)



To delete, or not to delete. THAT is the question!
ddp
Moderator
_
10. December 2013 @ 15:54 _ Link to this message    Send private message to this user   
bummer!!!!
AfterDawn Addict

4 product reviews
_
10. December 2013 @ 17:00 _ Link to this message    Send private message to this user   
Just had a quick play with the HD7770 on the UP3214Q. I say quick play as the short displayport cable means the monitor has to be at a crazy angle on the desk and the cable is pulled tawt.
Getting an eyefinity display group to come up is, there's no other word for it, a thorough pain in the behind. When you exhaust all the sensible options, just repeat the correct procedure ad nauseum, and eventually without warning it'll suddenly work as expected. It's rather tragic that a technology first introduced a little over 4 years ago is still in such a beta-like state, but there you go, that's AMD software development for you.

Once the display group was set up though, the desktop experience on the HD7770 was sublime. None of the desktop lag and general cursor stuttering I see on the HD6970s, whether crossfire is enabled or otherwise. This therefore only leaves the age of the cards as the cause. The HD7770 is slower, has less video memory, and the other PC's CPU, operating system and primary SSD are fairly similarly matched. Time then, when the bill for the UP3214Q is paid in the new year, to save up for an R9 290. Just the one to start with I think, that may not be enough power to max out games at 3840x2160, but it should be enough to replicate and slightly exceed the performance of my pair of HD6970s in a contemporary fashion suitable for usage with a 4K monitor, without too much heat and noise. Two 290s would be nice, but I'd like to consider my 70+dB PC days with having to manually tinker with fan speeds before going on a gaming session over. The HD6970s showed me a PC can be very powerful but still suitably quiet and autonomous. If I commit to any large expenditure on graphics that has to continue. Besides, I don't much fancy the idea of putting two 290s' worth of current through a 5 year old power supply.

Here's an interesting tidbit I discovered doing these 4K tests. When I started I became worried about there being high levels of ghosting on the UP3214Q, you could see a significant number of ghost image cursors follow you around on it that weren't really there on the 3008WFP. When I connected the HD7770 and saw them still there, I thought the monitor to blame. However, having run some tests on the U2312HM next to it (famed for having good response times and minimal ghosting, though that's not the reason I bought 2 of them), if I move the mouse at the same speed in cm/s there's far less ghosting on the 23". However, if I move the mouse at the same speed in pixels/s the effect is very similar. It would seem the pace with which you move the mouse around the screen on a 4K desktop is so high that windows draws several ghost cursors, to provide the effect of you rapidly panning, even when on a screen so large, you're simply sliding at a moderate pace. Put that one down to windows then, if there were a way to turn that off, I'd probably do it. Otherwise I'll just ignore it. It'll probably become second nature in a few weeks/months.

As a fluidity test on the HD7770 I ran Half Life 2. A game I considered old enough to actually run smoothly at 4K on a pretty basic graphics card, yet one that still looks fairly good. It ran pretty smoothly (I'd say at least 50fps, probably a lot more most of the time), and the graphics. Wow, it may be missing modern shader effects but texturally the game is stunning. Sitting in front of a 32" display with a game with textures that good running at 140dpi, it's just one of those things you can't stop grinning at. The game may now be over 9 years old, but it's still beautiful, especially at that resolution.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
harvardguy
Member
_
13. December 2013 @ 22:10 _ Link to this message    Send private message to this user   
Originally posted by Sam:
As a fluidity test on the HD7770 I ran Half Life 2. A game I considered old enough to actually run smoothly at 4K on a pretty basic graphics card, yet one that still looks fairly good. It ran pretty smoothly (I'd say at least 50fps, probably a lot more most of the time), and the graphics. Wow, it may be missing modern shader effects but texturally the game is stunning. Sitting in front of a 32" display with a game with textures that good running at 140dpi, it's just one of those things you can't stop grinning at. The game may now be over 9 years old, but it's still beautiful, especially at that resolution.
Sam, you're doing it again. You're going to talk me into investing into one of those things - moving so far out of Jeff's sweet spot that I'll never have a powerful enough machine to max anything other than Kevin's favorite game. Angry Birds.

HOW ARE YOU GETTING 4K TEXTURES IF THE GAME DOES NOT PROVIDE 4K TEXTURES?
Again, you may have already answered this - but doesn't the game itself have to draw the textures for you at your 4k resolution. I read recently that one game takes 50 gigs on the hard drive, because it comes with 4k textures. In fact, checking google, it looks like that's the game I'm currently addicted to, thanks to Jeff, Assassins Creed 4.





The above image is available at full size at: http://www.computerandvideogames.com/vi...7,311146,311148

I quite doubt that Half Life 2 supports 3840 x 2160. So how are you getting 140dpi of texture resolution? 'Splain please, once more. I'll listen better this time.

That is extraordinary that you found that the 7770 was able to better run the 4k than the 6970s. As you say, less memory, fewer shaders, etc. When you disable crossfire - are you very sure that is the same as if you had only one of the cards physically installed in your machine? Other than being produced on a smaller die, every feature of the 7770 is much inferior to the 6970. http://www.hwcompare.com/11921/radeon-hd-6970-vs-radeon-hd-7770/

The concern I have is with your conclusion that the 7000 family is inherently more compatible - why would that be? I guess if it were me I would yank out one of the 6970 cards just to be double-sure.

The other thing, the ghosting of the cursor, good work on that analysis. I assume you have the appropriate mouse setting enabled - without mouse trails of course. So it ghosts it anyway? Hmmmm. Well, yes, you'll become accustomed to that, I'm sure.


Originally posted by ddp:
i'm even older than you!!

That didn't surprise me at all. I knew ddp was quite a bit older even than Kevin, since he mentioned being personally humiliated a long while ago when a drunken group of Irish Americans crossed the border, invading his small village, only to conclude "For crissake it's cold up here - what are we doing - let's return to civilization." Or as ddp put it, "We drove them back."

Living in frigid conditions where brain processes run more slowly has the advantage of prolonging life by reducing normal meat decay. My chicken-cooking once each 6 months provides me a small piece of protein for breakfast every day, the bulk sitting out in the garage freezer.

Rich

This message has been edited since posting. Last time this message was edited on 13. December 2013 @ 22:11

AfterDawn Addict

4 product reviews
_
14. December 2013 @ 07:03 _ Link to this message    Send private message to this user   
To play a game and see benefits at 4K Rich, you don't need the textures themselves to be 4K in resolution - that would only be necessary if one texture occupied the entire screen (e.g. if you were standing facing a wall at zero distance).
What 4K allows is to see standard res textures but at considerably greater distance because there are so many pixels on the screen, the texture can be rendered at its full resolution from a far greater distance. If you're standing right close to someething on a 32" display, if the texture is low res, no amount of display resolution will cure that. However, you can stare at a scene and at a pretty reasonable distance (a few feet, say) you see a great improvement, because textures you're supposed to be close to when you walk past, will be sufficiently high resolution that when they make a small part of your screen, they look very crisp.

The HD7 series is listed as compatible with 4K on AMD's site, the HD6 series is not, so by rights it shouldn't work at all. I imagine it's something to do with the actual final output stage. Performance seems undiminished comparatively-speaking in terms of pixels per second output in games. 100fps ish from one GPU in counter-strike: global offensive pairs fine with the 160-180 ish I'd get with one at 2560x1600 (Two don't go much higher due to the CPU limit). Likewise 18-22fps in Battlefield 4 at 4K with AA/HBAO off on two GPUs tallies about right, because that would mean 40-45fps at 2560x1600, which again tallies with what I'd have expected from BF4.
It seems therefore that the stage sending the output to the screen might be where the issue is - even though it's the same displayport standard, there may be something different with how it's actually connected to the output buffer. It could also of course have something to do with the eyefinity support in the HD6 series. Eyefinity is after all required to produce 3840x2160 at 60Hz until we see 4K displays support single streams at that level of bandwidth. In reality, I suspect that needs to happen before 4K monitors become commonplace.

With regard to mouse curors, this is with pointer trails off, it seems standard windows behaviour to do this. Pointer trails are much more prolific.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
harvardguy
Member
_
15. December 2013 @ 00:20 _ Link to this message    Send private message to this user   
Originally posted by sam:
To play a game and see benefits at 4K Rich, you don't need the textures themselves to be 4K in resolution - that would only be necessary if one texture occupied the entire screen (e.g. if you were standing facing a wall at zero distance).
What 4K allows is to see standard res textures but at considerably greater distance because there are so many pixels on the screen, the texture can be rendered at its full resolution from a far greater distance. If you're standing right close to someething on a 32" display, if the texture is low res, no amount of display resolution will cure that. However, you can stare at a scene and at a pretty reasonable distance (a few feet, say) you see a great improvement, because textures you're supposed to be close to when you walk past, will be sufficiently high resolution that when they make a small part of your screen, they look very crisp.
I'm still not quite getting it - but maybe a hair closer. The key is when you say "to play a game."

I have a standard distance I sit from my screen - pretty close up which gives me a good immersive experience - maybe 14-16 inches from the screen. It is pushed as close to me as possible, but still allowing the keyboard to sit just in front of the feet of the monitor - I used to have the keyboard tilted up on the feet, but I didn't do that when the new back-lit keyboard, like Jeff's, came in.

So back to Half Life 2. Let's say you are gaming. Gordon Freeman and Alyx are standing there, with their textures, the ground and mountains are behind them, with their textures. You are the normal distance from your screen - in my case about 14-16 inches I would guess when I am gaming.

The image size that you are looking at is full screen - the same size as on your old 30 inch dell. But your monitor is spreading that information over twice as many pixels.

You are not standing 3 feet away, where it would look super crisp - yes - but you are sitting at gaming distance, 14" away, where you (at least for me) normally sit.

At normal gaming distance, does the game, half life 2, look crisper or not? I think not - in reading your post for the third time, I notice now that you say as much when you say if you are staring at it from close up, no amount of res will fix low textures - but from a couple feet away - yes, very crisp. So at normal gaming distance, no, correct?

Why did assassins creed 4 provide 4k textures? At normal gaming distance for me, 14", how will assassins creed 4 look on a 4k screen, vs how it looks now (just under twice as many pixels on roughly same surface area.) Twice as crisp? If yes, that will be awesome indeed - (except for Jeff's sweet spot argument.)

Originally posted by sam:
Eyefinity is after all required to produce 3840x2160 at 60Hz until we see 4K displays support single streams at that level of bandwidth. In reality, I suspect that needs to happen before 4K monitors become commonplace.
Hmmm. Thanks for that - so something different in the final output stage. AMD lists 7000 as compatible with 4k, but not 6000. When you say single streams - are you saying that eye-finity involves multiple streams? I guess you are.

======================================

I was wrong about assassins creed 4 being maxed - I found out that there are 4 levels of EQAA, whatever that is. There is EQAA 2X(4), then EQAA 4X(8), then EQAA 4X(16), then EQAA 8X(16). They also have MSAA anti-aliasing.

I was able to see a small, subtle, but noticeably better picture on each higher setting, until NOW I AM FULLY MAXED, at EQAA 8X(16), and my average framerate is only 22. The graphics cards are running at 78% load.

I can go to lower AA, and get average 32, for example at EQAA 4X(8), but the game doesn't look as good. This isn't the 4x on Far Cry 3 being quite a bit better looking than the 2x, but the 8x was almost no better. This is a perceptible improvement on each step.

I thought the game looked good before, but now it looks exceptionally good, and I'll suffer the slight laggyness as the price I have to pay. I am cpu bound. Somewhere in the image processing, the higher EQAA involves more cpu, but not necessarily more graphic card horsepower. I would have thought that it would have been all on the graphics cards.

Anyway, the game looks totally unbelievable. (Sometimes I do get 30 fps - but sometimes in heavy fighting, it drops to 17, and moving my character to the next person to sword fight with, can be a little tricky due to lag. This happens when I throw a smoke bomb. The sulfuric smoke makes it easy to fight the guys, as they all gasp and cough, but but the clouds of smoke drop my frame rates.)

Here from a few days ago, when I had convinced myself to run at the 4/8 setting, were 3 comparisons, good, better and best

Lowest EQAA is EQAA 2X(4), but this is second level EQAA 4X(8)



This is third level at EQAA 4X(16)



And this is what I am running at now, max level, EQAA 8X(16)



The differences are subtle - but the effect begins to be felt. The ship appears darker in the top picture, then brighter in the second, but then finally brighter, and more polished, in the third.

Something happened to me - maybe it was when you talked about smiling at the 4k screen. I decided to investigate the higher settings for some reason. Oh, yeah, I was looking for treasure - I finally figured out the treasure maps - and I was doing less sailing. Also I pursued the main game arc, and started walking around the new town of Kingston. For some reason I decided to re-visit those higher levels of AA. And then I was floored when I started walking around the EQAA 8X(16) world.

For example, look at the old Adewale, from a former post, compared to the new Adewale:







And then look at these 3 new shots, with the max AA. 1. Sailing, 2. my home port at night with the two town drunks (even the night the pictures have a warmth and a glow that is very appealing) and a daytime shot of the new town of Kingston.










So, yeah, I'll put up with the 22 average frame rate for that kind of ambience. Maybe I'll get around to trying to produce a reasonable overclock on the i7. The i7 of course has hyperthreading, whereas the 9450 does not.

What do you guys think? Does the doubling of the thread handling that you get from HT produce any benefit - would that also help me get better frame rates?

Rich

This message has been edited since posting. Last time this message was edited on 15. December 2013 @ 00:22

AfterDawn Addict

15 product reviews
_
15. December 2013 @ 04:55 _ Link to this message    Send private message to this user   
When gaming my head is roughly 2 feet from the screen and both monitors adequately fill my immediate field of view on their own. That being said, 4K vs 2K textures is significant for me, as I DO spend some of my time examining things up close. Skyrim in particular benefits hugely from texture and mesh mods as the game is very up-close and personal. It actually encourages scrutinizing the environment around to look for small details. You don't need a 4K monitor to get the effect of 4K textures. Though a monitor with good pixel density such as my 1920 x 1200 24" or better is recommended. A 1080p monitor is somewhat limited in that regard. Okay for gaming, but the sharpness and clarity aren't even in the same ballpark. High pixel density makes good textures really SHINE. 2560 x 1600 in particular is great for that. I have spent a bit of time gaming on a 30" 3007WFP. They are no joke, even though my 2407WFP is already far better than the average monitor.

As far as FXAA goes, no way, ewww. The only implementation of it that I like so far is being able to turn it to its lowest setting in the Battlefield games. Then it looks quite good and can be a passable substitute for real AA.

Every other game simply has it turned up WAY too high at default which makes it seem as if someone smeared Vaseline over the screen. It's a useful effect, but needs to be used extremely sparingly as it blurs EVERYTHING on the screen, in contrast to AA which just blends edges.

Mind you FXAA is noticeably less terrible with my 2407WFP vs the ASUS 1080p display. Low pixel density + FXAA = terrible. In that case, real AA is the only way to go.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 15. December 2013 @ 05:17

AfterDawn Addict

4 product reviews
_
15. December 2013 @ 06:19 _ Link to this message    Send private message to this user   
We're not talking about viewing distance here. Viewing distance obviously increases the crispness of any image, the further you get away. If you're playing with the same source resolution, the monitor resolution makes little difference. I say little and not none, because as you might expect, upscaling 1920x1080 from a film for example will look slightly better on 3840x2160 than 2560x1600 because no interpolation is required - 3840x2160 is an exact multiple of 1920x1080 so you simply get a 2x2 square the same colour for each original pixel, rather than having to try and interpolate half-pixels, which will 'smudge' the image, in the same way as zooming in on an image in windows photo viewer.
As it stands, the higher pixels per inch of the 4K display, assuming you are looking at a 4K image, such as a 2160p video, or more likely your desktop, allows you to sit closer to the screen and get the 'far away' effect. Everything like the start menu and desktop icons are smaller and as a result, more crisp to start with. Sitting in front of the UP3214Q with the standard 23" 1080p U2312HM beside it, it looks very fuzzy and low-res, whereas there's nothing wrong with it, it's just seeing the two beside each other that you realise what a difference it makes.

Now, back to the game analogy I used. Think about this for a minute. Say there's a wall texture that's 1024x432 pixels, that you're standing fairly close to, but not right in front of. Where your player is currently standing, it might make up a certain proportion of your screen, say 32% of your screen's width and 24% of its height.
If you had a 1920x1080 monitor, standing in that place, you'd be seeing a 1024x432 texture at only 614x259 - because that's 32% of your screen's width resolution and 24% of its height resolution. Meanwhile at 3840x2160 you can see the whole texture, which will in fact be upscaled, as you're now viewing a 1024x432 texture at 1229x518.
Apply this to something further away in your gamne, that only makes up 0.5% of your screen's size in each direction. Certainly large enough to be, for example, a unit heading in your direction in a game that you can see. At 1080p that object is 19x11 pixels, not much to go on about its detail. At 3840x2160 however, that same object will be rendered at 38x22 pixels. That may well be enough detail to get a clear view on what it is that's approaching.
The texture of this objects is clearly going to be much higher than either of these values, but having that higher screen resolution means you get a much better render of it from a distance. Since as you say, 4K textures are rare, most objects will be 'sharpened' by a 4K monitor beyond a couple of feet in front of you. Even if most textures are sub 1000 pixels wide and you upgrade from a 1920 wide to a 3840 wide monitor, all you need to do is make the texture take up less than those 1000 pixels on your screen and you're seeing an improvement. For most games, a very short distance is needed for this, as most individual textures in a game aren't covering half your screen at once!
On the other hand, as Jeff said in his post, there may be occasions when you want to view stuff really up close. As in walk up to a wall so you can go no further, such that the wall texture fills more than 100% of your screen. Most games don't actually provide wall textures at very high resolutions at all, most evident when you see safety notices on the walls, walk up to them and discover they're too blurry to read. The advantage of having 4K textures in a game for people who don't have 4K monitors, is that you can do things like this and still have a clear image in front of you. I am very much in favour of better textures for games, as it's still the most tangible indicator of 'good graphics' you can get, and far from the most demanding on hardware. Trouble is, it's also the most costly for the devs.

Now to the displayport debate again, so you understand what you're dealing with.

Displayport allows daisy-chaining off ports using a hub, much the same way as USB allows you to use hubs to expand the number of ports you have. This technology is called MST. Previously, you could buy MST hubs and run three displayport monitors off one port on your graphics card (as it's rare for graphics cards to come with three displayport connectors). This was useful for people using eyefinity as it meant they could use displayport for all three monitors. Mixing and matching monitors with and without displayport in eyefinity was always troublesome because different timing methods are used, and the displays could go out of sync with each other.
As it stands right now, there is no image handler chip that can handle a single 3840x2160 stream at 60Hz, only 30. The way the Dell UP3214Q overcomes this is an internal MST hub. As far as your PC is concerned, a Dell UP3214Q (along with the other 31.5" 4K monitors on sale from Asus and Sharp) is two 1920x2160 monitors, one for the left half of the panel, and one on the right.
Now, the "left" hand display processor supports 3840x2160 at 30Hz, so if you set the monitor to 30Hz mode (Displayport 1.1) in the menu, you can still get 4K without using eyefinity, just with a 30Hz refresh rate. This isn't very nice to use, but if you have a device that doesn't support the newer displayport standard you want to connect, this will work.
In order to get 60Hz though, you have to use both sides - to get them to appear as a single monitor you need to use Eyefinity to merge the two displays together.

With regard to AC4, two observations:
- Increasing Anti-Aliasing never increases the CPU load. If you're seeing higher CPU usage, something else may be going on with the game such as your current location. If you are seeing a drop in GPU load making you think the CPU is working harder, the GPUs may be running out of video memory.
- It's impossible to distinguish any quality difference in the thumbnails provided. To really judge the differences in quality between the screenshots you'll have to post them at their original resolution, rather than 900x600 odd.

As for Jeff's comments on dpi, DPI is king, it really is -

42" 1920x1080 (e.g. HDTV): 52.5dpi -> 2.75kPsi
32" 19201080 (e.g. HDTV): 68.8dpi -> 4.74kPsi
31.5" 3840x2160 (e.g. UP3214Q): 139.9dpi -> 19.6kPsi
30" 2560x1600 (e.g. 3007WFP): 100.6dpi -> 10.1kPsi
27" 2560x1440 (e.g. U2711): 108.8dpi -> 11.8kPsi
24" 1920x1200 (e.g. 2407FPW): 94.3dpi -> 8.9kPsi
23" 1920x1080 (e.g. U2312HM): 95.8dpi -> 9.2kPsi
21.5" 1920x1080 (e.g. S2240L): 102.5dpi -> 10.5kPsi

You'd be amazed what a difference even the 102.5dpi makes on a 21.5" vs a 23" at 1080p. I set an LG 21.5" up for a customer at my office once and it sat next to my 23" Dell, and I remember thinking 'wow, that's much crisper than mine' - and that was a 14% difference in the amount you could display in a given area.
Going from a 30" 2560x1600 to 31.5" 3840x2160 is a 94% increase. That's a big deal.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
harvardguy
Member
_
25. December 2013 @ 17:14 _ Link to this message    Send private message to this user   
Hi Guys - happy holidays to all!!

You're right - I could not tell any difference in those images - furthermore, running around the game yesterday after a one-week absence (building a W7 + XP tower for my black roommate Daryl) I said to myself, "Well, that's really nice - it must be at the EQAA 8x(16) setting, but it was back at my 4x(8) earlier compromise, two settings down, and the game still looks quite good.

How do I link to the full-size image? I am using photobucket per Shaff's recommendation of 5 years ago. Do I load the image full-size into there, and then just display the link, plus maybe the 900 width thumbnail?

So I hear what you guys, Jeff and Sam, are saying about dpi. Your explanation of being able to see more of the wall texture, because of the high res monitor, even if the texture is still back at the 2560x1600 size - that made a lot of sense to me - good job Sam - I think I followed that. Also, thanks for explaining the way eyefinity works - I did follow that very well.

The customer's monitor looking crisper - that made sense, and the 94% increase of the 4k monitor - huge jump in crispness. Maybe 2 years from now when the foundries have made further reductions in die size the 8000 or 9000 family AMD cards will be fully up to the challenge.

(I just heard about a new thing - graphene - a 2d graphite structure one atom thick - which IBM used to make an experimental integrated circuit. Maybe that's what will be required eventually to run 4k without using enormous amounts of graphics card power and heat.)

I haven't been totally scientific about this - but I was logging core temp, and gpu-z. At one point, on a core-temp once-per-second log, there is one line that shows cpu load at 100,100,100,100. Can't ask for more than that from the struggling 9450. Gpu load maxes at around 78% each gpu. Framerates are down, at the higher graphics settings. But reduce the EQAA, framerates rise, gpu load drops, and so does cpu load, to mid-to high 90% - but no more maxing out at 100%.

So .... if this data is correct .... is this counter-intuitive? I remember on Sleeping Dogs that I set framerates to 30, to stop the cpu from maxing at 100% on all cores, which was producing wild framerate variations as I drove around town. So higher framerates, equals higher processing load, right?

Yet the AC4 data seems to indicate that despite reducing framerates, increasing EQAA has the effect of increasing cpu load.

As to whether I am exceeding my vram limit of 3gigs - maybe - I didn't check that. But even if I am - how does that cause my cpu to max? Anyway, when you add it all up, obviously I am cpu limited.

The lagginess started to bug me, and I dropped back to the lower to get the frames up. I discovered that the MSI afterburner OSD costs me about 2 fps for the gpu information, and another 2 fps for the cpu load information which comes from enabling rivatuner - which tucks the cpu core temps and loads inside the MSI OSD. BUT... the hide hotkey now works quite well, and as soon as I hide the OSD I get all my framerate back.

This works particularly amazingly on furmark, which I use to test my OSD. I have fraps now, for $37, well worth it, and I can run fraps, and the msi osd at the same time, in a 2560x1600 window. Furmark and fraps show 25 fps. I hide the msi osd, and now furmark and fraps show 50 fps - what I used to get before when I was displaying the msi osd.

I don't know what happened on my system, but I never noticed the msi OSD having any kind of affect on framerates - certainly not cutting furmark in half. I was always getting 50 - I put that information on the shortcut as part of the name, as my quick test to verify that both cards were truly running in crossfire.

But suddenly - half the framerate with the MSI OSD. I uninstalled 2.2.3, and re-installed the former 2.2.2, but no change. Maybe a registry key got modified and stayed modified. Anyway, as long as it hides quickly - why should I care? So as of yesterday when all this became apparent, I now hide it in AC4 unless I want to know what time it is, and just rely on fraps for whether or not I can boost my graphics setting.

As I think I already posted about, I had bought the fraps program, because MSI didn't work at all for a couple of recent games, the COD Ghost, and the Battlefield 4. Both of them were not as bad as I thought they were going to be, and ultimately I decided they were worth the coin. Ghost single player had some amazing space station shooting, flying around in zero gravity - twice. And they had some underwater scuba shooting - similar type of feeling - with special bullets and special guns - and I got killed a bunch of times before I mastered it. So there was some good challenge. Those guns are real - I had a hard time believing it - but the skinny long bullets are effective as some little distance - like 30 or 40 feet - which as I say is hard to believe. Each one like a little torpedo or spear-gun - actually dashes along at up to 30 or 40 feet according to the game.


The Russians designed one that carries regular bullets at the same time, and automatically knows when you leave the water to change ammo.

I should look up the exact specs.

Oh my god, I did. The small bullets up to 15 meters = roughly 45 feet, the AK size to 75 feet, and a very large dart armor-piercing round can maintain its effectiveness for up to 60 meters - almost 200 feet. What the hell!!

As for Battlefield 4, the single player again wasn't too bad. I enjoyed parts of it. The multiplayer - I did put about 20 hours into it. But the black uniforms against black shadows renders the enemy invisible - often - and I began to get bugged by that. On the hotel map, I had some good success staying outside and fighting on the perimeter - that was actually fun. But the frenetic inside activity was bleak and more of a twitch shooter feeling - it wasn't my style.

I actually went back to the Medal of Honor multiplayer and had great success with the repeating sniper weapon. I think I posted about this before. So if I want multiplayer, I'll go back to that, or to BF:BC2 - or even to the BF2 demo, or once in a while to one of the CODs.

The challenge is running out on AC4 - I'm about 75% through the game arc. Early on I ran around the map and looted soooo many boats, and consequently upgraded my ship so much before coming back to the main story arc, that the things they throw at me now are a piece of cake. Well - still a bit of a challenge - but I'm glad to have the strongest hull, the extra-strength mortars, and the golden super-powerful repeating swivel cannon. I might be ready to take on one of those legendary ships one day. They are very hard to kill, but if you disable a few ships close by, that you can leave sitting there ready to board in order to repair your ship - like a giant health pack :P - that might be a way to finally beat one of them.

Happy holidays again to all,
Rich
AfterDawn Addict

4 product reviews
_
25. December 2013 @ 18:39 _ Link to this message    Send private message to this user   
I use imgur for my image hosting at the moment, as it's not cluttered with useless features and ads - you just have to learn to ignore all the 'internet culture' images it offers on the homepage from other users.
I found imageshack became pretty much unusable without paying for an account, and photobucket often only keep images a few months before deleting them.

Graphene's been a hot topic for a year or two now, and might potentially offer something to replace silicon, but we won't reach the 'silicon's not good enough' stage for another 5-10 years I don't think.
Remember the R9 290/290X are still built on a 28nm silicon process, whereas Intel's Haswell CPUs are already on 22nm. Intel only expect issues to crop up once going below 10nm.

As for GPU/CPU usage and limitations, I think you're pushing both boundaries at once, which makes analysis different. The game you're playing overstretches your CPU, and the detail/resolution settings you're using overstretches your GPU at the same time.
When your GPU is hammered by AA, the frame rate may be lower but more consistent, such that the CPU is always at 100%. When reducing the work on the GPU, the amount of graphical load may be more fluid, such that the CPU will be periodically slowing down slightly when graphically intensive operations come in fits and starts.
Remember also that there are some CPU operations that are not frame rate-dependent.

Exceeding 3GB VRAM at 2560x1600 is very unlikely. I don't currently know of any games that can do that without mods unless you are supersampling.

Frame rate drops like that with the OSD applications highlights why I try to avoid using them at all costs :) There is no reason why such applications should cause that level of drop in performance. Combined you're losing a good 10% there, which is the difference between models in a product range.

NEVER use Furmark as a means of testing frame rate. It will not react the same way to changes in graphical load/capability as games do since it is designed to stress-test the GPU, not provide a measure of performance. The frame rate counter is really there to show the GPU is not dropping frames inconsistently due to overclock or fault-related instability.

You're not the only one to have complained about the darkness in BF4 multiplayer. I'm personally hanging on until there are fewer bugs, and possibly until I have a card more adept at playing it, but we'll see. For me to enjoy Battlefield games I really need to be in a squad on voicechat with friends.

Happy christmas to all etc :)



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
ddp
Moderator
_
25. December 2013 @ 18:43 _ Link to this message    Send private message to this user   
same to you guys.
AfterDawn Addict

15 product reviews
_
26. December 2013 @ 21:50 _ Link to this message    Send private message to this user   
Happy holidays to everyone. Have fun and stay safe :)

Treated myself to a new pair of headphones. The Creative Fatal1ty's were finally getting used up. I've repaired the flimsy spaghetti cable about 10 times. They were getting physically and functionally beat. Too bad as they were broken in wonderfully and had beautiful sound.

Got a nice deal on a Logitech G230 analog headset over at Best Buy with my discount card. Normally $60 retail, with small after-holiday discount, plus a discount from my card brought it to $42. Very nice considering the Creative Fatal1ty headset was about $50. Similar drivers and performance specs to my previous headphones, look a little nicer, more comfortable, maybe a bit more solidly built. Time will tell there as the Creative headset really stood up to some abuse despite the fragile cable.

Sound quality so far is quite comparable, though maybe starts to distort at a lower volume than the Creative unit. Bass is a little more powerful, but at the cost of some sharpness. Time will also tell there as these 40mm headphone drivers always require a little breaking in before their true quality becomes apparent. The Creatives were the same way.

Mic is not quite as good. Picks up more background noise, and the sound card does not have noise cancelling. Not that bad or annoying, but something worth noting for some users.

$50 at time of purchase. Quite an old design.
http://www.newegg.com/Product/Product.aspx?Item=N82E16826158082

$42. Normally $60, even at Newegg.
http://www.newegg.com/Product/Product.aspx?Item=N82E16826104840

The 7.1 capable G430s were available for the same price, but I've heard them in some detail before and IMO: the low price + attempting proper 7.1(ie trying to do too much) = low quality components and sound. The identical, but stereo only, G230s seemed a much better buy. Also, they're analog. I have good reason to use my analog :)



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388

This message has been edited since posting. Last time this message was edited on 26. December 2013 @ 22:02

AfterDawn Addict
_
29. December 2013 @ 16:20 _ Link to this message    Send private message to this user   
Comfortably the youngest regular still here checking in.

Hope everyone had a good Christmas!


AfterDawn Addict

15 product reviews
_
29. December 2013 @ 16:25 _ Link to this message    Send private message to this user   
How old are you Ripper? I joined the site at like 15.



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
AfterDawn Addict

4 product reviews
_
29. December 2013 @ 18:03 _ Link to this message    Send private message to this user   
I too joined the forum at age 15, looking for support. Sufficiently disillusioned, my first signature I believe read 'my computer's crap, yours?'
How times change...
In February I believe I celebrate a decade at aD.



Afterdawn Addict // Silent PC enthusiast // PC Build advisor // LANGamer Alias:Ratmanscoop
PC Specs page -- http://my.afterdawn.com/sammorris/blog_entry.cfm/11247
updated 10-Dec-13
ddp
Moderator
_
29. December 2013 @ 18:16 _ Link to this message    Send private message to this user   
you've been here a bit longer than i have as i joined oct 15 2004.
AfterDawn Addict

7 product reviews
_
29. December 2013 @ 19:15 _ Link to this message    Send private message to this user   
February makes 6yrs for me. But I've been aware of Afterdawn for significantly longer ;) I wanna say, discovery was around 2004. Could be 2005 though. 2003 - 2004 is when I really got involved with my first computer. Dial-up internet :S Patience builder!

Though certain "essential" softwares were introduced to me, around 2002. Obsolete now... Winmx!



To delete, or not to delete. THAT is the question!
AfterDawn Addict
_
30. December 2013 @ 06:27 _ Link to this message    Send private message to this user   
My reg date is Feb 2006 and I'm 22 in the first half of 2014, making me 13 when I joined. So 8 years coming up!

I distinctly remember people assuming I was a lot older though, so I didn't correct them!

Edit: (Which would explain why I was probably a total PITA to begin with, right ddp!)


This message has been edited since posting. Last time this message was edited on 30. December 2013 @ 06:30

AfterDawn Addict

4 product reviews
_
30. December 2013 @ 10:17 _ Link to this message    Send private message to this user   
Feb 15 2006 is when i joined... yes its been a long time since my last visit....lol
AfterDawn Addict

15 product reviews
_
30. December 2013 @ 11:57 _ Link to this message    Send private message to this user   
Joined in January 2006. This January will be 8 years for me :D



AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
Advertisement
_
__
 
_
ddp
Moderator
_
30. December 2013 @ 15:06 _ Link to this message    Send private message to this user   
Ripper, not as bad as doggybot was.
 
afterdawn.com > forums > pc hardware > building a new pc > the official graphics card and pc gaming thread
 

Digital video: AfterDawn.com | AfterDawn Forums
Music: MP3Lizard.com
Gaming: Blasteroids.com | Blasteroids Forums | Compare game prices
Software: Software downloads
Blogs: User profile pages
RSS feeds: AfterDawn.com News | Software updates | AfterDawn Forums
International: AfterDawn in Finnish | AfterDawn in Swedish | AfterDawn in Norwegian | download.fi
Navigate: Search | Site map
About us: About AfterDawn Ltd | Advertise on our sites | Rules, Restrictions, Legal disclaimer & Privacy policy
Contact us: Send feedback | Contact our media sales team
 
  © 1999-2024 by AfterDawn Ltd.

  IDG TechNetwork