|
The Official Graphics Card and PC gaming Thread
|
|
AfterDawn Addict
7 product reviews
|
18. October 2010 @ 03:53 |
Link to this message
|
If it's true when you say they lie about their sensors, I'm very disappointed. I've been monitoring a ~42C Mobo/NB temp on the HTPC. If that's a lie at idle, I can only imagine what it is under load. I would DEFINITELY imagine it needs better cooling then.
To delete, or not to delete. THAT is the question!
|
Advertisement
|
|
|
AfterDawn Addict
4 product reviews
|
18. October 2010 @ 03:57 |
Link to this message
|
Sad but true, other people have reported the same.
|
Senior Member
4 product reviews
|
18. October 2010 @ 13:38 |
Link to this message
|
Originally posted by sammorris: Originally posted by DXR88: its a shame and a sham you have to have an nforce chip to have SLI.
Not true with LGA1156 or LGA1366 for Intel. Intel were so peed off at nforce chipsets they banned them, so now motherboard manufacturers pay nvidia a license to fee to legally be allowed to put SLI in the BIOS of the motherboard. Cheap boards don't get SLI, not because they lack any equipment but because they didn't pay the quite considerable license fee.
Omega: I'd take that Foxconn over any ASRock board any day. ASRock is the cheap low quality arm of Asus, so what does that tell you?
Maw: Yes they are removable.
Rich: Lmao have I played it. About the only multiplayer game I currently play at all is HoN, which is effectively the current incarnation of DoTA until DoTA2 comes out. I've played at least 300 hours of it I would think.
ASrock has stepped up there quality an customer services since '02, they've even received best of rewards an pcmag choice awards.
ASUS is the shitty one, pretty sad when your low end arm, destroys the parent company.
Powered By
|
AfterDawn Addict
4 product reviews
|
18. October 2010 @ 13:42 |
Link to this message
|
ASrock stuff was known for being low quality even before Asus dropped the ball. Quite frankly I don't have any consequence in either.
|
Senior Member
4 product reviews
|
18. October 2010 @ 13:51 |
Link to this message
|
Originally posted by sammorris: ASrock stuff was known for being low quality even before Asus dropped the ball. Quite frankly I don't have any consequence in either.
i have had issue after issue with ASUS boards something never worked right or didn't work at all. i was hesitant to get an Asrock board for my HTPC i read reviews compiled what the interwebs had to say, and ruled out the Paid for posting drones that seem to bash anything with negative reviews.
bought one of the 939 boards, 6 years not one issue. its even an Nforce board which is a shocker, didn't expect more than 3-4 years out of it.
Powered By
|
harvrdguy
Senior Member
|
18. October 2010 @ 18:08 |
Link to this message
|
Originally posted by Sam: Rich: Lmao have I played it. About the only multiplayer game I currently play at all is HoN, which is effectively the current incarnation of DoTA until DoTA2 comes out. I've played at least 300 hours of it I would think.
Sh****t!
One more game that Sam is going to rape me on!
- - - - - - - - - - - - - - - - -
LOL
All right, Sam or anybody, help me out on this game, please.
First of all I have to confess that I have never even played Team Fortress 2, which is a complete sin since that was the game Miles specialized in when he lived up in Seattle and first went to work for Valve - he and Ariel and Moby were the TF2 team.
And I have the game downloaded in the My Games steam folder - but I haven't gotten to it yet. Someday. Someday. LOL (I just know there's going to be a long learning curve called "getting raped.")
So Dota2 sounds like TF2, and they even mention the TF2 gamer community.
So, help me out Sam with 300 hours of Dota under your belt. How does it compare to Left 4 Dead. Just a short paragraph. I never would have played Left 4 Dead except you talked about it a lot, and it appeared miraculously in my games folder one day during that free week that they had. Then I tried it and was amazed. I will be playing a lot of that game at the end of this month on my 3-day once-a-month gaming weekend.
So in one paragraph (or more if you feel like it) why are you playing Dota, to the tune of 300 hours so far - what is the game playing experience like, particularly compared to Left 4 Dead which I am familiar with.
Rich
This message has been edited since posting. Last time this message was edited on 18. October 2010 @ 18:28
|
Red_Maw
Senior Member
|
18. October 2010 @ 23:32 |
Link to this message
|
Originally posted by sammorris:
Maw: Yes they are removable.
Thanks sam :) I'll most likely pick one up and put in this weekend.
This message has been edited since posting. Last time this message was edited on 18. October 2010 @ 23:33
|
AfterDawn Addict
4 product reviews
|
19. October 2010 @ 03:39 |
Link to this message
|
Dota is nothing like TF2, nor is it anything like Left 4 Dead.
First of all TF2 and L4D are first person shooters. You have a first person camera perspective, you use WASD to move your character around, moving the mouse to adjust your aim, and you click to fire or melee people, unless you're playing as infected, but similar idea.
TF2's controls are much the same, it's still an FPS, the difference being rather than the objective being to survive to the end, it's to win the multiplayer scenario.
There are a few standard multiplayer scenarios for FPS games, that many titles use. TF2 does not use many, largely owing to the fact that it's a class-based game, instead of a generic shooter.
i.e. Instead of as in an arcade shooter where everybody has the ability to pick up and use any weapon they find, in a class based shooter you're limited to using a particular set of weapons for your role - not unlike a tactical military shooter like bad company 2, as for example you can only really use snipers as recon, and you can only really heal people as medic.
However, how BFBC2 differs is, as would be realistic, there's nothing stopping you from killing an enemy, and taking whatever items they have, regardless of if they're your class or not. You can quite easily go medic, down an enemy sniper (which you can even do with the defibrillator, pretty hilarious) and take his sniper rifle and use it.
Team Fortress 2 locks you into the specific class you chose:
Scout - To my knowledge the only class that can double jump, moves the fastest, has a shotgun and a baseball bat [all classes have a melee weapon, and the other two are typically ranged]. Scouts are useful as they capture objectives with the speed of two people.
Soldier - main weapon is an RPG launcher. Also carries a shotgun secondary. Melee is spade
Heavy - main weapon is a large minigun. Also carries a shotgun secondary. Melee is fists
Pyro - main weapon is flamethrower. Also carries a shotgun secondary. Melee is axe
Engineer - carries a shotgun and a pistol. Melee is a wrench. Enginer can erect sentry guns and health/ammo dispensers, as well as teleporters. These can be upgraded in time to become more powerful (sentry) or faster to recharge (teleports)
Demoman - grenade launcher and sticky bomb (remote mine) launcher. Melee is beer bottle.
Sniper - Sniper rifle as primary, secondary submachine gun, melee is a kukri blade.
Medic - syringe gun as primary, Medigun as secondary - heals teammates and adds charged. When fully charged, activates ubercharge (temporary invincibility for both the medic and the healee). Melee is a bonesaw
Spy - Revolver as primary, electro-sapper as secondary (disables and damages enemy buildings), knife as melee - backstab is an instant kill. Can disguise himself as enemy players, or turn (almost) invisible.
To make things more complicated, TF2 employs an item drop and item craft system whereby you can get certain new weapons for your class from time to time, or for other classes. If you accumulate sufficiently many useless items [e.g. you already have them or you never play the class they're for] you can combine them to create more useful things.
The game modes TF2 employs are:
Capture the flag [called capture the intelligence here] - standard fare gamemode involes stealing an item, normally a flag, from deep inside the enemy base, making it across the battlefield and landing it at your base to score a point.
If a player carrying the flag is killed, the flag drops to the ground. Another team-mate can pick up the flag and continue assuming the enemy haven't recovered it. In some games, the enemy even touching their downed flag returns it to their base. In other games, they need to carry it back. In many titles, a team cannot score a capture of the enemy flag if their own has been taken. I don't think TF2 is one of these games.
Territory capture / Land Grab:
Not sure what the actual names for these are, those are titles from the xbox series Halo.
These two modes are very similar, a point is captured by a player standing on or around it for a certain period of time. In TF2, this process is sped up, the more players are standing on it. Typically, a point can't be captured if an enemy is also standing on the point.
The round is won when a team has captured every point.
How territories differs from land grab (I'm not sure which one is which, in TF2 the mode used is decided by which map it is. In Halo games, both modes are available for every compatible map) is that in one mode, both teams can capture points.
The TF2 way of doing this is to have 5 points. Base->Outside base->Centre->Outside base->Base from one team to the other. Say we call them 1,2,3,4,5 for simplicity here. If you're capturing 4 but the enemy are capturing 3, and the enemy capture 3 first, you can no longer capture 4, it is given back to the enemy - all the captured points are linked together.
I believe the Halo system is that whoever has the most points after all 5 have been captured wins the round.
The other method is simply a case of base defense, a team can only either attack, or defend, i.e. only one team is able to capture points. They have to capture as many within the time limit, or all as fast as possible, then when the teams are switched, the opposite team has to beat the record.
Payload run:
This I believe is largely unique to TF2. The map consists of a railway track that runs from your base to the enemy's. You start in your base with a bomb on a cart, which has to be pushed to the enemy base. When it gets to the centre, it detonates and you win the round.
The speed with which the cart is pushed depends on how many people are pushing it. (Again, scouts push double) The cart also regenerates health to the people near it. If the cart remains unattended for a certain time interval (around 30s) it begins to roll back slowly.
A checkpoint system is used in case a team don't make it to the end before time runs out. Checkpoints act as time-extends.
One particular map actually features a payload race where both teams get the cart and the first one to reach the other team's base wins.
So that's TF2 then, in a relatively sizeable nutshell.
How does DoTA differ? totally.
DoTA is an RTS style game, set from an overlord's perspective. There is only one objective, and in DoTA itself, only one map. HoN has three, but the others are very rarely played as they seem not to live up the exacting standards of extremely picky players.
You start with a prebuilt base which you cannot alter. It consists of a fountain, a shrine, a few random useless buildings, three sets of barracks (one for each lane) and 11 defense towers, 2 guarding the shrine, and 3 in each lane.
The objective of the game is to destroy the enemy shrine, which is only possible once most of the towers have been destroyed (all of the towers in a given lane, plus at least one of the towers defending the shrine), otherwise it is invulnerable to attack, to stop people teleporting in and grinding it down.
You play as a hero character, which are sort of designed into classes, but the boundaries are not so clear cut. There are a little over 60 heroes in HoN (the current modern adaptation of DoTA) and there were 100+ in the original DoTA, as will there be in DoTA 2.
Heroes are characters with a large amount of health (400-600 to start with, can be in excess of 4000 on occasions towards the end of a game), which you control RTS-style. While you can change class whenever you like in TF2, in DoTA you are locked to the same character for the course of the game, as your hero levels up.
With the exception of certain heroes' abilities, you only have one character to control.
The barracks spawn creeps every 30 seconds, weaker units which automatically charge at the enemy base. Until barracks are destroyed, creeps are evenly matched, so you won't find them pushing the enemy base by themselves. However, leave a lane ungarded and a hero can easily wipe out the enemy creeps and help the friendlies push the lane towards the base.
Every hero has 4 unique abilities. These have all kinds of functions. Some stun enemies, some slow them, some do damage or a combination of the prior, some heal, some activate other abilities, such as bonus damage to team mates.
One of the most fundamental parts of the game is in choosing a good team of 5 heroes for your team (the vast majority of games are 5v5, as it is the best mechanic for the map)
Heroes are divided into 3 primary attributes.
Strength
Agility
Intelligence
Every hero has these as a statistic, but every hero has one as a primary, which helps define their role.
Strength heroes are built like tanks. They have lots of health and are generally designed to try and soak up damage, allowing weaker heroes to survive.
Agility heroes are damage dealers. They attack hard and fast and most of the game's carries are agility (A carry is a hero that, when sufficiently levelled, can carry a weak team to victory, even if the game seems lost)
Intelligence heroes are spellcasters. These typically have the most exotic unique abilities, and are very useful to have in a teamfight.
Every hero has a few basic characteristics that define their usefulness
Health - pretty obvious, when it runs out, the enemy dies. Respawns are not instantaneous, the time they take scales with how long the game has been going, and how high a level they are. You can 'buy back' for a large sum of gold if you're desperate and/or you have the money, which provides an instant respawn from when you click.
Mana - Required to cast spells and use certain items.
Armor - the more you have, the less damage you take
Gold - the more you have, the more cool items you can buy. Items grant stats (+strength/agi/int) as well as some other raw stats (health, mana, damage) and in some cases, special abilities, for example, a heal. Everybody receives 1 gold per second, you get gold for killing creeps, towers and heroes, as well as for assisting with kills.
Damage - the more you have, the harder enemies get hit
Attack speed - the more you have, the faster you attack enemies.
Experience - experience means levels, you get it from being near creeps or enemies when they are killed. When you level up, apart from gaining more base stats, you are also able to level one of your unique abilities.
I'd go on, but I think it's probably best to continue from a question/answer perspective :P
|
AfterDawn Addict
4 product reviews
|
19. October 2010 @ 13:24 |
Link to this message
|
Second textwall of the day, some hardware updates.
Coming in December, the Geforce GTX485.
Of course, it's not called the GTX485, it's called the GTX580. This is nvidia after all.
Price will be similar to that of the GTX480 launch, so around $600 or £440.
The HD6970's price is likely to be similar as well.
All the GTX580 is is a 512 shader GTX480, with 2GB of memory.
Meanwhile all of the GTX400 series are being replaced with rebrands on a 384-bit bus width, so there will be a GTX570, 560, and probably a 576-core GTX590/680 within the next 6-8 months.
Now, the full version of Arcania:Gothic 4 as a benchmark, both on CPUs and on GPUs. (HD6 series results are of course speculative). GT430 results are now more accurate, but still extrapolated.
Results for resolutions 1680x1050, 1920x1200 and 2560x1600 are extrapolated.
Low Quality
1280x1024 M10: Radeon X1950XT/HD2900GT/3690/4670/5570, Geforce 8800GS/9600GSO/GT220/430
1280x1024 M20: Radeon HD3870/4830/5670, Geforce 8800GT/9800GT/GT240/GTS450
1280x1024 M30: Radeon HD3870X2/4770/5750, Geforce 9800GTX+/GTS250/450
1280x1024 M40: Radeon HD3870X2/4870/5770, Geforce 9800GX2/GTX280/460
1280x1024 M50: Radeon HD4890/5830/6850, Geforce GTX285/460
1280x1024 M60: Radeon HD4850X2/5850/6850, Geforce GTX295/470
1680x1050 M10: Radeon X1950XT-X/HD2900Pro/3850/4670/5570, Geforce 8800GS/9600GSO/GT240/430
1680x1050 M20: Radeon HD3870/4830/5670, Geforce 8800GTS 512/9800GT/GTS250/450
1680x1050 M30: Radeon HD3870X2/4770/5770, Geforce 9800GX2/GTX260/GTS450
1680x1050 M40: Radeon HD4890/5830/6850, Geforce GTX275/460
1680x1050 M50: Radeon HD4850X2/5830/6850, Geforce GTX295/470
1680x1050 M60: Radeon HD4870X2/5850/6850, Geforce GTX295/480
1920x1080 M10: Radeon HD2900Pro/3850/4670/5570, Geforce 8800GS/9600GSO/GT240/GTS450
1920x1080 M20: Radeon HD3870X2/4830/5750, Geforce 8800GTX/9800GTX/GTS250/450
1920x1080 M30: Radeon HD3870X2/4870/5830/6850, Geforce GTX280/460
1920x1080 M40: Radeon HD4850X2/5830/6850, Geforce GTX295/460 1GB (not 465)
1920x1080 M50: Radeon HD4870X2/5850/6870, Geforce GTX295/480
1920x1080 M60: Radeon HD6950/2x5830/4890, Geforce GTX480
1920x1200 M10: Radeon HD2900XT/3850/4670/5570, Geforce 8800GS/9600GSO/GT240/GTS450
1920x1200 M20: Radeon HD3870X2/4770/5750, Geforce 9800GTX+/GTS250/450
1920x1200 M30: Radeon HD4870/5830/6850, Geforce GTX280/460
1920x1200 M40: Radeon HD4850X2/5830/6850, Geforce GTX295/470
1920x1200 M50: Radeon HD4870X2/5870/6870, Geforce GTX295/480
1920x1200 M60: Radeon HD6950/2x5830/4890, Geforce GTX580/2x460
2560x1600 M10: Radeon HD3870/4830/5670, Geforce 8800GT/9800GT/GTS250/450
2560x1600 M20: Radeon HD4870/5830/6850, Geforce GTX280/460
2560x1600 M30: Radeon HD4850X2/5850/6850, Geforce GTX295/470
2560x1600 M40: Radeon HD6950/2x5830/4890, 2xGeforce GTX460. GTX580 possible
2560x1600 M50: Radeon HD5970/6970, 2xGeforce GTX470
2560x1600 M60: 2xRadeon HD5870/6870, 2xGeforce GTX480
Medium Quality
1280x1024 M10: Radeon HD2900Pro/3850/4670/5570, Geforce 8800GS/9600GSO/GT240/GTS450
1280x1024 M20: Radeon HD3870X2/4830/5750, Geforce 8800 Ultra/9800GTX/GTS250/450
1280x1024 M30: Radeon HD3870X2/4870/5830/6850, Geforce GTX280/460
1280x1024 M40: Radeon HD4850X2/5830/6850, Geforce GTX295/470
1280x1024 M50: Radeon HD4870X2/5850/6870, Geforce GTX295/480
1280x1024 M60: Radeon HD6950/2x5830/4890, Geforce GTX580/2x275
1680x1050 M10: Radeon HD2900XT/3850/4670/5570, Geforce 8800GT/9600GT/GT240/GTS450
1680x1050 M20: Radeon HD3870X2/4770/5750, Geforce 9800GX2/GTX260/GTS450
1680x1050 M30: Radeon HD4890/5830/6850, Geforce GTX285/460
1680x1050 M40: Radeon HD4850X2/5850/6850, Geforce GTX295/470
1680x1050 M50: Radeon HD4870X2/5870/6950, Geforce GTX480
1920x1080 M10: Radeon HD2900XT/3850/4670/5570, Geforce 8800GT/9800GT/GT240/GTS450
1920x1080 M20: Radeon HD3870X2/4860/5770, Geforce 9800GX2/GTX260-216/460
1920x1080 M30: Radeon HD4850X2/5830/6850, Geforce GTX295/470
1920x1080 M40: Radeon HD4870X2/5870/6870, Geforce GTX295/480
1920x1080 M50: Radeon HD6950/2x5830, 2xGeforce GTX465
1920x1080 M60: Radeon HD5970/6970, 2xGeforce GTX470
1920x1200 M10: Radeon HD2900XT/3870/4670/5570, Geforce 8800GT/9800GT/GT240/GTS450
1920x1200 M20: Radeon HD3870X2/4870/5830/6850, Geforce 9800GX2/GTX280/460
1920x1200 M30: Radeon HD4850X2/5850/6850, Geforce GTX295/470
1920x1200 M40: Radeon HD6950/2x5830/4890, Geforce GTX480
1920x1200 M50: Radeon HD6970/2x5830, 2xGeforce GTX470
1920x1200 M60: Radeon HD5970/2x6870, 2xGeforce GTX480
2560x1600 M10: Radeon HD3870X2/4770/5750, Geforce 9800GTX+/GTS250/450
2560x1600 M20: Radeon HD4850X2/5850/6850, Geforce GTX295/470
2560x1600 M30: Radeon HD6950/2x5830, 2x Geforce GTX460. GTX580 possible
2560x1600 M40: Radeon HD5970/2x6870, 2x Geforce GTX480
2560x1600 M50: 2x Radeon HD6950/3x5850, 2x Geforce GTX480
2560x1600 M60: 2x Radeon HD6970/3x5870, 3x Geforce GTX480
High Quality
1280x1024 M10: Radeon HD2900XT/3850/4670/5570, Geforce 8800GT/9800GT/GT240/GTS450
1280x1024 M20: Radeon HD3870X2/4770/5770, Geforce 9800GX2/GTX260/GTS450
1280x1024 M30: Radeon HD4850X2/5830/6850, Geforce GTX295/460
1280x1024 M40: Radeon HD4870X2/5870/6870, Geforce GTX295/480
1280x1024 M50: Radeon HD6950/2x5830/4890, Geforce GTX580/2xGTX285
1280x1024 M60: Radeon HD5970/6970, 2x Geforce GTX470
1680x1050 M10: Radeon HD2900XT/3870/4830/5670, Geforce 8800GT/9800GT/GTS250/450
1680x1050 M20: Radeon HD4870/5830/6850, Geforce GTX260-216/460
1680x1050 M30: Radeon HD4850X2/5850/6850, Geforce GTX295/470
1680x1050 M40: Radeon HD6950/2x5830/4890, Geforce GTX480
1680x1050 M50: Radeon HD6970/2x5830, 2x Geforce GTX460 1GB
1680x1050 M60: Radeon HD5970/2x6870, 2x Geforce GTX470
1920x1080 M10: Radeon HD3870X2/4830/5670, Geforce 8800GTS 512/9800GTX/GTS250/450
1920x1080 M20: Radeon HD4890/5830/6850, Geforce GTX275/460
1920x1080 M30: Radeon HD4870X2/5870/6870, Geforce GTX295/480
1920x1080 M40: Radeon HD6970/2x5830, 2x Geforce GTX465
1920x1080 M50: Radeon HD5970/2x6870, 2x Geforce GTX470
1920x1080 M60: 2x Radeon HD6950/3x5830, 2x Geforce GTX480
1920x1200 M10: Radeon HD3870X2/4830/5750, Geforce 8800GTS 512/9800GTX/GTS250/450
1920x1200 M20: Radeon HD4850X2/5830/6850, Geforce GTX295/460
1920x1200 M30: Radeon HD4870X2/5870/6950, Geforce GTX295/480
1920x1200 M40: Radeon HD6970/2x5830, 2x Geforce GTX460 1GB
1920x1200 M50: 2xRadeon HD6870/5870, 2x Geforce GTX480
1920x1200 M60: 2xRadeon HD6950/3x5850, 2x Geforce GTX480
2560x1600 M10: Radeon HD3870X2/4860/5770, Geforce 9800GX2/GTX260-216/GTS450
2560x1600 M20: Radeon HD4870X2/5870/6950, Geforce GTX295/480
2560x1600 M30: Radeon HD5970/2x6850, 2x Geforce GTX470
2560x1600 M40: 2x Radeon HD6850/3x5850, 2x Geforce GTX480
2560x1600 M50: 2x Radeon HD6970/3x5870, 3x Geforce GTX480
2560x1600 M60: 3x Radeon HD6950/4x5870, 4x Geforce GTX480
Very High Quality
1280x1024 M10: Radeon HD2900XT/HD3870/4830/5670, Geforce 8800GT/9800GT/GTS250/450
1280x1024 M20: Radeon HD4870/5830/6850, Geforce GTX280/460
1280x1024 M30: Radeon HD4850X2/5850/6850, Geforce GTX295/470
1280x1024 M40: Radeon HD6950/2x5830/4890, Geforce GTX480
1280x1024 M50: Radeon HD5970/6970, 2x Geforce GTX460 1GB
1280x1024 M60: 2xRadeon HD5870/6870, 2x Geforce GTX470
1680x1050 M10: Radeon HD3870X2/4830/5670, Geforce 8800GTS 512/9800GTX/GTS250/450
1680x1050 M20: Radeon HD4890/5830/6850, Geforce GTX285/460
1680x1050 M30: Radeon HD4870X2/5870/6870, Geforce GTX295/480
1680x1050 M40: Radeon HD6970/2x5830, 2x Geforce GTX460
1680x1050 M50: Radeon HD5970/2x6870, 2x Geforce GTX470
1680x1050 M60: 2x Radeon HD6950/3x5850, 2x Geforce GTX480
1920x1080 M10: Radeon HD3870X2/4770/5750, Geforce 8800GTX/9800GTX/GTS250/450
1920x1080 M20: Radeon HD4850X2/5830/6850, Geforce GTX295/465
1920x1080 M30: Radeon HD6950/2x5830/4890, Geforce GTX480
1920x1080 M40: Radeon HD5970/2x6850, 2x Geforce GTX470
1920x1080 M50: 2x Radeon HD6950/3x5830, 2x Geforce GTX480
1920x1080 M60: 2x Radeon HD6970/3x5850, 3x Geforce GTX470
1920x1200 M10: Radeon HD3870X2/4770/5750, Geforce 9800GTX+/GTS250/450
1920x1200 M20: Radeon HD4850X2/5850/6850, Geforce GTX295/470
1920x1200 M30: Radeon HD6950/2x5830, Geforce GTX580/2x GTX285/460
1920x1200 M40: Radeon HD5970/2x6870, 2x Geforce GTX470
1920x1200 M50: 2x Radeon HD6950/3x5850, 2x Geforce GTX480
1920x1200 M60: 2x Radeon HD6970/3x5870, 3x Geforce GTX480
2560x1600 M10: Radeon HD4890/5830/6850, Geforce GTX280/460
2560x1600 M20: Radeon HD6970/2x5830, 2x Geforce GTX285/460
2560x1600 M30: 2x Radeon HD5870/6950, 2x Geforce GTX480
2560x1600 M40: 2x Radeon HD6970/3x5870, 3x Geforce GTX480
2560x1600 M50: 3x Radeon HD6950/4x5870, 4x Geforce GTX470
2560x1600 M60: 3x Radeon HD6970, 4x Geforce GTX480
Low Quality CPU test
Single core
M10: Single core of QX9770/i7 920/i5 750
M20 and above: N/A
Dual core:
M10: Pentium E2160, Core 2 Duo E4300, Athlon64 X2 3800+, Pentium D 950 (3.2Ghz)
M20: Core 2 Duo E8400, E6850 OC@ 3.3Ghz, any Core i5 dual core, Phenom II X2 555 OC @ 3.3Ghz
M30: Core i5 dual core @ 3Ghz with HT enabled, 3.9Ghz without
Tri-core:
M10: Any 3-core processor
M20: 3 cores of any Intel quad core CPU, any Athlon II X3
M30: 3 cores of Q8400+, QX6850+, any Core i series Quad core
M40: 3 cores of Q9650 OC @ 3.45Ghz, any Core i series Quad core
M50: 3 cores of i7 950 without HT, any Core i7 with HT
M60: 3 cores of i7 950 with HT
Quad-core:
M10: Any quad core processor
M20: Phenom 9650 or above
M30: Phenom 9850 or above
M40: Core 2 Quad Q9450/Phenom II X4 940/QX6850
M50: Core 2 Quad Q9650 OC @ 3.3Ghz/any Core i series quad core CPU (HT off), Core i7 930 with HT on
M60: Core i7 960 @ 3.33Ghz with HT on, Core i7 920/i5 750 with HT off
Oh and by the way, I found the source of the parasitic glitch with ATI drivers. It's the steam installer. If you install an ATI driver using steam, should a driver install ever bug, you'll be stuck. Avoid the steam drivers!
|
AfterDawn Addict
|
19. October 2010 @ 13:31 |
Link to this message
|
man, GTX580, the whole 6 series naming waste, and possibiolity of a 5770 to 6770 rebrand. what a sad "generation"
MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
|
AfterDawn Addict
4 product reviews
|
19. October 2010 @ 13:33 |
Link to this message
|
Rebrands come from stagnation, and that's exactly what this is. Without a process drop, there's little else that can be done, save expanding die size to the maximum, something nvidia did right from the off with the GTX480, but something ATI can gleam a large amount of performance out of. Irony is it makes a mockery of their ad campaign quoting how efficient their cards are. The 6970's going to be a big 250W+ hog like the 480. Hopefully though, it'll be substantially faster, and quieter.
|
harvrdguy
Senior Member
|
19. October 2010 @ 16:47 |
Link to this message
|
Originally posted by sam: The 6970's going to be a big 250W+ hog like the 480. Hopefully though, it'll be substantially faster, and quieter.
Sounds like you don't yet have the performance numbers. What kind of price do you think they'll come up with - it's a two-gpu board, right?
On another note, Miles mentioned that he has playtested Dota2 just once, last week, and find it way complicated. He mentioned that he is sure it is great, since the guys at Valve who are into it are totally addicted, but he found it Meh.
Hey Sam, how about a one-paragraph description of HoN - Dota - which you have logged 300 hours on. Why is it not Meh?
LOL
|
AfterDawn Addict
4 product reviews
|
19. October 2010 @ 16:49 |
Link to this message
|
No the 6970 is a single GPU. It will be about 45% faster than a 5870.
|
harvrdguy
Senior Member
|
19. October 2010 @ 18:23 |
Link to this message
|
Excellent!
|
AfterDawn Addict
7 product reviews
|
20. October 2010 @ 14:32 |
Link to this message
|
Hey guys, what do you use to monitor temperatures while playing games, or perhaps watching movies. Is there a program that can display a northbridge Motherboard temperature while playing back a movie in fullscreen mode? The computer I'm working on crashes, and I'm uncertain what the temperature is when it crashes. It completely freezes when it does crash. So I'm confident that if the temperature is being displayed at the time of crash, I'll know then ;)
Apparently The HTPC is suffering from inadequate airflow. The PSU's exhaust is not cutting it. I'm gonna fashion double fan's exhaust near the cpu, to pull that areas heat out of the case. If I leave the panel off the HTPC, under load it only reaches 44 - 45C. But if I put it on, it overheats. I believe the GPU is blocking the Northbridge from getting adequate air.
I removed the GPU (GF210). I set a 70mm fan on top of the NB sink. Drastically affected temps. How I'm gonna adapt VGA d-sub to HDMI is beyond me. VGA D-sub is different then HD-15 VGA. At least I think there's a difference. I was comparing adapters before I bought the GF210. Surely there's a way to adapt analog VGA straight to HDMI?????
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 20. October 2010 @ 16:43
|
Red_Maw
Senior Member
|
20. October 2010 @ 18:28 |
Link to this message
|
I just set the program to log the temperature to a file so I can go back after the crash and see if there's any correlation. For short periods of time I usually use everest since you can log pretty much everything.
|
AfterDawn Addict
7 product reviews
|
20. October 2010 @ 18:45 |
Link to this message
|
That's useful. I thought to do that with speedfan, but its not capable of monitoring NB temps :(
I'll look for real time temperature monitors tonight. I'd really prefer a real time posting on the screen. I'm surprised Fraps haven't done this yet :/
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
15 product reviews
|
20. October 2010 @ 21:32 |
Link to this message
|
If you have a separate monitor anywhere in the house that would work, at least long enough to find out the issue.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
harvrdguy
Senior Member
|
21. October 2010 @ 00:00 |
Link to this message
|
Logging to a file sounds good - I was doing just some very short term logging tonight just comparing hd upscale - but that everest might work for you.
But on the analog vga to digital hdmi, I think you're talking about a $300 converter. Going the other direction, from digital to vga analog, which we do all the time when a graphics card comes only with dvi ports, of course it's just a $2 piece of hardware that shorts some connection and tells the graphics card to shove analog out the port, not digital. But when I looked it up about a year ago, to find why I couldn't get a cheap connection to do the opposite, like you're asking now, I encountered a whole different ball of wax, and the solutions involved active components that are expensive.
On another note, Kevin, since you're really into HTPC, I want to ask you what software you think I should get for dvd upscaling.
I was fooling around with the phenom today, the 9750 - wondering why Divx Media Player Plus won't play dvds. The divx player is wonderful about upscaling a video to HD - we have an avi called Earth that displays so beautifully on the 1920x1080 47" Toshiba, that when you're running it on the Divx player, you almost think it's HD.
Anyway, I did a lot of reading today, and two players were mentioned the most, Arc Soft, and Power DVD 10 with 3d imaging. There was a lot of talk about fddshow - which I guess is for the more sophisticated guys like you. Arc Soft was originally just for nvidia cuda, and not for Ati stream processing, but they said that they have added support for ati cards. However, on the p4 with the 3850, and on the phenom with the 3650, arcsoft downloaded an activision control that said that their product, called HDSim, high def similation, would not run on either computer. Weird! (I know the hd3850 is about twice as powerful as the hd3650, but I have read all over the net that the 3650 is quite a capable dxva card.)
So that made me suspicious that maybe power dvd was lying when they said they supported ati cards. But they let you download the 120MB $50 power dvd 10 for free, for a 30 day trial.
To my eyes and other's eyes, the picture is much better than, for example, vlc player, which is playing the western DVD, Broken Trail, just fine. (Great DVD by the way if you haven't watched it.)
I used gpu-z to log about 60 seconds of gpu load, and then later I logged 30 seconds of cpu load using task manager.
The first surprise was that the gpu load of vlc player was nearly twice the gpu load of power dvd, maybe 20% or so - holding pretty steady, versus maybe 10-14% for power dvd, jumping up and down. Here's the gpu load of VLC:
And here's the gpu load of Power DVD player:
And here's the comparison with both graphs magnified about 8 times:
That was wayyy counter-intuitive to me, because I knew power dvd was doing a lot of processing - you could definitely see a better quality in the upscaling it was doing.
And it wasn't my imagination. A guy named Thomas walked in when the vlc player was running, and then saw me crank up the same part of the DVD with power dvd, and blurted out loud, "That is a great picture."
I thanked him for commenting and I told him I was trying to compare the current player, Power DVD, to a player that likely wasn't doing much if any upscaling. Tommy said "Was that what was just playing when I first walked in?"
I said, "Yes, I'll show you the difference." He said, "You don't need to show me - I have an eye for that and I noticed the difference right away." And then I remembered - he used to install high end home theater systems. He said that he could easily see that the picture quality was far superior with the power dvd player.
Then I showed him the results of the gpu-z logging, where I had saved both sets of graphs in a rich text file. I told him I couldn't understand why the player with hardly any upscaling, was loading the gpu twice as much as the better quality player.
I said, "I'm going to take a look now at the cpu loading." Tom isn't too computer savvy. I showed him the task manager.
I said, "I'm gonna widen this thing, which is going to give me about 30 seconds of logging, and then I will place the graphs for safe keeping in this text file with the gpu charts. Maybe the cpu logging will explain how Power DVD is getting all the work done to upscale that DVD to HD."
Tom said he had other things to do, but he'd be back to see the results. After 30 seconds here is what VLC player loads the quad core phenom 9750:
Then I ran Power DVD, roughly the same scene:
Well, that tells the whole story right there!! (The keyboard sits in a dark cabinet - you see some spikes and a delay as I turn off the player and try to correctly hit the Alt PrintScreen keys.)
Jeeez Loueeeeez! VLC loads only one core at about 40%. The other three cores are idle. Power DVD on the other hand loads all four cores, about 30% each core. So power dvd is using at least three times more cpu processing power, balanced quite evenly over all four cores, and doesn't need to load the 3650 much at all.
Anyway, I found that interesting for some reason.
So, Kevin, what player should I really get? Do you like the Power DVD player? It has a little 3d button on the bottom, and when I pressed it the image turned sort of black and white on the screen, which tells me that if I had a pair of 3d glasses, I might have seen something interesting. We saw Avatar in 3D - where did I put those colored glasses, lol?
Rich
This message has been edited since posting. Last time this message was edited on 21. October 2010 @ 00:15
|
AfterDawn Addict
15 product reviews
|
21. October 2010 @ 00:34 |
Link to this message
|
I really like Media Player Classic with the K-Lite codec pack installed. The upscaling is beautiful and you DON'T HAVE TO PAY. Of course VLC's image quality sucks, I could've told you that. There is no comparison between VLC and a proper media player.
FFDshow is the media engine which media players like Windows Media Player, Media Player Classic, Power DVD and many others use. It is not, like you guessed, a media player on its own. Usually, when you install codecs or play with video color settings on the graphics control panel, the effects are meant to be applied to FFDshow. So when you see something mentioning FFDshow, it is simply referencing the engine, not a specific codec or media player.
VLC is very useful for files that no other media player will play. Beyond that it is entirely useless due to the aforementioned image quality and several compatibility issues with many types of features. The inability to properly utilize dual-audio MKVs is one that stuck out in my mind. VLC is an entire video engine on its own though. It is completely isolated from FFDshow and like media engines, and as such will not be affected by video enhancements or codecs. It simply plays the raw file. This of course also means it does no true upscaling.
And for the record paying for a media player is for chumps. The most powerful, flexible, compatible media player that exists is absolutely free. Try installing Media Player Classic with the K-Lite codec pack then go in MPCs options and enable DXVA. I believe to do this it's View -> Options -> Playback -> Output. Set everything to VMR9(Renderless) or DirectX 9 then set it to "Use texture surfaces and render video in 3D." This should fully enable DXVA.
Also keep in mind that using AA with DXVA enabled causes un-necessary strain on the GPU with no visual effect at all. If you are going to use DXVA disable AA in the graphics control panel. This applies to all DXVA capable media players except VLC.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 21. October 2010 @ 00:35
|
AfterDawn Addict
7 product reviews
|
21. October 2010 @ 00:38 |
Link to this message
|
I haven't used Power dvd in quite some time. And when I started playing BD discs, a lot of my searching seemed to point to Arcsoft TMT. I tried it, and loved it. Dvds never look like HD on my screen. But I am only 2 feet from it. I'm also highly critical about how dvds look. I've become spoiled to true full HD :p Even 720 is much better than dvd. In fact, I really gotta look to tell the difference between 720 and 1080 on my screen.
Arcsoft TMT rarely crashes. It has happened, but I think power dvd caused me more trouble in the past. That was some time ago though. Perhaps I'll try them again ;)
I've never seen a real good upscale. The blu ray/dvd upscaler in the living room supposedly upscales. I don't see it. Looks like a dvd to me :p
Using a second monitor is too much trouble...
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 21. October 2010 @ 00:39
|
Senior Member
4 product reviews
|
21. October 2010 @ 03:23 |
Link to this message
|
DVI-D and HDMI work in the sameway signaling wise. hence there are DVI-D to HDMI adapters. VGA and DVI-A do not work in the sameway. DVI-A's wiring is the same or close to DVI-D's but the topology is different.
so to get VGA to HDMI you need a Converter Box which are expensive, you'll also loose HDCP and other HDMI enhancements. not the way to go for sure.
|
AfterDawn Addict
7 product reviews
|
21. October 2010 @ 03:37 |
Link to this message
|
Originally posted by DXR88: DVI-D and HDMI work in the sameway signaling wise. hence there are DVI-D to HDMI adapters. VGA and DVI-A do not work in the sameway. DVI-A's wiring is the same or close to DVI-D's but the topology is different.
so to get VGA to HDMI you need a Converter Box which are expensive, you'll also loose HDCP and other HDMI enhancements. not the way to go for sure.
Are you 100% sure on the expensive converter box? I've been looking at an alternative means which would only cost me a fraction of the cost :p May not be as crisp as HD, but it would still blow DVD out of the water ;)
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
21. October 2010 @ 10:09 |
Link to this message
|
D-Sub and VGA are one and the same. VGA does not use anything but the same connector.
VGA is I would argue the slang term, since VGA officially refers to a resolution of 640x480 only.
Second monitors are mega handy for this sort of thing.
Rich has it pretty much spot on with the video connectors, digital to analog is a bit false, as graphics cards can output analog natively anyway, just through a digital port like DVI. Analog connectivity only however, needs a digital interpreter to become a digital signal, which is expensive.
I have used, but dislike, FFDShow. It's now included as standard with Media Player Classic HC, annoying, as it has several flaws.
I typically prefer CoreAVC, but it is not as compatible (for example, no native FLV support)
GPU decoding is miles more efficient than CPU decoding, but it obviously turns out that whichever program is using the GPU decoding is using a naff codec to do it.
|
Advertisement
|
|
|
AfterDawn Addict
7 product reviews
|
21. October 2010 @ 12:44 |
Link to this message
|
I realize that VGA and D-sub are the same. But HD-15 and D-sub are not the same apparently. You cannot hook an HD-15 connector to standard VGA, and expect it to work. I have an HD-15 to HDMI adapting cable, that will not relay a video signal, from a D-sub port. Because it's an analog signal? Can the port itself be taught to send Digital?
I have in my possession an 11 pin Dvi male to VGA. Came with my Asus 8600Gt. Which would connect to the following DVI Female to HDMI.
http://www.newegg.com/Product/Product.aspx?Item=N82E16812270288
Would that not work? I wonder if I'm missing something. :S
I'm sorry. I'm trying to understand, but clearly this is eluding me...
My brothers last HDTV, was a Dynex. It would accept standard VGA analog signals. I miss that tv LOL! If he still had it, I wouldn't have to worry about the damn GF210. I would have never bought it! The signal was only slightly fuzzy. They thought it looked fine. I however am critical ;)
To delete, or not to delete. THAT is the question!
This message has been edited since posting. Last time this message was edited on 21. October 2010 @ 14:43
|
|