|
The Official PC building thread - 4th Edition
|
|
|
AfterDawn Addict
15 product reviews
|
13. October 2011 @ 09:19 |
Link to this message
|
Russ I don't see any bias there. They're testing CPUs at stock speeds. Also, they're testing the fastest CPUs AMD has ever made, against decidedly not Intel's best, disregarding the 990X. If anything, the test is biased heavily in AMD's favor. The AMD CPUs are just slower. This is a fact. I don't know what you keep looking for Russ. Finding the right graph won't suddenly make AMD CPUs faster.
Also, I see nothing about the boards you mentioned at all. Disregarding the fact that none of the CPUs in the shown tests will even work in the boards you mentioned. So I really don't know what angle you're going for.
I mean wtf do you keep going on about it's not a fair test? Is it not a fair test unless AMD wins? If I was shown a graph with AMD winning, common sense tells me that's the unfair test. Every single benchmark we've seen so far has shown Bulldozer getting its ass handed to it. Are you trying to say that every single reputable review site is in Intel's pockets and that's why Bulldozer sucks? Sorry Russ, but Intel bribing a review site doesn't suddenly make a CPU better or worse. If Bulldozer sucks, there's nothing a slightly skewed graph is going to do to change it.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 13. October 2011 @ 09:22
|
Advertisement
|
|
|
AfterDawn Addict
|
13. October 2011 @ 09:30 |
Link to this message
|
GigaByte 990FXA-UD5 - AMD FX-8320 @4.0GHz @1.312v - Corsair H-60 liquid CPU Cooler - 4x4 GB GSkill RipJaws DDR3/1866 Cas8, 8-9-9-24 - Corsair 400-R Case - OCZ FATAL1TY 550 watt Modular PSU - Intel 330 120GB SATA III SSD - WD Black 500GB SATA III - WD black 1 TB Sata III - WD Black 500GB SATA II - 2 Asus DRW-24B1ST DVD-Burner - Sony 420W 5.1 PL-II Suround Sound - GigaByte GTX550/1GB 970 Mhz Video - Asus VE247H 23.6" HDMI 1080p Monitor
|
AfterDawn Addict
|
13. October 2011 @ 09:51 |
Link to this message
|
for the CPU argument, I used to have an e5200 at 3.5 GHz, which is a dual core chip. I then added an extra 4870 into crossfire, and in battlefield bad company 2, I saw near no increase, maybe 5 fps. I then added a q6600, clocked it to 3.4 and it damn near doubled my frame rates.
CPU bottleneck indeed.
bulldozer is onlygood in insanely multithreaded programs. So what one program you use or two, compared to the tens to hundreds that end up being worse of compared to the 1100t.
not to mention very soon sandybridge e will be coming out to further the gap, though that's a different price point.
MGR (Micro Gaming Rig) .|. Intel Q6600 @ 3.45GHz .|. Asus P35 P5K-E/WiFi .|. 4GB 1066MHz Geil Black Dragon RAM .|. Samsung F60 SSD .|. Corsair H50-1 Cooler .|. Sapphire 4870 512MB .|. Lian Li PC-A70B .|. Be Queit P7 Dark Power Pro 850W PSU .|. 24" 1920x1200 DGM (MVA Panel) .|. 24" 1920x1080 Dell (TN Panel) .|.
|
AfterDawn Addict
|
13. October 2011 @ 10:15 |
Link to this message
|
Originally posted by sammorris: Originally posted by theonejrs: Sam,
Take a look at the chip lineup on your graph. Look at all the crap they show for AMD cpus. A 3.0GHz Phenom IIx4 940 BE, an obsolete chip, not even produced anymore. Then there's another obsolete, no longer produced chip in the 2.6GHz Athlon IIx4 620, and yet another in the Phenom IIx2 550! There seems to be a little bias towards Intel, but I'm sure you won't find anything wrong with the way they've done things though. All Intel chips shown are modern except for the Q9550 and the E8400. Where's the Phenom IIx4s, x3s, and x2s? Where's the 1090T BE, which is a better all around chip than the 1100T BE?
Then there's the motherboards. They chose an obsolete GIGABYTE GA-MA790GP-DS4H AM2+/AM2 AMD 790GX with DDR2 to test the AMDs, but chose a more modern P55-GD55 motherboard with DDR3 to test the Intels. But it's a fair example, right? Not by a long shot! Get rid of all the junk and give us an honest showing next time!
Russ
The P55-GD55 is as old as LGA1156 gets, that was one of the early boards from late 2009. On top of that, you can't test Sandy bridge CPUs in an LGA1156 board as it's a different socket!
Meanwhile, in the land of people that can actually read:
Originally posted by bit-tech: Testing
To test the AMD processors we used the Asus Crosshair V Formula motherboard which is based on the AMD 990FX chipset and Socket AM3+ CPU socket.
For the Intel LGA1155 processors, we used the Asus Maximus IV Gene-Z and for the LGA1336 processors, we used the Asus Sabertooth X58. All three motherboards have proved to be excellent overclockers, and yet cost a reasonable amount, so are great options for everyday PCs running overclocked CPUs
Originally posted by techreport: http://techreport.com/articles.x/21813/5
You'll notice the use of an 890GPA-UD3H for the AMDs, A Crosshair 5 Formula for the Bulldozers, a DX58SO2 for the i7 6-core, a P5E3 Premium for the Core 2s and a P7P55D-E Pro for the sandy bridge CPUs. All seems fair to me.
Originally posted by HardOCP: http://hardocp.com/article/2011/10/11/amd_bulldozer_fx8150_desktop_performance_review/3
Here, a Crosshair 5 Formula is used for all the AMDs, and an MSI Z68A-GD65 for the sandy bridge CPUs.
Funny, not an MSI P55 or 790 chipset AMD board to be seen!
Sam,
First, please watch your wise mouth and respect your elders. I can read perfectly well, thank you! Please look carefully at the top of the graph!
There's also a Gigabyte MA785GT listed as well, but it's not a valid motherboard. I forgot the MSI X58 Eclipse Plus model that's obsolete as well, but still DDR3! Also note that there is a grand total of one modern AMD CPU, and that's the 1100T, the other 3 AMD CPUs are all obsolete and out of production. I stand by what I said!
Russ
GigaByte 990FXA-UD5 - AMD FX-8320 @4.0GHz @1.312v - Corsair H-60 liquid CPU Cooler - 4x4 GB GSkill RipJaws DDR3/1866 Cas8, 8-9-9-24 - Corsair 400-R Case - OCZ FATAL1TY 550 watt Modular PSU - Intel 330 120GB SATA III SSD - WD Black 500GB SATA III - WD black 1 TB Sata III - WD Black 500GB SATA II - 2 Asus DRW-24B1ST DVD-Burner - Sony 420W 5.1 PL-II Suround Sound - GigaByte GTX550/1GB 970 Mhz Video - Asus VE247H 23.6" HDMI 1080p Monitor
|
AfterDawn Addict
4 product reviews
|
13. October 2011 @ 11:13 |
Link to this message
|
Originally posted by shaffaaf: for the CPU argument, I used to have an e5200 at 3.5 GHz, which is a dual core chip. I then added an extra 4870 into crossfire, and in battlefield bad company 2, I saw near no increase, maybe 5 fps. I then added a q6600, clocked it to 3.4 and it damn near doubled my frame rates.
CPU bottleneck indeed.
bulldozer is onlygood in insanely multithreaded programs. So what one program you use or two, compared to the tens to hundreds that end up being worse of compared to the 1100t.
not to mention very soon sandybridge e will be coming out to further the gap, though that's a different price point.
Same story here Shaff, when I first went crossfire with the HD4870X2 I saw almost no benefit at all with my E4300, even clocked all the way to 75% beyond its normal speed. Once I stuck a Q6600 in there, and latterly a Q9550, the frame rates jumped enormously.
Originally posted by theonejrs: Sam,
First, please watch your wise mouth and respect your elders. I can read perfectly well, thank you! Please look carefully at the top of the graph!
Russ, with all due respect, I have to be brash sometimes to get your attention. Every discussion we have on these matters is littered with mistakes from you, and by the time we've cleared these up, the original argument is lost in pages of back and forth about who said what and who meant what. The only reason I posted that chart from GameGPU was to illustrate that some games can be extremely demanding on the CPU and it's not all just about the graphics! Did you not notice there's no Bulldozer on that chart?
This message has been edited since posting. Last time this message was edited on 13. October 2011 @ 11:14
|
AfterDawn Addict
|
13. October 2011 @ 11:50 |
Link to this message
|
Originally posted by sammorris: Originally posted by shaffaaf: for the CPU argument, I used to have an e5200 at 3.5 GHz, which is a dual core chip. I then added an extra 4870 into crossfire, and in battlefield bad company 2, I saw near no increase, maybe 5 fps. I then added a q6600, clocked it to 3.4 and it damn near doubled my frame rates.
CPU bottleneck indeed.
bulldozer is onlygood in insanely multithreaded programs. So what one program you use or two, compared to the tens to hundreds that end up being worse of compared to the 1100t.
not to mention very soon sandybridge e will be coming out to further the gap, though that's a different price point.
Same story here Shaff, when I first went crossfire with the HD4870X2 I saw almost no benefit at all with my E4300, even clocked all the way to 75% beyond its normal speed. Once I stuck a Q6600 in there, and latterly a Q9550, the frame rates jumped enormously.
Originally posted by theonejrs: Sam,
First, please watch your wise mouth and respect your elders. I can read perfectly well, thank you! Please look carefully at the top of the graph!
Russ, with all due respect, I have to be brash sometimes to get your attention. Every discussion we have on these matters is littered with mistakes from you, and by the time we've cleared these up, the original argument is lost in pages of back and forth about who said what and who meant what. The only reason I posted that chart from GameGPU was to illustrate that some games can be extremely demanding on the CPU and it's not all just about the graphics! Did you not notice there's no Bulldozer on that chart?
Sam,
With all due respect I understood you were only using that chart as an example. All I was pointing out was the most definite leanings toward favoring the Intels. It still turned out to be a poor example though, for an entirely different reason than the graph was intended to be posted for. I thought it was rather comical and a bit ironic. Here we are complaining about AMD not getting a fair shake by testing on the older AM2+ motherboards, and other things wholly intended to make AMD look worse than they really are, and you go and hand us the proof to validate that point on both counts. ;<) Thank you Sam!
Russ
GigaByte 990FXA-UD5 - AMD FX-8320 @4.0GHz @1.312v - Corsair H-60 liquid CPU Cooler - 4x4 GB GSkill RipJaws DDR3/1866 Cas8, 8-9-9-24 - Corsair 400-R Case - OCZ FATAL1TY 550 watt Modular PSU - Intel 330 120GB SATA III SSD - WD Black 500GB SATA III - WD black 1 TB Sata III - WD Black 500GB SATA II - 2 Asus DRW-24B1ST DVD-Burner - Sony 420W 5.1 PL-II Suround Sound - GigaByte GTX550/1GB 970 Mhz Video - Asus VE247H 23.6" HDMI 1080p Monitor
|
AfterDawn Addict
4 product reviews
|
13. October 2011 @ 12:37 |
Link to this message
|
Favouring the Intels in what. Who from? It's difficult to interpret your point of view, but all I can gleam so far is that every single review site favours Intel. If that's the case, then they're probably not biased, Intel are just better than you thought they were!
|
AfterDawn Addict
7 product reviews
|
13. October 2011 @ 13:27 |
Link to this message
|
Originally posted by AnandTech: AMD also shared with us that Windows 7 isn't really all that optimized for Bulldozer. Given AMD's unique multi-core module architecture, the OS scheduler needs to know when to place threads on a single module (with shared caches) vs. on separate modules with dedicated caches. Windows 7's scheduler isn't aware of Bulldozer's architecture and as a result sort of places threads wherever it sees fit, regardless of optimal placement. Windows 8 is expected to correct this, however given the short lead time on Bulldozer reviews we weren't able to do much experimenting with Windows 8 performance on the platform. There's also the fact that Windows 8 isn't expected out until the end of next year, at which point we'll likely see an upgraded successor to Bulldozer.
I thought this interesting. It does make one wonder, just how much it effected the tests. Probably not much, but it does require attention eh?
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
15 product reviews
|
13. October 2011 @ 13:40 |
Link to this message
|
I'm all excited to go Sandy Bridge now. i5 2500k here I come baby. Speeeeeeed!!!
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
7 product reviews
|
13. October 2011 @ 13:47 |
Link to this message
|
I'd like to see how much you spend on that system ;) If I go intel, I'm gonna have to buy ALL new parts. I suppose the same is for AMD. I want to keep my current for a secondary. If I have time today, I'll try and do a comparison for myself.
To delete, or not to delete. THAT is the question!
|
Senior Member
|
13. October 2011 @ 15:03 |
Link to this message
|
Originally posted by sammorris: Originally posted by Mr-Movies: Pings are directly related to bandwidth, when you ping a network/internet address you are measuring response time between the host and you. The host is a factor as well as bandwidth.
Nope - I had a ping of 6ms on a 4Mbps connection, yet I know plenty of people with a ping of 25ms on 24Mbps connections. How so? ADSL Interleave, for a start. Network capacity, distance from the exchange, distance between you and the remote sever, it all adds up.
The speed of light means even at full light speed (which fibre-optic cables can't make full use of) in 1ms light can only travel 300km. If we take the assumption that fibre optic cable carries information at a third of light speed, then to travel 5000km between here and the US east coast, that'd be 50ms. As you may notice, latencies between the UK and the US are indeed, a minimum of around 60-70ms. This has nothing to do with bandwidth, you could see this with a 40Mbps pipe, or a 100Gbps pipe. It's simple physics.
My ping rate explanation is nuts on like I stated before. Look the definition up I'm an electrical engineer I use to design optical cabling system as well as all other communication systems for the military and other. The medium doesn't change how ping works or is measured. Distance is a factor, how many connections there are, the host speed. Pinging measures the delay in transmitting between your computer and others (host) IP address and is measured in ms. It is no simpler than that so you can carry on all you want but that is the fact period.
Originally posted by sammorris: All this said, it's important to take CPU speed into perspective for things like gaming. It may be a considerable step ahead of the Phenom 2s, but I recently throttled back my i5 to stock clock speed (thermal paste problem, the cooler needs redoing), and really, I practically don't notice the difference. This though, is probably because even at stock it equates to a Phenom 2 at 3.6Ghz. This then, when considered the newer i5s are 35% faster to start with and will overclock to almost 50% beyond their stock speeds, you can understand why I tell people to stick i5s in their new gaming systems, $220 isn't a lot to pay for a CPU that can do that sort of work, especially when these people are spending at least that sort of money, often more, on graphics hardware.
If you read my reply properly you would see that I resigned somewhat to exactly what you loosely re-iterated here only I think I did a better job at explaining it then you of course.
As to your Benchmark links, You treat benchmark sites like they are truth, fact, and of course they are not. I have yet to find a site that isn't contradicted by another sites report. This alone would tell me these sites are nonsense as a whole, either on purpose or by poor testing parameters, or by inept testers. Just because you can find a link that says something on the internet certainly doesn't make it true or fact. That's not to say everything you find is wrong either so don't go literal the other way like I know you would argue back.
|
AfterDawn Addict
7 product reviews
|
13. October 2011 @ 17:26 |
Link to this message
|
I found the Photoshop benchmark laughable. LOL! You're really not gonna see a difference there thank you very much! Unless you're talking rendering filters. And even then, you're talking milliseconds eh? I've used countless CPU's within photoshop, and the differences are nearly negligible.
To delete, or not to delete. THAT is the question!
|
AfterDawn Addict
4 product reviews
|
13. October 2011 @ 17:37 |
Link to this message
|
Your explanation of ping isn't at fault, but you referred to the term 'bandwidth' when you meant 'latency' - two different things. That's where the confusion came from.
As for the benchmarks, just because they deviate that doesn't mean they're false. This suggests variance in similar systems, which does happen. Any decent site states their testing parameters so you can tell whether they're appropriate or not.
|
Senior Member
|
13. October 2011 @ 17:49 |
Link to this message
|
Originally posted by sammorris: Your explanation of ping isn't at fault, but you referred to the term 'bandwidth' when you meant 'latency' - two different things. That's where the confusion came from.
As for the benchmarks, just because they deviate that doesn't mean they're false. This suggests variance in similar systems, which does happen. Any decent site states their testing parameters so you can tell whether they're appropriate or not.
Latency was implied by delay so I did cover that and bandwidth, connections, and so on still enter in. Latency is the sum of that delay measured in ms as I stated so I covered it correctly but more generally.
Benchmarks can drastically be different and although it is always better to compare on same gear, same test methods, even with different gear it shouldn't be so drastic of a difference and my point still stands. If I really wanted to wast my time I could go around and find plenty of huge differences but again I'm not going to wast my time and I doubt it would change your bias.
Believe what you want but it pays to be objective...
|
AfterDawn Addict
4 product reviews
|
13. October 2011 @ 17:51 |
Link to this message
|
Originally posted by Mr-Movies: Originally posted by sammorris: Your explanation of ping isn't at fault, but you referred to the term 'bandwidth' when you meant 'latency' - two different things. That's where the confusion came from.
As for the benchmarks, just because they deviate that doesn't mean they're false. This suggests variance in similar systems, which does happen. Any decent site states their testing parameters so you can tell whether they're appropriate or not.
Latency was implied by delay so I did cover that and bandwidth, connections, and so on still enter in. Latency is the sum of that delay measured in ms as I stated so I covered it correctly but more generally.
Benchmarks can drastically be different and although it is always better to compare on same gear, same test methods, even with different gear it shouldn't be so drastic of a difference and my point still stands. If I really wanted to wast my time I could go around and find plenty of huge differences but again I'm not going to wast my time and I doubt it would change your bias.
Believe what you want but it pays to be objective...
Objective to me is taking a summary from various different test sites.
Saying 'they're all wrong' is not objective, it's just perverse and stupid. I'm not saying that you're doing that, but you have to provide a valid reason for criticising sites for being wrong other than 'they're a bit different' - or 'the owner smells'. In the case of Toms Hardware being paid off by nvidia it was obvious, but between HardOCP, Bit-tech, Techreport and others, they all paint the same picture. They aren't all biased!
|
AfterDawn Addict
|
13. October 2011 @ 18:09 |
Link to this message
|
Originally posted by Estuansis: Russ I don't see any bias there. They're testing CPUs at stock speeds. Also, they're testing the fastest CPUs AMD has ever made, against decidedly not Intel's best, disregarding the 990X. If anything, the test is biased heavily in AMD's favor. The AMD CPUs are just slower. This is a fact. I don't know what you keep looking for Russ. Finding the right graph won't suddenly make AMD CPUs faster.
Also, I see nothing about the boards you mentioned at all. Disregarding the fact that none of the CPUs in the shown tests will even work in the boards you mentioned. So I really don't know what angle you're going for.
I mean wtf do you keep going on about it's not a fair test? Is it not a fair test unless AMD wins? If I was shown a graph with AMD winning, common sense tells me that's the unfair test. Every single benchmark we've seen so far has shown Bulldozer getting its ass handed to it. Are you trying to say that every single reputable review site is in Intel's pockets and that's why Bulldozer sucks? Sorry Russ, but Intel bribing a review site doesn't suddenly make a CPU better or worse. If Bulldozer sucks, there's nothing a slightly skewed graph is going to do to change it.
Estuansis,
I did not say that it's not a fair test if AMD doesn't win. This is the graph Sam posted, that I questioned. All the processors I mentioned, the motherboards and memory, just as I posted they were. I never mentioned Bulldozer!
It was only supposed to be an example, but look at what was tested from AMD. The only valid high end chip was the 1100T. The 620 propus was discontinued 2 years ago, the 940 BE was dropped after the 955 BE came out, and the Phenom IIx2 550 was dropped almost a year ago. All the AMDs were tested with socket AM2+ motherboards with DDR2. On the Intel side, you have a socket 1366 Core i7, and two socket 1155s, a core i5 760, and a core i3 530. All were tested with DDR3 motherboards, and they don't show a memory speed. Nothing is a valid comparison
I have no idea what this chart has to do with Bulldozer, because it's not on the graph, and not part of the question I raised. I didn't ask about Bulldozer. All I was doing was pointing out how these results look skewed because the Quad core should have been a Phenom IIx4 965 BE, and the Dual core should have been the Phenom IIx2 555 BE. All I know is in a fair lineup, modern chips vs modern chips, and removing the obsolete and old tech chips, the Core i3 530 comes out dead last for that particular benchmark. The Phenom IIx2 555 BE Dual core and the x4 965 BE To be completely honest, I didn't even notice it right away that they had all those junk (Fos) CPUs for both sides in there! LOL!!
Best Regards,
Russ
GigaByte 990FXA-UD5 - AMD FX-8320 @4.0GHz @1.312v - Corsair H-60 liquid CPU Cooler - 4x4 GB GSkill RipJaws DDR3/1866 Cas8, 8-9-9-24 - Corsair 400-R Case - OCZ FATAL1TY 550 watt Modular PSU - Intel 330 120GB SATA III SSD - WD Black 500GB SATA III - WD black 1 TB Sata III - WD Black 500GB SATA II - 2 Asus DRW-24B1ST DVD-Burner - Sony 420W 5.1 PL-II Suround Sound - GigaByte GTX550/1GB 970 Mhz Video - Asus VE247H 23.6" HDMI 1080p Monitor
|
AfterDawn Addict
4 product reviews
|
13. October 2011 @ 18:12 |
Link to this message
|
It doesn't have anything to do with bulldozer, nor was it supposed to. It was brought up because it was claimed that Phenom 2s were adequate for modern games, as games only really needed GPU power - this graph disproves that entirely - forget which CPUs they're comparing to which, the numbers are so low, that it proves you can generate situations in games where even powerful CPUs produce painfully low numbers.
They have no reason to include the 955 over the 940 - they're both the same tech, one is just faster than the other - they just need to cover all architectures (at the time, this test came before bulldozer remember) which they have done.
|
Senior Member
|
13. October 2011 @ 19:06 |
Link to this message
|
Well facts are facts, when everything contradicts obviously there is something wrong! Coming to that conclusion takes objectivity and some smarts instead of believing all stats, benchmarks, and so on are correct/fact and represent the big picture well. All are used for personal reasons these days.
I periodically test security programs and I personally do this because there is no one out there testing them fairly or correctly. I've learned the hard way that I can't believe what website are spewing. I have a friend that I taught how to read stats and other number systems but he like so many take them too literally and doesn't see what they are telling you, so he blindly follows the BS. He is starting to learn that things aren't always as they are represented but it is slow coming.
I get a kick out of how you think something in the same family can't be different. If that was true then why have a 955 since you have a 940. It is this lack of insight you display that is a prime piece of the problem with you seeing the big picture. You seem to have narrow vision, not real objective. That is my personal observation as is my belief that you can't be wrong and admit it, as you always make excuses instead of saying ya your probably right and I was wrong.
Now don't take me to seriously as I enjoy our banter and like I said you bring a lot to the table.
|
AfterDawn Addict
|
14. October 2011 @ 10:03 |
Link to this message
|
Originally posted by sammorris: It doesn't have anything to do with bulldozer, nor was it supposed to. It was brought up because it was claimed that Phenom 2s were adequate for modern games, as games only really needed GPU power - this graph disproves that entirely - forget which CPUs they're comparing to which, the numbers are so low, that it proves you can generate situations in games where even powerful CPUs produce painfully low numbers.
They have no reason to include the 955 over the 940 - they're both the same tech, one is just faster than the other - they just need to cover all architectures (at the time, this test came before bulldozer remember) which they have done.
Sam,
There is every reason to eliminate the 940 and use the 955. They may be the same tech, but that's in name only. You can't use a 940 on an AM3 or AM3+ motherboard as it only has a DDR2 memory controller. The 940 never had C3 stepping, that I'm aware of, so you are limited to the 3 generations older architecture of socket AM2 or Socket AM2+ with the 940, and much more heat. It can't cover all the architectures without a DDR3 memory controller! It also can't take advantage of the performance gains, the latest motherboard chipsets offer, in socket AM3+.
That's a roughly 15% gain as a drop in with no overclock in my GigaByte 990XA-UD3. So far it's proven to be that way for any socket AM3 chip I've tried. The Phenom IIx2 3.2GHz 555 Callisto I tried, unlocked both cores and mirrors the Phenom IIx4 3.2GHz 955 in performance. CPUZ showed it as a B55 Quad core when unlocked The little "CPU that could", the Athlon IIx4 3.0GHz Propus Quad core overclocks to 4.2GHz on the 990XA, and is so much faster than the 2.8GHz 630 Propus in Oxi. I'll soon be taking a look at the new 2.6GHz Athlon IIx4 Llano based Quad, but with no graphics. I have a hunch that the Athlon IIs will all have Llano cores and 4MB of L2 cache, in the future. Makes sense since they're cheaper to make, even with the 4MB of L2 cache, and that's twice as much L2 cache than the Propus has. Should be a killer Quad core for $89, and all new tech too! Affordable Quad core computing, with Quality HD graphics has finally come to the masses! Dell, HP/Compaq, Acer, and eMachines/Gateway offer various AMD low end Quads for as little as $328, without a monitor. Game starved men are going to go nuts this Holiday Season, buying these things! Sales of "Crysis" will go up! LOL!!
Best Regards,
Russ
GigaByte 990FXA-UD5 - AMD FX-8320 @4.0GHz @1.312v - Corsair H-60 liquid CPU Cooler - 4x4 GB GSkill RipJaws DDR3/1866 Cas8, 8-9-9-24 - Corsair 400-R Case - OCZ FATAL1TY 550 watt Modular PSU - Intel 330 120GB SATA III SSD - WD Black 500GB SATA III - WD black 1 TB Sata III - WD Black 500GB SATA II - 2 Asus DRW-24B1ST DVD-Burner - Sony 420W 5.1 PL-II Suround Sound - GigaByte GTX550/1GB 970 Mhz Video - Asus VE247H 23.6" HDMI 1080p Monitor
|
AfterDawn Addict
4 product reviews
|
14. October 2011 @ 10:13 |
Link to this message
|
Yeah I'm aware of the performance gains of the new chipsets - that only applies to this GameGPU test anyway though - the other tests, biased or otherwise, use the 990 chipset in new boards, so their results are accurate in that sense.
I hope you're joking about playing Crysis on the integrated graphics chipset...
|
AfterDawn Addict
|
14. October 2011 @ 10:52 |
Link to this message
|
Originally posted by sammorris: Yeah I'm aware of the performance gains of the new chipsets - that only applies to this GameGPU test anyway though - the other tests, biased or otherwise, use the 990 chipset in new boards, so their results are accurate in that sense.
I hope you're joking about playing Crysis on the integrated graphics chipset...
Sam,
Nope, not joking at all. You just have to turn off all the goodies. You could play it on the old 785G Gigabyte's HD-3200 graphics. About all you could say about it was that it did play! LOL!! The Ati HD 6550D in the top Llano should play far better than the Hd 3200.
In that Game GPU test, they only had DDR2 motherboards for the AMDs, no DDR3. There were no 990 or 880 chipset motherboards. The Core i7 was run on a high dollar MSI X58 Military Spec board and the Core i5 & i3 were run on the Asus Crosshair Formula board.
Russ
GigaByte 990FXA-UD5 - AMD FX-8320 @4.0GHz @1.312v - Corsair H-60 liquid CPU Cooler - 4x4 GB GSkill RipJaws DDR3/1866 Cas8, 8-9-9-24 - Corsair 400-R Case - OCZ FATAL1TY 550 watt Modular PSU - Intel 330 120GB SATA III SSD - WD Black 500GB SATA III - WD black 1 TB Sata III - WD Black 500GB SATA II - 2 Asus DRW-24B1ST DVD-Burner - Sony 420W 5.1 PL-II Suround Sound - GigaByte GTX550/1GB 970 Mhz Video - Asus VE247H 23.6" HDMI 1080p Monitor
|
AfterDawn Addict
15 product reviews
|
14. October 2011 @ 11:31 |
Link to this message
|
I can agree that the new APU integrated graphics are amazing. Maybe not up to playing a monster like Crysis quite yet, but it certainly plays quite a large number of new games at reasonable speeds. There's a lot to be said for that.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
Senior Member
|
15. October 2011 @ 04:16 |
Link to this message
|
Even the new onboard GPU's aren't up to playing the newer intensive games, sure you might be able to play them with everything turned off and at low resolutions but that just isn't the same in my opinion.
Also I wouldn't want to use onboard for Solid Works, Pro-E, or even AutoCad 3D but I have used good gaming add-on cards for them.
|
AfterDawn Addict
15 product reviews
|
15. October 2011 @ 04:24 |
Link to this message
|
Yep this is true but compared to any integrated to come before it, it's the king of speed. Trick being most mass mfg PCs use integrated so if you give them a very cheap CPU with great integrated, suddenly any joe shmoe can hop on Left 4 Dead or World of Warcraft without a built-up PC. That opens lots of possibilities. When I was just a young'un, all I had to game with was Intel Extreme Graphics 2. I'd have killed for a proper IGP like the APUs have.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
Advertisement
|
|
|
AfterDawn Addict
4 product reviews
|
15. October 2011 @ 06:28 |
Link to this message
|
Yeah both the high-end Intels and the Llano AMDs have powerful IGPs, and I assume the same is true of the FX series CPUs? The AMD version is indeed faster, but the Intel one (for once) isn't totally hopeless either. Admittedly minimum detail and a low resolution, but they will play most games out there, which is a vast improvement over old IGPs.
|
|