|
The Official PC building thread - 4th Edition
|
|
|
AfterDawn Addict
15 product reviews
|
13. May 2012 @ 09:49 |
Link to this message
|
Quote: Definitely running multiple video cards in SLi or Crossfire is a waste of money as even if all the slots are capable of 16x the GPU's still are master/slave and only 1/4+ of the work is handed off from the master card. It is much better to get a x2 video card then two singles.
That's debatable. There are some marginal gains to be had from a single dual-chip card but in practice there isn't much difference except heat.
Likewise 90%+ scaling on quite a few games would argue with that as well. All the way down to 4x, the cards will perform the same, so running on a single slot only really save you space and maybe some heat due to more open space around the card.
Also, the dual-chip cards are generally using two very high end GPUs, ie out of my price range. I can Crossfire using two individual mid-range cards, and be much more cost-effective.
Quote: Don't buy that the Bulldozer is inferior and the extra cores are a major plus, again Intel is much better at clock rates and AMD is more core brute force approach. The CPU's need to be compared as CPU's and how much they can handle not to a core-core basis. This is why I don't buy your argument. I agree though if I want to push my CPU to the max clock rates Intel is hands down much better.
Well I AM a gamer, and regardless of how good Bulldozer performs as a CPU, games still need single core performance as much as multi-core performance. So some games, Bulldozer runs just great, but others it chokes pretty badly. I need both good single and multi performance in a gaming CPU.
I'm not saying Bulldozer is inferior my good sir :P Simply that it isn't as ideal for gaming as Intel CPUs currently are.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 13. May 2012 @ 10:00
|
Advertisement
|
|
|
AfterDawn Addict
4 product reviews
|
13. May 2012 @ 09:51 |
Link to this message
|
Quote: Video cards are always a bottle neck with the latest games, it costs a lot of money to stay up to date for gaming.
I missed this point. While it's true that graphics are more often the bottleneck than the CPU, this is far from always the case.
SCFA is one good example, but there are several others out there. Even counter-strike: source despite its age, with the sort of mods that are very popular (thinking mainly gungame here) - my old Q9550 could not keep up in a 24-player map. The difference between it and my i5 is enormous. Given that CSS is single-threaded, and Bulldozer is even slower per-thread than my Q9550 was, that's one CPU that can't really play that mod well at all, whereas my £150 i5 from February 2010 can with ease.
And before you call insignificance on it being a mod, gungame is if anything more popular than CSS itself, which is still very popular despite its age.
Strategy games in general are another key example.
I far from push my CPUs to the max. When I overclocked my i5 I literally chose a value that I thought realistic (4.11Ghz from a base of 2.66), set it, upped the voltage a bit, and left it there. That was literally it, I haven't touched the settings since the day after I bought the CPU well over two years ago. More than a 50% performance boost for free, on air cooling? Why not.
We're all technology enthusiasts here, so advocating the CPU that can't do this, and is slower even to begin with is a choice I will never truly understand.
Mr-Movies: I don't wish to be blunt here, but it's clear you don't really understand how multi-GPU technology works. X2 cards work the exact same way as using crossfire with a bridge connector, and the master/slave arrangement hasn't been the case since dongles were done away with back in 2007. GPUs are all interleaved now, and while the technology is far from perfect, it's fairly typical to get a 90%+ gain when using two cards, or two GPUs.
|
AfterDawn Addict
15 product reviews
|
13. May 2012 @ 09:54 |
Link to this message
|
Yep, Master/Slave hasn't been around since the X1900 series :P
Quote: As that may be true it is poor programming by the game maker, which is nothing new, or the game is optimized for a given CPU instruction set. In the old days people would program for flaws in a given CPU so that their program or game would perform better.
But that doesn't really matter to the gamer if it is their favorite game you have to go with what works best for your use for sure. Now these days I would think that would be extremely rare.
Thousands of units on-screen at once, each with their own path-finding, AI, attack routines, build routines. Not to mention large battle with huge amounts of particulates and explosions. The CPU is still very important for performance in Real-TIme Strategy. First person shooters, are generally another story. Graphics cards are much more important for shooter performance. Gaming is very multi-faceted, and no single setup will work best for every game. I've tried to build for all scenarios.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 13. May 2012 @ 10:06
|
AfterDawn Addict
4 product reviews
|
13. May 2012 @ 10:00 |
Link to this message
|
This sort of thing is far from uncommon.
|
AfterDawn Addict
15 product reviews
|
13. May 2012 @ 10:02 |
Link to this message
|
That would be a good example of something that relies on single core performance. Very common with many games. Bad programming or not, Intel runs better. Is it a sad truth? YES, but it IS true. I have been using AMD since I was a wee lad of 15 building my first PC, so please Movies, don't take me as anti-AMD in any way. You couldn't be further from the truth :)
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 13. May 2012 @ 10:04
|
AfterDawn Addict
4 product reviews
|
13. May 2012 @ 10:04 |
Link to this message
|
Let's not focus too much on one arena though. A lot of dyed-in-the-wool AMD fans concede that Bulldozer is not for gaming, but maintain its superiority in other areas, and that's fair enough, but what has to be spelt out is that the single-core, 2-core and 4-core limitations that hit games so hard also hit other areas equally hard. Outside of when I encode video, nothing I use ever manages to take advantage of all 4 cores on my CPU, let alone if I had 8.
|
AfterDawn Addict
15 product reviews
|
13. May 2012 @ 10:10 |
Link to this message
|
A sad truth. I am very partial to AMD in general but I can't ignore facts. Multi-core performance still gets largely ignored by software developers, so Bulldozer gets screwed, even if it is actually a very good CPU.
That chart is a good example of why I haven't gone to Bulldozer. It simply isn't any faster in my chosen software. Does that mean it's straight-up slower? NO. It's just slower in games.
I will maintain that Bulldozer is a fantastic CPU, but maybe a bit ahead of its time as far as programmers are concerned.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 13. May 2012 @ 10:13
|
AfterDawn Addict
4 product reviews
|
13. May 2012 @ 10:14 |
Link to this message
|
Perhaps, but in my mind it's no better than the X6 1100T, because each core is roughly 75% of the performance of an X6 core, therefore the raw performance, even in a best 8-core scenario, is the same as an X6. If it were faster in an 8-core environment than anything else out there, then sure, it'd be a great CPU, just set back by poor programmers. Unfortunately, however, that's not the case, the raw performance isn't sufficient to take on the i7 2700K. It just wins on costing less than the 2700K.
Even if this were the case, you unfortunately can't sympathise with a company that's built a CPU for a market that doesn't really exist. Server CPUs exist for a reason, and they cost a lot. The FX-8150 is basically a server chip being pedalled to standard end users.
|
AfterDawn Addict
15 product reviews
|
13. May 2012 @ 10:20 |
Link to this message
|
I've been back and forth on the issue a lot. But this is also exactly why I expect a revival ala Phenom II to bring Bulldozer up to snuff. Phenom I to Phenom II was a fantastic upgrade.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
AfterDawn Addict
4 product reviews
|
13. May 2012 @ 10:27 |
Link to this message
|
Ever hopeful, because it's the only thing that will spur Intel on to drop prices and innovate further, if anything else. Their lead has been so large for so long, it seems like they didn't really bother with Ivy Bridge.
|
AfterDawn Addict
15 product reviews
|
13. May 2012 @ 10:30 |
Link to this message
|
Yes it's very interesting to see Intel release a product that doesn't live up to expectations.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
|
Senior Member
|
13. May 2012 @ 10:46 |
Link to this message
|
A fare majority of the newer programs I use do or are capable of multiple cores and even if they are not I can still multitask older programs better with more cores and Windows 7 as the OS will handle some of that.
There is nothing wrong with server chips, that certainly is not a negative. Besides Intel and AMD will base their consumer chips on server CPU's from time-to-time. No big deal, petty excuse... In fact I've had friends that use server boards as their gaming platform and were very happy with the results.
I've said it before I just don't see the beef in your precious Intel's and I've given them a fare go to. But I also don't use some of the games that prefer the Intel's either and if that was my main focus then I might see the worth.
This message has been edited since posting. Last time this message was edited on 13. May 2012 @ 10:48
|
AfterDawn Addict
15 product reviews
|
13. May 2012 @ 11:21 |
Link to this message
|
It's not even an Intel/AMD bias. It's not a "precious" intel thing or anything like that. Sam, like myself, got his start gaming on AMD hardware as well. We're not trying to be biased, just realistic. You can't pick one and say the other is evil. You just rob yourself of performance. GHz clock speeds are not what's important; it's the speed per-core, and Bulldozer just doesn't have it. It doesn't even fully match up to the product it was meant to replace. Yes, it's newer technology and a very innovative one at that, but it's having teething problems.
Some server platforms would be fine for gaming. Older Socket 939 Opterons were wonderful as they were essentially Clawhammer/San Diego chips with a different core name and better overclocking. Same case with previous and current generations of Intel CPUs, the Server and Desktop lines largely maintain the same basic hardware. The old Opteron chips and Xeon chips were also literally higher binned Athlon 64s, and Core 2 Duos. But beyond the server chips that share a socket with desktop chips, it's very hard to effectively build a gaming PC using server hardware.
In the case of Bulldozer, it's not a higher binned desktop chip, it is really a server CPU meant for cluster environments such as render farms or super computers. They aren't entirely suited to be used in a desktop environment with so many variable loads. They are probably wonderful when you have many of them together, because a pure multithreaded environment such as a high load server or computing cluster is more suited to the design.
We're not down-talking AMD's product at all. It simply isn't the right product for desktops as it currently is.
The OS will allow excellent multi-tasking, but this is multi-tasking, not single application performance. As stated many times previously, Bulldozer has many strengths, but they are not in pure performance. I wish the case were different, but it just isn't. Particularly if you do video encoding or Folding or some other heavily multi-threaded task, they are very cost effective, but they still aren't even the fastest. Most people don't even do video encoding, Folding, Photoshop, or anything else very heavily mutli-threaded. Regardless of relative value, they are slower in most things. It's not an opinion it's a fact.
AMD Phenom II X6 1100T 4GHz(20 x 200) 1.5v 3000NB 2000HT, Corsair Hydro H110 w/ 4 x 140mm 1500RPM fans Push/Pull, Gigabyte GA-990FXA-UD5, 8GB(2 x 4GB) G.Skill RipJaws DDR3-1600 @ 1600MHz CL9 1.55v, Gigabyte GTX760 OC 4GB(1170/1700), Corsair 750HX
Detailed PC Specs: http://my.afterdawn.com/estuansis/blog_entry.cfm/11388
This message has been edited since posting. Last time this message was edited on 13. May 2012 @ 11:24
|
Senior Member
|
13. May 2012 @ 12:15 |
Link to this message
|
It's truly not hard to build a gaming machine on a server platform. Variable loads on a server should be a piece of cake also. That is what servers are made for and you're not running a server OS so that shouldn't hinder one.
I still don't buy that most games don't run fine on an AMD platform. I do game a little still and have games that are very intensive, they all run fine as well.
You are down talking but that doesn't matter if it is true, and maybe for the few games you like it is true. And I've found from personal experience that I would disagree. My disdain for Intel isn't that they are a bad processor, they aren't, it is the cost as they are not a smart buy and I totally don't see the bloated performance increases which are much smaller then typically reported. Again that is not an opinion it is backed by personal testing with multiple real environment apps.
If your argument back is that most people don't even do heavy tasking on their PC's then that would even more so support the fact that they would not need the extreme performance of the Intel, being sarcastic of course. :)
Like I've said time and again, over and over, if they spin your boat knock yourself out however I would argue the point and a few games that might perform better isn't enough to change my thoughts on the matter.
I won't be wasting my hard earned cash on an Intel anytime soon.
|
AfterDawn Addict
4 product reviews
|
13. May 2012 @ 13:47 |
Link to this message
|
Originally posted by Estuansis: Yes it's very interesting to see Intel release a product that doesn't live up to expectations.
It has happened before - Pentium 4 and Pentium D being prime examples.
Quote: If your argument back is that most people don't even do heavy tasking on their PC's then that would even more so support the fact that they would not need the extreme performance of the Intel, being sarcastic of course. :)
This is actually the most accurate criticism of the AMD fans out there - often people don't need the performance of the Intels, and cheaper AMDs are a better buy. Of course, cheaper Intels are also a better buy, so it's still a toss up, but the i5/i7s aren't necessary for everyone, far from it. I would far sooner see people with low-end AMDs and SSDs than i5s and mechanical OS drives, for instance.
Quote: and I totally don't see the bloated performance increases which are much smaller then typically reported. Again that is not an opinion it is backed by personal testing with multiple real environment apps.
Actually it's neither fact nor opinion, it's a placebo. This is the single reason why people are still buying Bulldozers essentially, barring the few odd occasions when there are genuine advantages (i.e. people who don't overclock, and use video encoding apps that have a bias towards AMD and use all 8 cores), it is simply because people who use AMDs don't realise what they're using is slower, and when they test Intels, they don't physically notice the difference, because they are predisposed against the idea of it being there.
The human mind does work in odd ways and can often conceal hard evidence displayed before your very eyes - if you start out anticipating something to be true, often when it turns out not to be the case, it can be very easy to ignore that fact.
I've had my scepticisms about several technologies in the past, dual-core CPUs for games, dual graphics, SSDs, and a fair few other things. When you first get them, you don't often realise the benefits even though they're right in front of you.
Fortunately, perhaps long and arduous though the process was, I've learnt to always take a step back and examine things at face value. I've now been doing this long enough to spot dubious results in benchmarks, and trust me, they are there. Benchmarks are far from hard fact in many cases, but when you examine a large number of them from all sorts of angles, and simply think about situations you know from the past, you get a good picture of what's going on.
Ultimately it is irrefutable fact that the original Core i5/i7 far surpassed the performance-per-core at stock of the Phenom II, that was a given. The Bulldozer is slower per-core than Phenom II in every case, we know this. Disregarding potentially biased sites you can always use intra-brand comparisons to work out how much better a CPU is than its predecessor, and from looking through countless sites some very solid figures emerge. Bulldozer is about 75-80% as powerful, per core, per mhz, as Phenom II.
The first core i5/i7 (Lynnfield/Bloomfield) is 35% more powerful, per core, per mhz.
The second generation i5/i7 (Sandy Bridge) is 10% beyond that, and the third generation (Ivy Bridge) is 5% beyond that.
That gives us (and I've thrown in some legacy results as well):
NETBURST ARCHITECTURE (Intel 90nm, P4 3.4Ghz etc.): 46% performance, e.g. P4 3400: 3400x1x46% = 1,564
CORE ARCHITECTURE (Intel 65nm, Q6600 etc.): 87% performance, e.g. Q6600: 2400x4x87% = 8,352
PENRYN ARCHITECTURE (Intel 45nm, Q9550 etc.): 100% performance, e.g. Q9550: 2833x4x100% = 11,333
BLOOMFIELD ARCHITECTURE (Intel 45nm, i7 920 etc.): 120% performance, e.g. i7 920: 2666x4x120% = 12,800
SANDY BRIDGE ARCHITECTURE (Intel 32nm, i7 2600K etc.): 138% performance, e.g. i7 2600K: 3400x4x138% = 18,768
IVY BRIDGE ARCHITECTURE (Intel 22nm, i7 3770K etc.): 145% performance, e.g. i7 3770K: 3500x4x145% = 20,3000
AMD MANCHESTER/TOLEDO (X2 4200+ etc.): 85% performance, e.g. X2 4200+: 2200x2x85% = 3,740
AMD DENEB (X4 955BE etc.) 88% performance, e.g. X4 955BE: 3200x4x88% = 11,264
AMD ZAMBEZI (FX-8150 etc.) 66% performance, e.g. FX-8150: 3600x8x66% = 19,008
|
Senior Member
|
13. May 2012 @ 16:20 |
Link to this message
|
Sam your a wast, especially your placebo nonsense. Same old argument and I disagree based on decades of experience.
Oh that's right placebo again... LOL Experience and self testing doesn't count unless it's yours. :)
I could tell you what you can do with your thrown and its BS numbers which I could find counters too, but I won't.... Have you fallacy, bask in it....
There is no need to further this discussion as you feel your right and I feel I'm right. It really didn't need to even go here from where we were.
This message has been edited since posting. Last time this message was edited on 13. May 2012 @ 16:21
|
AfterDawn Addict
4 product reviews
|
13. May 2012 @ 17:04 |
Link to this message
|
It's well established that you don't accept the truth about these CPUs. I have no objective to change that.
Anyone can have decades of experience in the PC industry, but if they stop chasing the latest hardware for as much as 18 months, their experience in the industry is of zero merit unless they start from the beginning and catch up with where hardware is now.
I know countless people who have 30+ years of experience in the computer industry and can provide fascinating detail on the systems of the early 1980s and so on, as well as perform any of the tasks that may be required to maintain current hardware. However, not one of the knows the exact standing of modern CPUs from memory, because they don't follow the market. Since I do, while I may only have 9 years or so experience in actually working on PCs and upgrading them, this is ample given the attention I pay to the market.
What is just infuriating to most, not just people like me, is when people who can't be bothered (or, more often than not simply refuse) to research what the truth of the matter is, then post nonsense as fact trying to start rumours that what is in actual fact truth and has been banded around for months/years, is actually false.
I could quite happily band around the notion that IDE has always been superior to SATA. After all, no denying the cables are better quality, so the interface must be better right? I mean sure, it's only 133MB/s link speed versus the 375MB/s of SATA2 or 750MB/s of SATA 3, but those are just numbers, who cares?
I've used one PC in my lifetime that had an IDE drive in it, but it was so much faster than the SATA PC that I used - IDE is better. That's real world proof from my wealth of experience, not these BS numbers that every website out there spams.
There is nil difference between that argument and yours.
(also, since I can see it being picked up on a mile off, I have used far more than one PC with an IDE drive in it - that phrase was used for sake of argument).
This message has been edited since posting. Last time this message was edited on 13. May 2012 @ 17:05
|
Senior Member
|
13. May 2012 @ 17:07 |
Link to this message
|
You are so full of yourself, have fun with that and I hope your are frustrated as you would buy BS stats then find out for yourself.
This will be the last I say so knock yourself out, you always do. In America we use thrown's in a way you wouldn't like! LOL
|
AfterDawn Addict
4 product reviews
|
13. May 2012 @ 17:11 |
Link to this message
|
Nah, no knocking out needed. I just have to make enough noise to counteract a false post on each page so that anyone reading it calls it into question, is all.
The only real aim here is to prevent people being led down the wrong path by reading false info :P
|
AfterDawn Addict
7 product reviews
|
13. May 2012 @ 17:19 |
Link to this message
|
Guys, both of you do not need the last word. I Need it! LOL!
To delete, or not to delete. THAT is the question!
|
sytyguy
Senior Member
|
13. May 2012 @ 18:18 |
Link to this message
|
Originally posted by Estuansis: Lol poor sytyguy. All from a typo :P
But yeah Movies is exactly right here. Even though USB has very reasonable theoretical specs, in reality it's always going to be a lot slower than SATA. USB3 manages ~40MB/s for me while my slowest SATA2 drive easily manages 70-75MB/s. My fastest being the WD1001FALS can pull 90-ish to another fast drive, and these aren't even SATA3 6Gbs drives.
Just a question sytyguy, does your name refer to the Chevy Syclone and Typhoon? That's the only place I've seen the word "Syty".
Just copied a 19GB Blu-Ray movie from a USB3 to a USB3, the average speed was about 98MPS, the high was 118MPS, the low 96MPS.
Yes, I used to own a '93 Typhoon, when I first joined here
|
AfterDawn Addict
4 product reviews
|
13. May 2012 @ 18:28 |
Link to this message
|
Out of curiosity, what boxes are you using for the USB3 transfer? About to buy first USB3 box next week, will be interested to see how it compares.
|
AfterDawn Addict
|
13. May 2012 @ 19:56 |
Link to this message
|
Originally posted by sammorris: I have seen hardly any evidence of FX-8150s selling at all from the online community. They haven't been quite as much of an economical flop as a technological flop, but they still haven't really made much of an impact.
Llano on the other hand has been selling pretty well, and as a low-end all-in-one system for HTPCs and the likes, it's one of the best options out there.
Ivy Bridge has turned out to be a failure, there is literally no reason to buy one if you already have any i5/i7, let alone a sandy bridge CPU - the performance/mhz is roughly the same, around a 5% gain, the base clock speeds are the same too, so out of the box they're no faster. They use less power, but due to a badly designed heatspreader they run hotter than their sandy bridge counterparts, and are therefore less easily overclocked, so all the advantages of having lower power consumption from 22nm silicon are lost. In effect, Ivy Bridge is almost like Intel's Bulldozer.
Sam,
It's funny, the FX chips sell well in the US, yet don't seem to sell well in the UK? There are 6 available models of the Zambezi (not Bulldozer), and 4 of those sell very well in the US. there have been problems with the FX-4170 4.2GHz Quad core with heat issues and the FX-6200 3.8GHz Quad core, that generally doesn't overclock very well. As you know, I have a lot of faith in Newegg's reviews, because most times the number of reviews very closely match actual number of sales. Sorting those reviews between fact and fiction requires a lot of practice to weed out purchasers who don't know what they are doing, even though they claim to be highly experienced. Newegg also lists the actual number of verified purchases. Here are the Newegg sales numbers for the Zambezi, in sales order.
Zambezi
#1 FX-4100 3.6GHz Quad core 486
#2 FX-8120 3.1GHz Eight core 481
#3 FX-8150 3.6GHz Eight Core 337
#4 FX-6100 3.3GHz Quad core 314
#5 FX-4170 4.2GHz Quad core 44
#6 FX-6200 3.8GHz Quad core 32
That equals 1694 total sales
Sandy Bridge
#1 i5-2500K 3.3GHz Quad core 2016
#2 i7-2600K 3.4GHz Quad core 1358
#3 i5-2500 3.3GHz Quad core 232
#4 i5-2400 3.1GHz Quad core 204
#5 i7-2600 3.4GHz Quad core 200
#6 i7-2700K 3.5GHz Quad core 146
#7 i5-2300 2.8GHz Quad core 50
#8 i5-2550K 3.4GHz Quad core 34
#9 i5-2405S 2.5GHz Quad core 15
#10 i5-2320 3.0GHz Quad core 13
#11 i5-2400S 2.5GHz Quad core 11
#12 i5-2380P 3.1GHz Quad core 6
#13 i5-2310 2.9GHz Quad core 4
#14 i7-2600S 2.8GHz Quad core 3
#15 i5-2450P 3.2GHz Quad core 2
total 4294
As you can see, after #6, the sandy Bridge pretty much falls off the table in terms of sales numbers. I didn't bother with the Dual cores at all, but there are 13 of them, with 9 showing sales of 50, or less. Cheap Dual cores, most with graphics inferior to the Llano. That's 17 CPUs that total 242 sales, or an average of about 14.2 sales per chip.
The bottom line is that AMD with Zambezi sold right at 40% of what the Sandy bridge Quads sold, with far less overhead, and Sandy Bridge has been out much longer than Zambezi, so there has to be some impact felt, especially since the US is the largest market. I've run my figures by friends that work at Newegg, and while they won't give me the actual numbers, they tell me that I am right in the ballpark. Intel needs to drop a number of the 17 chips that aren't selling, and make some concessions to the dealers that have them in stock, and they need to do it right away because every day they don't just costs Intel more money. The stockholder are not going to like it, but with the biggest sellers, with the i5-2500K and the i7-2600K price dropped to almost no profit, they are going to have to do something, given the failure of Ivy Bridge. Someone at Intel made a bad decision, and the damage is done, and can't be easily undone. There will be a fire sale to lower inventory. There has to be otherwise they won't sell!
Best Regards,
Russ
GigaByte 990FXA-UD5 - AMD FX-8320 @4.0GHz @1.312v - Corsair H-60 liquid CPU Cooler - 4x4 GB GSkill RipJaws DDR3/1866 Cas8, 8-9-9-24 - Corsair 400-R Case - OCZ FATAL1TY 550 watt Modular PSU - Intel 330 120GB SATA III SSD - WD Black 500GB SATA III - WD black 1 TB Sata III - WD Black 500GB SATA II - 2 Asus DRW-24B1ST DVD-Burner - Sony 420W 5.1 PL-II Suround Sound - GigaByte GTX550/1GB 970 Mhz Video - Asus VE247H 23.6" HDMI 1080p Monitor
This message has been edited since posting. Last time this message was edited on 13. May 2012 @ 20:06
|
sytyguy
Senior Member
|
13. May 2012 @ 22:57 |
Link to this message
|
Originally posted by sammorris: Out of curiosity, what boxes are you using for the USB3 transfer? About to buy first USB3 box next week, will be interested to see how it compares.
Both are a Mygica USB 3.0 Super Speed SATA Hard Drive Docking Station. Purchased from Meritline for $28.99 and free shipping, however they are probably cheaper now.
|
Advertisement
|
|
|
Senior Member
4 product reviews
|
14. May 2012 @ 02:30 |
Link to this message
|
Originally posted by Mr-Movies: Definitely running multiple video cards in SLi or Crossfire is a waste of money as even if all the slots are capable of 16x the GPU's still are master/slave and only 1/4+ of the work is handed off from the master card. It is much better to get a x2 video card then two singles.
its highly dependent on hardware and decent software coding, i can easily get 1/2 the performance sometimes more on my 2 GTX460's.
However to sit there and say that your only going to get 1/4 performance for all SLI Crossfire Configurations is just false, you may only get 1/4 of the performance on your setup this is true, but that could be do too a plethora of issues, Program Compatibility /coding and hardware capability are just the common two out of many.
Powered By
|
|