The Multi-core Scam

September 23, 2010

I’d just like to point out what a total scam multi-core/multi processors in Apple Mac Pros are to most users. Others have pointed out the failings of Apple’s Pro line but I’m talking specifically about multiple processors.

Computer manufacturers have been selling multi-core computers for years now. One of the reasons is that chip makers have finally run up against the limits of miniaturization and have been compensating by building 3 dimensional chips and adding 2 or more multi core chips to computers, rather than adding pure computing speed.  So despite the promise of Moore’s law, we haven’t really seen pure processor speed increases for years. Manufacturers have cheated by adding more processors to computers.  That is why we have seen slower processor speeds in newer computers. For example, all the way back in Aug 2006 Apple announced a 3 GHz Xeon Mac Pro. And 3 years later they announce… a 2.93 GHz Mac Pro?  Even today the fastest processor available for the Mac Pro  is 3.33 GHz. (And only 2.93 in their top of the line most expensive 12 core machine.)  So what gives?  Well the 2006 Mac Pro had two dual core chips for a total of 4 processors. And today’s Mac Pro comes with up to 12 processors.  That’s what they’re selling.  TWELVE PROCESSORS!

There would be nothing wrong with this if applications could use all that processing power. But there is the rub.  Very few programs correctly, efficiently, transparently use all of those cores.

It would be one thing if  we were on the bleeding edge here, talking about early adopters who just have to be patient and wait for the software to catch up with our hardware.  But that is not what we’re talking about. Way back in 2005 I bought a dual 2.0 GHz G5 PowerMac. One of the last before Apple switched to Intel.  So I’ve been using multi processors for 5 years now.  An eternity in tech time. And there were multi processor G4 machines for years before that!  So one would think that by today, programs would be taking advantage of all those processors (12!!!!)  I mean, I could understand way back in 2005 when, as an early adopter  many programs couldn’t take advantage of multiple processors.  But this is 2010.

Yet it is still the case that very very few programs are written in a way to take advantage of multiple cores, and those that are, mostly do so poorly. So why are they selling 12 processors!  Just to give you some examples; for work I need to transcode video into different formats, a very processor intensive task.  One that could greatly benefit from multiple cores.  But many programs do this very poorly.  Here’s Red Giant’s Grinder which is designed to use up to eight cores:

8 core machine with 16 virtual cores

It is working on 8 clips at nowhere near full capacity.

And here is Apple’s own Compressor, which does not by default use multiple cores, but can be tricked into it by making a ‘virtual cluster’ of machines, where by  “machines” you use your own processor cores:

8 core machine with 16 virtual cores

It starts off stronger on the same 8 clips, but then tapers off.  Admittedly I have gotten better results with Compressor – but there is no rhyme or reason.  No changes. Just try again.

And if you wanted to do this work right from within an editing program like Final Cut Pro, forget it:

8 core/16 virtual cores working on FCP export

Pretty anemic. So any time you want to export a video out of Final Cut Pro or many similar daily tasks, you will not be taking advantage of all that processing power.

There are many other examples out there, because in my opinion the vast majority of programs don’t make full use of Multi cores. You can look at some testing here, here, and here.  In fact it is the rare exception which DOES efficiently use all the cores.

So why have computer companies been selling multi core computers for so many years when there is almost no use for them? (And in a company like Apple’s case, when even their own software doesn’t make use of them.) And why has the tech press not pointed this out?  Look at Apple’s press release: “The new Mac Pro is the most powerful and configurable Mac we’ve ever made,” said Philip Schiller, Apple’s senior vice president of Worldwide Product Marketing. “With up to 12 cores, the new Mac Pro outperforms our previous top-of-the-line system by up to 50 percent, and with over a billion possible configurations, our customers can create exactly the system they want.” (emphasis added) Yeah, 50% if you set up some test program that can utilize all 12 cores. Good luck using those 12 cores (24 with hyper-threading!) in the real world.

I purchased a 2.26 GHz 8 core mac in 2009 before I looked closely at this issue and realized after I did that I should have gotten the fastest single processor offered. For this year that would be the single 6 core 3.33 GHz.  This would be much more useful to me in the real world (not to mention cheaper) than the 2.93 12 core machine.  Apple and other developers have been taking steps to be able to use more processors more efficiently in the future.  But we’ve been waiting – and buying their products in the mean time – for 6 or 7 years now.  Many multi core computers have gone completely obsolete in that time and still we haven’t arrived at a place where we can use these things.

I mean think how many product cycles have gone by for G4, G5, and Mac Pro multi core machines and still so little software can use these cores. Still! Today! Apple’s Final Cut Studio is still a 32 bit program.  All that marketing and  all those reviews that say this years machine is x% faster.  To do what?! With what program!?

Frankly I think it’s been a wholesale scam on the consumer for more than half a decade now. This issue should be pointed out in every review of every multi-core machine:  “You would have a faster real world computer (and spend less money) by getting the fastest chip available rather than more cores.”

This goes for the new imac too. Why pay $2199 for a quad core 2.93 GHz when you can pay $1899 for a 3.6 GHz machine? Apple has really gotten away scott free on this.


There has been a lot of news regarding online advertising lately: Apple’s iAd, apple’s change to it’s developer agreement that supposedly cripples competitor ad companies from collecting analytics, Google’s purchase of AdMob, the FTC looking into both those subjects, criticism of facebook and their data gathering.

The debate hinges on the assumption that gathering these analytics from users is extremely valuable because advertisers can target potential customers very specifically using all that information that they gather about the user.

What I want to know is if that’s true, why are the ads I see online so terrible and so off base? For all the bajillions of bytes of data they are collecting from us, why are that ads that are served up to me as relevant to me as if I were watching late night cable? On facebook I’ve got a mortgage ad (don’t need one) 4 foods to never eat ad, (meh) and get help for being cheated on ad, (er, unless the facebook analytics are privy to something I don’t know, this one is off base.) On my iphone I’ve god ads for a singles app (I’m married), for the bing app, (I already have it), and the bing app (just because Microsoft wants to pay to put a Bing banner in every app doesn’t mean ad companies are harvesting the power of user analytics.)

For all the talk and handwringing and hundreds of millions of dollars being spent on ad companies and FTC investigations, there seems to very little ability to actually do anything with this information. I think there is quite a bit of programing and engineering work to be done before advertisers are able to actually use the data to serve up ads that would really be of interest to any individual.

Leica M9 Thoughts

November 10, 2009

As a M6 owner with a few nice pieces of Leica glass, and as someone who would love to use that glass on a digital body, I’ve been following the digital rangefinder with interest since the Epson RD1. (I actually bought the RD1, but returned it within 7 days after realizing it was not going to really give me the Leica M experience.)

It sounds like from all I’ve read and seen that the M9 has really done a nice job in providing a full frame digital experience in a Leica M camera. While there has been a lot of discussion about the price and what the M series Leica is all about, I feel there is one comment or side to the discussion missing. And this is the longevity of the digital sensor. I do not mean to jump into the discussion of the value of rangefinders and of M Leica specifically – I own one. I get it; I love it. But whereas the M Leica’s traditionally have lasted forever (we all know there are many working M3’s out there worth as much or more than a new camera) this $7000 digital M9 will be obsolete in a year or two. No matter what happened in the technological advancement in 35mm film, you could put it in your M series Leica. When I purchased the M6 at 4 times the cost of a good SLR, I partly justified it as a camera I would have for life. (Just my luck to come of age at the dawn of the digital age. It was also only $2000 and something) Now you just have to decide that you want to spend $7000 for a couple years of use before a completely upgraded sensor is released. I mean, if the idea was that you would be able to keep the beautifully crafted body for life and upgrade the sensor somehow, that would be one thing. But really, it’s a $7000 disposable body. This is true of DSLR’s, but I feel the $1600 to $3000 I spend every two to three years is justified by the professional work I use them for. The money I would spend on a Leica to me would be a longer term investment… except that in reality it is not.

Just my 2 cents.


Review of the M9

Field review