Blackfriars' Marketing

Monday, December 17, 2007

Multi-core software thrust neglects the real challenge: killer apps

Today's New York Times cites an ongoing claim by Microsoft that
faster chips are leaving programmers in the dust. Surprisingly for the Times, it is somewhat breathless in its admiration for the work that Microsoft is doing in this area, to the point where it cites no other software companies in the article.

To accelerate its parallel computing efforts, Microsoft has hired some of the best minds in the field and has set up teams to explore approaches to rewriting the company’s software.

If it succeeds, the effort could begin to change consumer computing in roughly three years. The most aggressive of the Microsoft planners believe that the new software, designed to take advantage of microprocessors now being refined by companies like Intel and Advanced Micro Devices, could bring as much as a hundredfold computing speed-up in solving some problems.

Microsoft executives argue that such an advance would herald the advent of a class of consumer and office-oriented programs that could end the keyboard-and-mouse computing era by allowing even hand-held devices to see, listen, speak and make complex real-world decisions — in the process, transforming computers from tools into companions.

The flaw in Microsoft's strategy around parallel processing becomes apparent toward the end of the article.

Microsoft sees this as the company’s principal opportunity, and industry executives have said that the arrival of manycore microprocessors is likely to be timed to the arrival of “Windows 7.” That is the name the company has given to the follow-on operating system to Windows Vista.

The opportunity for the company is striking, Mr. Mundie said, because manycore chips will offer the kind of leap in processing power that makes it possible to take computing in fundamentally new directions.

He envisions modern chips that will increasingly resemble musical orchestras. Rather than having tiled arrays of identical processors, the microprocessor of the future will include many different computing cores, each built to solve a specific type of problem. A.M.D. has already announced its intent to blend both graphics and traditional processing units onto a single piece of silicon.

In the future, Mr. Mundie said, parallel software will take on tasks that make the computer increasingly act as an intelligent personal assistant.

“My machine overnight could process my in-box, analyze which ones were probably the most important, but it could go a step further,” he said. “It could interpret some of them, it could look at whether I’ve ever corresponded with these people, it could determine the semantic context, it could draft three possible replies. And when I came in in the morning, it would say, hey, I looked at these messages, these are the ones you probably care about, you probably want to do this for these guys, and just click yes and I’ll finish the appointment.”

So here's a question for skeptical readers: how long do you think the above email response example cited by Mr. Mundie would take on today's dual-core Intel processors? The answer: a few seconds at most. No matter how much semantic analysis you layer in here, email is not by any stretch of the imagination constrained by processor speed. So why does Mundie cite this example? Because Microsoft sells email systems.

The problem articulated in the article isn't with chip technology, software design, and parallel processing development. The real underlying problem is marketing. Microsoft and other technology companies haven't identified a consumer segment or application that desperately needs high-performance computing. Said another way, there's no killer consumer application for ten core processors, to say nothing of the 60 or 100 core processors currently on the drawing board.

Now this revelation isn't exactly new. The technology industry was struggling a decade ago to find a killer application for faster processors. And some killer applications emerged, but each had some issues about driving demand for faster processors:

  1. Internet browsing and email. As noted above, these applications are not exactly processor-intensive. Rather, these apps spend most of their time waiting for either the user to input something or for an Internet web server to respond over a broadband connection. Today's multi-gigahertz processors deal with these applications just fine, so fine, in fact, that nowadays mobile phones are nearly as responsive on fast networks as computers are.

  2. Advanced photo and video editing. These applications can be very processor intensive, as can be validated by anyone who runs Photoshop, Aperture, Final Cut, or Adobe Premiere. But the challenge here is that these applications don't have a broad base of users. Yes, professional photographers, TV stations, and movie studios need them. But those uses account for sales of a few million computers at most, not the hundreds of millions needed to justify a decade of new software development.

  3. Computer gaming. This application is today probably the biggest consumer of PC computing cycles. Ask any gamer, and they'll tell you that they'll buy as much computing power as they can afford. But because of the high cost of computing power to date, this industry segment has moved toward dedicated gaming hardware, since gaming console manufacturers can optimize their hardware for this specific task. The result: Computer gaming is only a weak driver of parallel processing software development, despite its need for as many gigaFLOPS as possible.


Vendors won't crack the problem of parallel processing software by focusing on broad "everyone needs them" applications like email. Instead, they need to identify early adopter market segments -- communities who today are desperate for more computing power and are already rolling their own multi-core software to get that power. Some of those communities are obvious, such as pharmaceutical companies who need to explore countless chemical compounds to come up with new drugs, and Wall Street trading firms and hedge funds who model financial markets for profit.

As Geoffrey Moore noted in his classic book, Crossing the Chasm, products never hit a broad-based majority right out of the gate. Instead, they evolve through much smaller market segments of technology enthusiasts and pragmatic early majorities who demand a whole product that solves a significant problem. Even Microsoft Excel wasn't originally used by most computer users; its early adopters were Mac users (where the product was released first) and accountants who really needed spreadsheet functions that ran under Windows. Why would anyone think multi-core software would be different?

I assume that with such a star-studded group of parallel processing researchers, Microsoft will figure out the technical issues fairly quickly. And if it wants some suggestions about some very important, highly parallel applications, here's some food for thought: Google was so unsatisfied with the price, performance, and physical density of PCs for its highly parallel applications that it designed and built its own machines for its data centers. Unlike today's PCs, those processors and cores get used whenever and wherever there is demand. And when you have billions of requests per day for your services and applications, as noted by this article in the New York Sunday Times, parallelism is pretty easy to find.

Full disclosure: the author is long Google at the time of writing.



Technorati Tags: , , , , , ,