Home' Technology Review : January February 2009 Contents FEATURE STORY 57
WWW. TECHNOLOGYREVIEW. COM
e ort from chip makers, software developers, and academic com-
puter scientists. Indeed, Illinois's UPCRC is funded by Microsoft
and Intel---the two companies that have the most to gain if multi-
core computing succeeds, and the most to lose if it fails.
INVENTING NEW TOOLS
If software keeps getting more complex, it's not just because more
features are being added to it; it's also because the code is built on
more and more layers of abstraction that hide the complexity of
what programmers are really doing. This is not mere bloat: pro-
grammers need abstractions in order to make basic binary code
do the ever more advanced work we want it to do. When it comes
to writing for parallel processors, though, programmers are using
tools so rudimentary that James Larus, director of software archi-
tecture for the Data Center Futures project at Microsoft Research,
likens them to the lowest-level and most di cult language a pro-
grammer can use.
"We couldn't imagine writing today's software in assembly lan-
guage," he says. "But for some reason we think we can write parallel
software of equal sophistication with the new and critical pieces
written in what amounts to parallel assembly language. We can't."
That's why Microsoft is releasing parallel-programming tools
as fast as it can. F#, for example, is Microsoft's parallel version of
the general-purpose ML programming language. Not only does it
parallelize certain functions, but it prevents them from interacting
improperly, so parallel software becomes easier to write.
Intel, meanwhile, is sending Ghuloum abroad one week per
month to talk with software developers about multicore archi-
tecture and parallel-programming models. "We've taken the
philosophy that the parallel-programming 'problem' won't be
solved in the next year or two and will require many incremen-
tal improvements---and a small number of leaps---to existing lan-
guages," Ghuloum says. "I also tend to think we can't do this in a
vacuum; that is, without significant programmer feedback, we will
undoubtedly end up with the wrong thing in some way."
In both the commercial and the open-source markets, other
new languages and tools either tap the power of multicore process-
ing or mask its complexity. Among these are Google's MapReduce
framework, which makes it easier to run parallel computations
over clusters of computers, and Hadoop, an open-source imple-
mentation of MapReduce that can distribute applications across
thousands of nodes. New programming languages like Clojure
and Erlang were designed from the ground up for parallel com-
puting. The popular Facebook chat application was written partly
Meanwhile, MIT spino Cilk Arts can break programs written
in the established language C++ into "threads" that can be executed
in parallel on multiple cores. And St. Louis--based Appistry claims
that its Enterprise Application Fabric automatically distributes
applications for Microsoft's .Net programming framework across
thousands of servers without requiring programmers to change
a single line of their original code.
THE LIMITS OF MULTICORE COMPUTING
Just as Intel's dream of 10- and 30-gigahertz chips gave way to the
pursuit of multicore computing, however, multicore itself might
be around for a matter of years rather than decades. The e ciency
of parallel systems declines with each added processor, as cores
vie for the same data; there will come a point at which adding an
additional core to a chip will actually slow it down. That may well
set a practical limit on the multicore strategy long before we start
buying hundred-core PCs.
Does it matter, though? While there may be applications that
demand the power of many cores, most people aren't using those
applications. Other than hard-core gamers, few people are com-
plaining that their PCs are too slow. In fact, Microsoft has empha-
sized that Windows 7, the successor to the troubled Windows Vista,
will use less processing power and memory than Vista---a move
made necessary by the popularity of lower-power mobile com-
puting platforms and the expected migration of PC applications
to Internet-based servers. A cynic might say that the quest for
ever-increasing processing power is strictly commercial---that
semiconductor and computer companies, software vendors, and
makers of mobile phones need us to buy new gizmos.
So what's the downside if multicore computing fails? What
is the likely impact on our culture if we take a technical zig that
should have been a zag and suddenly aren't capable of using all 64
processor cores in our future notebook computers?
"I can't wait!" says Steve Wozniak, the inventor of the Apple II.
"The repeal of Moore's Law would create a renaissance for software
development," he claims. "Only then will we finally be able to cre-
ate software that will run on a stable and enduring platform."
"In schools," says Woz, "the life span of a desk is 25 years, a text-
book is 10 years, and a computer is three years, tops. Which of
these devices costs the most to buy and operate? Why, the PC, of
course. Which has residual value when its useful life is over? Not
the PC---it costs money to dispose of. At least books can be burned
for heat. Until technology slows down enough for computing plat-
forms to last long enough to be economically viable, they won't
be truly intrinsic to education. So the end of Moore's Law, while
it may look bad, would actually be very good."
ROBERT X. CRINGELY HAS WRITTEN ABOUT TECHNOLOGY FOR 30 YEARS. HE IS
THE AUTHOR OF ACCIDENTAL EMPIRES: HOW THE BOYS OF SILICON VALLEY MAKE
THEIR MILLIONS, BATTLE FOREIGN COMPETITION, AND STILL CAN'T GET A DATE.
Watch Robert Cringely explain multicore processing:
Links Archive March April 2009 November December 2008 Navigation Previous Page Next Page