Sun blackbox boggles bloggers' brains (and iPod+tubes)

Blink and you'll miss IT Blogwatch, in which Sun unveils it's shipping container datacenter. Not to mention a totally tubular iPod dock...

Thus blogged Om Malik:

Sun Microsystems is putting an entire data center in a shipping container, betting that it could help overcome the escalating real estate costs, and can (literally) provide computing on demand. The $500,000 “data center in a box” is going to be available in second half of 2007, reports John Markoff of The New York Times. The water cooled system is painted black, and has seven racks of 35 server computers based on either Sun’s Niagara Sparc [or AMD] Opteron ... The expandable computer system, called Project Blackbox, is based on a standard 20-foot shipping container and can be deployed virtually anywhere there is electricity, chilled water and an Internet connection.

This reminds us of a speculative post by Bob Cringley, back in November 2005. Google hired a pair of very bright industrial designers to figure out how to cram the greatest number of CPUs, the most storage, memory and power support into a 20- or 40-foot box. We’re talking about 5000 Opteron processors and 3.5 petabytes of disk storage that can be dropped-off overnight by a tractor-trailer rig ... despite Sun’s optimism, we wonder how many will buy into this version of computing on demand. Connecting data centers to the internet is not a trivial task, and our friends who are savvy in the ways of routers and switches often share their woes over a pint.

Sun CEO Jonathan Schwartz explains why:

As I've been saying for a while, our customers ... face a diversity of tough challenges. What does the CIO in midtown Manhattan do when she runs out of roof space or power? How does an aid agency deliver basic connectivity to 5,000 relief workers in a tsunami stricken metropolis? What does an oil company do when they want to move high performance analytics onto an offshore platform or supertanker? Or a large web services company do when they want to cookie cutter their infrastructure next to a hyrdroelectric plant for cheap power - within weeks, not years?

None of these are easy problems to solve - especially one computer at a time. They're more commonplace than you'd think across the globe. And now you know the motivation behind our asking a simple question, "what would the perfect datacenter look like?"


The world's transportation infrastructure has been optimized for ... moving containers on rails, roads and at sea ... we turned the rack 90 degrees, and created a vastly more efficient airflow across multiple racks. And why not partially cool with water in addition to air ... vector the air, augment with a water chiller, and cooling expense plummets. As does your impact on the environment ... if you can generate power for less than the power company charges you, why not do so - put a generator next to the chiller in a sister container, and you've got access to nearly limitless cheap power. (Heck, you could run it on bio-diesel.)


in most datacenters I vist ... operators run out of power capacity long before they fill up their datacenters ... In a container ... we jam systems to a multiple of the density level and really scrimp on space. And it can run anywhere, in the basement, the parking garage, or on a rooftop. Where utilities, not people, belong.

Martin MC Brown will take one to go::

Back in my college years I applied for a job at a company that provided computing power for a stock trading company. Understandably here, time was money, and so part of their disaster recovery plan was a complete duplicate of their internal setup that was, and I quote, 'kept in the back of a lorry round the corner'. ... Sun now provides something similar, but perhaps more off-the-shelf than a custom built setup in a lorry, in the form of Project Blackbox.

But Rob Mitchell wants it to stay::

In the movie Goodfellas, the New York mob makes off with millions of dollars in the famous Lufthansa heist. But instead of hijacking trucks, what if criminals could hitch up to your data center and haul it away? That could be a new security concern if IT organizations take to new, modular data centers as Sun's Project Blackbox.
Sun's Hal Stern adds:

Today, we have a hierarchy of nesting structures that give data center architects a wide variety of choices and control points. The announcement of Sun's Project Blackbox is just the highest-level abstraction in this stack


How do you manage it? Capture events? Do physical security? Identify the basic unit of work, deployment, and cost for power, cooling, space, and compute/storage density? These are all questions that get asked as part of virtualization at a lower level of abstraction (anyone who wants to implement CPU-level virtualization but hasn't built a model for allocating applications to CPU resources is making work, not progress). Answer these questions about your data center and you can build a box -- literally -- around it. That's Project Blackbox: stimulating conversation about data center abstraction. Generating some interest in literally boxing it up and moving it to where the space, power, cooling, or environmentals are more friendly.

Carl Howe calls it, " Sun's most important product":

This is the ultimate computing commoditization play ... in an increasingly environmentally-conscious business environment where power is a major cost, the system breaks some new ground ... since the system is water-cooled, these systems could have a fascinating dual-use application in northern climates. The warmed water can be used as a heat source for adjacent buildings. With 80% of power in a data center being converted directly into heat, the ability to heat an office as well as providing computing services to it provides even more value to business buyers.


With data centers running from $250 million to a billion dollars to build, Sun's Blackbox changes the economics of that market by starting around $500,000. That's like IBM selling $5,000 PCs that could do the work of $500,000 DEC mainframes -- and I predict it will be just as disruptive ... this could easily be one of the most transformational computing products of the decade for business. It changes data centers from a build-it-yourself business to one where they are available off the shelf and on demand. And, like the standardized shipping containers in which they are built, most people will never give them a second thought.

Nick Carr calls it, "Trailer park computing":

The containerized data center is one more manifestation of the fundamental shift that is transforming corporate computing - the shift from the Second Age client-server model of fragmented, custom-built computing components to the Third Age model of standardized, utility-class infrastructure. As this shift plays out, the center of corporate computing will move from the personal computer upstream to the data center. And, inevitably, what happened to the PC - standardization and commoditization - will happen to the data center as well. What is Sun's data-center-in-a-box but an early example of the data center as a standardized commodity, an off-the-shelf, turnkey black box?


In many ways, the containerized data center resembles the standardized electricity-generation system that Thomas Edison sold to factories at the end of the 19th century and the beginning of the 20th. Manufacturers bought a lot of those systems to replace their complex, custom-built hydraulic or steam systems for generating mechanical power. Edison's off-the-shelf powerplant turned out to be a transitional product - though a very lucrative one. Once the distribution network - the electric grid - had matured, factories abandoned their private generating stations altogether, choosing to get their power for a monthly fee from utilities, the ultimate black boxes.

But all Techdirt's Joe sees is a publicity stunt:

It's not clear how big the market for this is going to be. It will probably be too much for most small companies, while large companies with massive infrastructure needs probably won't get much use in an additional shipping container filled with gear. The sweet spot of companies for whom this will be ideal seems small. It's impact on Sun's business won't be as significant as what it represents, the continuing commoditization of corporate infrastructure.

Scott Yang agrees, but then changes his mind:

it is no more than a strategy for Sun to sell more boxes. If people are thinking about ecology impact. reducing energy bills, better power utilisation, etc, they need to start thinking about building better and more efficient software applications and taking advantage of virtualization and clustering to increase CPU utilisation. Sometimes I wonder why many Web 2.0 sites require so much server power. CPU power in servers increases every year, but somehow they can never keep up with the traffic


[On the other hand] reducing power consumption, increase software efficiency, minimising server costs — these don't matter. After all, servers are cheap, coders are not, and timing is super-expensive. Don't have time to tighten up the code? Buy more servers and buy more time, and pray that Google, Yahoo, Microsoft or Amazon will buy you before the bank account runs dry. Sun Project Blackbox — helps your incompetent developers to buy more time. Why not?

Sun's Dave Douglas gives us a tour on YouTube:

Hi there ... I'm vice president of avanced technology here at Sun Microsystems. Over the last couple of years, we've heard a lot of customers tell us... [for a transcript, PayPal $50 to]

John Sinteur sums up the thoughts of many:

Crap. Of all the systems they could have used in their “Try&Buy” system, this one isn’t in there.

Buffer overflow:

Around the Net

Around Computerworld

And finally... this iPod dock runs on a series of tubes

Richi Jennings is an independent technology and marketing consultant, specializing in email, blogging, Linux, and computer security. A 20 year, cross-functional IT veteran, he is also an analyst at Ferris Research. Contact Richi at

Copyright © 2006 IDG Communications, Inc.

Shop Tech Products at Amazon