Full System Simulation: Software Development's Missing Link

The primary challenge in delivering an electronic system is developing and testing the software. Systems aren't just computerlike boxes; they're complex products, from planes to cell phones. The Airbus 380, for example, is expected to contain more than 1 billion lines of code, while a cell phone can contain several million.

The process of delivering a working electronic system has moved away from the traditional one, where designers finalized the hardware design, then completed the software, then performed the final system integration. Instead, driven by cost and time-to-market pressures, software development using full system simulation is becoming the technique of choice.

In the past, two approaches to software development predominated: host-based, and hardware-based.

In host-based development, designers would first create a test scaffold on a desktop computer -- though the final product would eventually ship on some other platform. The vain hope was that, for example, development on an Intel PC using this scaffold would be accurate enough that it would run on a MIPS processor with a real-time operating system.

Because the weaknesses of this approach wouldn't surface until final system integration, this phase of the project would ultimately produce an unpredictable, prolonged schedule. The advantage of this approach is that the code runs very fast, because it is natively compiled for the Intel processor. The obvious disadvantage is that it doesn't reflect how the real system will behave.

In addition, this approach clearly won't work for developing device drivers, operating system kernels and any other software that interacts intimately with the underlying hardware. The development of the "golden" code -- the code that will actually ship with the product -- doesn't begin until hardware is available.

Conversely, hardware-based development uses either real hardware or a surrogate, typically built using field-programmable gate-arrays (FPGA). Real hardware has the advantage of being as accurate a model as possible.

The big disadvantage is that the hardware has to actually exist, or at least be sufficiently well advanced that the FPGAs can be programmed. Even then, hardware isn't an especially friendly debugging environment. Worse, software and hardware development are unnecessarily serialized.

Furthermore, the device being designed doesn't operate in isolation. For example, a set-top box may need to communicate with a video server, and perhaps another billing server, while the servers need to communicate with a PC used for management control. Because of the need to produce an accurate test environment, systems companies using this approach spend tens or hundreds of millions of dollars on test racks.

Advances in simulation technology, coupled with the inexorable increase in the computing power of off-the-shelf PCs and workstations, are now providing a superior approach for addressing the challenges of system design. The solution is to use full-system simulation to virtualize the product development process.

To take the virtual approach, a software model of the system, known as "a virtual platform," is built and run in a full-system simulation environment. The virtual platform must have both fidelity and performance: fidelity, so the software "cannot tell the difference," and binaries of the golden code run on the virtual platform unchanged. The platform must also have exceptional performance -- high enough that software developers prefer to use the virtual platform, along with its superior debugging features.

Performance of full-system simulation can now reach peak speeds of more than 1 billion instructions per second on a commodity Linux workstation. Idle systems simulate even faster. For example, it is possible to simulate a 24-processor, 8GB, 64-bit enterprise server on a "vanilla" laptop; that is, to simulate a $2 million product on a $2,000 PC.

To be effective, the virtual platform must be capable of simulating not only the system being designed, but enough of the environment surrounding it to model real-world use -- a virtual test rack. Fortunately, virtualization is increasingly scalable, and it's now possible to simulate even thousands of systems on a network of inexpensive workstations.

The advantages of virtualization fall under two broad headings: First, a virtual approach to system development is less expensive than real hardware, creating an immediate saving in the capital that needs to be dedicated to software development and testing. This is especially true for organizations with large product development teams since the usual economics of software come into play.

While developing the first instance of a virtual platform may be relatively expensive, duplicating it is inexpensive. Every engineer who needs access to the virtual platform can get it, unlike real hardware where cost constraints create an environment where there is never enough. This makes it easier to raise the bar on quality while continuing to meet schedules.

Second, the virtual approach is superior to real hardware in a number of key ways. The most obvious benefit is that it can be available much earlier since the models aren't overly complex and can be produced quickly. This enables hardware and software development to largely overlap, providing a clear time-to-market advantage.

In addition, it's impossible to stop a real disk drive to investigate its internals, but easy to stop a virtual one. Unlike real hardware, a virtual platform is deterministic, so all bugs that show up can be re-created.

A fully simulated system can be check-pointed and later restarted, making many aspects of system testing much more effective than with real hardware. For example, multiple copies of the system can be launched with different faults injected to ensure that all error cases are handled correctly.

Many problems in today's systems occur during interactions between different products or multiple instances of the same product. These are precisely the areas that are most difficult to test with real hardware, which is notorious for bugs that disappear when inspected closely.

Being fully deterministic, even when distributed over multiple host computers, virtualization makes tracking these problems much easier, enabling you to go back in time until just before the bug occurs to investigate the internals at any level of detail.

As commodity workstations become ever faster and cheaper, and development simulation technologies continue to improve, the advantages of a virtual approach to system development will be undeniable.

Join the discussion
Be the first to comment on this article. Our Commenting Policies