Skip the navigation

Update: Conoco deploys Linux-based supercomputer for use in oil exploration

August 31, 2000 12:00 PM ET

Computerworld - Energy conglomerate Conoco Inc. yesterday announced that it has built and deployed a huge Linux-based supercomputer to analyze massive amounts of seismic data gathered in the process of exploring for oil and gas.

The new geophysical computer -- which Conoco said boasts enough storage capacity to house the complete U.S. Library of Congress -- was built entirely by an internal information technology and engineering team headed by Dr. Alan Huffman, manager of the $27 billion company's Seismic Imaging Technology Center.

Huffman said the Linux-based system cost one-tenth the price that Conoco likely would have had to pay for a conventional supercomputer. He added that the system already has been used to analyze seismic data from the North Sea and the Gulf of Mexico, where Conoco recently discovered oil and is drilling two deep-water wells.

Houston-based Conoco said the supercomputer integrates the open-source Linux operating system and a clustered hardware architecture built around Intel Corp.'s microprocessors with advanced tape robotics, 10 terabytes of hard-disk storage and proprietary seismic software developed by the energy company.

The supercomputer and its accompanying disk farm are located at Conoco's seismic computing facility in Ponca City, Okla. But the new system has been designed in such a way that it's accessible from almost any Conoco substation, according to company officials. Making that possible involved re-engineering the company's seismic software to run under Linux with a Java-based and XML-compatible user interface.

"We jumped on Linux because it had the flexibility to customize to our needs," Huffman said. In addition to re-engineering its software, he added, Conoco designed the hardware so that mini-clusters of processors can be split off for use by geophysicists who want to process data at oil exploration sites.

For example, Concoco's geophysicists need to analyze tapes containing sound waves that are recorded in the field and used to build an image of the sub-surface of the earth, similar to the way physicians use ultrasound data to develop physical pictures of particular body parts. "If a scientist is sitting in central Asia and has a bunch of tapes to be analyzed, he can bring a mini-cluster and process it right there," Huffman said.

The new supercomputer should enable Conoco to analyze seismic data faster and more cheaply than it currently can, Huffman said. And workers at the company also should be able to do new kinds of analytical work that "were simply not possible or cost-effective before," he added.

Related stories:

Read more about Operating Systems in Computerworld's Operating Systems Topic Center.

Our Commenting Policies