Woodside pushes cloud’s limit to crunch data

Using cloud services to do the heavy lifting and full-wave inversion have led to better quality seismic imaging and a shorter time between data acquisition and data being made available to seismic interpreters at Woodside.

However, Woodside’s chief technology officer, Shaun Gregory, says that the processing requirements for the volumes of data involved mean that the resources company is running up against limitations with some of the public cloud services it employs.

The approach is still managing to deliver images in around four to eight weeks, compared to the traditional 12 to 18 months, he told an investor briefing earlier this month.

Woodside has employed both Google Cloud Platform and Amazon Web Services to crunch the seismic data.

“We’re hitting the limits of that at the moment and that’s why it’s still four to eight weeks,” the CTO said. “There is a limit to how much we can scale up computing in this instance and we’ve worked with those providers to increase the amount that we can utilise.”

The barrier is a combination of “being able to move enough data on their cloud to the compute, and balancing memory, compute and the moving of data around,” he said. “We’re getting there mdash; six months ago we could spin up 10,000 nodes. Today we can spin a 100,000.”

“We know people want it from the day the boat finishes [a survey] and [to have] images available the next day – that is our aspiration,” Gregory said. “That is many years off. But in the next few years you’ll be seeing it in weeks.”

Because cloud infrastructure can be scaled up and down as needed, it’s proved a cost-effective solution, he said.

Another key role for public cloud at Woodside is processing data gathered from the 200,000 sensors of the Pluto LNG plant. Every minute the plant’s sensors stream data into a platform built in AWS.

The data is being employed to optimise production at the plant, based on conditions such as temperature. Using several years’ worth of data, the plant’s level of LNG production can be compared to the best level delivered during similar conditions, and the system built by Woodside can recommend potential measures to increase it.

“It runs and recalculates every 10 minutes and re-presents that data back to decision makers in the field,” Gregory said.

Woodside’s partnership with IBM for the deployment of Watson technology is another way in which the resources company is leveraging its masses of data, the CTO told the briefing. Woodside currently has “10 Watsons” operating in different areas of the business, Gregory said.

One example is ‘Watson for Wells’. Every time a well is drilled, documentation equivalent to a foot-high stack of paper is generated. Watson is capable of ingesting the documentation from 5000 wells in six hours.

One way the system is used is to deliver insights about possible drilling hazards around a well. “What used to take months, takes minutes,” Gregory said.

Currently around 900 people at Woodside are using Watson. The latest Watson-based system at the company mdash; ‘Health and Safety Watson’ mdash; “is probably going to penetrate every user in Woodside,” the CTO said.

Earlier this month Woodside began the hunt for a chief data officer.

“Data is at the core of the way Woodside works from our daily operations right through to our future technology programs like data science and artificial intelligence,” a spokesperson from the company told Computerworld at the time.

“We’re looking for a chief data officer who will provide leadership to champion the creation of value and the safeguarding of our operations from data right across the organisation.”

Copyright © 2017 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon