Researchers at the University of Sheffield are increasingly turning to graphics processing unit (GPU) technology to supercharge their work with complex system simulations.
In March 2017, researchers at the university's Department of Computer Science received an award of over £22,000 from the Department for Transport, alongside Transport Simulation Systems (TSS) for its work on road simulations.
The project - which ended in January 2017 - emphasised the use of GPUs to improve the speed and accuracy of its road micro-simulations by up to 33 times.
Micro-simulation is an analytics tool which can be used to perform detailed analysis of activities like traffic flow through an intersection, or even of pathogens spreading disease through a population.
Speaking to Computerworld UK during Nvidia's GTC conference last week, Dr Paul Richmond, research software engineer at the University of Sheffield, told us how the university is increasingly turning to GPUs for more advanced computation.
"We can scale these simulations up to represent human immune systems and run them fast enough and explore all of the different parameters around what type of interventions may produce emergent properties," he said. "In this case, a good patient outcome like disease remission and so on."
Case studies
For example, the university has embarked on a project with Siemens Rail Automotive to develop an advanced multi-modal simulator.
When asked how it works, Richmond said that the simulation allows existing models of the rail network to link with a complex systems model of pedestrian behaviour through a station, which will then link with Transport for London's models of the road network. This enables researchers to start to link it all together.
"So what you see is, when trains get delayed within the station and you end up with congestion on the platform, it can feed directly into the model's train schedule and that may have a knock-on effect to lead to further congestion," Richmond explained.
"These things are really closely coupled together and they've never really been used together in the past, and that's the same with roads and with pedestrian kind of simulations. So we're currently exploring with Siemens ways in which we may take that forward and we've developed that for about a year or so," he added.
Read next: In the wake of the Uber accident, Nvidia is turning to simulations to test autonomous cars
Another example is the university's work with consulting firm Atkins Global to better simulate cities. Partnered with the Highways Agency, they maintain a piece of software used for strategic planning of infrastructure and where to invest in roads.
"We've been working with them to provide GPU acceleration to that code," Richmond said. "So we've taken what was a piece of software that took 12 hours in serial to run, down to eight minutes with GPUs. So I'm really, really happy with that."
GPUs in action
In 2017 the university bought a DGX-1 workstation, which is described by Nvidia as a personal supercomputer. At the same time it partnered with Joint Academic Data Science Endeavour (JADE) – the UK's largest GPU facility – as well as other universities to further its deep learning and AI research.
Richmond began developing his software on consumer-grade GPUs. This was then deployed on more specialist systems like P100s, V100s and Titan V hardware, then later at JADE, which is a collection of 23 DGX-1s.
The university is predominantly using the DGX-1 for AI research but about 25 percent of the time it's used for other functions like high performance computing simulations.
Read next: Nvidia supercharges deep learning at GTC 2018
Nvidia announced the DGX-2 system at this year's GTC, but for now the university will probably hold onto its existing setup.
"We go through hardware refreshes in two- to three-year cycles," Richmond said. "Will we deploy DGX-2? They're quite expensive, and given that we've just spent in the UK between £3-4 million on DGX-1, I think it's unlikely that we'll upgrade at this point."
Benefits of Nvidia GPUs
The university has certainly benefitted from the opportunity to conduct research in the UK using GPUs, unlike the many UK-based researchers that are still falling back on using CPU-based systems.
"In the US and China, you'd see the big syndicates are already GPU accelerated," Richmond said. "In the UK they're not. Researchers are still using old bits of code that work on CPUs so we invested in these Tier-2 boxes it's kind of an experimental thing.
"So depending on the success of things like JADE, that will influence the kind of policy and procurement to a bigger level, so maybe for the next hardware refresh, maybe [UK supercomputer] ARCHER will be full of DGX-2s or DGX-3."