LAS VEGAS - We no longer need seers, oracles and psychics to tell us about the future. The declining cost of sensor technology and the availability of cloud-based analytical platforms is making predictive analytics accessible to every industry and most products.
These technologies give insights into how products are performing. Sensors record vibration, temperature, pressure and voltage, among other conditions, and provide data for real-time analysis. Sensors can help lead to discovery of faulty parts in products weeks before they actually fail.
Initial deployments of sensors have been in large and expensive industrial platforms, such as electrical generation systems and jet engines. In time, sensors connected to analytical platforms will be found in nearly every product.
The belief is that this technology will make machinery and systems more reliable. Sensors and analytics will alert users and vendors to problems days, weeks and months before a problem becomes visible. This insight into performance will also significantly reduce unplanned failures.
"We will know more about when they are going to fail, and how they fail," said Richard Soley, CEO of the Object Management Group, a nonprofit technology standards consortium.
Businesses will also benefit from learning how customers are using their products, which will shape how products are made, Soley said.
Predictive analytics capability in industrial applications is not a new concept. Big machinery has long used sensors. What is new is the convergence of three major trends that will make deployment ubiquitous, say people working in this area.
First, sensor technology is declining in price as it gets smaller and more efficient; second, wireless communication systems have become reliable and global; third, is that cloud-based platforms that can be used for analytics and development are emerging rapidly. Collectively, these trends underpin the Internet of Things.
At IBM's big conference, InterConnect, this week, the University of South Carolina was showing off a sensor-equipped gear box on an Apache helicopter that is part of study for the U.S. Army. There were four sensors on the gear box collecting temperature and vibration data.
One of the big savings in the use of this technology, aside from predicting failure, is correctly planning maintenance. Many maintenance activities may be unnecessary, wasteful and can introduce new problems.
"If you can reduce improper maintenance processes and improve the identification of faulty maintenance, you can directly impact safety," said Retired Maj. Gen. Lester Eisner, with South Carolina's National Guard, who is deputy director of the university's Office of Economic Engagement.
In another area, National Instruments has been working with utilities to deploy its sensor technology. Today, many utilities have employees who collect data directly off machines, which is something of a shotgun approach, said Stuart Gillen, principal marketing manager at the company and a speaker at the IBM conference.
All it takes is one or two "catches" – preventing a failure in a large system – to justify the cost of deploying technology that can take in all the data from these systems and provide a more targeted approach to maintaining them, Gillen said.
National Instruments is working with IBM and its recently launched Internet of Things capability, which is part of IBM's Bluemix cloud platform. This platform gives developers the ability to create new ways of working with the machine data.
There is much optimism that this technology will reduce equipment failures. Having the ability to see a little further into the future and reducing the need to rely on the benefit of hard-learned hindsight is the goal. But no one is predicting that this technology will eliminate failure all together.
"There are a lot of variables" that can contribute to equipment failure, said Sky Matthews, the CTO of IBM's Internet of Things effort, but this technology "can certainly dramatically reduce them."