Going industrial

Himagiri Mukkamala, Head of Engineering for Predix, GE Digital, looks at digital transformation within the industrial space, the industrial internet and the implications in terms of software development.

Unless you’ve been living in a cave, you will have heard the phrase ‘digital transformation’ many times over the past few years – some may say, too many. You may even have used it yourself, to describe the way in which companies are striving to change, not just their IT systems and communication technologies, but business processes and practices to both cope with and gain advantage from the huge surge in data now routinely generated by the modern world.

Chances are you’ll also have an understanding of how big data, machine learning and the internet of things (IoT) fit into the digital transformation story and the many benefits and implications for consumers and businesses alike. You’re less likely, however, to have seen or heard much about how all this is affecting manufacturing, power generation, mass transit, health services and other large, industrial scale, businesses, which for a variety of reasons, have come late to the digital transformation table.

That, however, is changing and changing fast, providing a whole new set of challenges for those involved in designing, developing and testing the software involved.

Reasons to be cautious

The fact that the technologies employed by the industrial world are lagging behind their consumer and enterprise counterparts should come as little surprise. To start with, whereas the typical business infrastructure might get replaced every two to three years and consumer smartphones upgraded annually, the technology behind the average power station or locomotive could easily be 20‑30 years old, even more in some cases. Additionally, such systems are crucially, often strategically, important, making security a number one concern ahead of any possible benefits likely to accrue from the technologies and processes of digital transformation.

Consider, for example, what would happen if a power generation equipment or jet engine were compromised by a denial of service attack. Worse still, what if those systems were held to ransom by extremists using CryptoLocker or other form of ransomware? The chaos that would ensue is unthinkable and the main reason why traditional SCADA systems (Supervisory Control and Data Acquisition) are mostly designed to work locally with so‑called ‘air‑gaps’ isolating them from the wider internet.

The potential benefits of bridging those ‘air‑gaps’ and connecting industrial systems directly to the internet are, however, growing ever more pressing as the devices being controlled (assets in SCADA speak) are engineered to deliver up data beyond that required for simple monitoring and management purposes.

The carrot of commercial exploitation of big data is clearly being dangled here with companies looking to exploit the information they have already and develop new and valuable machine learning analytics and applications on the back of the greater connectivity. And that, in turn, is leading to companies looking to push old‑style SCADA technology out the door in favour of a more modern industrial internet designed to use cloud technology to communicate with and manage large‑scale assets.

The same but different

The aim of the industrial internet is, ultimately, to enable digital transformation on an industrial scale, and you could be forgiven for thinking that it bears more than a passing resemblance to the internet of things (IoT) where edge devices are similarly connected to monitoring and management intelligence in the cloud. The similarities, however, are mostly superficial, not least because of the huge amounts of data involved, even compared to the volumes predicted for some of the Big Data applications being developed to exploit the internet of things. Other differences between IoT include the complexity in the assets such as a locomotive with 300,000 parts versus a smartwatch, and the velocity at which data is streamed from various sensors on the assets.

We’re not talking about switching light bulbs on and off here, or, monitoring CCTV cameras and other relatively simple applications, but managing massive assets that can generate gigantic amounts of data. The sensors in just one jet engine, for example, can produce terabytes of data every day as can the hundreds of turbines in the average wind farm, water pumping stations and so on.

More than that, the IoT approach would be to push all this data into the cloud and analyse it there, but that would be far too slow and impractical for a lot of industrial data. Far better to collect and pre‑process the data and run local machine learning analytics using local gateways or on the control systems before making it available in a more digestible format both for sharing with other gateways and processing by applications in the cloud. Not only is this more efficient it also helps address many of the security concerns, as the sensors involved need little in the way of exploitable intelligence, plus the assets themselves are not directly accessible. It provides a kind of logical ‘air‑gap’ if you like, on top of which it also enables multiple layers of robust security to be put in place to provide protection far greater than available with vanilla IoT technology.

A joined up approach

Of course there’s the inertia of the very conservative industrial market to overcome but it’s important not to dismiss the concept of an industrial internet as something that will never happen. Industrial companies may typically be slow to change, but a growing groundswell of enthusiasm for something better than traditional SCADA solutions provide is driving the technology forward. However, it would not be ideal for every vendor to go it alone and develop its own implementation employing different, possibly proprietary, communication, security and other technologies. GE believes an integrated edge to cloud architecture and technology that can enable GE and non‑GE assets and enterprises to participate in this transformation is critical just like the platforms that have been able to successfully enable the consumer and enterprise digital transformation.

It is important for developers to have access to platforms, tools and services designed specifically to build industrial grade, asset performance management (APM) systems rather than make do with general purpose alternatives. In this respect, existing platforms for enterprise cloud applications, fall a long way short of what’s required, especially in terms of the scalability and security required. Hence why GE has put its weight behind Pivotal Cloud Foundry, a commercial implementation of the open source PaaS service already proving popular with cloud developers.

 

GE Predix

Figure 1. Originally created to enable developers at GE to build custom APM applications for its own customers, the Predix platform has since been made more widely available and is already being used by over 8500 developers in companies such as Boeing, ConocoPhillips, Pitney Bowes and others to build large‑scale industrial APM systems.

 

Not only is the firm making a substantial investment in Pivotal Cloud Foundry, GE is also using it as the base of its own Predix platform; to host highly secure, industrial grade APM systems and provide access to the specialist tools required to better develop, test, deploy and manage the software involved.

Through Cloud Foundry, Predix provides access to all the tools needed to develop and deploy cloud‑based applications and analytics but do so at scale. It also provides developers with access to advanced analytics services to help both automate management processes and exploit the information currently locked up in conventional SCADA silos. Other advantages include the ability to work with any development platform and apply a modern continuous development methodology, avoiding the inherent delays and potential for errors that can arise with a cyclic code/test approach. This is particularly important in the industrial world where minor issues can have potentially catastrophic effects, so Predix encourages a team approach to development with integrated testing and quality assurance at every phase of the process.

Security services are also built in and treated as a priority at every stage together with a strong DevOps approach to enable continuous development, implementation and operation at the kind of industrial scale users of Predix expect to achieve.

Sure, the industrial world is still playing catch up with the enterprise in its quest to reap the benefits of digital transformation, but it’s getting there and solutions such as Predix will see us achieving that goal sooner rather than later.

This article was first published in the July 2016 issue of TEST Magazine.

Edited for web by Cecilia Rehn.

More
articles

Menu