GE’s clarion call on cloud and analytics with its Industrial Internet initiative should serve notice to organizations everywhere about not whether, but how to extract value from big data. GE will leverage industrial data to drive the three pillars of return on investment (ROI): reducing costs for itself and its customers, raising productivity of people and equipment, and generating incremental revenues. The Industrial Internet can be more significant for GE than Six Sigma was in the 1980s.
The most notable aspect of GE’s initiative is the partnerships it fostered with Amazon, Pivotal and Accenture. Rather than building its own Hadoop clusters and IaaS platform on OpenStack or CloudStack, the company opted for a faster, more efficient path to delivering its services. GE is focusing on creating high-value software using its knowledge about the equipment it manufactures, rather than getting into the cloud infrastructure and hosting business itself.
The opportunity is enormous. There are over 250,000 pieces of GE equipment deployed globally – ranging from gas turbines and jet engines to locomotives and medical devices. For perspective on the magnitude of data generated, we highlighted in a recent post that Boeing reported that jet engines generate 10 terabytes of operational data for every 30 minutes they turn. Hence, a four-engine jumbo jet can create 640 terabytes of data on a single trans-Atlantic flight. Every day, there are approximately 25,000 flights.
For perspective on the efficiency potential consider the following statistics from GE’s research about how much just a 1% improvement in yield can achieve: in aviation, a 1% reduction in fuel consumption could result in $30 billion in savings over 15 years. A 1% improvement in the efficiency of gas-fired power plants could produce $66 billion in fuel savings globally. And a 1% improvement in healthcare process efficiency could generate $63 billion in savings throughout the health care system.
Driving ROI and profitability in a return to vertical integration
GE Intelligent Platforms will help customers collect and analyze data from GE equipment either in their own private clouds or on Amazon Web Services. The new Proficy Monitoring & Analysis software suite is an integrated solution for collecting and analyzing big industrial data. The suite consists of six products, including the new Proficy Historian HD and the Proficy Knowledge Center.
• Proficy Historian – the flagship data collection software
• Proficy Historian Analysis – for data mining and visualization
• Proficy SmartSignal – predictive analysis for condition-based monitoring
• Proficy CSense – process-oriented monitoring, troubleshooting and optimization
• Proficy Historian HD – leverages Hadoop clusters to store massive data sets
• Proficy Knowledge Center – a console that integrates all the components
The ROI on data about equipment availability is compelling. Customers can deploy the suite for remotely monitoring an individual plant or its entire global operations. Benefits include reduced down time, higher production yields, lower defect rates, improved inventory management and significant maintenance cost savings.
The platform will strengthen GE’s services business while also helping the company in the design and manufacture of future products. Better data on equipment performance can drive service contact value and renewal rates while improving manufacturing efficiencies. This, in turn begets more service and maintenance contracts. The result is a more predictable stream of revenues and cash flow.
One example of how data analytics can influence GE’s business is in wind turbines. By virtue the work it has done in monitoring and testing different combinations of materials and speeds, GE has been able to optimize the performance of the turbines. This intelligence has allowed GE to become the global market leader in an industry that appeared to be slipping away to foreign competition.
Data analytics also has the potential to reverse the outsourcing trend in manufacturing over the past few decades. Leveraging data from machinery in operation to 3-D parts printing, shop-floor production, supply chain management and service and maintenance contracts could give rise to a new, data-driven vertical integration.
Standards yes, but defined by whom?
GE is initially targeting existing customers. However, with the help of its partners, one can envision future offerings that extend beyond GE’s installed base. The company’s VP of Global Software, Bill Ruh summed up the opportunity and challenges last week. He said the key will be architecting systems that are able to capture data, and bring it to the cloud and provide actionable intelligence based on it.
Interoperability and the ability to share data among different platforms such as GE’s is a major objective. In Ruh’s words, “What made the Internet successful is a few simple protocols…I think that’s what’s going to be key here; that we figure out where the standards have to occur”.
The OpenStack and CloudStack contingencies pay attention. Working together with GE, AWS and Eucalyptus, along with Pivotal, are in the catbird seat to defining standards at this early stage in the game.