GE Tools Target Industrial Big Data ProblemsGE Tools Target Industrial Big Data Problems
Proficy Historian 5.0 and Proficy Historian Analysis help industrial enterprises cut costs and reduce big data storage footprint.
GE Intelligent Platforms has announced Proficy Historian 5.0 and Proficy Historian Analysis, a pair of big data management tools designed to collect, archive, distribute, and analyze industrial information across a global enterprise.
Proficy Historian manages massive volumes of real-time information at very high speeds, and captures data from multiple sensors and systems. It provides an integrated view of an entire industrial operation and offers instant access to historical data. Its compression model can help companies dramatically reduce their data footprint, the company said.
In addition to promising faster performance, version 5.0 supports multiple data stores, allowing companies to separate regulatory data from process data. It also archives less frequently used information and provides fast access to high-fidelity data for frequent analysis.
Historian's new supervisory control and data acquisition (SCADA) buffer makes it easier for industrial users to gather information for real-time trending and to move data to Historian's archives, according to GE.
[ Data storage is the tricky issue right now, but new problems are waiting in the wings. See Machine-To-Machine Analytics: Next Big Data Challenge? ]
Proficy Historian Analysis is a Web-based visualization and analysis tool that works with data captured by Proficy Historian. It's designed to help industries analyze their historical data for actionable insights, such as finding ways to implement process improvements.
Historian Analysis users can view multiple trend charts at one time, create reports that show trends for a specific period of time, and share these reports and screen layouts with other users.
Eric Pool, principal engineer at GE Energy, says his company is using Proficy Historian to reduce its data footprint.
"We monitor roughly 1,500 gas and steam turbines around the world, and we generate 6 to 8 terabytes of data a year that we store centrally in our data center," said Pool in a phone interview with information.
Similar to other industries, the volume of data generated by GE Energy is growing rapidly.
"As we add units, we add capability, new types of sensors, new monitoring processes, and new kinds of analytics," Pool said. "We have over a million process parameters that we're measuring. We stream that data into our system and store it our database, using the Proficy Historian compression model."
Proficy Historian has enabled GE Energy to cut its data footprint from 50 terabytes to 10 terabytes. It's also allowed the company to greatly reduce the amount of time it takes to complete year-over-year fleet analysis, which requires access to very large time series data sets.
GE Energy's data growth is primarily in two areas. "We're adding new units and equipment every year, so that adds to the scale. And then we're adding more sensor capability every year," said Pool.
For instance, degradation analysis on a turbine used to take two weeks to complete, but Proficy Historian has helped GE Energy complete the task in one hour.
"It provides us better access to the information," said Pool of Proficy Historian. "Instead of having to extract data from the system and pull it into a file system, and then doing some sort of analysis external to the database, we can do a lot of that work as the data's coming into the system."
Big data management tools such as Proficy Historian 5.0 and Proficy Historian Analysis are part of General Electric's strategy to build Internet-connected software and analytics--tools capable of extracting meaning from a variety of data sources, including machines.
In a recent guest post for Forbes, Bill Ruh, VP and global technology director at General Electric, wrote:
"By connecting machines to the Internet via software, data is produced and insight is gained, but what's more is that these machines are now part of a cohesive intelligent network that can be architected to automate the delivery of key information securely to predict performance issues. This represents hundreds-of-billions of dollars saved in time and resources across major industries."
In-memory analytics offers subsecond response times and hundreds of thousands of transactions per second. Now falling costs put it in reach of more enterprises. Also in the Analytics Speed Demon special issue of information: Louisiana State University hopes to align business and IT more closely through a master's program focused on analytics. (Free registration required.)
About the Author
You May Also Like