In other words, how to suggest to industrial companies a balanced approach for data management and archiving with local management of the plants and use of optimization programs in the Cloud, also thanks to the use of Edge Computing.


Check out our Edge Computing archive!

Download our white papers and read our articles on this topic!
Click here

Many industrial companies increasingly want to use Cloud data analysis to optimize plant efficiency and management and see value in having a single platform for data storage and analysis.

The question is: how to exploit these analyzes on “Industrial Big Data” collected by the sensors and systems on the plant?

The answer may be a hybrid model that allows enterprises to connect cloud and on-premise solutions to obtain the benefits of both systems, and here CIOs see the potential value. You start the pilot projects all aimed at the Cloud, and then, a couple of months after the start of the project, someone in the financial sector sees the invoices received from the Cloud Providers for importing and storing data on the cloud and start questioning your initial choices.

In fact, the Cloud can be very expensive, not convenient for archiving time series of high resolution data generated by machines and equipment of industrial plants.

 

 

 

So how can we harness the power of cloud-based analytics tools on big data created by modern machines?

Fortunately, the answer is clear: a hybrid data management model that uses existing technology close to the data source (on the floor or in the corporate data center) and moves the “relevant data” with the right frequency to the Cloud for analysis. Machine and process data collected on plants is often acquired and stored at one-second intervals (or even faster). If you do the math, each sensor generates around 2,6 million individual value samples per month, which need to be processed and potentially archived.

Here is an example, referring to a major cloud analytics platform provider that charges for both processing (i.e. loading data) and data storage. Using only their logging option, the cost of data processing and archiving for a year is around €200 per sensor. The storage option is limited to approximately 120 high-speed sensors. Another storage option is slightly cheaper and can store 10 times as much data, but is limited to 1.200 sensors. If we have 1.200 sensors at $150 a year, the cost would be $180.000 a year.

A traditional Data Historian is specially designed to archive time series data very efficiently.

Why can using a process PDB make a difference? read this WP…Historian vs RDB white paper

For a Data Historian license of 10.000 tags (10 times larger than the aforementioned cloud analysis platform), the cost of the license and the server on which to install the software would be less than €90.000, calculated over a three-year period.

This is a savings of over 80% compared to using the Cloud for storage. (OK, there's electricity cost and IT overhead to think about, but at an 80% savings, electricity and IT costs are marginal).

The missing element here is the analytic suite in the Cloud: with a hybrid model, industrial companies can connect these two technologies to get the benefits of both.

In a hybrid model, theHistorian and the Cloud work together to meet the needs of IT and OT teams. The Historian is highly efficient at archiving time series data at large scales; users today can store tens of thousands of tags per facility for local reporting and analysis purposes. Historian technology is designed for this purpose with key features such as economy, efficiency and safety.

One of the key technologies used to store large amounts of data is compression, which minimizes the data stored on disk or transferred between servers. A Historian's technology also improves query performance and has built-in aggregation capabilities.

Additionally, Historians have collectors, which transfer data from a source to a destination. These collectors can use compression and aggregation functionality to dramatically reduce the amount of data transferred from a source to a destination. Using Historian's collectors, users can define both compression ratios and aggregation which results in only important data value changes being sent to the Cloud at the right speed needed for analysis. This allows very large amounts of high-resolution data to be stored on the plant and a fraction of that data sent to the cloud for analysis, at the right rate.

The hybrid data model

This combination of data, on-premise historians and cloud storage, for data required for analysis, the hybrid model, provides industrial companies with a balanced approach to data storage compatible with local plant management, optimization applications and minimizes overall costs.

For on-premises technology in the hybrid data model, it is important to mention that Historians offer significant advantages over relational databases (RDBs), which help companies gain more OT insights by supporting simple operator queries, answering questions such as: Which customer ordered the largest batch? Relational databases are designed to manage relationships and are ideal for storing contextual or genealogical information about manufacturing processes, but they are rarely the best approach for collecting and optimizing large amounts of process data.

On the other hand, Historians are designed for production, process data acquisition and presentation. They maximize the power of time-series data and excel at answering questions typical of manufacturing people who need to make real-time decisions, such as: What was today's average hourly production compared to a year ago or two years ago? does today?

Historians offer key advantages over RDBs, namely: built-in data collection capabilities, faster speeds, higher data compression, robust redundancy, higher data security, and faster time-to-value.

In a hybrid cloud model, compression is especially important. The powerful compression algorithms of Plant- or Enterprise-level Historians allow users to easily and securely archive years of data online, which improves performance, reduces maintenance and lowers costs. Files can be created, backed up and deleted automatically, allowing for extended use without the need for a designated database administrator.

As a result, industrial companies can leverage greater process visibility for better and faster decisions, increased productivity, and reduced costs for a sustainable competitive advantage.

Take for example, Asset Performance Management (APM) solutions that typically leverage Historian technology on-premises, and then send the relevant data to the Cloud. The APM solution accesses data in the cloud for analysis and optimization. The hybrid model reduces costs and maintenance while ensuring that process engineers have the data they need for analysis.

In another example, a food industry uses HMI/SCADA and MES solutions together with a Historian to manage time series data and A&E (Alarms and Events) in a hybrid on-premise / Cloud data model. The Historian collects data at lightning speed from multiple data sources, aggregates and archives it efficiently and securely. A subset of the data is then sent to the Cloud and used by the analytics software. This solution has allowed a reduction in raw material costs and reduced customer complaints by 33%.

Conclusion

With the rise of analytics, industrial companies cannot comprehensively predict today what data they will need to answer the next problem tomorrow. Fortunately, the hybrid data model (Cloud and On-premise) allows companies to use Historian technology to have a flexible and convenient way to collect all data and make it available for sending to the Cloud and subsequent analysis, reprocessing and extractions of valuable KPIs.

Source: Using the hybrid-data model to bridge cloud & on-premises solutions