Machine learning is transforming operations, enabling dramatic gains in efficiency and productivity. However, to fully reap these benefits, you need a way to analyze the high volume of diverse streaming data coming through your machines in real time with full fidelity (not downsampled) and interpret it for actionable insight. Increasingly, this means deploying edge computing, but the question is how to do so.
Cloud computing has merit as a data modeling and machine learning portal, but it suffers from latency issues and cannot provide the real-time responsiveness needed in applications for industrial enterprises. Instead of moving data to the compute location, the challenge is to move the compute to the point where data is generated, in real time. Only then will companies have the intelligence in time to impact operations.
Foghorn is at the forefront of “edgifying” machine learning models and has succeeded in bringing cloud-class machine learning down to small-footprint, compute-constrained edge devices.
Real-time Insights means Machine Learning at the Edge
Schindler Elevator wanted to put an end to routine problems, such as friction in the doors. As part of this effort, Schindler worked with Foghorn to create a predictive maintenance solution. By analyzing sensor data at the source, Schindler can now determine maintenance needs well in advance, without the cost, latency, security, and other issues associated with the transfer of large amounts of data outside of the building. Thus, it can schedule service before anomalies impact performance in a highly efficient manner.
This business value has been seen with numerous other clients. As another example, oil and gas firms working in sites far from city centers can use machine learning at the edge to analyze data, including video and audio. Among other things, this data can be used to predict the pressure across pumps and alert operators about abnormal operating parameters, again all while processing the great majority of the data locally on-site.
Drowning in Tons of Sensor Data
The digital transformation has prompted organizations to install audio, video, and 3D sensors across their operations. With a tsunami of data coming through, it is now hard for organizations to get actionable insight from that data in an efficient and timely manner.
The solution is to move the compute and processing to the edge. According to Gartner, within the next four years 75% of enterprise-generated data will be processed at the edge (versus the cloud), up from <10% today. The move to the edge will be driven not only by the vast increase in data, but also the need for higher fidelity analysis, lower latency requirements, security issues, and huge cost advantages.
Real-Time Analysis is a Necessity
While the cloud is a good place to store data and train machine learning models, it is often unsuitable for real-time data collection or analysis. Bandwidth is a particular challenge, as industrial environments typically lack the network capacity to ship all sensor data up to the cloud. Thus, cloud-based analytics are limited to batch or micro-batch analysis, making it easy to miss blips in the numbers.
In contrast, edge technology can analyze all raw data. This delivers the highest-fidelity analytics, and increases the likelihood of detecting anomalies, enabling immediate reactions—reducing downtime and cutting maintenance costs.
While we have seen the cloud becoming more secure over time, numerous risks still are involved with transferring and storing data on the cloud. These security issues often hinder organizations from working on the cloud. With edge computing, businesses have more control over their security, giving them another reason to embrace this approach.
How Edgification WorksBut moving analytics to the edge is not simply a matter of changing where the processing happens. The typical machine learning model in use today was developed with assumptions that make sense only in a cloud environment.
- They were developed for batch or micro-batch data, which does not work well with high velocity/volume of streaming data from sensors, etc.
- They were developed with an assumption of unlimited compute so no constraints were put on model size and weights, again suited for edge devices, of which many are compute constrained.
- They include preprocessing (alignment, conditioning, filtering, etc.)and post-processing (aggregation, alert generation, etc.)logic included as part of the model, which create major code bloat, and doesn’t bode well for constrained edge devices.
- Runtime environments and language of implementation are also not an issue in the cloud, but they are at the edge.
Since these assumptions do not hold true at the edge, the machine learning models need to be adapted for their new environments. In other words, they need to be edgified:
- They need to be connected to streaming data.
- They need data to be preprocessed/enriched (cleansed, filtered, normalized, and contextualized), best accomplished through a complex event processor (CEP).
- The pre- and post-processing logic needs to be extracted out of the modeland executed in the CEP engine for a smaller computing load.
- The models can then be tuned, including the weights, and eliminating the preprocessing elements brings the size and computing memory required down by more than 80% in some cases.
- The models finally need to be converted into an expression language designed specifically for the edge. This enables fast and efficient execution in resource-constrained environments.
The cloud plays a critical role in ML model creation and training, especially for deep learning models as significant compute resources are required. Once the model is trained, it can then be “transformed” through edgification and pushed to the edge.
Ultimately, edge inferences will be frequently sent to the cloud to further tune the models, and the updated models pushed back to the edge in a highly iterative, closed-loop fashion. So “AI” in IIoT can be summarized as this closed-loop edge to cloud machine learning and model edgification.
Of course, edge computing alone is not enough. To distribute and analyze data at an enterprise scale, machine learning systems must span from the edge to the cloud. Foghorn has taken a three-layer approach to data processing:
- Enrichment. The enrichment layer gets the data ready for future processing though decoding, filtering, interpolation, and more. In short, this layer enhances data quality to ensure that the other layers achieve good results.
- Complex Event Processing (CEP). This layer is used by the many businesses that already know the problems and patterns they face. These companies can express their patterns and problems in the CEP engine to generate a tool for data analysis.
- Machine Learning Engine: The Machine Learning Engine is pre-packed with models that assist with anomaly detection, such as decision trees, regressions, and sndf clustering. This layer is where the edge and cloud overlap.
The Machine Learning Engine uses a combination of supervised and unsupervised learning. If a company already has sufficient historical data, the supervised learning can take place in the cloud. If not, the model can be developed at the edge as the data starts coming in.
But there are times when you have to implement techniques of unsupervised learning to revise the model for incremental updates. Through unsupervised learning, this model can teach itself how to implement incremental updates over a period of time.
Edgification provides Real Operational Benefits
Edgification promises numerous benefits, including:
- Massive reduction of data. When analytics move to the edge, there is a massive decrease in the amount of data pushed across the network. This reduces data storage and data-handling costs.
- Better real-time insights. By keeping the computing close to the data source, edgified machine learning can detect emerging patterns and enable immediate responses.
- Predictive maintenance for all. Because an edge-based system can handle all incoming machine data, it can predict maintenance needs across all equipment in the operation.
- Improved yield. Manufacturers can increase productivity and reduce downtime by rapidly detecting and addressing suboptimal performance.
With the benefits mentioned above, edgification seems to be leading the wave toward the future in the IoT market. By transforming the IoT market, edge will make real-time analysis easier, hence increasing operational efficiency while decreasing the costs incurred in handling and storing data.