Machine learning is transforming operations, enabling dramatic gains in efficiency and productivity. However, to fully reap these benefits, you need a way to analyze the high volume of diverse streaming data coming through your machines in real time with full fidelity (not downsampled) and interpret it for actionable insight. Increasingly, this means deploying edge computing, but the question is how to do so.
Cloud computing has merit as a data modeling and machine learning portal, but it suffers from latency issues and cannot provide the real-time responsiveness needed in applications for industrial enterprises. Instead of moving data to the compute location, the challenge is to move the compute to the point where data is generated, in real time. Only then will companies have the intelligence in time to impact operations.
Foghorn is at the forefront of “edgifying” machine learning models and has succeeded in bringing cloud-class machine learning down to small-footprint, compute-constrained edge devices.
Drowning in Tons of Sensor Data
The digital transformation has prompted organizations to install audio, video, and 3D sensors across their operations. With a tsunami of data coming through, it is now hard for organizations to get actionable insight from that data in an efficient and timely manner.
The solution is to move the compute and processing to the edge. According to Gartner, within the next four years 75% of enterprise-generated data will be processed at the edge (versus the cloud), up from <10% today. The move to the edge will be driven not only by the vast increase in data, but also the need for higher fidelity analysis, lower latency requirements, security issues, and huge cost advantages.
Real-Time Analysis is a Necessity
While the cloud is a good place to store data and train machine learning models, it is often unsuitable for real-time data collection or analysis. Bandwidth is a particular challenge, as industrial environments typically lack the network capacity to ship all sensor data up to the cloud. Thus, cloud-based analytics are limited to batch or micro-batch analysis, making it easy to miss blips in the numbers.
In contrast, edge technology can analyze all raw data. This delivers the highest-fidelity analytics, and increases the likelihood of detecting anomalies, enabling immediate reactions—reducing downtime and cutting maintenance costs.
While we have seen the cloud becoming more secure over time, numerous risks still are involved with transferring and storing data on the cloud. These security issues often hinder organizations from working on the cloud. With edge computing, businesses have more control over their security, giving them another reason to embrace this approach.
The cloud plays a critical role in ML model creation and training, especially for deep learning models as significant compute resources are required. Once the model is trained, it can then be “transformed” through edgification and pushed to the edge.
Ultimately, edge inferences will be frequently sent to the cloud to further tune the models, and the updated models pushed back to the edge in a highly iterative, closed-loop fashion. So “AI” in IIoT can be summarized as this closed-loop edge to cloud machine learning and model edgification.
Of course, edge computing alone is not enough. To distribute and analyze data at an enterprise scale, machine learning systems must span from the edge to the cloud. Foghorn has taken a three-layer approach to data processing:
- Enrichment. The enrichment layer gets the data ready for future processing though decoding, filtering, interpolation, and more. In short, this layer enhances data quality to ensure that the other layers achieve good results.
- Complex Event Processing (CEP). This layer is used by the many businesses that already know the problems and patterns they face. These companies can express their patterns and problems in the CEP engine to generate a tool for data analysis.
- Machine Learning Engine: The Machine Learning Engine is pre-packed with models that assist with anomaly detection, such as decision trees, regressions, and sndf clustering. This layer is where the edge and cloud overlap.
The Machine Learning Engine uses a combination of supervised and unsupervised learning. If a company already has sufficient historical data, the supervised learning can take place in the cloud. If not, the model can be developed at the edge as the data starts coming in.
But there are times when you have to implement techniques of unsupervised learning to revise the model for incremental updates. Through unsupervised learning, this model can teach itself how to implement incremental updates over a period of time.
Edgification provides Real Operational Benefits
Edgification promises numerous benefits, including:
- Massive reduction of data. When analytics move to the edge, there is a massive decrease in the amount of data pushed across the network. This reduces data storage and data-handling costs.
- Better real-time insights. By keeping the computing close to the data source, edgified machine learning can detect emerging patterns and enable immediate responses.
- Predictive maintenance for all. Because an edge-based system can handle all incoming machine data, it can predict maintenance needs across all equipment in the operation.
- Improved yield.
With the benefits mentioned above, edgification seems to be leading the wave toward the future in the IoT market. By transforming the IoT market, edge will make real-time analysis easier, hence increasing operational efficiency while decreasing the costs incurred in handling and storing data.