Our Journey Towards Building the Most Advanced Edge Intelligence Platform for IIoT
By Abhi Sharma, Head of Analytics and Machine Learning, FogHorn Systems
Happy 2018! After ending the last year on a high note with numerous awards in 2017, I’m excited to
In part one of this four-part series, we discussed edge computing and why it’s indispensable for IIoT applications. We described concretely, our vision for analytics and machine learning at the edge.
In part two, we dove into specific technical detail of why complex, non-trivial analytics must be performed in real-time at the edge across heterogeneous devices. Shipping all this data to a centralized cloud computing environment simply doesn’t work for the all-instrumented and autonomous computing environment the industry is heading towards. To that end, we brought to market the most advanced, extremely low-footprint, low-latency complex event processing (CEP) engine. Our CEP works just like the fast processing part of the human
In this post, we evaluate another inevitable first principle about IIoT:
No tangible business outcome in IIoT can ever be achieved without incorporating the domain expertise and tribal knowledge of people into the analytics and machine learning software at the edge. This includes having the ability to swiftly iterate and improve on solutions with augmentation from automated intelligence software.
Reasoning from the above first principle:
We have discussed at great length in part one and part two why an advanced, distributed,
As an example, let’s go back to the 1970s–1980s and think about computers itself. People had no idea why they would ever need a computer. When computers were first introduced, they were considered esoteric and had little adoption until Excel spreadsheets (along with many other things) came along and changed the whole story for everyone across so many domains and industries. It suddenly made sense to use computers to develop all kinds of interesting solutions. An unstoppable burst of applications followed from there on, and the rest is history as we know it.
IIoT is so much at the same
One might ask, why is that? Why is IIoT at
Answer: One part domain knowledge, one part automated intelligence
IIoT applications are very domain-centric. Even if the domain is the same from business to business, each company’s physical environment varies so much that it requires localized context and human knowledge to solve real problems. For example, even though someone might be able to write great C++ code or create an interesting machine learning model, in general, they most likely lack the expertise to know what it takes to manufacture an electric capacitor or understand the mechanics of a running elevator. On the other hand, a factory manager or a technician probably understands the vagaries of the machines and can intuitively predict issues. That human intuition is a combination of domain expertise + real-time intelligence.
Combining the power of automated intelligence with domain knowledge helps identify why certain sensor readings are more important than others, what needs to instrumented and measured at the accurate time, and where advanced analytics should be applied to derive higher level insights that matter. Thus the ability to combine technology with domain expertise is the keystone to achieving successful business outcomes in IIoT. Conclusion being: the edge-computing technologies must be
And now to touch on the third factor in the equation (the circle):
Iterative Learning and Improvement
If there is one constant in technical and business advancement, it is iterative learning and improvement. This can only happen when tools
Eventually, intelligent automation then drives the whole loop along with higher-order insights coming from humans. This evolution is the most important, but one of the most underestimated and misunderstood ground truths for IIoT. We observe that a lot of technology leaders today tend to trivialize the IIoT effort by suggesting that IIoT use cases can be solved by simply pointing random machine learning algorithms to data, performing threshold alerting, and finally storing data in the cloud. Unfortunately, this approach only lends credence to the old saying, “Garbage in, garbage out.”
So how did FogHorn solve this to enable
Engineering the platform
Based on the understanding above, our platform needed to be super-efficient, but also easy for OT specialists to use for analytics and machine learning. To that end, we created a
VEL allows people (even without computer science and IT backgrounds) to describe complex analytics in terms of easy-to-understand mathematical formulas, rather than traditional if-then-else programming logic. This paradigm changes the mental model from a programming problem to one of data flow analysis: Less code and more
Now, one key point to note - it is not just the visual workbench that really solves the problem. It is a huge misconception in the IIoT market today that just slapping a drag and drop tool where you don’t have to write code, solves the problem of being OT centric. It only
What really bubbles up as simplicity to the user is the fundamental languages constructs that are designed in as
In our upcoming release this summer, our CEP engine will also allow running pre-trained machine learning models from the cloud right at the edge in an extremely efficient way by just importing them and binding to live streaming sensor data. One of the most respected VC firms in Silicon Valley, a16z, seems to deeply agree with our approach. Check their video out for more information.
VEL is optimized for development ease and efficiency:
- Significantly reduced programming effort to develop IIoT use cases with VEL. You require zero infrastructure or boilerplate code; just pure business logic expressions.
- Syntax and semantic checking (unlike Python). These are done right at compile time, guaranteeing complete correctness.
- OT-centric tools for visual analytic authoring, development and data-flow debugging capabilities. Tools follow a simple recursive mental model of input-compute-output at any layer of complexity, allowing domain
expertiseto think simply. This resonates well with the OT-centric toolchain, like PLC, LabView and Ladder Logic programming etc.
Here’s a real example. One customer built an error-prone IoT program in Python that required over 3000 lines of code. When they converted their program to VEL, they reduced the code to about 60 lines of VEL with semantic checking and data simulations available at the time of authoring.
VEL enables faster improvement iterations:
- VEL enables tinkering and exploratory analysis. This allows quick iterations without having to worry about
productionizinga solution. VEL takes care of all the runtime production efficiencies, offering very optimized runtime performance.
- VEL also lets you import a trained machine learning model (using any tool, such as Python, IBM, SAS, R, KNIME, etc.). Just run it directly at the edge with efficient code-generation in VEL. This provides you with a tool- and language-agnostic ability to run machine learning models right at the edge.
All the above analysis, design and engineering for a singular purpose is why we started FogHorn in the first place. In the next and final blog of this 4-part series, I will talk about our purpose, practical aspects and actual meaning of machine learning at the edge, and why enabling fancy words like ML and AI at the edge is making ‘data’ humanly-consumable first.
VEL is a trademark of FogHorn Systems.