Building an Edge Intelligence Platform for IIoT—Part 3 of 4

Our Journey Towards Building the Most Advanced Edge Intelligence Platform for IIoT


By Abhi Sharma, Head of Analytics and Machine Learning, FogHorn Systems

Happy 2018! After ending the last year on a high note with numerous awards in 2017, I’m excited to pen the third part of my blog series. Let’s recap quickly what we discussed in the earlier posts.

In part one of this four-part series, we discussed edge computing and why it’s indispensable for IIoT applications. We described concretely, our vision for analytics and machine learning at the edge.

In part two, we dove into specific technical detail of why complex, non-trivial analytics must be performed in real-time at the edge across heterogeneous devices. Shipping all this data to a centralized cloud computing environment simply doesn’t work for the all-instrumented and autonomous computing environment the industry is heading towards. To that end, we brought to market the most advanced, extremely low-footprint, low-latency complex event processing (CEP) engine. Our CEP works just like the fast processing part of the human brain—enriching, characterizing, and summarizing data in flight. For the first time, FogHorn’s technology makes it possible to run cloud-like analytics and trained machine learning models right at the edge.

In this post, we evaluate another inevitable first principle about IIoT:

No tangible business outcome in IIoT can ever be achieved without incorporating the domain expertise and tribal knowledge of people into the analytics and machine learning software at the edge. This includes having the ability to swiftly iterate and improve on solutions with augmentation from automated intelligence software.

Intelligent Edge Platform for IIoT Blog 3 of 4 diagram

Reasoning from the above first principle:

We have discussed at great length in part one and part two why an advanced, distributed, and  localized software intelligence platform is so important since sensors are instrumenting almost everything in the physical world. The issue is, any kind of new advanced platform technology is useless unless you can build valuable applications on it.

As an example, let’s go back to the 1970s–1980s and think about computers itself. People had no idea why they would ever need a computer. When computers were first introduced, they were considered esoteric and had little adoption until Excel spreadsheets (along with many other things) came along and changed the whole story for everyone across so many domains and industries. It suddenly made sense to use computers to develop all kinds of interesting solutions. An unstoppable burst of applications followed from there on, and the rest is history as we know it.
IIoT is so much at the same stage, but in a more modern and complex setting. Firstly, you need an advanced edge intelligence platform (just like computers were)—which in itself would require meticulous craftsmanship. However, to truly bring out business value across numerous domains, we also have to make the Excel equivalent for such an advanced IIoT platform—thus, allowing people to build interesting solutions for the modern industrial age everywhere and really fast.

One might ask, why is that? Why is IIoT at a same stage as I describe above?
Answer: One part domain knowledge, one part automated intelligence

IIoT applications are very domain-centric. Even if the domain is the same from business to business, each company’s physical environment varies so much that it requires localized context and human knowledge to solve real problems. For example, even though someone might be able to write great C++ code or create an interesting machine learning model, in general, they most likely lack the expertise to know what it takes to manufacture an electric capacitor or understand the mechanics of a running elevator. On the other hand, a factory manager or a technician probably understands the vagaries of the machines and can intuitively predict issues. That human intuition is a combination of domain expertise + real-time intelligence. Well the answer is simple then, you need both.

Combining the power of automated intelligence with domain knowledge helps identify why certain sensor readings are more important than others, what needs to instrumented and measured at the accurate time, and where advanced analytics should be applied to derive higher level insights that matter. Thus the ability to combine technology with domain expertise is the keystone to achieving successful business outcomes in IIoT.  Conclusion being: the edge-computing technologies must be usable not only by IT personnel, but also OT people. Slowly the physical world is becoming the domain of IT and not just centralized data centers and servers.

And now to touch on the third factor in the equation (the circle):

Iterative Learning and Improvement
If there is one constant in technical and business advancement, it is iterative learning and improvement. This can only happen when tools makes it insanely easy to capture tribal knowledge from domain experts, and then lets you leverage data in real time. Here we see the birth of useful applications. This then leads to numerous iterations and improved automation that accelerates quickly as more insights are learned over time.

Eventually, intelligent automation then drives the whole loop along with higher-order insights coming from humans. This evolution is the most important, but one of the most underestimated and misunderstood ground truths for IIoT. We observe that a lot of technology leaders today tend to trivialize the IIoT effort by suggesting that IIoT use cases can be solved by simply pointing random machine learning algorithms to data, performing threshold alerting, and finally storing data in the cloud. Unfortunately, this approach only lends credence to the old saying, “Garbage in, garbage out.”
So how did FogHorn solve this to enable high value apps in IIoT? FogHorn’s software platform had to do for IIoT what Excel did for computers in order to make a massive impact in the industry—and we are doing exactly that.

Engineering the platform

Based on the understanding above, our platform needed to be super-efficient, but also easy for OT specialists to use for analytics and machine learning. To that end, we created a scripty, reactive, functional language and platform called VEL™ that lets you run analytics and machine learning right at the edge.

VEL allows people (even without computer science and IT backgrounds) to describe complex analytics in terms of easy-to-understand mathematical formulas, rather than traditional if-then-else programming logic. This paradigm changes the mental model from a programming problem to one of data flow analysis: Less code and more math, if you will. Pair this with our visual workbench and you’ll see that (following the analogy above) we deliver the “Excel” for IIoT.

Now, one key point to note - it is not just the visual workbench that really solves the problem. It is a huge misconception in the IIoT market today that just slapping a drag and drop tool where you don’t have to write code, solves the problem of being OT centric. It only helpful to solve very simple and trivial proof-of-concept type of problems which is far from the real world. Most of these tools just turn out to be almost useless when rubber hits the road.

What really bubbles up as simplicity to the user is the fundamental languages constructs that are designed in as first class elements. As an corollary: imagine building an app without the iOS or the Android platform. Sure you could do it but it would be so massively hard and you could create only a few apps. That’s why we went to great lengths to solve from the ground up by taking a language driven approach and by writing a compiler and language called VEL. OT centric is – when you can articulate a complex problem in simple English and then it is equally easy to turn that into a program without having to change the mental-model or reframe the level of abstraction or constraint the expression of the analytic intent or the lack of advanced capabilities.

In our upcoming release this summer, our CEP engine will also allow running pre-trained machine learning models from the cloud right at the edge in an extremely efficient way by just importing them and binding to live streaming sensor data. One of the most respected VC firms in Silicon Valley, a16z, seems to deeply agree with our approach. Check their video out for more information.

VEL is optimized for development ease and efficiency:

  • Significantly reduced programming effort to develop IIoT use cases with VEL. You require zero infrastructure or boilerplate code; just pure business logic expressions.
  • Syntax and semantic checking (unlike Python). These are done right at compile time, guaranteeing complete correctness.
  • OT-centric tools for visual analytic authoring, development and data-flow debugging capabilities. Tools follow a simple recursive mental model of input-compute-output at any layer of complexity, allowing domain expertise to think simply. This resonates well with the OT-centric toolchain, like PLC, LabView and Ladder Logic programming etc.

Here’s a real example. One customer built an error-prone IoT program in Python that required over 3000 lines of code. When they converted their program to VEL, they reduced the code to about 60 lines of VEL with semantic checking and data simulations available at the time of authoring.

VEL enables faster improvement iterations:

  • VEL enables tinkering and exploratory analysis. This allows quick iterations without having to worry about productionizing a solution. VEL takes care of all the runtime production efficiencies, offering very optimized runtime performance.
  • VEL also lets you import a trained machine learning model (using any tool, such as Python, IBM, SAS, R, KNIME, etc.). Just run it directly at the edge with efficient code-generation in VEL. This provides you with a tool- and language-agnostic ability to run machine learning models right at the edge.

All the above analysis, design and engineering for a singular purpose is why we started FogHorn in the first place. In the next and final blog of this 4-part series, I will talk about our purpose, practical aspects and actual meaning of machine learning at the edge, and why enabling fancy words like ML and AI at the edge is making ‘data’ humanly-consumable first.

VEL is a trademark of FogHorn Systems.