About This File
The Intelligent Equipment Accelerator provides a reference architecture and code assets for building telemetry monitoring solutions inside of equipment hierarchies. It is primarily configuration-driven which allows a flexible object hierarchy based on the generic concept of Entities. Attached to these Entities are Devices which represent data producing sensors. The platform illustrates how capturing sensor telemetry can be used to gain business insights.
And here's a video showing the Accelerator in action.
Business Scenario
Most modern equipment are instrumented in some way with a variety of telemetry captured from sensors, from cars to electronics to lightbulbs. Gathering this data and making sense of it all is a key problem for owners of this equipment. Once data is captured either on edge devices or within a core infrastructure, it then becomes a challenge to detect patterns and meaningful behaviours from the noise. Through the use of rule-based systems and data science models, actionable insights can be gleaned. That allows the ability to take action in developing situations, or just capture the data to refine models for future improvements to the system.
Concepts
The Intelligent Equipment Accelerator has a generic data model that is configuration driven. At the top level there are two main concepts:
Devices -- are anything that produce a stream of data. Also known as sensors. Typically produce data triplets at high frequency, consisting of a unique identifier, a timestamp, and a data value. Devices are attached to a single Entity, but an Entity can have multiple Devices.
Entities -- are anything else. This can be factories, production lines, equipment, aircraft, buses, ovens, drilling rigs... anything. Organized into hierarchies, one Entity may have a single parent, but multiple children.
To help with configuration, the Accelerator also separates configuration into Templates and Instances.
Instances -- are physical examples of Devices or Entities, equivalent in object-oriented programming to an Object Instance. They are linked to a single Template, have a physical location, and a unique identifier like a serial number.
Templates -- definition of common properties for all Instances of a given Template, equivalent in object-oriented programming to a Class. May also be known as a type. Will not have a physical location or a unique identifier like a serial number (but could be a unique model number).
Since Devices often send only single data points at a time, it is often useful to aggregate these together into virtual rows of data for processing.
Features-- are linked to a single Device Instance or to a Device Template associated with an Entity Template. Represents a single value in a virtual row.
Feature Sets-- are logical groupings of Features into a single virtual row. This virtual row can then be passed to rules and data science models to evaluate multi-variate conditions and states.
This configuration looks like this:
In addition, users can configure Modules which link to physical EventFlow application modules implementing specific business rules or interfaces. These may be implemented as Validation Modules, Cleansing Modules, or Rule Modules. These modules are then linked to Devices, Device Templates, and Feature Sets so they are called during the processing of data from these data sources.
The Accelerator captures data feeds from external systems as reports.
Alert -- reports from external systems of alert conditions.
Reading -- device reading reports consisting of a triplet of unique identifier, reading date and time, and the value.
Status -- a condition status for a given Device or Entity. For example, a production line or pump operating status.
Part -- a part produced report used as part of operational metrics.
Position -- a physical location for a given Entity that may change over time.
After processing the inbound reports, the Accelerator produces external actions.
Alerts -- similar to Alert reports, indicates an alert condition on a Device or Entity.
Status -- similar to Status reports, indicates the status of a given Device or Entity has changed.
Readings -- certain rules may produce additional Feature values as part of the rule execution. For example, an autoencoder rule may generate a cluster number and a reconstruction error. These Feature values are produced as new Readings.
The dynamic data model looks like this:
Benefits and Business Value
Most modern equipment today are instrumented with some sort of sensor. We can use the streaming data from these sensors, combined with context information from various systems to gain a complete real-time view of all operations in order to rapidly resolve current issues and intervene to address preventable problems before they occur.
Technical Scenario
The Accelerator provides a generic data model for building entity and device hierarchies with a configuration interface. The included demos capture sensor data from a number of devices installed on equipment in their respective environments. These demo scenarios are:
- Production oilfield with a series of wells using electric submersible pumps (ESP). The Accelerator captures telemetry and attempts to identify a failure pattern and alert when this looks likely.
- Heavy equipment monitoring engine signals for preventative maintenance
- Power plant where the overall state of the generation lines in the plant are computed using both an R model using a K-means clustering algorithm, and an H2O model using an Autoencoder algorithm.
- Servers showing monitoring of an IT infrastructure hierarchy showing infrastructure, platform, and service level monitoring.
- Widgets showing operational analytics monitoring the production of parts from various factories and production lines
The Accelerator is based around a single TIBCO Streaming engine called the Event Manager. This engine receives a defined set of reports from multiple sources either through directly enqueued stream data or through a JMS receiver. In the demo the Simulator connects to the Event Manager through the internal messaging bus. In a real implementation the integration of data sources will always be a project and will likely require development of adapters and ingress EventFlow to transform the data into the Accelerator canonical formats.
As device readings flow through the Event Manager, they are subjected to several analyses, validation to ensure the data is correct, cleansing, business rules, summarizing, and statistics calculation. The results of these are pushed through to Live Datamart as appropriate, and a fully custom HTML5 application can be used to view the contents, as well as Spotfire.