IT, OT, and Kaizen

The software and hardware systems used in manufacturing fall under Information Technology (IT) if they only interact with humans, and Operational Technology (OT) if they also interact with machines and facilities. Industry 4.0 is mostly OT, but manufacturing has traditionally focused more on IT.

IT produces reports on delivery performance; OT issues alarms when a gas pipe springs a leak. The distinction is sharp between extreme cases but blurry where the two meet. In principle, all the systems should form a functional stack, with each layer activating and exchanging data with the layer below. The top layer supports management decisions, and the bottom layer interacts with operators and machines. In reality, it does not often work as it should. The key to making it work is continuous improvement/Kaizen, with technology retrofits, rather than a radical “digital transformation.”

The IT/OT structure

In practice, the separation between IT and OT is in the organization’s structure. Factories and manufacturing companies usually have IT departments responsible not only for the IT functions but also for the infrastructure that is strictly system software.

The IT department looks after networks, servers, routers, and security. It provisions users with equipment and access credentials, none of which has to do with apps’ ability to serve the business.

OT, on the other hand, is usually in the hands of the engineers who design and update production lines.

The following stack shows the content of the different layers and the organization on the left side, with the uncomfortable overlap between the purviews of IT and OT. The engineers complain that the IT department does not understand the needs of OT, and IT complains that they cannot support the “nonstandard” tools the engineers want to use.

 

Managerial Versus Technical Issues

Technically, the separation between IT and OT should not exist. The flow of data and information between the layers should be seamless. It has never been seamless, and it’s not today in spite of the technological advances of the past 30 years.

30 years ago, the excuse was that the VAX machines from DEC used in OT “could not talk” to the AS/400 IBM machines used in IT. They had different operating systems, and even encoded characters into bits differently. So “obviously,” the only way to exchange data was to have a human operator read off one system’s screen and type into the other.

They didn’t want a bridge.

While commonly believed among manufacturing professionals, this was not true. You could buy a box, plug in a connection to a VAX in one end and an AS/400 on the other, and have data flow freely back and forth. The truth was that the people on the OT side of the chasm didn’t want to share their data with managers on the IT side. The alleged technical incompatibility was an excuse to make sure no one built the bridge.

Thirty years later, the technology excuse is even less credible than it was then, but the question remains of whether the natives want the bridge.

IT and OT integration

We usually call the IT and OT systems in a manufacturing organization its legacy systems. In planning for IT and OT in manufacturing, many managers believe in digital transformation by replacing their legacy systems with a standard stack of new hardware and software for all factories. This approach has been sold for decades before “Industry 4.0” was coined, to generation after generation of managers, and it has regularly failed to produce the desired results.

Continuous improvement/Kaizen, applied to legacy systems, yields better results faster. It does, however, require the organization to learn more powerful software tools.

Issues with System Replacement

Not only does the complete replacement of legacy systems generate conversion costs and delays but the resulting systems merely replicate the functions of the legacy systems with new technology. They usually fail to meet the needs of operations, that continue to rely on workarounds. This is a common worldwide pattern, observed even in companies that are otherwise sophisticated in the use of IT and OT in their products or services. We encountered several examples during the Van-of-Nerds tour:

  • Shipment owners. The company makes expensive, high-technology products in small series, and each customer relationship has a manager who needs to be kept informed of the status of every shipment. The ERP system is supposed to excel at commercial transaction processing but it either has no place to attach a manager to a shipment, or the implementers of the system don’t know where it is.
  • Paper travelers. The company makes state-of-the-art aircraft components with equally advanced processes, but the progress of each unit through Assembly is tracked with a paper traveler. This traveler carries manually entered measurements, timestamps, and signatures like it’s 1970. Once the product is shipped, the filled-out traveler is archived on paper for decades, in case it is needed for traceability. This is mandated by aviation authorities.

    As a result, Shipping maintains the data about shipments and their managers manually on a whiteboard. The shipment managers are part of the company’s business model, and it’s not the place of ERP suppliers to decide whether to support them.

    Manual data entry is error-prone.  Retrieval and analysis of documents after a plane crash is slower on paper than in electronic form. And Manufacturing Execution Systems (MES) were developed 40 years ago to eliminate paper travelers. And Data Lakes are a special kind of database designed to hold documents that are unlikely ever to be retrieved but must be kept around in case of need.

Example: Replacing a 25-Year-Old MES

Assume a production line that has been using an MES for 25 years. Its operators are trained to use it to process shop floor transactions and view status information with it. It has its limitations. Production is used to working around these limitations, and the system is embedded in daily operations.  They can’t envision running the line without it. 

This MES, however, is obsolete, and you want to replace it with the state of the art. It is a major surgery on your production system, involving the following steps:

  1. Evaluate, select, and acquire a new system, and likely a system integrator to help implement it. 
  2. Install the new system on a separate server, to set it up without disrupting operations.
  3. Extract all the master data from the old system, and convert it to the new one. This usually requires contractors familiar with the new system.
  4. Extract and load the status and history data from the old system.
  5. Train the production organization in the use of the new system.
  6. Run the old and the new systems in parallel to validate the new system both in terms of function and responsiveness. 
  7. Turn off the old system, delete all company-private information, and dispose of it. 

It takes so much time and resources that, for the better part of one year, you will not be able to do anything with this line other than implementing the new MES. Once done, you have a new system performing the same functions as the old one. If you want the new system to do more than the old one, it is your starting point. It’s worth pondering the alternative of improving the old system through retrofits. 

Focus on the Data

If operations on the shop floor require workarounds to the official system, this system fails to provide actionable data. Industry 4.0 purportedly “integrates processes vertically, across the entire organization,” but we did not see much of this. We mostly saw clever local initiatives. 

To avoid the dysfunctions described above, manufacturing professionals need to think about the data, particularly where data meet materials, and not assume that possessing any vendor product is a solution. 

To illustrate this issue, assume you have standardized the company on the same stack of ERP, MES, SCADA, and machine control systems but have made no effort to define a shared data model. Then three factories performing the same coating operation may call it respectively  “CT1,” “Coat1,” and “Coating-1.” 

Using the same software does nothing to help you match technical data on this operation across the company. Multiply this by thousands of operations in hundreds of processes, and you have a Tower of Babel, where the multiple dialects of different sites impede the joint use of technical data in troubleshooting and improvement. A data model, on the other hand,  would enable sharing, even if the factories use different systems

Connecting OT and IT

Replacing legacy systems in existing factories is not the way to resolve the disconnect between bottom-up OT and top-down IT. Retrofit solutions exist to make different systems play together.

What they have in common is a focus on the data rather than the systems, and a spirit of incremental improvement. It matches what production does with materials and processes.

Middleware

A key concept is middleware to facilitate data exchanges. In the publish-and-subscribe model, each system publishes messages identified by subject, to which other systems subscribe as needed. The messages are self-describing in terms of a shared data model. This enables subscribers to decode them and respond to their content.

Data Stores

You retain data in multiple types of databases:

  • Master data. You store, retrieve, update and control revisions of the data model with Master Data Management (MDM). Then you keep this data in a document database.
  • Status and future. You keep data about the current state and schedules of future events in an in-memory database.
  • Structured historical data. You keep historical data that you are likely to use in forecasting or problem-solving in a data warehouse.  You periodically update a data warehouse from all the legacy systems through Extract, Transfer, and Load (ETL) tools.
  • Unstructured historical data. You keep historical data that you are unlikely to need but must retain, in a data lake. There, its stays in its original form, with just enough administrative data attached to retrieve it.

Some suppliers combine the data warehouse and the data lake functions in data lake-houses.

Current use of these tools

Some of our hosts brought up publish-and-subscribe, and data lakes. Sometimes, however, they did not seem to understand what these tools do. For example, we heard “We need to structure the data to put it into the data lake.” No, you don’t! The intent of data lakes is specifically to free you from the need to do this. The ETL of a data warehouse does it upfront. In a data lake, you only need to do it when and if you retrieve the data.

Improvement

These retrofits let a manufacturing organization continuously improve the use of its data. And they do so without the trauma of uprooting legacy systems. New systems go into greenfield sites, where there is nothing to uproot. Improvements in the legacy systems inform their selection and implementation.

Conclusions

The approach described above is Kaizen for IT and OT. Like Kaizen applied to production lines, it not only improves performance but also develops skills that enhance its ability to select effective software and hardware.

#it, #ot, #kaizen, #legacysystems