There is a lesson that manufacturing leaders seem determined to learn the hard way: flooding factories with new technology does not improve their performance.
Roger Smith learned it at GM in the 1980s. Elon Musk, for all his other achievements, admitted by tweet to making the same mistake at Tesla last year.
To really improve manufacturing performance, you start with, as Crispin Vincenti-Brown put it, with “what happens when the guy picks up the wrench.” You work with that person to make the work easier, faster, safer, and less prone to deviations and errors. In doing this, you apply, as needed, technology you can afford that operators can work with.
This is hard work but it pays off. It is a key lesson learned from Toyota, TPS, and many companies that implemented it under the “Lean” label. But it’s an eat-your-vegetables message. The lure of a technological shortcut is irresistible.
The latest version to hog the limelight is Industry 4.0, a German government program started in 2011. Within 8 years, Industry 4.0 has succeeded in redirecting the conversation about manufacturing from the now stale Lean into a pure technology play.
Meaning of Industry 4.0
The German word “Industrie” is not usually applied to activities other than manufacturing. In English, you talk about “the banking industry”; in German, you don’t. It’s called the “Bankbranche.” “Industrie 4.0” is more specific than “Industry 4.0.”
Beyond that, however, “4.0” is not descriptive. It’s not clear that manufacturing has neatly defined revision numbers at all, let alone the ones in the Industry 4.0 literature.
“Digital Transformation”
In its Spring 2019 special edition, the MIT Sloan Management Review avoids the “Industry 4.0” label, discussing instead “digital transformation.” But the digital transformation of manufacturing occurred decades ago when CNC machines replaced copy milling machines and calculators replaced slide rules.
Key Technologies
“Digital transformation” is not descriptive of what is happening today. Fran Yáñez described the content of the Industry 4.0 “revolution” in an unstructured list of “key technologies.” We present it below with more structure, as a stack, with the addition, in blue, of items that strike us as missing from Yáñez’s list. The items in the stack need the items below and are needed by the items above.
Industry 4.0 Key Technologies, and More
These items are to manufacturing as instruments are to making music: possession does not equate to proficiency. A factory can have all of them and still be unable to compete.
Some of the items, in fact, require substantial knowledge and skills, so much so that it is difficult to envision how any individual could master the whole list, from advanced analytics to cybersecurity.
What makes the list less daunting is that many of the items in it are not new. MES, CMMS, and SCADA systems, for example, have existed for decades.
Advanced analytics have been used in a few high-technology industries, like semiconductors but most manufacturing managers don’t use anything beyond pie charts and stacked-bars.
Strategy Considerations
We will return to Yáñez’s list in Part 2. First, we discuss a general approach to the successful deployment of this kind of technology in manufacturing.
Industry 4.0, IT, and Process Control
For lack of a better term, I like to use Information Technology (IT) as an umbrella term for all the software supporting manufacturing, except for the elements that directly drive physical changes, which is the realm of process control.
Within IT, apps interacts with human users and with other apps; process controllers, in addition, take input from sensors and issue output to actuators. The bundle marketed as Industry 4.0 spans both IT and Process Control.
Consider the following examples:
Telling an AGV to move from A to B is an IT function. Production control identifies this move as necessary and gives the order to do it. In response, the computer embedded in the AGV plots a path from A to B and drives the AGV while dodging fixed and moving obstacles. This is the job of a controller.
Failure analysis on a product unit after final test is an IT function. Making a machine perform a sequence of steps while keeping its vital signs within specified limits and taking in-situ measurements is a process control function.
The line between IT and process control is blurry but there are several reasons to draw it:
The stakes in getting the systems to work right are different. In IT, errors may lead to bad business decisions; in process control, they may destroy products and equipment, and even injure people.
The technical skills required for IT and process control are different.
IT is managed by a specialized department while control is within the purview of manufacturing or production engineering.
Dysfunctions of Manufacturing IT
The key reason given for IT to be managed centrally is that the IT apps share an infrastructure of networks, servers, workstations of different types, and a set of approved and supported software tools.
Legacies of the Mainframe Age
To a large extent, however, this organization structure is a legacy of the mainframe age. All the data processing was done in a single computer for the whole plant, interacting with dumb terminals keystroke by keystroke.
Innovation since the mainframe era has enabled the various departments to go their own way and acquire their own systems. IT departments have been resisting this evolution, leading to the paradox that employees of major corporations often have more modern tools in their private lives than at work.
Two Leaders’ Perspectives
In a 2019 interview, retired GE CEO Jeffrey Immelt said:
Jeffrey Immelt
“Harsh though it may sound, the IT functions in manufacturing companies aren’t staffed by digital technologists. IT engineers buy hardware, outsource software development, and excel at managing projects and customizing vendor-developed software to improve operational efficiency. Reimagining products and services with proprietary software for customers requires very different capabilities.”
In a 1947 lecture (p. 12) Alan Turing had warned about the response to continuing innovation by an IT profession that didn’t exist yet and therefore could not be offended by harsher words than Immelt’s. He said:
Alan Turing
“They may be unwilling to let their jobs be stolen from them in this way. In that case they would surround the whole of their work with mystery and make excuses, couched in well chosen gibberish, whenever any dangerous suggestions were made.”
More than 70 years later, engineers and managers in Production, Materials Management, Maintenance, or Quality may recognize in Turing’s words their experience with corporate IT.
Whenever, in operations, they try to do anything innovative with data, the IT department is in the way, blocking instead of supporting them. Second only to Accounting, corporate IT is perceived as the main obstacle to improvement.
The Perspective of Corporate IT
From the perspective of corporate IT, on the other hand, there are reasons to try and prevent a free-for-all. If every department is allowed its own information system, the plant and the company become a tower of Babel, where data generated in multiple systems cannot be cross-referenced. As of 2019, however, IT departments have been unsuccessful in this pursuit.
The Dysfunctional Top-Down Strategy
ERP systems, supposedly all-in-one solutions for manufacturing IT, have been the main applications supported by IT departments. But they have turned out to be insufficient to serve the needs of all the support functions.
ERP
The original sin of MRP/Closed-Loop MRP/MRP-II/ERP is a top-down approach that starts from planning at the top level and cascades down. The cascading down never worked because the models used at the top were simplistic and did not reflect the reality of the shop floor.
TPS
This is the opposite of the approach followed in TPS/Lean, that starts with the details of tasks at the control level where instructions directly translate into actions on machines and materials, and builds higher-level functions on this foundation.
ERP Workarounds
Where ERP is used, workarounds proliferate. They are specialized systems or internally generated Excel spreadsheets that, in addition to being security hazards, have created the very towers of Babel central IT wanted to avoid.
In his famous management audit of the Tower of Babel, Frederick Brooks pointed out that the project did not fail because its goal was impossible or for lack of resources but because participants couldn’t communicate.
Focus on the Information Content
In 2019, restricting the hardware, system software and apps used in a manufacturing organization to a standard set defined and maintained by IT is neither necessary nor sufficient to ensure that the participants communicate. It is ineffective.
Don’t Standardize What Doesn’t Need To Be
It doesn’t matter whether individual computers run on Windows, Linux, OSX, iOS, or Android, anymore than it matters which brand of ballpoint pen they use. What does matter is that the devices should all be able to exchange messages, read from and write to repositories of shared data, while protecting proprietary information.
A Human Challenge
It is a human challenge, not a technical challenge. Individuals who share data — like production operators and materials handlers — must use the same formats, relationships, and versions. Departments that share data with other departments — like Production and Maintenance with equipment status — must do the same for what they share, etc.
Departments have reasons to group products in different ways. For Engineering, it may be by process and feature similarity; for Production, by volume and stability; in Sales, by market segment; In Accounting, by revenue or profit.
The different groupings, however, must be properly cross-referenced for the whole organization, so that you can retrieve the data about products made with a process P, at volume ≥V, for 16 to 24-year old females, with a value added ≥25%… The IT of most manufacturing plants today, is incapable of producing this kind of information on the fly.
Improving Existing Activities Versus Inventing New Ones
Every new generation of information technology has created many more opportunities than the manufacturing industry has been able to use, lagging behind, in particular, financial services. It should be no surprise to see it happen with the latest.
According to Immelt, “Many CEOs miss the fact that a digital transformation isn’t the same as the digitalization of an existing business.”
Referring to the IoT or machine learning as “digital transformation” is like calling a car a “horseless carriage.” While using quaint, archaic language, however, Immelt is warning against using the new technology just to improve the efficiency of current practices and advocates taking it as an opportunity to change strategy. This what Marshall McLuhan called the horseless carriage syndrome.
Immelt points out that the “digital transformation” can “alter what a manufacturing company sells” but this is true of any innovation. One year into an automotive client’s Lean implementation, working with the management of the plant to take stock of what had been accomplished, I was surprised to find that a Sales manager was the most enthusiastic.
In this business, you start by providing sample quantities of about 50 units of a product a customer wants. Happy customers follow up with orders in the hundreds of thousands per year for the next several years.
“Before Lean,” the manager said, “production was always too busy to bother with samples. Now, when I ask for a sample, they are somehow able to work it into the schedule. Orders for my line of products have doubled in the past 12 months.”
They had changed line designs and production control methods but the changes had not involved any IT or automatic control system.
A Bottom-Up Approach
If the top-down, ERP-based approach does not work, what is the bottom-up alternative?
Don’t Wait For The New System
A common, mistaken belief is that there is no point in trying to improve the performance of legacy systems. A new all-in-one that will be implemented two years from now will solve all the problems, and the supplier’s consultants will work with you to reengineer your business model and fit the system’s capabilities… This is the next iteration of the top-down strategy and not likely to deliver on its promises.
Apply Continuous Improvement to IT and Process Control
Instead, the whole range of IT and process control should be the object of continuous improvement. Many of the “Industry 4.0 enablers” can be retrofitted to legacy systems, yielding benefits in months rather than years, at a much lower cost than replacing the legacy systems.
In fact, some of the enablers, like Data Warehouses, are specifically intended to wrangle value out of the data in legacy systems.
As in production, this continuous improvement effort develops skills that make the organization a savvier buyer when actually replacing the legacy systems or designing systems for new plants/lines.
Leverage Existing Talent
As for production lines, part of the talent needed to pursue continuous improvement in IT and process control is already present in most manufacturing organizations, in employees who like to tinker with software. Left on their own, they are the ones producing the problematic Excel spreadsheets discussed above.
Additional talent is needed to provide the tinkerers with better tools, train them, and organize their efforts so that their contributions coalesce into a coherent system. Recruiting this talent may be the most difficult challenge. As Immelt points out, manufacturing companies “aren’t considered employers of choice by software engineers.”
The Industry 4.0 literature is long on technology but short on management enablers.
Excellent description of the current state of the industry, Michel.
Just having returned from #HannoverMesse I 100% agree. In the end it is a people business in which technology can play A (significant) role but certainly NOT THE role. That is too often overlooked by top- and mid-level management, which is incentivized by short term KPIs.
“Whenever, in operations, they try to do anything innovative with data, the IT department is in the way, blocking instead of supporting them. ”
Very true. However, I feel the new trend is even worse. IT departments — hearing the new buzz in their field is IoT, Industry 4.0, Business Intelligence, etc.– try to “help” by applying these concepts to a process with a shaky foundation to start with. Often making things worse by putting significant effort into an activity that shouldn’t exist in the first place.
Hi Michel,
Congratulations for your article, it is very inspiring and you are a great thinker. I agree with you: flooding factories with new technology does not improve their performance. The best way to face the “digital transformation” and “Industry 4.0” is previously applying LEAN Manufacturing, using a bottom-up approach. In my book I mention this idea in the chapter “The role of LEAN in the Factory of the Future” (“The Goal is Industry 4.0: Technologies and Trends of the Fourth Industrial Revolution”, Amazon books) Applying new technologies, without taking into account “what happens when the guy picks up the wrench” is to build, as you rightly say, a carriage without the horse. But it is also true that horse carriages were later replaced by motor vehicles. That is what will happen in industry in the coming years, and far from causing fear or uncertainty in us, it has to cause excitement and motivation in us: We should go for it with all our strength. Thanks! Fran Yáñez
Without question, the idea/concept of Industry 4.0 has become a real hodgepodge of technologies and “digital solution” offerings; one that is being very widely touted under the hyperbolized banner of “DIGITAL TRANSFORMATION.”
Not unlike what has happened with other mega-business-trends of the past, whether they be technology-based, managerial-based, or sociologically-based, they tend to accrete more and more content as more and more potential players jump on the bandwagon with their own interpretations.
Yes, as Michel points out in his article, the original notion of Industry 4.0 had its origins in Germany as a national program focused on modernization of its industrial base by making greater use of available computer technologies.
But the birth of that notion/terminology took place back in 2011; and over the 8-yr period since then, the notion has taken on what can be likened to a steroid-enhanced physique. How so, you might ask? Well, just doing a google search for graphic representations of I4.0 will bring up a plethora of depictions, some of which appear to be compendiums of computer-based technologies… many of which date back decades.
Interestingly, I just attended a small vendor gathering that was promoted under the guise of introducing/explaining Industry 4.0 to the attendees. And the opening presenter utilized a graphic very similar to one shown below…
And rather than depict an eye-straining list of the various technologies that are involved, it does a fairly nice job of categorizing the various technologies that are considered to be in play when it comes to building new and/or enhancing existing capabilities that are in some way, shape, or form related to an organization’s end-to-end supply chain operations.
Needless to say, that’s an extremely wide range of technologies to consider. That said, there’s no reason why any organization should not be looking at its operations and asking the question as to whether or not some available technology might be leveraged in ways that could enhance the overall performance of the SYSTEM (aka the enterprise as a whole).
And from my POV (which includes decades of work with state-of-the-market computer-based technologies) that famous adage coined by Michael Hammer at the birth of the reengineering revolution… “TECHNOLOGY SHOULD NEVER BE USED IN AN ATTEMPT TO PAVE OVER THE COW PATHS”… still holds true today. In other words, pursuing “better sameness” with the addition of expensive (an often complex) technologies (ala GM and Tesla) is NOT an advisable course of action.
On the other hand, there appears to a significant amount of opportunity that many companies might pursue in those mission critical areas of their operations where they could gain some advantage through an increase in speed and/or flexibility (i.e., NPPD/I, flexible delivery, product/service customization, field service/repair, order processing and fulfillment, etc.).
In this regard, I see significant SYSTEM performance improvement potential through the proper application of technologies such as:
1) integrated CAD/CAE/CAM technologies that make use of additive and subtractive manufacturing techniques (i.e., to accelerate new product and process development, testing, and implementation),
2) ER (Extended Reality) systems that can provide operator and/or service technician with real-time information/instructions on how to go about performing some production, maintenance, or repair function (i.e., using Augmented Reality goggles or phone or tablet to see an overlay of information on some physical piece of equipment).
3) AI-based or enhanced order/offering configuration and pricing systems that are capable of rapidly responding to customer product/service inquiries such that multiple alternatives/options can be quickly assessed and then translated into accurate and complete orders for direct submission to the order fulfillment operations/process.
In this picture, I wonder why IoT is linked to “Big Data and Analytics” and not “Cybersecurity,” or what the different colors mean. My suspicion is that the links and colors are just decoration. I think of meaningless graphic elements as pollution.
Also, why are “Big Data” and “Analytics” bundled? Big Data is a term from e-commerce, where sites like Amazon or eBay accumulate terabytes of data every day, while manufacturing databases including years of history are at most in the hundreds of Gigabytes.
Of course, if you polled hundreds of sensors every millisecond and recorded on video everything that happens in a factory 24×7, you would end up with Big Data. Just because you can generate Big Data doesn’t mean you should.
If you do, you will need farms of servers and tools like Google’s MapReduce or Apache’s Hadoop just to deal with the volume, prior to doing any Analytics. And analytical tools can be applied to small, medium, or large datasets.
I know what you are saying because I was up to my neck in CIM for most the 1980s. It wasn’t my choice but I went along with it and did my best to make it work. While what’s happening today looks similar, we need to remember that history does not repeat itself, it stutters.
In 1981, relational databases were the state of this art but so clunky and slow that we gave up on using them for shop floor transaction processing. Now they are everywhere and their performance is not an issue.
What is an issue today is that the relational model isn’t as great as we thought it was before we could actually use it. That has led to the appearance of new types like NoSQL (“Not only SQL”) and graph databases. And then you have in-memory databases for status data, and data warehouses or data lakes for history pulled from multiple sources…
The robots of the 1980s were dangerous beasts kept in cages. Today, some are cobots and can work side-by-side with people.
The technology has changed; the people, not so much. In the 1980s, you had computer scientists who knew nothing about manufacturing and manufacturers who had no notion of what software or automation could do. Now, you have new generations of both.
Again, the graphic is – IMO – a good example of what’s been going on around the topic/notion of Industry 4.0. Many – if not all – of the bandwagoneers are picking up on the notion in any way they see fit and translating the notion into whatever context might best fit their interest/intent.
Fortunately, when I was exposed to the graphic for the first time, it had a somewhat meaningful context which I have not shared. In essence, the graphic was used as a “talking” tool that could allow the various vendors to address their respective technologies -by category – into a discussion or overview of Industry 4.0 as they tend to view it.
Because I did not develop the graphic, and because my exposure to it had a limited context, I cannot provide any answers to the valid questions you’ve posed. My only response is to say that this sort of thing has happened many times in the past, with the advent of newly hyperbolized trends such as this one, and the marketplace in which much of the information pertaining to the trend appears is a bit like the old Wild West; where there was little to no law and order (i.e., standardization and/or agreement over what’s on and what’s not and why).
I can tell you that one of the vendors who presented their offering in the context of Industry 4.0/Big Data Analytics happened to Siemens. Based on what was presented, it was clear that Siemens believes that many companies can benefit from instituting Big Data Analytics capabilities.
I did not have time to explore their offering in detail; but I’m sure many of issue you’ve raised relative to this aspect should be part of the overall calculus that goes into assessing the appropriateness of the technology. In keeping with Michel’s focus above on the Big Data and Analytics portion of the Industry 4.0 equation, the following is a just released IW article describing a presentation by the CEO (Lou Rassey) of Fast Radius, an additive manufacturing solutions provider.
The title of Jill Jusko’s article is: Data Is the Heart of Smarter Manufacturing. And the content of her article focuses on what Rassey had to say about the need for a paradigm shift that he believes manufacturers MUST embrace.
And that shift is toward viewing and treating data as an asset… whatever that might mean and in whatever context might be implied. [Remember: DATA without a CONTEXT is MEANINGLESS.]
Here’s the link to the article for anyone interested in reading it…
It’s definitely important to separate the bandwagon and hype to get down to the nitty gritty of real research and applications. (And what’s been going on for the last 10 yrs). There are pretty cool advancements, esp inside manufacturing.
Dig a little more into Siemens, they’ve worked some interesting stuff inside the model based enterprise, decent presentations and work on integration. You may find it interesting.
I’ll take digitization, industry 4.0 however anyone wants to generalize or categorize it. It’s all in how its utilized, and the smart folks can identify hype over real application. It’s been a long time coming, and we are just in the infancy stage.
I definitely applaud your enthusiasm and optimistic outlook for Industry 4.0… If you’re able stick with this trend long enough, I have a sneaking suspicion that eventually, you’re going to need every bit of that enthusiasm and optimism you can muster.
Why so, you may ask? Well, based on past experience, I’ve been through a very similar mega trend back in the 80’s and 90;s when the big computer-based technology were first coming into vogue. That push involved the likes of CAD/CAE/CAM/CIM/CIE, DNC/CNC machining, intra and interplant networking, relational data bases, robotics/automation, lights-out manufacturing, and the still pervasive EAPs. Interesting, that’s all the cool stuff that was happening in the manufacturing arena before the off-shoring trend took hold.
Need I mention that the number of vendors participating in this arena back then was through the roof.
Also, believe it or not, stereolithography (a form of additive manufacturing) was just coming into generalized use back then. And – at that time – I was of the opinion that it was one of the leading edge technologies that held great promise in helping to accelerate the NPD process…. I’m still waiting to see that promise come to fruition.
I like that thought about history stuttering… After all, an actual repeat of history would be a violation of physics; that is, the arrow of time only travels in one direction. Ergo, history can never actually repeat itself.
So, when it comes to stuttering, trends such as Industry 4.0 is more than likely to move forward in fits and starts – and maybe even require taking a step backward in some areas. I can live with that notion.
In fact, it’s that stuttering quality that makes these trends so frustrating at times. One can never really know for sure what’s most likely to take hold and what’s not and when. Ergo, what you’ve highlighted relative to the current state use of relational databases and robots is a good example of how an early vision is transformed by actual experience in utilizing a particular technology and how the technology evolves over time; thereby changing its “usability” profile.
Another good example is how the notion of data warehouses and data marts have given way to data lakes, and data ponds, and even data swamps. What’s actually happening with the technology goes beyond merely changing the terminology; there are also changes that are taking place and will likely continue to take place in the general surrounding technology “ecosystem;” which ultimately impacts/alters the way a particular technology (i.e., databases and specialized storage configurations) can be, is being, and might someday be utilized.
Automation in general and robotics in particular make for a great example of where this combined specific and general milieu interaction is taking place; currently across a wide range of industries and applications. In other words, the future is typically so dynamic that it’s extremely difficult to prognosticate about it with any degree of accuracy; other than to predict that the future of computer-based technologies is likely to continue to evolve at a very fast pace. Doing other wise, such as trying to predict in which direction(s) and to what extent a technology will evolve, is more of a crap shoot over the longer term than a reliable piece of information or knowledge.
But as we all (should) know, vast majority of all marketing information [aka hype] is not always about providing an accurate and reliable picture of reality.
Bottom line: If people are the slowest change element within the overall technology ecosystem, the ability to apply technology in more efficient and effective ways going forward still requires/demands a way of THINKING AND BEHAVING that is NOT the result of genetic evolution, but rather more so the result of the good-ol hands-on learning and experimentation.
This is an excellent post. Sooooo informative.
However with all this industry 4.0 “hype” (if I may say so) and its immense capabilities, I am still fumbling for an answer on how to correlate the ground realities and the worth of such advanced tools. I mean are we really jumping the guns?
In India we don’t yet have even 5 pc businesses even thinking about this though we as consultant are almost constantly showcasing the worth of this and its immense capabilities. But the job is to bridge the gap efforts rather than the end of the road solutions especially in smaller industries.
Lets debate this friends…
Clearly Industry 4.0 [and 5] is a fad – however it is also an agenda with government and industry support. Thus fad or not fad – the agenda has had real impact on industry in europe for several years now.
The distinctions between 1,2,3 and industry 4 is meaningful in the context of ambitions to modernize and develop more sophisticated technologies which can be better integrated and interacting with each other.
This is also why often concepts such as big data and IoT are mentioned as we are not just talking about more automatization but also metalevel integration and interaction in support for better quality co-ordination of different technologies and so on. [simple version].
The difference between 4.0 and 5.0 is in the agenda and purpose. 4.0 is a continuation of efforts to develop more sophisticated and coordinated technologies [with the help of AI etc] and to replace employees at all levels.
Robots are seen as tools to be used in organizations to support their activities *instead* of employees. While the 5.0 agenda deviates mainly that AI and other sophisticated technologies as [collaborative] Robots are developed and implemented with the purpose to *support* employees / customers in their activities as *partners* – not replace.
My understanding of the 4.0 or 5.0 (whatever) concept is very different than an “improvement”. Although many same elements are involved, the full automation concept cannot be achieved through taking the traditional approach and improving it.
This has to be designed in. I think Elon Musk simply did not have enough time and resources, and, most importantly, – people, to fully realize the concept. As funny as it sounds, given the fantastic achievement of Tesla, he could not be innovative enough in the business context of trying to change one of the most conservative industries.
Also, the application of software engineering concepts to manufacturing of cyber-physical systems may have specific limitations. My bet is that Elon has a small group of people taking on the car manufacturing of the future… which we did not see yet 🙂
April 2, 2019 @ 11:41 pm
Excellent description of the current state of the industry, Michel.
Just having returned from #HannoverMesse I 100% agree. In the end it is a people business in which technology can play A (significant) role but certainly NOT THE role. That is too often overlooked by top- and mid-level management, which is incentivized by short term KPIs.
April 5, 2019 @ 10:18 am
“Whenever, in operations, they try to do anything innovative with data, the IT department is in the way, blocking instead of supporting them. ”
Very true. However, I feel the new trend is even worse. IT departments — hearing the new buzz in their field is IoT, Industry 4.0, Business Intelligence, etc.– try to “help” by applying these concepts to a process with a shaky foundation to start with. Often making things worse by putting significant effort into an activity that shouldn’t exist in the first place.
April 6, 2019 @ 2:55 am
Hi Michel,
Congratulations for your article, it is very inspiring and you are a great thinker. I agree with you: flooding factories with new technology does not improve their performance. The best way to face the “digital transformation” and “Industry 4.0” is previously applying LEAN Manufacturing, using a bottom-up approach. In my book I mention this idea in the chapter “The role of LEAN in the Factory of the Future” (“The Goal is Industry 4.0: Technologies and Trends of the Fourth Industrial Revolution”, Amazon books) Applying new technologies, without taking into account “what happens when the guy picks up the wrench” is to build, as you rightly say, a carriage without the horse. But it is also true that horse carriages were later replaced by motor vehicles. That is what will happen in industry in the coming years, and far from causing fear or uncertainty in us, it has to cause excitement and motivation in us: We should go for it with all our strength. Thanks! Fran Yáñez
April 8, 2019 @ 8:55 am
Comment on LinkedIn:
April 8, 2019 @ 8:56 am
What is the position of the Australian government? Is it involved in any way? Does it promote Industry 4.0?
April 8, 2019 @ 8:57 am
Comment on LinkedIn:
April 8, 2019 @ 8:58 am
I am not suggesting anything. I am asking a question because I don’t know the answer.
April 8, 2019 @ 8:59 am
Comment on LinkedIn:
April 8, 2019 @ 9:00 am
In this picture, I wonder why IoT is linked to “Big Data and Analytics” and not “Cybersecurity,” or what the different colors mean. My suspicion is that the links and colors are just decoration. I think of meaningless graphic elements as pollution.
Also, why are “Big Data” and “Analytics” bundled? Big Data is a term from e-commerce, where sites like Amazon or eBay accumulate terabytes of data every day, while manufacturing databases including years of history are at most in the hundreds of Gigabytes.
Of course, if you polled hundreds of sensors every millisecond and recorded on video everything that happens in a factory 24×7, you would end up with Big Data. Just because you can generate Big Data doesn’t mean you should.
If you do, you will need farms of servers and tools like Google’s MapReduce or Apache’s Hadoop just to deal with the volume, prior to doing any Analytics. And analytical tools can be applied to small, medium, or large datasets.
April 8, 2019 @ 9:04 am
I know what you are saying because I was up to my neck in CIM for most the 1980s. It wasn’t my choice but I went along with it and did my best to make it work. While what’s happening today looks similar, we need to remember that history does not repeat itself, it stutters.
In 1981, relational databases were the state of this art but so clunky and slow that we gave up on using them for shop floor transaction processing. Now they are everywhere and their performance is not an issue.
What is an issue today is that the relational model isn’t as great as we thought it was before we could actually use it. That has led to the appearance of new types like NoSQL (“Not only SQL”) and graph databases. And then you have in-memory databases for status data, and data warehouses or data lakes for history pulled from multiple sources…
The robots of the 1980s were dangerous beasts kept in cages. Today, some are cobots and can work side-by-side with people.
The technology has changed; the people, not so much. In the 1980s, you had computer scientists who knew nothing about manufacturing and manufacturers who had no notion of what software or automation could do. Now, you have new generations of both.
April 8, 2019 @ 9:01 am
Comment on LinkedIn:
April 8, 2019 @ 9:02 am
Comment on LinkedIn:
April 8, 2019 @ 9:03 am
Comment on LinkedIn:
April 8, 2019 @ 9:05 am
Comment on LinkedIn:
April 8, 2019 @ 9:06 am
Comment on LinkedIn:
April 8, 2019 @ 9:07 am
Comment on LinkedIn:
April 8, 2019 @ 9:08 am
Comment on LinkedIn: