Sensor Fusion, PLCs, and Low-Power Components Enable Innovations in Manufacturing 4.0

author avatar

03 May, 2023

Sensor Fusion, PLCs, and Low-Power Components Enable Innovations in Manufacturing 4.0

Article #1 of Transforming Industrial Manufacturing with Industry 4.0 Series: Advancements in less-glamorized technologies like sensing, Programmable Logic Controllers, low-power components, and vision systems have played important roles in the rapid progression of Manufacturing 4.0.

This is the first article in a 7-part series featuring articles on Transforming Industrial Manufacturing with Industry 4.0. The series looks at technological developments and emerging trends in the manufacturing industry that drive growth and innovation. This series is sponsored by Mouser Electronics. Through their sponsorship, Mouser Electronics shares its passion and support for engineering advancements that enable a smarter, cleaner, safer manufacturing future.

Like other areas of the 4.0 era, Manufacturing 4.0 is all about using data and connectivity to make processes efficient and lean, enabling intelligent systems to make the right decisions. While Industry 4.0 is a broad concept that encompasses various areas, Manufacturing 4.0 is focused on optimizing the manufacturing process from the design and planning stages to the delivery of the final product. Many technologies have enabled such growth and capabilities. Among them, artificial intelligence, machine learning, big data, cloud computing, and augmented reality tend to monopolize the spotlight; however, a few other supporting technologies are vital in accomplishing these goals. 

This article explores the role of sensors, programmable logic controllers (PLCs), low-power components, and vision systems as vital—albeit under-recognized—technologies that are helping to advance Manufacturing 4.0.

Sensors and Sensor Fusion

In addition to capturing data used for insights and decision-making, sensors capture data that's needed as a product moves through the manufacturing process. This information can be used to identify bottlenecks, streamline operations, and ensure that products meet quality and safety standards. Here are some examples of applications of sensors in Manufacturing 4.0:

  • Positional sensors facilitate the measurement of mechanical position, like whether a cylinder returns to its home position before it actuates on the next product on the assembly line.

  • Presence-detection sensors are similar; here, an optical presence-detection sensor would emit a beam of light or laser that is always visible unless something is blocking its path, indicating ‘presence’ or ‘no presence’ of something in its view.

  • Size-detection sensor data can be used for quality control and to determine whether an assembly can safely move through equipment down the line.

  • Contact sensors might be used to sense whether a compartment door is open or closed or to cause a hard stop to prevent equipment damage —both with safety implications.

  • Vibration sensors are often used to determine equipment health; that is, vibration from a servo motor indicates that parts are wearing out. From this data, maintenance needs can be predicted before problems occur.

Without sensors, manufacturing automation simply could not happen. Take, for example, when dealing with a press that presses 45,000 kg of pressure onto another surface, you must know that the area is clear. Sensors are the digital eyes, ears, nose, and fingers of automation that remove any guesswork or assumptions, which leads to much safer, consistent, and efficient conditions. What's more, they require rugged designs that may need to withstand heat, moisture, oil, dust, and various other possible harsh conditions.

From a data-collection standpoint, sensors are the gateway to the insights you seek as they provide the raw data that's used to tell the story of what's happening along the production line. Many manufacturers are still in the process of retrofitting their legacy equipment; here, there's a lot of effort going into developing data access points that can tie into older PLCs without affecting their functionality. Others have streams of data coming in from multiple sources and struggle to figure out how to use it to yield useful insights. 

Sensor fusion is the process of integrating data from multiple sensors to improve the accuracy, reliability, and robustness of the measurements. It has advanced the quality and type of data that can be collected and the certainty of insights derived from the data. Sensor fusion algorithms use advanced mathematical techniques, such as Kalman filtering, particle filtering, and Bayesian inference, to merge the sensor data and estimate the state of the system being monitored. Sensor fusion is widely used in various applications to improve situational awareness, navigation, and control. For example, you might use a laser sensor to detect height and a vision sensor to confirm. 

In many cases, using two or more different sensors provide redundancy in gathering data; however, sensor fusion enables you to combine data based on the various sensors' strengths. Still, the challenge is understanding how to apply today's tremendous sensor technology to gain useful insights.

Programmable Logic Controllers

When collecting information from machines, the communication is not between sensors but from a sensor to a PLC along the manufacturing network. PLCs are solid-state industrial computers that have been ruggedized and used for controlling manufacturing processes. PLCs are the brains of manufacturing, where logic and process information are stored and where network communication begins.

The main function of PLCs is to receive inputs, make real-time, logic-based decisions, and send operating instructions through the outputs, ultimately determining the order of operations in complex processes. PLC input can come from switches, sensors, vision systems, and other sources, while output destinations can include sirens, relays, indicator lights, cylinders, solenoids, analog outputs, robots, and even other controllers. PLCs, ensure that the correct input is received as well. For instance, a machine might have a big red button that initiates the conveyor belt; however, the PLC watches to ensure that the power-on signal and maybe that a safety feature is initiated before the conveyor belt can be turned on. 

PLCs are also helping improve manufacturing efficiency by providing operational data for closed-loop digital twins (CLDTs). CLDTs are virtual models that are used to simulate and optimize the performance of physical systems in real time. They are an advanced form of digital twin that incorporate closed-loop control and feedback mechanisms, allowing them to interact with the physical system they represent. The idea of CLDTs is to use a virtual model that, ideally, accounts for all systems and variables that affect production efficiency. Here, the PLCs provide historical and real-time I/O data that, along with data from other systems, can be used to fine-tune machine settings, staffing, material storage, and other operational aspects. CLDTs can be implemented at any scale, ranging from a single piece of equipment to an entire production line or to entire manufacturing operations.

In edge-enabled environments, virtualized PLCs are gaining momentum and eliminating the need for physical PLCs and enclosures. Otherwise, PLCs for the entire manufacturing floor can be housed in a single case, or designed modularly, grouped by function (power, processing, I/O selection, etc.). Modular PLCs have the advantage of easier maintenance because they're not tied to other systems. 

Whether physical or virtualized, PLCs are the masterminds of manufacturing processes.

Low-Power Components and Subsystems

Also relevant in terms of under-appreciated tech is the use of low-power components in every electronic device, electrical device, and subsystem, including PLCs. Low power exists at the lowest level of design components such as transistors, printed circuit boards (PCBs), resistors, field programmable gate arrays (FPGAs), etc. And as far as enabling Manufacturing 4.0 is concerned, low-power is a big enabler because it significantly reduces the size of components. Without advancements in low-power components, an electro-mechanical system might have had a whole box of relays that could be as big as a wall in your house.

As components and subsystems have gotten smaller, they've also gotten better by producing less heat and being more efficient. These aspects have enabled designers to build out complex machinery and processes without taking up an entire manufacturing facility. And, of course, a smaller size has also resulted in lower costs. We're now able to apply various technologies and methods that have been available but were cost-prohibitive in the past. Elon Musk mentioned in an interview at International Astronautical Congress (IAC), in Guadalajara, Mexico, in 2016 that it costs $140,000 per ton to send something to Mars.[1] What it essentially meant is though the technology exists to embark on such a mission, there is currently no financing available to support such an endeavor at its estimated expense.

Vision Systems

Robot vision systems enable robots to perceive and interpret the surrounding environment using cameras, sensors, and other imaging devices. These systems utilize various techniques such as image processing, machine learning, and computer vision algorithms to capture and analyze visual data, allowing the robot to make decisions and perform tasks autonomously or with human assistance. Applications of vision systems generally fall into one of the following four categories:

  • Guidance: It involves using robot vision systems to locate the position and orientation of a part or object to guide the robot's movement or operation. For example, in an industrial assembly line, a robot may use a vision system to locate the position of a component and align itself accordingly for assembly.

  • Identification: Robot vision systems can be used to identify objects, parts, or other items based on visual characteristics such as shape, color, or texture. This can include tasks like recognizing bar codes, sorting inventory, or identifying parts for manufacturing.

  • Gauging: Gauging involves using robot vision systems to measure and calculate the distance between two points or the dimensions of an object and determining whether the measurement meets specifications. It can include tasks like checking the size or position of a part during manufacturing or inspecting a product for quality control.

  • Inspecting: Vision systems can be used to detect defects, abnormalities, or other issues that may affect the quality or functionality of a product or part. This can include tasks like detecting cracks or imperfections in materials or inspecting products for damage or defects before they are shipped.

In manufacturing environments, vision systems can be trained to recognize objects that can be measured, counted, decoded, or positioned. As with other machine learning applications, training requires large datasets in which characteristics of shape, size, orientation, edges, patterns, colors, and the like are labeled. 

For example, in training a system to identify defects in a finned tube, the system might be trained to identify tubes of a specified length and circumference with strip fins (not wire fins) that are plain (not serrated) with welds placed 0.14" apart. The trained system stores an image —a collection of pixels in a distinct formation —used as the basis for comparison.

In use, vision systems deliver pass/fail results. Continuing the example, the system's camera would acquire an image of the finned tube as it completes its manufacturing journey. The image is made up of captured light with areas of black, white, gray, and possibly color. This is delivered to an image sensor that captures reflected light and converts it into pixels of a distinct formation. The system then interprets the image and determines if it matches an exact formation of pixels it's been trained to recognize.

Vision systems have advanced manufacturing in several ways, such as improving product quality, reducing waste (materials and time), reducing downtime, creating traceability and accountability, and facilitating compliance.

Conclusion

The rapid progression of Manufacturing 4.0 is largely attributed to technologies such as artificial intelligence, big data, cloud computing, and additive manufacturing. While these cutting-edge technologies often capture the headlines, a number of less-glamorized technologies have also played important roles:

  • Sensors capture data used for insights and decision-making, keeping the product moving smoothly through the manufacturing process.

  • PLCs are the brains of manufacturing; It's where programs, information, and backups are stored and how communication occurs.

  • Low-power components and subsystems enable us to apply technologies and methods that were previously cost- and/or size-prohibitive.

  • Vision systems have allowed robots to see objects, communicate critical information to other systems, and perform tasks.

Flashy technologies might steal the spotlight, but these less-glamorized technologies are also vital to Manufacturing 4.0 and beyond.

This article is based on: Why Manufacturing 4.0 Will Succeed?, a blog by Mouser Electronics. It has been substantially edited by the Wevolver team and Electrical Engineer Ravi Y Rao. It's the first article from the Transforming Industrial Manufacturing with Industry 4.0 Series. Future articles will introduce readers to some more trends and technologies transforming industrial automation.


The introductory article presented the different topics covered in the Transforming Industrial Manufacturing with Industry 4.0 Series.

The first article discusses Sensor Fusion, PLCs, Low-Power Components, and Vision Systems and their impact on the progression of Manufacturing 4.0.

The second article examines the expanding and evolving roles of systems, process, and design engineers within the design chain of bringing new industrial automation products to fruition.

The third article takes a look at the development of smart factories, their characteristics, benefits, and challenges that need to be addressed for a successful digital transformation.

The fourth article focuses on technologies like Robot Operating Systems, edge computing, and new software solutions that are improving robotics in industrial and commercial environments.

The fifth article explores some challenges in accessing information in the manufacturing sector and how AI-driven AR has the potential to overcome them.

The sixth article explains how digital twins are helping bridge the gap between design and manufacturing.

The seventh article how manufacturing environments are adapting to the evolving customer needs and expectations.


About the sponsor: Mouser Electronics

Mouser Electronics is a worldwide leading authorized distributor of semiconductors and electronic components for over 1,200 manufacturer brands. They specialize in the rapid introduction of new products and technologies for design engineers and buyers. Their extensive product offering includes semiconductors, interconnects, passives, and electromechanical components.

References

[1] Loren Grush, ‘The biggest lingering questions about SpaceX's Mars colonization plans’, 28 Sept. 2016, The Verge, [Online], Available from: https://www.theverge.com/2016/9/28/13087110/spacex-elon-musk-mars-plan-habitat-radiation-funding-questions

More by Mario Sheppard

Mario Sheppard is a United States Navy submarine veteran who is currently the lead engineer for automation and manufacturing technology at Supernal, the e-VTOL division of Hyundai Motor Group. Over the years he’s lead successful automation-robotics projects in a diverse group of industries, with mos...